Search results for: toxicity testing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3972

Search results for: toxicity testing

2472 Experimenting with Error Performance of Systems Employing Pulse Shaping Filters on a Software-Defined-Radio Platform

Authors: Chia-Yu Yao

Abstract:

This paper presents experimental results on testing the symbol-error-rate (SER) performance of quadrature amplitude modulation (QAM) systems employing symmetric pulse-shaping square-root (SR) filters designed by minimizing the roughness function and by minimizing the peak-to-average power ratio (PAR). The device used in the experiments is the 'bladeRF' software-defined-radio platform. PAR is a well-known measurement, whereas the roughness function is a concept for measuring the jitter-induced interference. The experimental results show that the system employing minimum-roughness pulse-shaping SR filters outperforms the system employing minimum-PAR pulse-shaping SR filters in the sense of SER performance.

Keywords: pulse-shaping filters, FIR filters, jittering, QAM

Procedia PDF Downloads 342
2471 Binding Mechanism of Synthesized 5β-Dihydrocortisol and 5β-Dihydrocortisol Acetate with Human Serum Albumin to Understand Their Role in Breast Cancer

Authors: Monika Kallubai, Shreya Dubey, Rajagopal Subramanyam

Abstract:

Our study is all about the biological interactions of synthesized 5β-dihydrocortisol (Dhc) and 5β-dihydrocortisol acetate (DhcA) molecules with carrier protein Human Serum Albumin (HSA). The cytotoxic study was performed on breast cancer cell line (MCF-7) normal human embryonic kidney cell line (HEK293), the IC50 values for MCF-7 cells were 28 and 25 µM, respectively, whereas no toxicity in terms of cell viability was observed with HEK293 cell line. The further experiment proved that Dhc and DhcA induced 35.6% and 37.7% early apoptotic cells and 2.5%, 2.9% late apoptotic cells respectively. Morphological observation of cell death through TUNEL assay revealed that Dhc and DhcA induced apoptosis in MCF-7 cells. The complexes of HSA–Dhc and HSA–DhcA were observed as static quenching, and the binding constants (K) was 4.7±0.03×104 M-1 and 3.9±0.05×104 M-1, and their binding free energies were found to be -6.4 and -6.16 kcal/mol, respectively. The displacement studies confirmed that lidocaine 1.4±0.05×104 M-1 replaced Dhc, and phenylbutazone 1.5±0.05×104 M-1 replaced by DhcA, which explains domain I and domain II are the binding sites for Dhc and DhcA. Further, CD results revealed that the secondary structure of HSA was altered in the presence of Dhc and DhcA. Furthermore, the atomic force microscopy and transmission electron microscopy showed that the dimensions like height and molecular sizes of the HSA–Dhc and HSA–DhcA complex were larger compared to HSA alone. Detailed analysis through molecular dynamics simulations also supported the greater stability of HSA–Dhc and HSA–DhcA complexes, and root-mean-square-fluctuation interpreted the binding site of Dhc as domain IB and domain IIA for DhcA. This information is valuable for the further development of steroid derivatives with improved pharmacological significance as novel anti-cancer drugs.

Keywords: apoptosis, dihydrocortisol, fluorescence quenching, protein conformations

Procedia PDF Downloads 132
2470 Frequency Modulation in Vibro-Acoustic Modulation Method

Authors: D. Liu, D. M. Donskoy

Abstract:

The vibroacoustic modulation method is based on the modulation effect of high-frequency ultrasonic wave (carrier) by low-frequency vibration in the presence of various defects, primarily contact-type such as cracks, delamination, etc. The presence and severity of the defect are measured by the ratio of the spectral sidebands and the carrier in the spectrum of the modulated signal. This approach, however, does not differentiate between amplitude and frequency modulations, AM and FM, respectfully. It was experimentally shown that both modulations could be present in the spectrum, yet each modulation may be associated with different physical mechanisms. AM mechanisms are quite well understood and widely covered in the literature. This paper is a first attempt to explain the generation mechanisms of FM and its correlation with the flaw properties. Here we proposed two possible mechanisms leading to FM modulation based on nonlinear local defect resonance and dynamic acousto-elastic models.

Keywords: non-destructive testing, nonlinear acoustics, structural health monitoring, acousto-elasticity, local defect resonance

Procedia PDF Downloads 154
2469 Stabilization of Pb, Cr, Cd, Cu and Zn in Solid Waste and Sludge Pyrolysis by Modified Vermiculite

Authors: Yuxuan Yang, Zhaoping Zhong

Abstract:

Municipal solid waste and sludge are important sources of waste energy and their proper disposal is of great importance. Pyrolysis can fully decompose solid wastes and sludge, and the pyrolysis products (charcoal, oil and gas) have important recovery values. Due to the complex composition of solid wastes and sludge, the pyrolysis process at high temperatures is prone to heavy metal emissions, which are harmful to humans and the environment and reduce the safety of pyrolysis products. In this paper, heavy metal emissions during pyrolysis of municipal sewage sludge, paper mill sludge, municipal domestic waste, and aged refuse at 450-650°C were investigated and the emissions and hazards of heavy metals (Pb, Cr, Cd, Cu and Zn) were effectively reduced by adding modified vermiculite as an additive. The vermiculite was modified by intercalation with cetyltrimethylammonium bromide, which resulted in more than twice the original layer spacing of the vermiculite. Afterward, the interpolated vermiculite was made into vermiculite flakes by exfoliation modification. After that, the expansion rate of vermiculite flakes was increased by Mg2+ modification and thermal activation. The expanded vermiculite flakes were acidified to improve the textural characteristics of the vermiculite. The modified vermiculite was analysed by XRD, FT-IR, BET and SEM to clarify the modification effect. The incorporation of modified vermiculite resulted in more than 80% retention of all heavy metals at 450°C. Cr, Cu and Zn were better retained than Pb and Cd. The incorporation of modified vermiculite effectively reduced the risk of heavy metals, and all risks were low for Pb, Cr, Cu and Zn. The toxicity of all heavy metals was greatly reduced by the incorporation of modified vermiculite and the morphology of heavy metals was transformed from Exchangeable and acid-soluble (F1) and Reducible (F2) to Oxidizable (F3) and Residual (F4). In addition, the increase in temperature favored the stabilization of heavy metal forms. This study provides a new insight into the cleaner use of energy and the safe management of solid waste.

Keywords: heavy metal, pyrolysis, vermiculite, solid waste

Procedia PDF Downloads 70
2468 Verification of Low-Dose Diagnostic X-Ray as a Tool for Relating Vital Internal Organ Structures to External Body Armour Coverage

Authors: Natalie A. Sterk, Bernard van Vuuren, Petrie Marais, Bongani Mthombeni

Abstract:

Injuries to the internal structures of the thorax and abdomen remain a leading cause of death among soldiers. Body armour is a standard issue piece of military equipment designed to protect the vital organs against ballistic and stab threats. When configured for maximum protection, the excessive weight and size of the armour may limit soldier mobility and increase physical fatigue and discomfort. Providing soldiers with more armour than necessary may, therefore, hinder their ability to react rapidly in life-threatening situations. The capability to determine the optimal trade-off between the amount of essential anatomical coverage and hindrance on soldier performance may significantly enhance the design of armour systems. The current study aimed to develop and pilot a methodology for relating internal anatomical structures with actual armour plate coverage in real-time using low-dose diagnostic X-ray scanning. Several pilot scanning sessions were held at Lodox Systems (Pty) Ltd head-office in South Africa. Testing involved using the Lodox eXero-dr to scan dummy trunk rigs at various degrees and heights of measurement; as well as human participants, wearing correctly fitted body armour while positioned in supine, prone shooting, seated and kneeling shooting postures. The verification of sizing and metrics obtained from the Lodox eXero-dr were then confirmed through a verification board with known dimensions. Results indicated that the low-dose diagnostic X-ray has the capability to clearly identify the vital internal structures of the aortic arch, heart, and lungs in relation to the position of the external armour plates. Further testing is still required in order to fully and accurately identify the inferior liver boundary, inferior vena cava, and spleen. The scans produced in the supine, prone, and seated postures provided superior image quality over the kneeling posture. The X-ray-source and-detector distance from the object must be standardised to control for possible magnification changes and for comparison purposes. To account for this, specific scanning heights and angles were identified to allow for parallel scanning of relevant areas. The low-dose diagnostic X-ray provides a non-invasive, safe, and rapid technique for relating vital internal structures with external structures. This capability can be used for the re-evaluation of anatomical coverage required for essential protection while optimising armour design and fit for soldier performance.

Keywords: body armour, low-dose diagnostic X-ray, scanning, vital organ coverage

Procedia PDF Downloads 124
2467 Effects of Vitamin C and Spondias mombin Supplementation on Hematology, Growth, Egg Production Traits, and Eggshell Quality in Japanese Quails (Coturnix coturnix japonica) in a Hot-Humid Tropics

Authors: B. O. Oyebanji, I. O. Dudusola, C. T. Ademola, S. A. Olaniyan

Abstract:

A 56 day study was conducted to evaluate the effect of dietary inclusion of Spondias mombin on hematological, growth, egg parameters and egg shell quality of Japanese quails, Cortunix cortunix japonica. One hundred birds were used for this study, and they were allocated randomly into 5 groups and replicated twice. Group 1 animals served as control without inclusion of extract, groups 2, 3, and 4 had 200 mg/kg, 400 mg/kg and 800 mg/kg inclusion of SM, group 5 had 600 mg/kg of vitamin C respectively. The birds were weighed weekly to determine weight change, the blood parameters analyzed at the completion of the experiment were PCV, Hb, RBC WBC, differential WBC count, MCH, MCH, and MCV were afterwards calculated from these parameters. 5 eggs were collected from each group and egg weight, eggshell weight, eggshell diameter, yolk weight, albumen weight, yolk diameter, yolk height, albumen percentage, yolk percentage and shell percentage were determined. There was no significant difference among the group for the hematological parameters measured and calculated. The egg weight and albumen weight of quails on 800 mg/kg was highest of all the groups, all other egg parameters measured showed no significant difference. The birds supplemented with Vitamin C had the highest weight gain (40.8±2.5 g) and the lowest feed conversion ratio (2.25). There was no mortality recorded in all the groups except in the SM800 group with 10% mortality. It can be concluded from this experiment that Vitamin C supplementation has positive effect on quail production in humid tropics and the inclusion of Spondias mombin leaf extract has a dose-dependent toxicity in quails.

Keywords: hematology, quails, Spondias mombin, vitamin C

Procedia PDF Downloads 357
2466 Cognitive Weighted Polymorphism Factor: A New Cognitive Complexity Metric

Authors: T. Francis Thamburaj, A. Aloysius

Abstract:

Polymorphism is one of the main pillars of the object-oriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.

Keywords: cognitive complexity metric, object-oriented metrics, polymorphism factor, software metrics

Procedia PDF Downloads 461
2465 Distribution and Historical Trends of PAHs Deposition in Recent Sediment Cores of the Imo River, SE Nigeria

Authors: Miranda I. Dosunmu, Orok E. Oyo-Ita, Inyang O. Oyo-Ita

Abstract:

Polycyclic aromatic hydrocarbons (PAHs) are a class of priority listed organic pollutants due to their carcinogenicity, mutagenity, acute toxicity and persistency in the environment. The distribution and historical changes of PAHs contamination in recent sediment cores from the Imo River were investigated using gas chromatography coupled with mass spectrometer. The concentrations of total PAHs (TPAHs) ranging from 402.37 ng/g dry weight (dw) at the surface layer of the Estuary zone (ESC6; 0-5 cm) to 92,388.59 ng/g dw at the near surface layer of the Afam zone (ASC5; 5-10 cm) indicate that PAHs contamination was localized not only between sample sites but also within the same cores. Sediment-depth profiles for the four (Afam, Mangrove, Estuary and illegal Petroleum refinery) cores revealed irregular distribution patterns in the TPAH concentrations except the fact that these levels became maximized at the near surface layers (5-10 cm) corresponding to a geological time-frame of about 1996-2004. This time scale coincided with the period of intensive bunkering and oil pipeline vandalization by the Niger Delta militant groups. Also a general slight decline was found in the TPAHs levels from near the surface layers (5-10 cm) to the most recent top layers (0-5 cm) of the cores, attributable to the recent effort by the Nigerian government in clamping down the illegal activity of the economic saboteurs. Therefore, the recent amnesty period granted to the militant groups should be extended. Although mechanism of perylene formation still remains enigmatic, examination of its distributions down cores indicates natural biogenic, pyrogenic and petrogenic origins for the compound at different zones. Thus, the characteristic features of the Imo River environment provide a means of tracing diverse origins for perylene.

Keywords: perylene, historical trend, distribution, origin, Imo River

Procedia PDF Downloads 251
2464 Efficient Moment Frame Structure

Authors: Mircea I. Pastrav, Cornelia Baera, Florea Dinu

Abstract:

A different concept for designing and detailing of reinforced concrete precast frame structures is analyzed in this paper. The new detailing of the joints derives from the special hybrid moment frame joints. The special reinforcements of this alternative detailing, named modified special hybrid joint, are bondless with respect to both column and beams. Full scale tests were performed on a plan model, which represents a part of 5 story structure, cropped in the middle of the beams and columns spans. Theoretical approach was developed, based on testing results on twice repaired model, subjected to lateral seismic type loading. Discussion regarding the modified special hybrid joint behavior and further on widening research needed concludes the presentation.

Keywords: modified hybrid joint, repair, seismic loading type, acceptance criteria

Procedia PDF Downloads 523
2463 Serological IgG Testing to Diagnose Alimentary Induced Diseases and Monitoring Efficacy of an Individual Defined Diet in Dogs

Authors: Anne-Margré C. Vink

Abstract:

Background: Food-related allergies and intolerances are frequently occurring in dogs. Diagnosis and monitoring according to ‘Golden Standard’ of elimination efficiency are time-consuming, expensive, and requires expert clinical setting. In order to facilitate rapid and robust, quantitative testing of intolerance, and determining the individual offending foods, a serological test is implicated. Method: As we developed Medisynx IgG Human Screening Test ELISA before and the dog’s immune system is most similar to humans, we were able to develop Medisynx IgG Dog Screening Test ELISA as well. In this study, 47 dogs suffering from Canine Atopic Dermatitis (CAD) and several secondary induced reactions were included to participate in serological Medisynx IgG Dog Screening Test ELISA (within < 0,02 % SD). Results were expressed as titers relative to the standard OD readings to diagnose alimentary induced diseases and monitoring the efficacy of an individual eliminating diet in dogs. Split sample analysis was performed by independently sending 2 times 3 ml serum under two unique codes. Results: The veterinarian monitored these dogs to check dog’ results at least at 3, 7, 21, 49, 70 days and after period of 6 and 12 months on an individual negative diet and a positive challenge (retrospectively) at 6 months. Data of each dog were recorded in a screening form and reported that a complete recovery of all clinical manifestations was observed at or less than 70 days (between 50 and 70 days) in the majority of dogs(44 out of 47 dogs =93.6%). Conclusion: Challenge results showed a significant result of 100% in specificity as well as 100% positive predicted value. On the other hand, sensitivity was 95,7% and negative predictive value was 95,7%. In conclusion, an individual diet based on IgG ELISA in dogs provides a significant improvement of atopic dermatitis and pruritus including all other non-specific defined allergic skin reactions as erythema, itching, biting and gnawing at toes, as well as to several secondary manifestations like chronic diarrhoea, chronic constipation, otitis media, obesity, laziness or inactive behaviour, pain and muscular stiffness causing a movement disorders, excessive lacrimation, hyper behaviour, nervous behaviour and not possible to stay alone at home, anxiety, biting and aggressive behaviour and disobedience behaviour. Furthermore, we conclude that a relatively more severe systemic candidiasis, as shown by relatively higher titer (class 3 and 4 IgG reactions to Candida albicans), influence the duration of recovery from clinical manifestations in affected dogs. These findings are consistent with our preliminary human clinical studies.

Keywords: allergy, canine atopic dermatitis, CAD, food allergens, IgG-ELISA, food-incompatibility

Procedia PDF Downloads 323
2462 Biodiesel Production from Palm Oil Using an Oscillatory Baffled Reactor

Authors: Malee Santikunaporn, Tattep Techopittayakul, Channarong Asavatesanupap

Abstract:

Biofuel production especially that of biodiesel has gained tremendous attention during the last decade due to environmental concerns and shortage in petroleum oil reservoir. This research aims to investigate the influences of operating parameters, such as the alcohol-to-oil molar ratio (4:1, 6:1, and 9:1) and the amount of catalyst (1, 1.5, and 2 wt.%) on the trans esterification of refined palm oil (RPO) in a medium-scale oscillatory baffle reactor.  It has been shown that an increase in the methanol-to-oil ratio resulted in an increase in fatty acid methyl esters (FAMEs) content. The amount of catalyst has an insignificant effect on the FAMEs content. Engine testing was performed on B0 (100 v/v% diesel) and blended fuel or B50 (50 v/v% diesel). Combustion of B50 was found to give lower torque compared to pure diesel. Exhaust gas from B50 was found to contain lower concentration of CO and CO2.

Keywords: biodiesel, palm oil, transesterification, oscillatory baffled reactor

Procedia PDF Downloads 178
2461 Flexible Arm Manipulator Control for Industrial Tasks

Authors: Mircea Ivanescu, Nirvana Popescu, Decebal Popescu, Dorin Popescu

Abstract:

This paper addresses the control problem of a class of hyper-redundant arms. In order to avoid discrepancy between the mathematical model and the actual dynamics, the dynamic model with uncertain parameters of this class of manipulators is inferred. A procedure to design a feedback controller which stabilizes the uncertain system has been proposed. A PD boundary control algorithm is used in order to control the desired position of the manipulator. This controller is easy to implement from the point of view of measuring techniques and actuation. Numerical simulations verify the effectiveness of the presented methods. In order to verify the suitability of the control algorithm, a platform with a 3D flexible manipulator has been employed for testing. Experimental tests on this platform illustrate the applications of the techniques developed in the paper.

Keywords: distributed model, flexible manipulator, observer, robot control

Procedia PDF Downloads 321
2460 An Enhanced Harmony Search (ENHS) Algorithm for Solving Optimization Problems

Authors: Talha A. Taj, Talha A. Khan, M. Imran Khalid

Abstract:

Optimization techniques attract researchers to formulate a problem and determine its optimum solution. This paper presents an Enhanced Harmony Search (ENHS) algorithm for solving optimization problems. The proposed algorithm increases the convergence and is more efficient than the standard Harmony Search (HS) algorithm. The paper discusses the novel techniques in detail and also provides the strategy for tuning the decisive parameters that affects the efficiency of the ENHS algorithm. The algorithm is tested on various benchmark functions, a real world optimization problem and a constrained objective function. Also, the results of ENHS are compared to standard HS, and various other optimization algorithms. The ENHS algorithms prove to be significantly better and more efficient than other algorithms. The simulation and testing of the algorithms is performed in MATLAB.

Keywords: optimization, harmony search algorithm, MATLAB, electronic

Procedia PDF Downloads 464
2459 Sol-Gel Coated Fabric for Controlled Release of Mosquito Repellent

Authors: Bhaskar M. Murai, Neeraj Banchor, Ishveen Chabbra, Madhusudhan Nadgir, S. Vidhya

Abstract:

Sol-gel technology combined with electronics and biochemistry helps to overcome the problems caused by mosquitoes by developing a portable, low-cost device which enables controlled release of trapped compound inside it. It is a wet-chemical technique which is used primarily for fabrication of silicate gel which is usually allowed to dry as per requirement. The outcome is solid rock hard material which is porous and has lots of applications in different fields. Taking porosity as a key factor, allethrin a naturally occurring synthetic compound with molecular mass 302.40 was entrapped inside the sol-gel matrix as a dopant. Allethrin is commonly used as an insecticide and is a key ingredient in commercially available mosquitoes repellent in Asian and subtropical countries. It has low toxicity for humans and birds, and are used in many household insecticides such as RAID as well as mosquito coils. They are however highly toxic to fish and bees. Insects subject to its exposure become paralyzed (nervous system effect) before dying. They are also used as an ultra-low volume spray for outdoor mosquito control. Therefore, there is a need for controlled release of allethrin in the environment. For controlled release of allethrin from sol-gel matrix, its (allethrin) we utilized temperature based controlled evaporation through porous sol-gel. Different types of fabric like cotton, Terri-cotton, polyester, surgical cap, knee-cap etc are studied and the best with maximum absorption capacity is selected to hold the sol-gel matrix with maximum quantity. For sol-gel coating 2 x 2cm cloth pieces are dipped in sol-gel solution for 10 minutes and by calculating the weight difference we concluded that Terri cotton is best suitable for our project. An electronic circuit with heating plate is developed in to test the controlled release of compound. An oscillatory circuit is used to produce the required heat.

Keywords: sol-gel, allethrin, TEOS, biochemistry

Procedia PDF Downloads 378
2458 Reduction of Plutonium Production in Heavy Water Research Reactor: A Feasibility Study through Neutronic Analysis Using MCNPX2.6 and CINDER90 Codes

Authors: H. Shamoradifar, B. Teimuri, P. Parvaresh, S. Mohammadi

Abstract:

One of the main characteristics of Heavy Water Moderated Reactors is their high production of plutonium. This article demonstrates the possibility of reduction of plutonium and other actinides in Heavy Water Research Reactor. Among the many ways for reducing plutonium production in a heavy water reactor, in this research, changing the fuel from natural Uranium fuel to Thorium-Uranium mixed fuel was focused. The main fissile nucleus in Thorium-Uranium fuels is U-233 which would be produced after neutron absorption by Th-232, so the Thorium-Uranium fuels have some known advantages compared to the Uranium fuels. Due to this fact, four Thorium-Uranium fuels with different compositions ratios were chosen in our simulations; a) 10% UO2-90% THO2 (enriched= 20%); b) 15% UO2-85% THO2 (enriched= 10%); c) 30% UO2-70% THO2 (enriched= 5%); d) 35% UO2-65% THO2 (enriched= 3.7%). The natural Uranium Oxide (UO2) is considered as the reference fuel, in other words all of the calculated data are compared with the related data from Uranium fuel. Neutronic parameters were calculated and used as the comparison parameters. All calculations were performed by Monte Carol (MCNPX2.6) steady state reaction rate calculation linked to a deterministic depletion calculation (CINDER90). The obtained computational data showed that Thorium-Uranium fuels with four different fissile compositions ratios can satisfy the safety and operating requirements for Heavy Water Research Reactor. Furthermore, Thorium-Uranium fuels have a very good proliferation resistance and consume less fissile material than uranium fuels at the same reactor operation time. Using mixed Thorium-Uranium fuels reduced the long-lived α emitter, high radiotoxic wastes and the radio toxicity level of spent fuel.

Keywords: Heavy Water Reactor, Burn up, Minor Actinides, Neutronic Calculation

Procedia PDF Downloads 246
2457 Influence of Glass Plates Different Boundary Conditions on Human Impact Resistance

Authors: Alberto Sanchidrián, José A. Parra, Jesús Alonso, Julián Pecharromán, Antonia Pacios, Consuelo Huerta

Abstract:

Glass is a commonly used material in building; there is not a unique design solution as plates with a different number of layers and interlayers may be used. In most façades, a security glazing have to be used according to its performance in the impact pendulum. The European Standard EN 12600 establishes an impact test procedure for classification under the point of view of the human security, of flat plates with different thickness, using a pendulum of two tires and 50 kg mass that impacts against the plate from different heights. However, this test does not replicate the actual dimensions and border conditions used in building configurations and so the real stress distribution is not determined with this test. The influence of different boundary conditions, as the ones employed in construction sites, is not well taking into account when testing the behaviour of safety glazing and there is not a detailed procedure and criteria to determinate the glass resistance against human impact. To reproduce the actual boundary conditions on site, when needed, the pendulum test is arranged to be used "in situ", with no account for load control, stiffness, and without a standard procedure. Fracture stress of small and large glass plates fit a Weibull distribution with quite a big dispersion so conservative values are adopted for admissible fracture stress under static loads. In fact, test performed for human impact gives a fracture strength two or three times higher, and many times without a total fracture of the glass plate. Newest standards, as for example DIN 18008-4, states for an admissible fracture stress 2.5 times higher than the ones used for static and wing loads. Now two working areas are open: a) to define a standard for the ‘in situ’ test; b) to prepare a laboratory procedure that allows testing with more real stress distribution. To work on both research lines a laboratory that allows to test medium size specimens with different border conditions, has been developed. A special steel frame allows reproducing the stiffness of the glass support substructure, including a rigid condition used as reference. The dynamic behaviour of the glass plate and its support substructure have been characterized with finite elements models updated with modal tests results. In addition, a new portable impact machine is being used to get enough force and direction control during the impact test. Impact based on 100 J is used. To avoid problems with broken glass plates, the test have been done using an aluminium plate of 1000 mm x 700 mm size and 10 mm thickness supported on four sides; three different substructure stiffness conditions are used. A detailed control of the dynamic stiffness and the behaviour of the plate is done with modal tests. Repeatability of the test and reproducibility of results prove that procedure to control both, stiffness of the plate and the impact level, is necessary.

Keywords: glass plates, human impact test, modal test, plate boundary conditions

Procedia PDF Downloads 308
2456 Electrochemical Growth and Properties of Cu2O Nanostructures

Authors: A. Azizi, S. Laidoudi, G. Schmerber, A. Dinia

Abstract:

Cuprous oxide (Cu2O) is a well-known oxide semiconductor with a band gap of 2.1 eV and a natural p-type conductivity, which is an attractive material for device applications because of its abundant availability, non toxicity, and low production cost. It has a higher absorption coefficient in the visible region and the minority carrier diffusion length is also suitable for use as a solar cell absorber layer and it has been explored in junction with n type ZnO for photovoltaic applications. Cu2O nanostructures have been made by a variety of techniques; the electrodeposition method has emerged as one of the most promising processing routes as it is particularly provides advantages such as a low-cost, low temperature and a high level of purity in the products. In this work, Cu2O nanostructures prepared by electrodeposition from aqueous cupric sulfate solution with citric acid at 65°C onto a fluorine doped tin oxide (FTO) coated glass substrates were investigated. The effects of deposition potential on the electrochemical, surface morphology, structural and optical properties of Cu2O thin films were investigated. During cyclic voltammetry experiences, the potential interval where the electrodeposition of Cu2O is carried out was established. The Mott–Schottky (M-S) plot demonstrates that all the films are p-type semiconductors, the flat-band potential and the acceptor density for the Cu2O thin films are determined. AFM images reveal that the applied potential has a very significant influence on the surface morphology and size of the crystallites of thin Cu2O. The XRD measurements indicated that all the obtained films display a Cu2O cubic structure with a strong preferential orientation of the (111) direction. The optical transmission spectra in the UV-Visible domains revealed the highest transmission (75 %), and their calculated gap values increased from 1.93 to 2.24 eV, with increasing potentials.

Keywords: Cu2O, electrodeposition, Mott–Schottky plot, nanostructure, optical properties, XRD

Procedia PDF Downloads 356
2455 Protective Impact of Some Natural Extracts Against Acute Hepatotoxicity in Wistar Rats: DNA Protecting, Antioxidant and Anti-Inflammatory Effects

Authors: Yara Mohamed Taha, Mohamed Ali El Desouky, Heba Kamal Abdel Hakim, Maha Hanafy Mahmoud

Abstract:

Hepatotoxicity due to drugs and toxic chemicals constitutes a crucial health problem nowadays. Medicinal plants are widely used recently for protecting against many liver disorders and inflammatory conditions. This study aims to evaluate hepatoprotective impact of green tea extract (GTE), rosemary extract (RE) and rosmarinic acid (RA) against hepatotoxins; ferric nitrilotriacetate (Fe-NTA) and diethylnitrosamine (DEN) in rats. Five groups of male Wistar rats were included; one control negative, while the other groups were treated intraperitoneally with DEN as 160 mg.kg-1 b.w. on 15th day and Fe-NTA as 5 mg.kg-1 b.w. on 33rd day. One of them was control positive. The other three groups were pre-administered with daily protective oral doses of either 200 mg.kg-1 b.w. of RE or 1 g.kg- 1 b.w. of GTE or 50 mg.kg-1 b.w. of RA two weeks prior to DEN exposure and continued till the end of the experimental period. The obtained data revealed a highly significant increase of MDA, 8-OHdG, DNA damage percent, a significant depletion of GSH and elevated Gr-1 protein expression in hepatocytes with liver tissue histopathological changes of rats exposed to DEN+Fe-NTA. Pre-administration of protective doses of RE, GTE and RA to DEN+Fe-NTA treated rats could normalize the altered biochemical, histopathological and immunohistochemical parameters. In conclusion, RE, GTE and RA showed a hepatoprotective effect against liver toxicity induced by DEN+Fe-NTA, with the best antioxidant and anti-inflammatory impact were for RA and GTE. Therefore, the current study declared that rosemary, green tea and products enriched with rosmarinic acid should be involved daily in diet of people who are exposed to chemicals and environmental toxins to protect themselves from hepatotoxicity.

Keywords: hepatotoxicity, diethylnitrosamine and ferric nitrilotriacetate, rosemary extract (RE), green tea extract (GTE), rosmarinic acid (RA)

Procedia PDF Downloads 93
2454 From By-product To Brilliance: Transforming Adobe Brick Construction Using Meat Industry Waste-derived Glycoproteins

Authors: Amal Balila, Maria Vahdati

Abstract:

Earth is a green building material with very low embodied energy and almost zero greenhouse gas emissions. However, it lacks strength and durability in its natural state. By responsibly sourcing stabilisers, it's possible to enhance its strength. This research draws inspiration from the robustness of termite mounds, where termites incorporate glycoproteins from their saliva during construction. Biomimicry explores the potential of these termite stabilisers in producing bio-inspired adobe bricks. The meat industry generates significant waste during slaughter, including blood, skin, bones, tendons, gastrointestinal contents, and internal organs. While abundant, many meat by-products raise concerns regarding human consumption, religious orders, cultural and ethical beliefs, and also heavily contribute to environmental pollution. Extracting and utilising proteins from this waste is vital for reducing pollution and increasing profitability. Exploring the untapped potential of meat industry waste, this research investigates how glycoproteins could revolutionize adobe brick construction. Bovine serum albumin (BSA) from cows' blood and mucin from porcine stomachs were the chosen glycoproteins used as stabilisers for adobe brick production. Despite their wide usage across various fields, they have very limited utilisation in food processing. Thus, both were identified as potential stabilisers for adobe brick production in this study. Two soil types were utilised to prepare adobe bricks for testing, comparing controlled unstabilised bricks with glycoprotein-stabilised ones. All bricks underwent testing for unconfined compressive strength and erosion resistance. The primary finding of this study is the efficacy of BSA, a glycoprotein derived from cows' blood and a by-product of the beef industry, as an earth construction stabiliser. Adding 0.5% by weight of BSA resulted in a 17% and 41% increase in the unconfined compressive strength for British and Sudanese adobe bricks, respectively. Further, adding 5% by weight of BSA led to a 202% and 97% increase in the unconfined compressive strength for British and Sudanese adobe bricks, respectively. Moreover, using 0.1%, 0.2%, and 0.5% by weight of BSA resulted in erosion rate reductions of 30%, 48%, and 70% for British adobe bricks, respectively, with a 97% reduction observed for Sudanese adobe bricks at 0.5% by weight of BSA. However, mucin from the porcine stomach did not significantly improve the unconfined compressive strength of adobe bricks. Nevertheless, employing 0.1% and 0.2% by weight of mucin resulted in erosion rate reductions of 28% and 55% for British adobe bricks, respectively. These findings underscore BSA's efficiency as an earth construction stabiliser for wall construction and mucin's efficacy for wall render, showcasing their potential for sustainable and durable building practices.

Keywords: biomimicry, earth construction, industrial waste management, sustainable building materials, termite mounds.

Procedia PDF Downloads 53
2453 Efficacy Testing of a Product in Reducing Facial Hyperpigmentation and Photoaging after a 12-Week Use

Authors: Nalini Kaul, Barrie Drewitt, Elsie Kohoot

Abstract:

Hyperpigmentation is the third most common pigmentary disorder where dermatologic treatment is sought. It affects all ages resulting in skin darkening because of melanin accumulation. An uneven skin tone because of either exposure to the sun (solar lentigos/age spots/sun spots or skin disruption following acne, or rashes (post-inflammatory hyperpigmentation -PIH) or hormonal changes (melasma) can lead to significant psychosocial impairment. Dyschromia is a result of various alterations in biochemical processes regulating melanogenesis. Treatments include the daily use of sunscreen with lightening, brightening, and exfoliating products. Depigmentation is achieved by various depigmenting agents: common examples are hydroquinone, arbutin, azelaic acid, aloesin, mulberry, licorice extracts, kojic acid, niacinamide, ellagic acid, arbutin, green tea, turmeric, soy, ascorbic acid, and tranexamic acid. These agents affect pigmentation by interfering with mechanisms before, during, and after melanin synthesis. While immediate correction is much sought after, patience and diligence are key. Our objective was to assess the effects of a facial product with pigmentation treatment and UV protection in 35 healthy F (35-65y), meeting the study criteria. Subjects with mild to moderate hyperpigmentation and fine lines with no use of skin-lightening products in the last six months or any dermatological procedures in the last twelve months before the study started were included. Efficacy parameters included expert clinical grading for hyperpigmentation, radiance, skin tone & smoothness, fine lines, and wrinkles bioinstrumentation (Corneometer®, Colorimeter®), digital photography and imaging (Visia-CR®), and self-assessment questionnaires. Safety included grading for erythema, edema, dryness & peeling and self-assessments for itching, stinging, tingling, and burning. Our results showed statistically significant improvement in clinical grading scores, bioinstrumentation, and digital photos for hyperpigmentation-brown spots, fine lines/wrinkles, skin tone, radiance, pores, skin smoothness, and overall appearance compared to baseline. The product was also well-tolerated and liked by subjects. Conclusion: Facial hyperpigmentation is of great concern, and treatment strategies are increasingly sought. Clinical trials with both subjective and objective assessments, imaging analyses, and self-perception are essential to distinguish evidence-based products. The multifunctional cosmetic product tested in this clinical study showed efficacy, tolerability, and subject satisfaction in reducing hyperpigmentation and global photoaging.

Keywords: hyperpigmentation; photoaging, clinical testing, expert visual evaluations, bio-instruments

Procedia PDF Downloads 79
2452 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine

Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang

Abstract:

Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.

Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing

Procedia PDF Downloads 204
2451 MyAds: A Social Adaptive System for Online Advertisment from Hypotheses to Implementation

Authors: Dana A. Al Qudah, Alexandra I. Critea, Rizik M. H. Al Sayyed, Amer Obeidah

Abstract:

Online advertisement is one of the major incomes for many companies; it has a role in the overall business flow and affects the consumer behavior directly. Unfortunately most users tend to block their ads or ignore them. MyAds is a social adaptive hypermedia system for online advertising and its main goal is to explore how to make online ads more acceptable. In order to achieve such a goal, various technologies and techniques are used. This paper presents a theoretical framework as well as the system architecture for MyAds that was designed based on a set of hypotheses and an exploratory study. The system then was implemented and a pilot experiment was conducted to validate it. The main outcomes suggest that the system has provided personalized ads for users. The main implications suggest that the system can be used for further testing and validating.

Keywords: adaptive hypermedia, e-advertisement, social, hypotheses, exploratory study, framework

Procedia PDF Downloads 413
2450 Nanopharmaceutical: A Comprehensive Appearance of Drug Delivery System

Authors: Mahsa Fathollahzadeh

Abstract:

The various nanoparticles employed in drug delivery applications include micelles, liposomes, solid lipid nanoparticles, polymeric nanoparticles, functionalized nanoparticles, nanocrystals, cyclodextrins, dendrimers, and nanotubes. Micelles, composed of amphiphilic block copolymers, can encapsulate hydrophobic molecules, allowing for targeted delivery. Liposomes, vesicular structures made up of phospholipids, can encapsulate both hydrophobic and hydrophilic molecules, providing a flexible platform for delivering therapeutic agents. Solid lipid nanoparticles (SLNs) and nanostructured lipid carriers (NLCs) are designed to improve the stability and bioavailability of lipophilic drugs. Polymeric nanoparticles, such as poly(lactic-co-glycolic acid) (PLGA), are biodegradable and can be engineered to release drugs in a controlled manner. Functionalized nanoparticles, coated with targeting ligands or antibodies, can specifically target diseased cells or tissues. Nanocrystals, engineered to have specific surface properties, can enhance the solubility and bioavailability of poorly soluble drugs. Cyclodextrins, doughnut-shaped molecules with hydrophobic cavities, can be complex with hydrophobic molecules, allowing for improved solubility and bioavailability. Dendrimers, branched polymers with a central core, can be designed to deliver multiple therapeutic agents simultaneously. Nanotubes and metallic nanoparticles, such as gold nanoparticles, offer real-time tracking capabilities and can be used to detect biomolecular interactions. The use of these nanoparticles has revolutionized the field of drug delivery, enabling targeted and controlled release of therapeutic agents, reduced toxicity, and improved patient outcomes.

Keywords: nanotechnology, nanopharmaceuticals, drug-delivery, proteins, ligands, nanoparticles, chemistry

Procedia PDF Downloads 55
2449 The Impact of Bitcoin on Stock Market Performance

Authors: Oliver Takawira, Thembi Hope

Abstract:

This study will analyse the relationship between Bitcoin price movements and the Johannesburg stock exchange (JSE). The aim is to determine whether Bitcoin price movements affect the stock market performance. As crypto currencies continue to gain prominence as a safe asset during periods of economic distress, this raises the question of whether Bitcoin’s prosperity could affect investment in the stock market. To identify the existence of a short run and long run linear relationship, the study will apply the Autoregressive Distributed Lag Model (ARDL) bounds test and a Vector Error Correction Model (VECM) after testing the data for unit roots and cointegration using the Augmented Dicker Fuller (ADF) and Phillips-Perron (PP). The Non-Linear Auto Regressive Distributed Lag (NARDL) will then be used to check if there is a non-linear relationship between bitcoin prices and stock market prices.

Keywords: bitcoin, stock market, interest rates, ARDL

Procedia PDF Downloads 107
2448 Failure Mode Analysis of a Multiple Layer Explosion Bonded Cryogenic Transition Joint

Authors: Richard Colwell, Thomas Englert

Abstract:

In cryogenic liquefaction processes, brazed aluminum core heat exchangers are used to minimize surface area/volume of the exchanger. Aluminum alloy (5083-H321; UNS A95083) piping must transition to higher melting point 304L stainless steel piping outside of the heat exchanger kettle or cold box for safety reasons. Since aluminum alloys and austenitic stainless steel cannot be directly welded to together, a transition joint consisting of 5 layers of different metals explosively bonded are used. Failures of two of these joints resulted in process shut-down and loss of revenue. Failure analyses, FEA analysis, and mock-up testing were performed by multiple teams to gain a further understanding into the failure mechanisms involved.

Keywords: explosion bonding, intermetallic compound, thermal strain, titanium-nickel Interface

Procedia PDF Downloads 219
2447 Integrated Design of Froth Flotation Process in Sludge Oil Recovery Using Cavitation Nanobubbles for Increase the Efficiency and High Viscose Compatibility

Authors: Yolla Miranda, Marini Altyra, Karina Kalmapuspita Imas

Abstract:

Oily sludge wastes always fill in upstream and downstream petroleum industry process. Sludge still contains oil that can use for energy storage. Recycling sludge is a method to handling it for reduce the toxicity and very probable to get the remaining oil around 20% from its volume. Froth flotation, a common method based on chemical unit for separate fine solid particles from an aqueous suspension. The basic composition of froth flotation is the capture of oil droplets or small solids by air bubbles in an aqueous slurry, followed by their levitation and collection in a froth layer. This method has been known as no intensive energy requirement and easy to apply. But the low efficiency and unable treat the high viscosity become the biggest problem in froth flotation unit. This study give the design to manage the high viscosity of sludge first and then entering the froth flotation including cavitation tube on it to change the bubbles into nano particles. The recovery in flotation starts with the collision and adhesion of hydrophobic particles to the air bubbles followed by transportation of the hydrophobic particle-bubble aggregate from the collection zone to the froth zone, drainage and enrichment of the froth, and finally by its overflow removal from the cell top. The effective particle separation by froth flotation relies on the efficient capture of hydrophobic particles by air bubbles in three steps. The important step is collision. Decreasing the bubble particles will increasing the collision effect. It cause the process more efficient. The pre-treatment, froth flotation, and cavitation tube integrated each other. The design shows the integrated unit and its process.

Keywords: sludge oil recovery, froth flotation, cavitation tube, nanobubbles, high viscosity

Procedia PDF Downloads 381
2446 Experimental and Numerical Analysis on Enhancing Mechanical Properties of CFRP Adhesive Joints Using Hybrid Nanofillers

Authors: Qiong Rao, Xiongqi Peng

Abstract:

In this work, multi-walled carbon nanotubes (MWCNTs) and graphene nanoplates (GNPs) were dispersed into epoxy adhesive to investigate their synergy effects on the shear properties, mode I and mode II fracture toughness of unidirectional composite bonded joints. Testing results showed that the incorporation of MWCNTs and GNPs significantly improved the shear strength, the mode I and mode II fracture toughness by 36.6%, 45% and 286%, respectively. In addition, the fracture surfaces of the bonding area as well as the toughening mechanism of nanofillers were analyzed. Finally, a nonlinear cohesive/friction coupled model for delamination analysis of adhesive layer under shear and normal compression loadings was proposed and implemented in ABAQUS/Explicit via user subroutine VUMAT.

Keywords: nanofillers, adhesive joints, fracture toughness, cohesive zone model

Procedia PDF Downloads 133
2445 A Detailed Computational Investigation into Copper Catalyzed Sonogashira Coupling Reaction

Authors: C. Rajalakshmi, Vibin Ipe Thomas

Abstract:

Sonogashira coupling reactions are widely employed in the synthesis of molecules of biological and pharmaceutical importance. Copper catalyzed Sonogashira coupling reactions are gaining importance owing to the low cost and less toxicity of copper as compared to the palladium catalyst. In the present work, a detailed computational study has been carried out on the Sonogashira coupling reaction between aryl halides and terminal alkynes catalyzed by Copper (I) species with trans-1, 2 Diaminocyclohexane as ligand. All calculations are performed at Density Functional Theory (DFT) level, using the hybrid Becke3LYP functional. Cu and I atoms are described using an effective core potential (LANL2DZ) for the inner electrons and its associated double-ζ basis set for the outer electrons. For all other atoms, 6-311G+* basis set is used. We have identified that the active catalyst species is a neutral 3-coordinate trans-1,2 diaminocyclohexane ligated Cu (I) alkyne complex and found that the oxidative addition and reductive elimination occurs in a single step proceeding through one transition state. This is owing to the ease of reductive elimination involving coupling of Csp2-Csp carbon atoms and the less stable Cu (III) intermediate. This shows the mechanism of copper catalyzed Sonogashira coupling reactions are quite different from those catalyzed by palladium. To gain further insights into the mechanism, substrates containing various functional groups are considered in our study to traverse their effect on the feasibility of the reaction. We have also explored the effect of ligand on the catalytic cycle of the coupling reaction. The theoretical results obtained are in good agreement with the experimental observation. This shows the relevance of a combined theoretical and experimental approach for rationally improving the cross-coupling reaction mechanisms.

Keywords: copper catalysed, density functional theory, reaction mechanism, Sonogashira coupling

Procedia PDF Downloads 118
2444 Theoretical and Experimental Bending Properties of Composite Pipes

Authors: Maja Stefanovska, Svetlana Risteska, Blagoja Samakoski, Gari Maneski, Biljana Kostadinoska

Abstract:

Aim of this work is to determine the theoretical and experimental properties of filament wound glass fiber/epoxy resin composite pipes with different winding design subjected under bending. For determination of bending strength of composite samples three point bending tests were conducted according to ASTM D790 standard. Good correlation between theoretical and experimental results has been obtained, where sample No4 has shown the highest value of bending strength. All samples have demonstrated matrix cracking and fiber failure followed by layers delamination during testing. Also, it was found that smaller winding angles lead to an increase in bending stress. From presented results good merger between glass fibers and epoxy resin was confirmed by SEM analysis.

Keywords: bending properties, composite pipe, winding design, SEM

Procedia PDF Downloads 329
2443 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 130