Search results for: mutation testing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3173

Search results for: mutation testing

2123 Development and Total Error Concept Validation of Common Analytical Method for Quantification of All Residual Solvents Present in Amino Acids by Gas Chromatography-Head Space

Authors: A. Ramachandra Reddy, V. Murugan, Prema Kumari

Abstract:

Residual solvents in Pharmaceutical samples are monitored using gas chromatography with headspace (GC-HS). Based on current regulatory and compendial requirements, measuring the residual solvents are mandatory for all release testing of active pharmaceutical ingredients (API). Generally, isopropyl alcohol is used as the residual solvent in proline and tryptophan; methanol in cysteine monohydrate hydrochloride, glycine, methionine and serine; ethanol in glycine and lysine monohydrate; acetic acid in methionine. In order to have a single method for determining these residual solvents (isopropyl alcohol, ethanol, methanol and acetic acid) in all these 7 amino acids a sensitive and simple method was developed by using gas chromatography headspace technique with flame ionization detection. During development, no reproducibility, retention time variation and bad peak shape of acetic acid peaks were identified due to the reaction of acetic acid with the stationary phase (cyanopropyl dimethyl polysiloxane phase) of column and dissociation of acetic acid with water (if diluent) while applying temperature gradient. Therefore, dimethyl sulfoxide was used as diluent to avoid these issues. But most the methods published for acetic acid quantification by GC-HS uses derivatisation technique to protect acetic acid. As per compendia, risk-based approach was selected as appropriate to determine the degree and extent of the validation process to assure the fitness of the procedure. Therefore, Total error concept was selected to validate the analytical procedure. An accuracy profile of ±40% was selected for lower level (quantitation limit level) and for other levels ±30% with 95% confidence interval (risk profile 5%). The method was developed using DB-Waxetr column manufactured by Agilent contains 530 µm internal diameter, thickness: 2.0 µm, and length: 30 m. A constant flow of 6.0 mL/min. with constant make up mode of Helium gas was selected as a carrier gas. The present method is simple, rapid, and accurate, which is suitable for rapid analysis of isopropyl alcohol, ethanol, methanol and acetic acid in amino acids. The range of the method for isopropyl alcohol is 50ppm to 200ppm, ethanol is 50ppm to 3000ppm, methanol is 50ppm to 400ppm and acetic acid 100ppm to 400ppm, which covers the specification limits provided in European pharmacopeia. The accuracy profile and risk profile generated as part of validation were found to be satisfactory. Therefore, this method can be used for testing of residual solvents in amino acids drug substances.

Keywords: amino acid, head space, gas chromatography, total error

Procedia PDF Downloads 137
2122 A Cross-Cultural Validation of the Simple Measure of Impact of Lupus Erythematosus in Youngsters (Smiley) among Filipino Pediatric Lupus Patients

Authors: Jemely M. Punzalan, Christine B. Bernal, Beatrice B. Canonigo, Maria Rosario F. Cabansag, Dennis S. Flores, Paul Joseph T. Galutira, Remedios D. Chan

Abstract:

Background: Systemic lupus erythematosus (SLE) is one of the most common autoimmune disorders predominates in women of childbearing age. Simple Measure of Impact of Lupus Erythematosus in Youngsters (SMILEY) is the only health specific quality of life tool for pediatric SLE, which has been translated to different languages except in Filipino. Objective: The primary objective of this study was to develop a Filipino translation of the SMILEY and to examine the validity and reliability of this translation. Methodology: The SMILEY was translated into Filipino by a bilingual individual and back-translated by another bilingual individual blinded from the original English version. The translation was evaluated for content validity by a panel of experts and subjected to pilot testing. The pilot-tested translation was used in the validity and reliability testing proper. The SMILEY, together with the previously validated PEDSQL 4.0 Generic Core Scale was administered to lupus pediatric patients and their parent at two separate occasions: a baseline and a re-test seven to fourteen days apart. Tests for convergent validity, internal consistency, and test-retest reliability were performed. Results: A total of fifty children and their parent were recruited. The mean age was 15.38±2.62 years (range 8-18 years), mean education at high school level. The mean duration of SLE was 28 months (range 1-81 months). Subjects found the questionnaires to be relevant, easy to understand and answer. The validity of the SMILEY was demonstrated in terms of content validity, convergent validity, internal consistency, and test-retest reliability. Age, socioeconomic status and educational attainment did not show a significant effect on the scores. The difference between scores of child and parent report was showed to be significant with SMILEY total (p=0.0214), effect on social life (p=0.0000), and PEDSQL physical function (p=0.0460). Child reports showed higher scores for the following domains compared to their parent. Conclusion: SMILEY is a brief, easy to understand, valid and reliable tool for assessing pediatric SLE specific HRQOL. It will be useful in providing better care, understanding and may offer critical information regarding the effect of SLE in the quality of life of our pediatric lupus patients. It will help physician understands the needs of their patient not only on treatment of the specific disease but as well as the impact of the treatment on their daily lives.

Keywords: systemic lupus erythematosus, pediatrics, quality of life, Simple Measure of Impact of Lupus Erythematosus in Youngsters (SMILEY)

Procedia PDF Downloads 429
2121 Stating Best Commercialization Method: An Unanswered Question from Scholars and Practitioners

Authors: Saheed A. Gbadegeshin

Abstract:

Commercialization method is a means to make inventions available at the market for final consumption. It is described as an important tool for keeping business enterprises sustainable and improving national economic growth. Thus, there are several scholarly publications on it, either presenting or testing different methods for commercialization. However, young entrepreneurs, technologists and scientists would like to know the best method to commercialize their innovations. Then, this question arises: What is the best commercialization method? To answer the question, a systematic literature review was conducted, and practitioners were interviewed. The literary results revealed that there are many methods but new methods are needed to improve commercialization especially during these times of economic crisis and political uncertainty. Similarly, the empirical results showed there are several methods, but the best method is the one that reduces costs, reduces the risks associated with uncertainty, and improves customer participation and acceptability. Therefore, it was concluded that new commercialization method is essential for today's high technologies and a method was presented.

Keywords: commercialization method, technology, knowledge, intellectual property, innovation, invention

Procedia PDF Downloads 332
2120 Some Pertinent Issues and Considerations on CBSE

Authors: Anil Kumar Tripathi, Ratneshwer Gupta

Abstract:

All the software engineering researches and best industry practices aim at providing software products with high degree of quality and functionality at low cost and less time. These requirements are addressed by the Component Based Software Engineering (CBSE) as well. CBSE, which deals with the software construction by components’ assembly, is a revolutionary extension of Software Engineering. CBSE must define and describe processes to assure timely completion of high quality software systems that are composed of a variety of pre built software components. Though these features provide distinct and visible benefits in software design and programming, they also raise some challenging problems. The aim of this work is to summarize the pertinent issues and considerations in CBSE to make an understanding in forms of concepts and observations that may lead to development of newer ways of dealing with the problems and challenges in CBSE.

Keywords: software component, component based software engineering, software process, testing, maintenance

Procedia PDF Downloads 388
2119 Investigation on Dry Sliding Wear for Laser Cladding of Stellite 6 Produced on a P91 Steel Substrate

Authors: Alain Kusmoko, Druce Dunne, Huijun Li

Abstract:

Stellite 6 was deposited by laser cladding on a chromium bearing substrate (P91) with energy inputs of 1 kW (P91-1) and 1.8 kW (P91-1.8). The chemical compositions and microstructures of these coatings were characterized by atomic absorption spectroscopy, optical microscopy and scanning electron microscopy. The microhardness of the coatings was measured and the wear mechanism of the coatings was assessed using a pin-on-plate (reciprocating) wear testing machine. The results showed less cracking and pore development for Stellite 6 coatings applied to the P91 steel substrate with the lower heat input (P91-1). Further, the Stellite coating for P91-1 was significantly harder than that obtained for P91-1.8. The wear test results indicated that the weight loss for P91-1 was much lower than for P91-1.8. It is concluded that the lower hardness of the coating for P91-1.8, together with the softer underlying substrate structure, markedly reduced the wear resistance of the Stellite 6 coating.

Keywords: friction and wear, laser cladding, P91 steel, Stellite 6 coating

Procedia PDF Downloads 428
2118 Mechanical Behavior of PVD Single Layer and Multilayer under Indentation Tests

Authors: K. Kaouther, D. Hafedh, A. Ben Cheikh Larbi

Abstract:

Various structures and compositions thin films were deposited on 100C6 (AISI 52100) steel substrate by PVD magnetron sputtering system. The morphological proprieties were evaluated using an atomic force microscopy (AFM). Vickers microindentation tests were performed with a Shimadzu HMV-2000 hardness testing machine. Hardness measurement was carried out using Jonsson and Hogmark model. The results show that the coatings topography was dominated by domes and craters. Mechanical behavior and failure modes under microindentation were depending of coatings structure and composition. TiAlN multilayer showed exception in the microindentation resistance compared to TiN single layer and TiAlN/TiAlN nanolayer. Piled structure provides an increase of failure resistance and a decrease in cracks propagation.

Keywords: PVD thin films, multilayer, microindentation, cracking, damage mechanisms

Procedia PDF Downloads 395
2117 Experimenting with Error Performance of Systems Employing Pulse Shaping Filters on a Software-Defined-Radio Platform

Authors: Chia-Yu Yao

Abstract:

This paper presents experimental results on testing the symbol-error-rate (SER) performance of quadrature amplitude modulation (QAM) systems employing symmetric pulse-shaping square-root (SR) filters designed by minimizing the roughness function and by minimizing the peak-to-average power ratio (PAR). The device used in the experiments is the 'bladeRF' software-defined-radio platform. PAR is a well-known measurement, whereas the roughness function is a concept for measuring the jitter-induced interference. The experimental results show that the system employing minimum-roughness pulse-shaping SR filters outperforms the system employing minimum-PAR pulse-shaping SR filters in the sense of SER performance.

Keywords: pulse-shaping filters, FIR filters, jittering, QAM

Procedia PDF Downloads 331
2116 Frequency Modulation in Vibro-Acoustic Modulation Method

Authors: D. Liu, D. M. Donskoy

Abstract:

The vibroacoustic modulation method is based on the modulation effect of high-frequency ultrasonic wave (carrier) by low-frequency vibration in the presence of various defects, primarily contact-type such as cracks, delamination, etc. The presence and severity of the defect are measured by the ratio of the spectral sidebands and the carrier in the spectrum of the modulated signal. This approach, however, does not differentiate between amplitude and frequency modulations, AM and FM, respectfully. It was experimentally shown that both modulations could be present in the spectrum, yet each modulation may be associated with different physical mechanisms. AM mechanisms are quite well understood and widely covered in the literature. This paper is a first attempt to explain the generation mechanisms of FM and its correlation with the flaw properties. Here we proposed two possible mechanisms leading to FM modulation based on nonlinear local defect resonance and dynamic acousto-elastic models.

Keywords: non-destructive testing, nonlinear acoustics, structural health monitoring, acousto-elasticity, local defect resonance

Procedia PDF Downloads 137
2115 Verification of Low-Dose Diagnostic X-Ray as a Tool for Relating Vital Internal Organ Structures to External Body Armour Coverage

Authors: Natalie A. Sterk, Bernard van Vuuren, Petrie Marais, Bongani Mthombeni

Abstract:

Injuries to the internal structures of the thorax and abdomen remain a leading cause of death among soldiers. Body armour is a standard issue piece of military equipment designed to protect the vital organs against ballistic and stab threats. When configured for maximum protection, the excessive weight and size of the armour may limit soldier mobility and increase physical fatigue and discomfort. Providing soldiers with more armour than necessary may, therefore, hinder their ability to react rapidly in life-threatening situations. The capability to determine the optimal trade-off between the amount of essential anatomical coverage and hindrance on soldier performance may significantly enhance the design of armour systems. The current study aimed to develop and pilot a methodology for relating internal anatomical structures with actual armour plate coverage in real-time using low-dose diagnostic X-ray scanning. Several pilot scanning sessions were held at Lodox Systems (Pty) Ltd head-office in South Africa. Testing involved using the Lodox eXero-dr to scan dummy trunk rigs at various degrees and heights of measurement; as well as human participants, wearing correctly fitted body armour while positioned in supine, prone shooting, seated and kneeling shooting postures. The verification of sizing and metrics obtained from the Lodox eXero-dr were then confirmed through a verification board with known dimensions. Results indicated that the low-dose diagnostic X-ray has the capability to clearly identify the vital internal structures of the aortic arch, heart, and lungs in relation to the position of the external armour plates. Further testing is still required in order to fully and accurately identify the inferior liver boundary, inferior vena cava, and spleen. The scans produced in the supine, prone, and seated postures provided superior image quality over the kneeling posture. The X-ray-source and-detector distance from the object must be standardised to control for possible magnification changes and for comparison purposes. To account for this, specific scanning heights and angles were identified to allow for parallel scanning of relevant areas. The low-dose diagnostic X-ray provides a non-invasive, safe, and rapid technique for relating vital internal structures with external structures. This capability can be used for the re-evaluation of anatomical coverage required for essential protection while optimising armour design and fit for soldier performance.

Keywords: body armour, low-dose diagnostic X-ray, scanning, vital organ coverage

Procedia PDF Downloads 114
2114 Cognitive Weighted Polymorphism Factor: A New Cognitive Complexity Metric

Authors: T. Francis Thamburaj, A. Aloysius

Abstract:

Polymorphism is one of the main pillars of the object-oriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.

Keywords: cognitive complexity metric, object-oriented metrics, polymorphism factor, software metrics

Procedia PDF Downloads 437
2113 Efficient Moment Frame Structure

Authors: Mircea I. Pastrav, Cornelia Baera, Florea Dinu

Abstract:

A different concept for designing and detailing of reinforced concrete precast frame structures is analyzed in this paper. The new detailing of the joints derives from the special hybrid moment frame joints. The special reinforcements of this alternative detailing, named modified special hybrid joint, are bondless with respect to both column and beams. Full scale tests were performed on a plan model, which represents a part of 5 story structure, cropped in the middle of the beams and columns spans. Theoretical approach was developed, based on testing results on twice repaired model, subjected to lateral seismic type loading. Discussion regarding the modified special hybrid joint behavior and further on widening research needed concludes the presentation.

Keywords: modified hybrid joint, repair, seismic loading type, acceptance criteria

Procedia PDF Downloads 513
2112 Serological IgG Testing to Diagnose Alimentary Induced Diseases and Monitoring Efficacy of an Individual Defined Diet in Dogs

Authors: Anne-Margré C. Vink

Abstract:

Background: Food-related allergies and intolerances are frequently occurring in dogs. Diagnosis and monitoring according to ‘Golden Standard’ of elimination efficiency are time-consuming, expensive, and requires expert clinical setting. In order to facilitate rapid and robust, quantitative testing of intolerance, and determining the individual offending foods, a serological test is implicated. Method: As we developed Medisynx IgG Human Screening Test ELISA before and the dog’s immune system is most similar to humans, we were able to develop Medisynx IgG Dog Screening Test ELISA as well. In this study, 47 dogs suffering from Canine Atopic Dermatitis (CAD) and several secondary induced reactions were included to participate in serological Medisynx IgG Dog Screening Test ELISA (within < 0,02 % SD). Results were expressed as titers relative to the standard OD readings to diagnose alimentary induced diseases and monitoring the efficacy of an individual eliminating diet in dogs. Split sample analysis was performed by independently sending 2 times 3 ml serum under two unique codes. Results: The veterinarian monitored these dogs to check dog’ results at least at 3, 7, 21, 49, 70 days and after period of 6 and 12 months on an individual negative diet and a positive challenge (retrospectively) at 6 months. Data of each dog were recorded in a screening form and reported that a complete recovery of all clinical manifestations was observed at or less than 70 days (between 50 and 70 days) in the majority of dogs(44 out of 47 dogs =93.6%). Conclusion: Challenge results showed a significant result of 100% in specificity as well as 100% positive predicted value. On the other hand, sensitivity was 95,7% and negative predictive value was 95,7%. In conclusion, an individual diet based on IgG ELISA in dogs provides a significant improvement of atopic dermatitis and pruritus including all other non-specific defined allergic skin reactions as erythema, itching, biting and gnawing at toes, as well as to several secondary manifestations like chronic diarrhoea, chronic constipation, otitis media, obesity, laziness or inactive behaviour, pain and muscular stiffness causing a movement disorders, excessive lacrimation, hyper behaviour, nervous behaviour and not possible to stay alone at home, anxiety, biting and aggressive behaviour and disobedience behaviour. Furthermore, we conclude that a relatively more severe systemic candidiasis, as shown by relatively higher titer (class 3 and 4 IgG reactions to Candida albicans), influence the duration of recovery from clinical manifestations in affected dogs. These findings are consistent with our preliminary human clinical studies.

Keywords: allergy, canine atopic dermatitis, CAD, food allergens, IgG-ELISA, food-incompatibility

Procedia PDF Downloads 311
2111 Biodiesel Production from Palm Oil Using an Oscillatory Baffled Reactor

Authors: Malee Santikunaporn, Tattep Techopittayakul, Channarong Asavatesanupap

Abstract:

Biofuel production especially that of biodiesel has gained tremendous attention during the last decade due to environmental concerns and shortage in petroleum oil reservoir. This research aims to investigate the influences of operating parameters, such as the alcohol-to-oil molar ratio (4:1, 6:1, and 9:1) and the amount of catalyst (1, 1.5, and 2 wt.%) on the trans esterification of refined palm oil (RPO) in a medium-scale oscillatory baffle reactor.  It has been shown that an increase in the methanol-to-oil ratio resulted in an increase in fatty acid methyl esters (FAMEs) content. The amount of catalyst has an insignificant effect on the FAMEs content. Engine testing was performed on B0 (100 v/v% diesel) and blended fuel or B50 (50 v/v% diesel). Combustion of B50 was found to give lower torque compared to pure diesel. Exhaust gas from B50 was found to contain lower concentration of CO and CO2.

Keywords: biodiesel, palm oil, transesterification, oscillatory baffled reactor

Procedia PDF Downloads 161
2110 Assessing the Blood-Brain Barrier (BBB) Permeability in PEA-15 Mutant Cat Brain using Magnetization Transfer (MT) Effect at 7T

Authors: Sultan Z. Mahmud, Emily C. Graff, Adil Bashir

Abstract:

Phosphoprotein enriched in astrocytes 15 kDa (PEA-15) is a multifunctional adapter protein which is associated with the regulation of apoptotic cell death. Recently it has been discovered that PEA-15 is crucial in normal neurodevelopment of domestic cats, a gyrencephalic animal model, although the exact function of PEA-15 in neurodevelopment is unknown. This study investigates how PEA-15 affects the blood-brain barrier (BBB) permeability in cat brain, which can cause abnormalities in tissue metabolite and energy supplies. Severe polymicrogyria and microcephaly have been observed in cats with a loss of function PEA-15 mutation, affecting the normal neurodevelopment of the cat. This suggests that the vital role of PEA-15 in neurodevelopment is associated with gyrification. Neurodevelopment is a highly energy demanding process. The mammalian brain depends on glucose as its main energy source. PEA-15 plays a very important role in glucose uptake and utilization by interacting with phospholipase D1 (PLD1). Mitochondria also plays a critical role in bioenergetics and essential to supply adequate energy needed for neurodevelopment. Cerebral blood flow regulates adequate metabolite supply and recent findings also showed that blood plasma contains mitochondria as well. So the BBB can play a very important role in regulating metabolite and energy supply in the brain. In this study the blood-brain permeability in cat brain was measured using MRI magnetization transfer (MT) effect on the perfusion signal. Perfusion is the tissue mass normalized supply of blood to the capillary bed. Perfusion also accommodates the supply of oxygen and other metabolites to the tissue. A fraction of the arterial blood can diffuse to the tissue, which depends on the BBB permeability. This fraction is known as water extraction fraction (EF). MT is a process of saturating the macromolecules, which has an effect on the blood that has been diffused into the tissue while having minimal effect on intravascular blood water that has not been exchanged with the tissue. Measurement of perfusion signal with and without MT enables to estimate the microvascular blood flow, EF and permeability surface area product (PS) in the brain. All the experiments were performed with Siemens 7T Magnetom with 32 channel head coil. Three control cats and three PEA-15 mutant cats were used for the study. Average EF in white and gray matter was 0.9±0.1 and 0.86±0.15 respectively, perfusion in white and gray matter was 85±15 mL/100g/min and 97±20 mL/100g/min respectively, PS in white and gray matter was 201±25 mL/100g/min and 225±35 mL/100g/min respectively for control cats. For PEA-15 mutant cats, average EF in white and gray matter was 0.81±0.15 and 0.77±0.2 respectively, perfusion in white and gray matter was 140±25 mL/100g/min and 165±18 mL/100g/min respectively, PS in white and gray matter was 240±30 mL/100g/min and 259±21 mL/100g/min respectively. This results show that BBB is compromised in PEA-15 mutant cat brain, where EF is decreased and perfusion as well as PS are increased in the mutant cats compared to the control cats. This findings might further explain the function of PEA-15 in neurodevelopment.

Keywords: BBB, cat brain, magnetization transfer, PEA-15

Procedia PDF Downloads 125
2109 Flexible Arm Manipulator Control for Industrial Tasks

Authors: Mircea Ivanescu, Nirvana Popescu, Decebal Popescu, Dorin Popescu

Abstract:

This paper addresses the control problem of a class of hyper-redundant arms. In order to avoid discrepancy between the mathematical model and the actual dynamics, the dynamic model with uncertain parameters of this class of manipulators is inferred. A procedure to design a feedback controller which stabilizes the uncertain system has been proposed. A PD boundary control algorithm is used in order to control the desired position of the manipulator. This controller is easy to implement from the point of view of measuring techniques and actuation. Numerical simulations verify the effectiveness of the presented methods. In order to verify the suitability of the control algorithm, a platform with a 3D flexible manipulator has been employed for testing. Experimental tests on this platform illustrate the applications of the techniques developed in the paper.

Keywords: distributed model, flexible manipulator, observer, robot control

Procedia PDF Downloads 312
2108 An Enhanced Harmony Search (ENHS) Algorithm for Solving Optimization Problems

Authors: Talha A. Taj, Talha A. Khan, M. Imran Khalid

Abstract:

Optimization techniques attract researchers to formulate a problem and determine its optimum solution. This paper presents an Enhanced Harmony Search (ENHS) algorithm for solving optimization problems. The proposed algorithm increases the convergence and is more efficient than the standard Harmony Search (HS) algorithm. The paper discusses the novel techniques in detail and also provides the strategy for tuning the decisive parameters that affects the efficiency of the ENHS algorithm. The algorithm is tested on various benchmark functions, a real world optimization problem and a constrained objective function. Also, the results of ENHS are compared to standard HS, and various other optimization algorithms. The ENHS algorithms prove to be significantly better and more efficient than other algorithms. The simulation and testing of the algorithms is performed in MATLAB.

Keywords: optimization, harmony search algorithm, MATLAB, electronic

Procedia PDF Downloads 448
2107 Systematic Identification of Noncoding Cancer Driver Somatic Mutations

Authors: Zohar Manber, Ran Elkon

Abstract:

Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).

Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements

Procedia PDF Downloads 96
2106 Influence of Glass Plates Different Boundary Conditions on Human Impact Resistance

Authors: Alberto Sanchidrián, José A. Parra, Jesús Alonso, Julián Pecharromán, Antonia Pacios, Consuelo Huerta

Abstract:

Glass is a commonly used material in building; there is not a unique design solution as plates with a different number of layers and interlayers may be used. In most façades, a security glazing have to be used according to its performance in the impact pendulum. The European Standard EN 12600 establishes an impact test procedure for classification under the point of view of the human security, of flat plates with different thickness, using a pendulum of two tires and 50 kg mass that impacts against the plate from different heights. However, this test does not replicate the actual dimensions and border conditions used in building configurations and so the real stress distribution is not determined with this test. The influence of different boundary conditions, as the ones employed in construction sites, is not well taking into account when testing the behaviour of safety glazing and there is not a detailed procedure and criteria to determinate the glass resistance against human impact. To reproduce the actual boundary conditions on site, when needed, the pendulum test is arranged to be used "in situ", with no account for load control, stiffness, and without a standard procedure. Fracture stress of small and large glass plates fit a Weibull distribution with quite a big dispersion so conservative values are adopted for admissible fracture stress under static loads. In fact, test performed for human impact gives a fracture strength two or three times higher, and many times without a total fracture of the glass plate. Newest standards, as for example DIN 18008-4, states for an admissible fracture stress 2.5 times higher than the ones used for static and wing loads. Now two working areas are open: a) to define a standard for the ‘in situ’ test; b) to prepare a laboratory procedure that allows testing with more real stress distribution. To work on both research lines a laboratory that allows to test medium size specimens with different border conditions, has been developed. A special steel frame allows reproducing the stiffness of the glass support substructure, including a rigid condition used as reference. The dynamic behaviour of the glass plate and its support substructure have been characterized with finite elements models updated with modal tests results. In addition, a new portable impact machine is being used to get enough force and direction control during the impact test. Impact based on 100 J is used. To avoid problems with broken glass plates, the test have been done using an aluminium plate of 1000 mm x 700 mm size and 10 mm thickness supported on four sides; three different substructure stiffness conditions are used. A detailed control of the dynamic stiffness and the behaviour of the plate is done with modal tests. Repeatability of the test and reproducibility of results prove that procedure to control both, stiffness of the plate and the impact level, is necessary.

Keywords: glass plates, human impact test, modal test, plate boundary conditions

Procedia PDF Downloads 301
2105 Management of Caverno-Venous Leakage: A Series of 133 Patients with Symptoms, Hemodynamic Workup, and Results of Surgery

Authors: Allaire Eric, Hauet Pascal, Floresco Jean, Beley Sebastien, Sussman Helene, Virag Ronald

Abstract:

Background: Caverno-venous leakage (CVL) is devastating, although barely known disease, the first cause of major physical impairment in men under 25, and responsible for 50% of resistances to phosphodiesterase 5-inhibitors (PDE5-I), affecting 30 to 40% of users in this medication class. In this condition, too early blood drainage from corpora cavernosa prevents penile rigidity and penetration during sexual intercourse. The role of conservative surgery in this disease remains controversial. Aim: Assess complications and results of combined open surgery and embolization for CVL. Method: Between June 2016 and September 2021, 133 consecutive patients underwent surgery in our institution for CVL, causing severe erectile dysfunction (ED) resistance to oral medical treatment. Procedures combined vein embolization and ligation with microsurgical techniques. We performed a pre-and post-operative clinical (Erection Harness Scale: EHS) hemodynamic evaluation by duplex sonography in all patients. Before surgery, the CVL network was visualized by computed tomography cavernography. Penile EMG was performed in case of diabetes or suspected other neurological conditions. All patients were optimized for hormonal status—data we prospectively recorded. Results: Clinical signs suggesting CVL were ED since age lower than 25, loss of erection when changing position, penile rigidity varying according to the position. Main complications were minor pulmonary embolism in 2 patients, one after airline travel, one with Factor V Leiden heterozygote mutation, one infection and three hematomas requiring reoperation, one decreased gland sensitivity lasting for more than one year. Mean pre-operative pharmacologic EHS was 2.37+/-0.64, mean pharmacologic post-operative EHS was 3.21+/-0.60, p<0.0001 (paired t-test). The mean EHS variation was 0.87+/-0.74. After surgery, 81.5% of patients had a pharmacologic EHS equal to or over 3, allowing for intercourse with penetration. Three patients (2.2%) experienced lower post-operative EHS. The main cause of failure was leakage from the deep dorsal aspect of the corpus cavernosa. In a 14 months follow-up, 83.2% of patients had a clinical EHS equal to or over 3, allowing for sexual intercourse with penetration, one-third of them without any medication. 5 patients had a penile implant after unsuccessful conservative surgery. Conclusion: Open surgery combined with embolization for CVL is an efficient approach to CVL causing severe erectile dysfunction.

Keywords: erectile dysfunction, cavernovenous leakage, surgery, embolization, treatment, result, complications, penile duplex sonography

Procedia PDF Downloads 136
2104 From By-product To Brilliance: Transforming Adobe Brick Construction Using Meat Industry Waste-derived Glycoproteins

Authors: Amal Balila, Maria Vahdati

Abstract:

Earth is a green building material with very low embodied energy and almost zero greenhouse gas emissions. However, it lacks strength and durability in its natural state. By responsibly sourcing stabilisers, it's possible to enhance its strength. This research draws inspiration from the robustness of termite mounds, where termites incorporate glycoproteins from their saliva during construction. Biomimicry explores the potential of these termite stabilisers in producing bio-inspired adobe bricks. The meat industry generates significant waste during slaughter, including blood, skin, bones, tendons, gastrointestinal contents, and internal organs. While abundant, many meat by-products raise concerns regarding human consumption, religious orders, cultural and ethical beliefs, and also heavily contribute to environmental pollution. Extracting and utilising proteins from this waste is vital for reducing pollution and increasing profitability. Exploring the untapped potential of meat industry waste, this research investigates how glycoproteins could revolutionize adobe brick construction. Bovine serum albumin (BSA) from cows' blood and mucin from porcine stomachs were the chosen glycoproteins used as stabilisers for adobe brick production. Despite their wide usage across various fields, they have very limited utilisation in food processing. Thus, both were identified as potential stabilisers for adobe brick production in this study. Two soil types were utilised to prepare adobe bricks for testing, comparing controlled unstabilised bricks with glycoprotein-stabilised ones. All bricks underwent testing for unconfined compressive strength and erosion resistance. The primary finding of this study is the efficacy of BSA, a glycoprotein derived from cows' blood and a by-product of the beef industry, as an earth construction stabiliser. Adding 0.5% by weight of BSA resulted in a 17% and 41% increase in the unconfined compressive strength for British and Sudanese adobe bricks, respectively. Further, adding 5% by weight of BSA led to a 202% and 97% increase in the unconfined compressive strength for British and Sudanese adobe bricks, respectively. Moreover, using 0.1%, 0.2%, and 0.5% by weight of BSA resulted in erosion rate reductions of 30%, 48%, and 70% for British adobe bricks, respectively, with a 97% reduction observed for Sudanese adobe bricks at 0.5% by weight of BSA. However, mucin from the porcine stomach did not significantly improve the unconfined compressive strength of adobe bricks. Nevertheless, employing 0.1% and 0.2% by weight of mucin resulted in erosion rate reductions of 28% and 55% for British adobe bricks, respectively. These findings underscore BSA's efficiency as an earth construction stabiliser for wall construction and mucin's efficacy for wall render, showcasing their potential for sustainable and durable building practices.

Keywords: biomimicry, earth construction, industrial waste management, sustainable building materials, termite mounds.

Procedia PDF Downloads 39
2103 Efficacy Testing of a Product in Reducing Facial Hyperpigmentation and Photoaging after a 12-Week Use

Authors: Nalini Kaul, Barrie Drewitt, Elsie Kohoot

Abstract:

Hyperpigmentation is the third most common pigmentary disorder where dermatologic treatment is sought. It affects all ages resulting in skin darkening because of melanin accumulation. An uneven skin tone because of either exposure to the sun (solar lentigos/age spots/sun spots or skin disruption following acne, or rashes (post-inflammatory hyperpigmentation -PIH) or hormonal changes (melasma) can lead to significant psychosocial impairment. Dyschromia is a result of various alterations in biochemical processes regulating melanogenesis. Treatments include the daily use of sunscreen with lightening, brightening, and exfoliating products. Depigmentation is achieved by various depigmenting agents: common examples are hydroquinone, arbutin, azelaic acid, aloesin, mulberry, licorice extracts, kojic acid, niacinamide, ellagic acid, arbutin, green tea, turmeric, soy, ascorbic acid, and tranexamic acid. These agents affect pigmentation by interfering with mechanisms before, during, and after melanin synthesis. While immediate correction is much sought after, patience and diligence are key. Our objective was to assess the effects of a facial product with pigmentation treatment and UV protection in 35 healthy F (35-65y), meeting the study criteria. Subjects with mild to moderate hyperpigmentation and fine lines with no use of skin-lightening products in the last six months or any dermatological procedures in the last twelve months before the study started were included. Efficacy parameters included expert clinical grading for hyperpigmentation, radiance, skin tone & smoothness, fine lines, and wrinkles bioinstrumentation (Corneometer®, Colorimeter®), digital photography and imaging (Visia-CR®), and self-assessment questionnaires. Safety included grading for erythema, edema, dryness & peeling and self-assessments for itching, stinging, tingling, and burning. Our results showed statistically significant improvement in clinical grading scores, bioinstrumentation, and digital photos for hyperpigmentation-brown spots, fine lines/wrinkles, skin tone, radiance, pores, skin smoothness, and overall appearance compared to baseline. The product was also well-tolerated and liked by subjects. Conclusion: Facial hyperpigmentation is of great concern, and treatment strategies are increasingly sought. Clinical trials with both subjective and objective assessments, imaging analyses, and self-perception are essential to distinguish evidence-based products. The multifunctional cosmetic product tested in this clinical study showed efficacy, tolerability, and subject satisfaction in reducing hyperpigmentation and global photoaging.

Keywords: hyperpigmentation; photoaging, clinical testing, expert visual evaluations, bio-instruments

Procedia PDF Downloads 64
2102 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine

Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang

Abstract:

Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.

Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing

Procedia PDF Downloads 193
2101 MyAds: A Social Adaptive System for Online Advertisment from Hypotheses to Implementation

Authors: Dana A. Al Qudah, Alexandra I. Critea, Rizik M. H. Al Sayyed, Amer Obeidah

Abstract:

Online advertisement is one of the major incomes for many companies; it has a role in the overall business flow and affects the consumer behavior directly. Unfortunately most users tend to block their ads or ignore them. MyAds is a social adaptive hypermedia system for online advertising and its main goal is to explore how to make online ads more acceptable. In order to achieve such a goal, various technologies and techniques are used. This paper presents a theoretical framework as well as the system architecture for MyAds that was designed based on a set of hypotheses and an exploratory study. The system then was implemented and a pilot experiment was conducted to validate it. The main outcomes suggest that the system has provided personalized ads for users. The main implications suggest that the system can be used for further testing and validating.

Keywords: adaptive hypermedia, e-advertisement, social, hypotheses, exploratory study, framework

Procedia PDF Downloads 399
2100 The Impact of Bitcoin on Stock Market Performance

Authors: Oliver Takawira, Thembi Hope

Abstract:

This study will analyse the relationship between Bitcoin price movements and the Johannesburg stock exchange (JSE). The aim is to determine whether Bitcoin price movements affect the stock market performance. As crypto currencies continue to gain prominence as a safe asset during periods of economic distress, this raises the question of whether Bitcoin’s prosperity could affect investment in the stock market. To identify the existence of a short run and long run linear relationship, the study will apply the Autoregressive Distributed Lag Model (ARDL) bounds test and a Vector Error Correction Model (VECM) after testing the data for unit roots and cointegration using the Augmented Dicker Fuller (ADF) and Phillips-Perron (PP). The Non-Linear Auto Regressive Distributed Lag (NARDL) will then be used to check if there is a non-linear relationship between bitcoin prices and stock market prices.

Keywords: bitcoin, stock market, interest rates, ARDL

Procedia PDF Downloads 93
2099 Failure Mode Analysis of a Multiple Layer Explosion Bonded Cryogenic Transition Joint

Authors: Richard Colwell, Thomas Englert

Abstract:

In cryogenic liquefaction processes, brazed aluminum core heat exchangers are used to minimize surface area/volume of the exchanger. Aluminum alloy (5083-H321; UNS A95083) piping must transition to higher melting point 304L stainless steel piping outside of the heat exchanger kettle or cold box for safety reasons. Since aluminum alloys and austenitic stainless steel cannot be directly welded to together, a transition joint consisting of 5 layers of different metals explosively bonded are used. Failures of two of these joints resulted in process shut-down and loss of revenue. Failure analyses, FEA analysis, and mock-up testing were performed by multiple teams to gain a further understanding into the failure mechanisms involved.

Keywords: explosion bonding, intermetallic compound, thermal strain, titanium-nickel Interface

Procedia PDF Downloads 202
2098 Experimental and Numerical Analysis on Enhancing Mechanical Properties of CFRP Adhesive Joints Using Hybrid Nanofillers

Authors: Qiong Rao, Xiongqi Peng

Abstract:

In this work, multi-walled carbon nanotubes (MWCNTs) and graphene nanoplates (GNPs) were dispersed into epoxy adhesive to investigate their synergy effects on the shear properties, mode I and mode II fracture toughness of unidirectional composite bonded joints. Testing results showed that the incorporation of MWCNTs and GNPs significantly improved the shear strength, the mode I and mode II fracture toughness by 36.6%, 45% and 286%, respectively. In addition, the fracture surfaces of the bonding area as well as the toughening mechanism of nanofillers were analyzed. Finally, a nonlinear cohesive/friction coupled model for delamination analysis of adhesive layer under shear and normal compression loadings was proposed and implemented in ABAQUS/Explicit via user subroutine VUMAT.

Keywords: nanofillers, adhesive joints, fracture toughness, cohesive zone model

Procedia PDF Downloads 125
2097 Theoretical and Experimental Bending Properties of Composite Pipes

Authors: Maja Stefanovska, Svetlana Risteska, Blagoja Samakoski, Gari Maneski, Biljana Kostadinoska

Abstract:

Aim of this work is to determine the theoretical and experimental properties of filament wound glass fiber/epoxy resin composite pipes with different winding design subjected under bending. For determination of bending strength of composite samples three point bending tests were conducted according to ASTM D790 standard. Good correlation between theoretical and experimental results has been obtained, where sample No4 has shown the highest value of bending strength. All samples have demonstrated matrix cracking and fiber failure followed by layers delamination during testing. Also, it was found that smaller winding angles lead to an increase in bending stress. From presented results good merger between glass fibers and epoxy resin was confirmed by SEM analysis.

Keywords: bending properties, composite pipe, winding design, SEM

Procedia PDF Downloads 315
2096 A Report of 5-Months-Old Baby with Balanced Chromosomal Rearrangements along with Phenotypic Abnormalities

Authors: Mohit Kumar, Beklashwar Salona, Shiv Murti, Mukesh Singh

Abstract:

We report here a case of five-months old male baby, born as second child of non-consanguineous parents with no considerable history of genetic abnormality which was referred to our cytogenetic laboratory for chromosomal analysis. Physical dysmorphic facial features including mongoloid face, cleft palate, simian crease, and developmental delay were observed. We present this case with unique balanced autosomal translocation of t(3;10)(p21;p13). The risk of phenotypic abnormalities based on de novo balanced translocation was estimated to be 7%. The association of balanced chromosomal rearrangement with Down syndrome features such as multiple congenital anomalies, facial dysmorphism and congenital heart anomalies are very rare in a 5-months old male child. Trisomy-21 is not uncommon in chromosomal abnormality with the birth defect and balanced translocations are frequently observed in patients with secondary infertility or recurrent spontaneous abortion (RSA). Two ml heparinized peripheral blood cells cultured in RPMI-1640 for 72 hours supplemented with 20% fetal bovine serum, phytohemagglutinin (PHA), and antibiotics were used for chromosomal analysis. A total 30 metaphases images were captured using Olympus-BX51 microscope and analyzed using Bio-view karyotyping software through GTG-banding (G bands by trypsin and Giemsa) according to International System for Human Cytogenetic Nomenclature 2016. The results showed balanced translocation between short arm of chromosome # 3 and short arm of chromosome # 10. The karyotype of the child was found to be 46,XY,t(3;10)(p21; p13). Chromosomal abnormalities are one of the major causes of birth defect in new born babies. Also, balanced translocations are frequently observed in patients with secondary infertility or recurrent spontaneous abortion. The index case presented with dysmorphic facial features and had a balanced translocation 46,XY,t(3;10)(p21;p13). This translocation with break points at (p21; p13) has not been reported in the literature in a child with facial dysmorphism. To the best of our knowledge, this is the first report of novel balanced translocation t(3;10) with break points in a child with dysmorphic features. We found balanced chromosomal translocation instead of any trisomy or unbalanced aberrations along with some phenotypic abnormalities. Therefore, we suggest that such novel balanced translocation with abnormal phenotype should be reported in order to enable the pathologist, pediatrician, and gynecologist to have a better insight into the intricacies of chromosomal abnormalities and their associated phenotypic features. We hypothesized that dysmorphic features as seen in this case may be the result of change in the pattern of genes located at the breakpoint area in balanced translocations or may be due to deletion or mutation of genes located on the p-arm of chromosome # 3 and p-arm of chromosome # 10.

Keywords: balanced translocation, karyotyping, phenotypic abnormalities, facial dimorphisms

Procedia PDF Downloads 199
2095 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 108
2094 Changes in Postural Stability after Coordination Exercise

Authors: Ivan Struhár, Martin Sebera, Lenka Dovrtělová

Abstract:

The aim of this study was to find out if the special type of exercise with elastic cord can improve the level of postural stability. The exercise programme was conducted twice a week for 3 months. The participants were randomly divided into an experimental group and a control group. The electronic balance board was used for testing of postural stability. All participants trained for 18 hours at the time of experiment without any special form of coordination programme. The experimental group performed 90 minutes plus of coordination exercise. The result showed that differences between pre-test and post-test occurred in the experimental group. It was used the nonparametric Wilcoxon t-test for paired samples (p=0.012; the significance level 95%). We calculated effect size by Cohen´s d. In the experimental group d is 1.96 which indicates a large effect. In the control group d is 0.04 which confirms no significant improvement.

Keywords: balance board, balance training, coordination, stability

Procedia PDF Downloads 379