Search results for: validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1322

Search results for: validation

572 Enhanced Flight Dynamics Model to Simulate the Aircraft Response to Gust Encounters

Authors: Castells Pau, Poetsch Christophe

Abstract:

The effect of gust and turbulence encounters on aircraft is a wide field of study which allows different approaches, from high-fidelity multidisciplinary simulations to more simplified models adapted to industrial applications. The typical main goal is to predict the gust loads on the aircraft in order to ensure a safe design and achieve certification. Another topic widely studied is the gust loads reduction through an active control law. The impact of gusts on aircraft handling qualities is of interest as well in the analysis of in-service events so as to evaluate the aircraft response and the performance of the flight control laws. Traditionally, gust loads and handling qualities are addressed separately with different models adapted to the specific needs of each discipline. In this paper, an assessment of the differences between both models is presented and a strategy to better account for the physics of gust encounters in a typical flight dynamics model is proposed based on the model used for gust loads analysis. The applied corrections aim to capture the gust unsteady aerodynamics and propagation as well as the effect of dynamic flexibility at low frequencies. Results from the gust loads model at different flight conditions and measures from real events are used for validation. An assessment of a possible extension of steady aerodynamic nonlinearities to low frequency range is also addressed. The proposed corrections provide meaningful means to evaluate the performance and possible adjustments of the flight control laws.

Keywords: flight dynamics, gust loads, handling qualities, unsteady aerodynamics

Procedia PDF Downloads 124
571 An Inverse Docking Approach for Identifying New Potential Anticancer Targets

Authors: Soujanya Pasumarthi

Abstract:

Inverse docking is a relatively new technique that has been used to identify potential receptor targets of small molecules. Our docking software package MDock is well suited for such an application as it is both computationally efficient, yet simultaneously shows adequate results in binding affinity predictions and enrichment tests. As a validation study, we present the first stage results of an inverse-docking study which seeks to identify potential direct targets of PRIMA-1. PRIMA-1 is well known for its ability to restore mutant p53's tumor suppressor function, leading to apoptosis in several types of cancer cells. For this reason, we believe that potential direct targets of PRIMA-1 identified in silico should be experimentally screened for their ability to inhibitcancer cell growth. The highest-ranked human protein of our PRIMA-1 docking results is oxidosqualene cyclase (OSC), which is part of the cholesterol synthetic pathway. The results of two followup experiments which treat OSC as a possible anti-cancer target are promising. We show that both PRIMA-1 and Ro 48-8071, a known potent OSC inhibitor, significantly reduce theviability of BT-474 breast cancer cells relative to normal mammary cells. In addition, like PRIMA-1, we find that Ro 48-8071 results in increased binding of mutant p53 to DNA in BT- 474cells (which highly express p53). For the first time, Ro 48-8071 is shown as a potent agent in killing human breast cancer cells. The potential of OSC as a new target for developing anticancer therapies is worth further investigation.

Keywords: inverse docking, in silico screening, protein-ligand interactions, molecular docking

Procedia PDF Downloads 413
570 The Effect of Crack Size, Orientation and Number on the Elastic Modulus of a Cracked Body

Authors: Mark T. Hanson, Alan T. Varughese

Abstract:

Osteoporosis is a disease affecting bone quality which in turn can increase the risk of low energy fractures. Treatment of osteoporosis using Bisphosphonates has the beneficial effect of increasing bone mass while at the same time has been linked to the formation of atypical femoral fractures. This has led to the increased study of micro-fractures in bones of patients using Bisphosphonate treatment. One of the mechanics related issues which have been identified in this regard is the loss in stiffness of bones containing one or many micro-fractures. Different theories have been put forth using fracture mechanics to determine the effect of crack presence on elastic properties such as modulus. However, validation of these results in a deterministic way has not been forthcoming. The present analysis seeks to provide this deterministic evaluation of fracture’s effect on the elastic modulus. In particular, the effect of crack size, crack orientation and crack number on elastic modulus is investigated. In particular, the Finite Element method is used to explicitly determine the elastic modulus reduction caused by the presence of cracks in a representative volume element. Single cracks of various lengths and orientations are examined as well as cases of multiple cracks. Cracks in tension as well as under shear stress are considered. Although the focus is predominantly two-dimensional, some three-dimensional results are also presented. The results obtained show the explicit reduction in modulus caused by the parameters of crack size, orientation and number noted above. The present results allow the interpretation of the various theories which currently exist in the literature.

Keywords: cracks, elastic, fracture, modulus

Procedia PDF Downloads 86
569 2D Numerical Modeling of Ultrasonic Measurements in Concrete: Wave Propagation in a Multiple-Scattering Medium

Authors: T. Yu, L. Audibert, J. F. Chaix, D. Komatitsch, V. Garnier, J. M. Henault

Abstract:

Linear Ultrasonic Techniques play a major role in Non-Destructive Evaluation (NDE) for civil engineering structures in concrete since they can meet operational requirements. Interpretation of ultrasonic measurements could be improved by a better understanding of ultrasonic wave propagation in a multiple scattering medium. This work aims to develop a 2D numerical model of ultrasonic wave propagation in a heterogeneous medium, like concrete, integrating the multiple scattering phenomena in SPECFEM software. The coherent field of multiple scattering is obtained by averaging numerical wave fields, and it is used to determine the effective phase velocity and attenuation corresponding to an equivalent homogeneous medium. First, this model is applied to one scattering element (a cylinder) in a homogenous medium in a linear-elastic system, and its validation is completed thanks to the comparison with analytical solution. Then, some cases of multiple scattering by a set of randomly located cylinders or polygons are simulated to perform parametric studies on the influence of frequency and scatterer size, concentration, and shape. Also, the effective properties are compared with the predictions of Waterman-Truell model to verify its validity. Finally, the mortar viscoelastic behavior is introduced in the simulation in order to considerer the dispersion and the attenuation due to porosity included in the cement paste. In the future, different steps will be developed: The comparisons with experimental results, the interpretation of NDE measurements, and the optimization of NDE parameters before an auscultation.

Keywords: attenuation, multiple-scattering medium, numerical modeling, phase velocity, ultrasonic measurements

Procedia PDF Downloads 230
568 Tomato-Weed Classification by RetinaNet One-Step Neural Network

Authors: Dionisio Andujar, Juan lópez-Correa, Hugo Moreno, Angela Ri

Abstract:

The increased number of weeds in tomato crops highly lower yields. Weed identification with the aim of machine learning is important to carry out site-specific control. The last advances in computer vision are a powerful tool to face the problem. The analysis of RGB (Red, Green, Blue) images through Artificial Neural Networks had been rapidly developed in the past few years, providing new methods for weed classification. The development of the algorithms for crop and weed species classification looks for a real-time classification system using Object Detection algorithms based on Convolutional Neural Networks. The site study was located in commercial corn fields. The classification system has been tested. The procedure can detect and classify weed seedlings in tomato fields. The input to the Neural Network was a set of 10,000 RGB images with a natural infestation of Cyperus rotundus l., Echinochloa crus galli L., Setaria italica L., Portulaca oeracea L., and Solanum nigrum L. The validation process was done with a random selection of RGB images containing the aforementioned species. The mean average precision (mAP) was established as the metric for object detection. The results showed agreements higher than 95 %. The system will provide the input for an online spraying system. Thus, this work plays an important role in Site Specific Weed Management by reducing herbicide use in a single step.

Keywords: deep learning, object detection, cnn, tomato, weeds

Procedia PDF Downloads 79
567 Classification of EEG Signals Based on Dynamic Connectivity Analysis

Authors: Zoran Šverko, Saša Vlahinić, Nino Stojković, Ivan Markovinović

Abstract:

In this article, the classification of target letters is performed using data from the EEG P300 Speller paradigm. Neural networks trained with the results of dynamic connectivity analysis between different brain regions are used for classification. Dynamic connectivity analysis is based on the adaptive window size and the imaginary part of the complex Pearson correlation coefficient. Brain dynamics are analysed using the relative intersection of confidence intervals for the imaginary component of the complex Pearson correlation coefficient method (RICI-imCPCC). The RICI-imCPCC method overcomes the shortcomings of currently used dynamical connectivity analysis methods, such as the low reliability and low temporal precision for short connectivity intervals encountered in constant sliding window analysis with wide window size and the high susceptibility to noise encountered in constant sliding window analysis with narrow window size. This method overcomes these shortcomings by dynamically adjusting the window size using the RICI rule. This method extracts information about brain connections for each time sample. Seventy percent of the extracted brain connectivity information is used for training and thirty percent for validation. Classification of the target word is also done and based on the same analysis method. As far as we know, through this research, we have shown for the first time that dynamic connectivity can be used as a parameter for classifying EEG signals.

Keywords: dynamic connectivity analysis, EEG, neural networks, Pearson correlation coefficients

Procedia PDF Downloads 172
566 Design and Analysis of a Lightweight Fire-Resistant Door

Authors: Zainab Fadil, Mouath Alawadhi, Abdullah Alhusainan, Fahad Alqadiri, Abdulaziz Alqadiri

Abstract:

This study investigates how lightweight a fire resistance door will perform with under types of insulation materials. Data is initially collected from various websites, scientific books and research papers. Results show that different layers of insulation in a single door can perform better than one insulator. Furthermore, insulation materials that are lightweight, high strength and low thermal conductivity are the most preferred for fire-rated doors. Whereas heavy weight, low strength, and high thermal conductivity are least preferred for fire-resistance doors. Fire-rated doors specifications, theoretical test methodology, structural analysis, and comparison between five different models with diverse layers insulations are presented. Five different door models are being investigated with different insulation materials and arrangements. Model 1 contains an air gap between door layers. Model 2 includes phenolic foam, mild steel and polyurethane. Model 3 includes phenolic foam and glass wool. Model 4 includes polyurethane and glass wool. Model 5 includes only rock wool between the door layers. It is noticed that model 5 is the most efficient model and its design is simple compared to other models. For this model, numerical calculations are performed to check its efficiency and the results are compared to data from experiments for validation. Good agreement was noticed.

Keywords: fire resistance, insulation, strength, thermal conductivity, lightweight, layers

Procedia PDF Downloads 56
565 The Neutrophil-to-Lymphocyte Ratio after Surgery for Hip Fracture in a New, Simple, and Objective Score to Predict Postoperative Mortality

Authors: Philippe Dillien, Patrice Forget, Harald Engel, Olivier Cornu, Marc De Kock, Jean Cyr Yombi

Abstract:

Introduction: Hip fracture precedes commonly death in elderly people. Identification of high-risk patients may contribute to target patients in whom optimal management, resource allocation and trials efficiency is needed. The aim of this study is to construct a predictive score of mortality after hip fracture on the basis of the objective prognostic factors available: Neutrophil-to-lymphocyte ratio (NLR), age, and sex. C-Reactive Protein (CRP), is also considered as an alternative to the NLR. Patients and methods: After the IRB approval, we analyzed our prospective database including 286 consecutive patients with hip fracture. A score was constructed combining age (1 point per decade above 74 years), sex (1 point for males), and NLR at postoperative day+5 (1 point if >5). A receiver-operating curve (ROC) curve analysis was performed. Results: From the 286 patients included, 235 were analyzed (72 males and 163 females, 30.6%/69.4%), with a median age of 84 (range: 65 to 102) years, mean NLR values of 6.47+/-6.07. At one year, 82/280 patients died (29.3%). Graphical analysis and log-rank test confirm a highly statistically significant difference (P<0.001). Performance analysis shows an AUC of 0.72 [95%CI 0.65-0.79]. CRP shows no advantage on NLR. Conclusion: We have developed a score based on age, sex and the NLR to predict the risk of mortality at one year in elderly patients after surgery for a hip fracture. After external validation, it may be included in clinical practice as in clinical research to stratify the risk of postoperative mortality.

Keywords: neutrophil-to-lymphocyte ratio, hip fracture, postoperative mortality, medical and health sciences

Procedia PDF Downloads 389
564 Highly Glazed Office Spaces: Simulated Visual Comfort vs Real User Experiences

Authors: Zahra Hamedani, Ebrahim Solgi, Henry Skates, Gillian Isoardi

Abstract:

Daylighting plays a pivotal role in promoting productivity and user satisfaction in office spaces. There is an ongoing trend in designing office buildings with a high proportion of glazing which relatively increases the risk of high visual discomfort. Providing a more realistic lighting analysis can be of high value at the early stages of building design when necessary changes can be made at a very low cost. This holistic approach can be achieved by incorporating subjective evaluation and user behaviour in computer simulation and provide a comprehensive lighting analysis. In this research, a detailed computer simulation model has been made using Radiance and Daysim. Afterwards, this model was validated by measurements and user feedback. The case study building is the school of science at Griffith University, Gold Coast, Queensland, which features highly glazed office spaces. In this paper, the visual comfort predicted by the model is compared with a preliminary survey of the building users to evaluate how user behaviour such as desk position, orientation selection, and user movement caused by daylight changes and other visual variations can inform perceptions of visual comfort. This work supports preliminary design analysis of visual comfort incorporating the effects of gaze shift patterns and views with the goal of designing effective layout for office spaces.

Keywords: lighting simulation, office buildings, user behaviour, validation, visual comfort

Procedia PDF Downloads 181
563 A Laboratory–Designed Activity in Ecology to Demonstrate the Allelopathic Property of the Philippine Chromolaena odorata L. (King and Robinson) Leaf Extracts

Authors: Lina T. Codilla

Abstract:

This study primarily designed a laboratory activity in ecology to demonstrate the allelopathic property of the Philippine Chromolaena odorata L. (hagonoy) leaf extracts to Lycopersicum esculentum (M), commonly known as tomatoes. Ethanol extracts of C. odorata leaves were tested on seed germination and seedling growth of L. esculentum in 7-day and 14-day observation periods. Analysis of variance and Tukey’s HSD post hoc test was utilized to determine differences among treatments while Pre–test – Post–test experimental design was utilized in the determination of the effectiveness of the designed laboratory activity. Results showed that the 0.5% concentration level of ethanol leaf extracts significantly inhibited germination and seedling growth of L. esculentum in both observation periods. These results were used as the basis in the development of instructional material in ecology. The laboratory activity underwent face validation by five (5) experts in various fields of specialization, namely, Biological Sciences, Chemistry and Science Education. The readability of the designed laboratory activity was determined using a Cloze Test. Pilot testing was conducted and showed that the laboratory activity developed is found to be a very effective tool in supplementing learning about allelopathy in ecology class. Thus, it is recommended for use among ecology classes but modification will be made in a small – scale basis to minimize time consumption.

Keywords: allelopathy, chromolaena odorata l. (hagonoy), designed-laboratory activity, organic herbicide students’ performance

Procedia PDF Downloads 268
562 Development of GIS-Based Geotechnical Guidance Maps for Prediction of Soil Bearing Capacity

Authors: Q. Toufeeq, R. Kauser, U. R. Jamil, N. Sohaib

Abstract:

Foundation design of a structure needs soil investigation to avoid failures due to settlements. This soil investigation is expensive and time-consuming. Developments of new residential societies involve huge leveling of large sites that is accompanied by heavy land filling. Poor practices of land fill for deep depths cause differential settlements and consolidations of underneath soil that sometimes result in the collapse of structures. The extent of filling remains unknown to the individual developer unless soil investigation is carried out. Soil investigation cannot be performed on each available site due to involved costs. However, fair estimate of bearing capacity can be made if such tests are already done in the surrounding areas. The geotechnical guidance maps can provide a fair assessment of soil properties. Previously, GIS-based approaches have been used to develop maps using extrapolation and interpolations techniques for bearing capacities, underground recharge, soil classification, geological hazards, landslide hazards, socio-economic, and soil liquefaction mapping. Standard penetration test (SPT) data of surrounding sites were already available. Google Earth is used for digitization of collected data. Few points were considered for data calibration and validation. Resultant Geographic information system (GIS)-based guidance maps are helpful to anticipate the bearing capacity in the real estate industry.

Keywords: bearing capacity, soil classification, geographical information system, inverse distance weighted, radial basis function

Procedia PDF Downloads 107
561 Constraining the Potential Nickel Laterite Area Using Geographic Information System-Based Multi-Criteria Rating in Surigao Del Sur

Authors: Reiner-Ace P. Mateo, Vince Paolo F. Obille

Abstract:

The traditional method of classifying the potential mineral resources requires a significant amount of time and money. In this paper, an alternative way to classify potential mineral resources with GIS application in Surigao del Sur. The three (3) analog map data inputs integrated to GIS are geologic map, topographic map, and land cover/vegetation map. The indicators used in the classification of potential nickel laterite integrated from the analog map data inputs are a geologic indicator, which is the presence of ultramafic rock from the geologic map; slope indicator and the presence of plateau edges from the topographic map; areas of forest land, grassland, and shrublands from the land cover/vegetation map. The potential mineral of the area was classified from low up to very high potential. The produced mineral potential classification map of Surigao del Sur has an estimated 4.63% low nickel laterite potential, 42.15% medium nickel laterite potential, 43.34% high nickel laterite potential, and 9.88% very high nickel laterite from its ultramafic terrains. For the validation of the produced map, it was compared with known occurrences of nickel laterite in the area using a nickel mining tenement map from the area with the application of remote sensing. Three (3) prominent nickel mining companies were delineated in the study area. The generated potential classification map of nickel-laterite in Surigao Del Sur may be of aid to the mining companies which are currently in the exploration phase in the study area. Also, the currently operating nickel mines in the study area can help to validate the reliability of the mineral classification map produced.

Keywords: mineral potential classification, nickel laterites, GIS, remote sensing, Surigao del Sur

Procedia PDF Downloads 97
560 Experimental Set-up for the Thermo-Hydric Study of a Wood Chips Bed Crossed by an Air Flow

Authors: Dimitri Bigot, Bruno Malet-Damour, Jérôme Vigneron

Abstract:

Many studies have been made about using bio-based materials in buildings. The goal is to reduce its environmental footprint by analyzing its life cycle. This can lead to minimize the carbon emissions or energy consumption. A previous work proposed to numerically study the feasibility of using wood chips to regulate relative humidity inside a building. This has shown the capability of a wood chips bed to regulate humidity inside the building, to improve thermal comfort, and so potentially reduce building energy consumption. However, it also shown that some physical parameters of the wood chips must be identified to validate the proposed model and the associated results. This paper presents an experimental setup able to study such a wood chips bed with different solicitations. It consists of a simple duct filled with wood chips and crossed by an air flow with variable temperature and relative humidity. Its main objective is to study the thermal behavior of the wood chips bed by controlling temperature and relative humidity of the air that enters into it and by observing the same parameters at the output. First, the experimental set up is described according to previous results. A focus is made on the particular properties that have to be characterized. Then some case studies are presented in relation to the previous results in order to identify the key physical properties. Finally, the feasibility of the proposed technology is discussed, and some model validation paths are given.

Keywords: wood chips bed, experimental set-up, bio-based material, desiccant, relative humidity, water content, thermal behaviour, air treatment

Procedia PDF Downloads 94
559 Acoustic Modeling of a Data Center with a Hot Aisle Containment System

Authors: Arshad Alfoqaha, Seth Bard, Dustin Demetriou

Abstract:

A new multi-physics acoustic modeling approach using ANSYS Mechanical FEA and FLUENT CFD methods is developed for modeling servers mounted to racks, such as IBM Z and IBM Power Systems, in data centers. This new approach allows users to determine the thermal and acoustic conditions that people are exposed to within the data center. The sound pressure level (SPL) exposure for a human working inside a hot aisle containment system inside the data center is studied. The SPL is analyzed at the noise source, at the human body, on the rack walls, on the containment walls, and on the ceiling and flooring plenum walls. In the acoustic CFD simulation, it is assumed that a four-inch diameter sphere with monopole acoustic radiation, placed in the middle of each rack, provides a single-source representation of all noise sources within the rack. Ffowcs Williams & Hawkings (FWH) acoustic model is employed. The target frequency is 1000 Hz, and the total simulation time for the transient analysis is 1.4 seconds, with a very small time step of 3e-5 seconds and 10 iterations to ensure convergence and accuracy. A User Defined Function (UDF) is developed to accurately simulate the acoustic noise source, and a Dynamic Mesh is applied to ensure acoustic wave propagation. Initial validation of the acoustic CFD simulation using a closed-form solution for the spherical propagation of an acoustic point source is performed.

Keywords: data centers, FLUENT, acoustics, sound pressure level, SPL, hot aisle containment, IBM

Procedia PDF Downloads 149
558 Simulation of Glass Breakage Using Voronoi Random Field Tessellations

Authors: Michael A. Kraus, Navid Pourmoghaddam, Martin Botz, Jens Schneider, Geralt Siebert

Abstract:

Fragmentation analysis of tempered glass gives insight into the quality of the tempering process and defines a certain degree of safety as well. Different standard such as the European EN 12150-1 or the American ASTM C 1048/CPSC 16 CFR 1201 define a minimum number of fragments required for soda-lime safety glass on the basis of fragmentation test results for classification. This work presents an approach for the glass breakage pattern prediction using a Voronoi Tesselation over Random Fields. The random Voronoi tessellation is trained with and validated against data from several breakage patterns. The fragments in observation areas of 50 mm x 50 mm were used for training and validation. All glass specimen used in this study were commercially available soda-lime glasses at three different thicknesses levels of 4 mm, 8 mm and 12 mm. The results of this work form a Bayesian framework for the training and prediction of breakage patterns of tempered soda-lime glass using a Voronoi Random Field Tesselation. Uncertainties occurring in this process can be well quantified, and several statistical measures of the pattern can be preservation with this method. Within this work it was found, that different Random Fields as basis for the Voronoi Tesselation lead to differently well fitted statistical properties of the glass breakage patterns. As the methodology is derived and kept general, the framework could be also applied to other random tesselations and crack pattern modelling purposes.

Keywords: glass breakage predicition, Voronoi Random Field Tessellation, fragmentation analysis, Bayesian parameter identification

Procedia PDF Downloads 132
557 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C

Authors: Keaghan Brown

Abstract:

The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.

Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase

Procedia PDF Downloads 40
556 Realization of Hybrid Beams Inertial Amplifier

Authors: Somya Ranjan Patro, Abhigna Bhatt, Arnab Banerjee

Abstract:

Inertial amplifier has recently gained increasing attention as a new mechanism for vibration control of structures. Currently, theoretical investigations are undertaken by researchers to reveal its fundamentals and to understand its underline principles in altering the structural response of structures against dynamic loadings. This paper investigates experimental and analytical studies on the dynamic characteristics of hybrid beam inertial amplifier (HBIA). The analytical formulation of the HBIA has been derived by implementing the spectral element method and rigid body dynamics. This formulation gives the relation between dynamic force and the response of the structure in the frequency domain. Further, for validation of the proposed HBIA, the experiments have been performed. The experimental setup consists of a 3D printed HBIA of polylactic acid (PLA) material screwed at the base plate of the shaker system. Two numbers of accelerometers are used to study the response, one at the base plate of the shaker second one placed at the top of the inertial amplifier. A force transducer is also placed in between the base plate and the inertial amplifier to calculate the total amount of load transferred from the base plate to the inertial amplifier. The obtained time domain response from the accelerometers have been converted into the frequency domain using the Fast Fourier Transform (FFT) algorithm. The experimental transmittance values are successfully validated with the analytical results, providing us essential confidence in our proposed methodology.

Keywords: inertial amplifier, fast fourier transform, natural frequencies, polylactic acid, transmittance, vibration absorbers

Procedia PDF Downloads 71
555 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil

Authors: P. S. Jain, K. D. Bobade, S. J. Surana

Abstract:

A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.

Keywords: naftopidil, HPTLC, validation, stability, degradation

Procedia PDF Downloads 377
554 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm

Authors: Muhammad Bilal, Zhongfeng Qiu

Abstract:

Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.

Keywords: AEORNET, AOD, SARA, GOCI, Beijing

Procedia PDF Downloads 140
553 Global Developmental Delay and Its Association with Risk Factors: Validation by Structural Equation Modelling

Authors: Bavneet Kaur Sidhu, Manoj Tiwari

Abstract:

Global Developmental Delay (GDD) is a common pediatric condition. Etiologies of GDD might, however, differ in developing countries. In the last decade, sporadic families are being reported in various countries. As to the author’s best knowledge, many risk factors and their correlation with the prevalence of GDD have been studied but its statistical correlation has not been done. Thus we propose the present study by targeting the risk factor, prevalence and their statistical correlation with GDD. FMR1 gene was studied to confirm the disease and its penetrance. A complete questionnaire-based performance was designed for the statistical studies having a personal, past and present medical history along with their socio-economic status as well. Methods: We distributed the children’s age in 4 different age groups having 5-year intervals and applied structural equation modeling (SEM) techniques, Spearman’s rank correlation coefficient, Karl Pearson correlation coefficient, and chi-square test.Result: A total of 1100 families were enrolled for this study; among them, 330 were clinically and biologically confirmed (radiological studies) for the disease, 204 were males (61.8%), 126 were females (38.18%). We found that 27.87% were genetic and 72.12 were sporadic, out of 72.12 %, 43.277% cases from urban and 56.72% from the rural locality, the mothers' literacy rate was 32.12% and working women numbers were 41.21%. Conclusions: There is a significant association between mothers' age and GDD prevalence, which is also followed by mothers' literacy rate and mothers' occupation, whereas there was no association between fathers' age and GDD.

Keywords: global developmental delay, FMR1 gene, spearman’ rank correlation coefficient, structural equation modeling

Procedia PDF Downloads 107
552 Revisiting the Historical Narratives of the Old Churches in Albay, Bikol Region, Philippines

Authors: Ruby Ann L. Ayo

Abstract:

As cultural heritage reflects the historical origin of a certain group of people, it reveals their customs, traits, beliefs, practices and even values they hold on for years. One of the tangible examples of cultural heritage is the physical structures including the old churches. The study looked-into the existing historical narratives of the century Old Catholic churches in the Province of Albay, Bikol Region, Philippines: NuestraSeñora de Salvacion in Joroan, Tiwi, Albay; the Our Lady of the Gate in Daraga, Albay; the San Juan de Bautista in Tabaco City and the St. John the Baptist in Camalig, Albay. The historical narratives were analysed in terms of validity and reliability of the secondary documents with reference to the elements of history revealing consistency and adequacy of historical facts. The contents were examined using a modified Checklist of Historical Documents. The historical narratives were likewise submitted to the content expert for validation as regards historical authenticity and accuracy. The contents of the narratives were scrutinized according to the following codes: (1.1) the Patron Saints;(1.2) factors that paved to their constructions; (1.3) the people responsible for their constructions; (1.4) the misconceptions about their constructions; and (1.5) their contributions to Bikol heritage. Based on the codes, themes were identified as: (2.1) Marian Devotees and Christ-centered Patron Saints; (2.2) geographical, socio-political and cultural factors; (2.3) church and government officials; (2.4) misconceptions on the dates of constructions and original sites; and (2.5) popular pilgrim sites and well-admired architectural designs.

Keywords: historical narratives, old churches, cultural heritage, historical validity and reliability, elements of history

Procedia PDF Downloads 268
551 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials

Authors: Rajesh Kumar G

Abstract:

A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.

Keywords: adaptive design, simulation, borrowing data, bayesian model

Procedia PDF Downloads 42
550 Application of Nonparametric Geographically Weighted Regression to Evaluate the Unemployment Rate in East Java

Authors: Sifriyani Sifriyani, I Nyoman Budiantara, Sri Haryatmi, Gunardi Gunardi

Abstract:

East Java Province has a first rank as a province that has the most counties and cities in Indonesia and has the largest population. In 2015, the population reached 38.847.561 million, this figure showed a very high population growth. High population growth is feared to lead to increase the levels of unemployment. In this study, the researchers mapped and modeled the unemployment rate with 6 variables that were supposed to influence. Modeling was done by nonparametric geographically weighted regression methods with truncated spline approach. This method was chosen because spline method is a flexible method, these models tend to look for its own estimation. In this modeling, there were point knots, the point that showed the changes of data. The selection of the optimum point knots was done by selecting the most minimun value of Generalized Cross Validation (GCV). Based on the research, 6 variables were declared to affect the level of unemployment in eastern Java. They were the percentage of population that is educated above high school, the rate of economic growth, the population density, the investment ratio of total labor force, the regional minimum wage and the ratio of the number of big industry and medium scale industry from the work force. The nonparametric geographically weighted regression models with truncated spline approach had a coefficient of determination 98.95% and the value of MSE equal to 0.0047.

Keywords: East Java, nonparametric geographically weighted regression, spatial, spline approach, unemployed rate

Procedia PDF Downloads 290
549 A Study of Laminar Natural Convection in Annular Spaces between Differentially Heated Horizontal Circular Cylinders Filled with Non-Newtonian Nano Fluids

Authors: Behzad Ahdiharab, Senol Baskaya, Tamer Calisir

Abstract:

Heat exchangers are one of the most widely used systems in factories, refineries etc. In this study, natural convection heat transfer using nano-fluids in between two cylinders is numerically investigated. The inner and outer cylinders are kept at constant temperatures. One of the most important assumptions in the project is that the working fluid is non-Newtonian. In recent years, the use of nano-fluids in industrial applications has increased profoundly. In this study, nano-Newtonian fluids containing metal particles with high heat transfer coefficients have been used. All fluid properties such as homogeneity has been calculated. In the present study, solutions have been obtained under unsteady conditions, base fluid was water, and effects of various parameters on heat transfer have been investigated. These parameters are Rayleigh number (103 < Ra < 106), power-law index (0.6 < n < 1.4), aspect ratio (0 < AR < 0.8), nano-particle composition, horizontal and vertical displacement of the inner cylinder, rotation of the inner cylinder, and volume fraction of nanoparticles. Results such as the internal cylinder average and local Nusselt number variations, contours of temperature, flow lines are presented. The results are also discussed in detail. From the validation study performed it was found that a very good agreement exists between the present results and those from the open literature. It was found out that the heat transfer is always affected by the investigated parameters. However, the degree to which the heat transfer is affected does change in a wide range.

Keywords: heat transfer, circular space, non-Newtonian, nano fluid, computational fluid dynamics.

Procedia PDF Downloads 389
548 Design, Construction And Validation Of A Simple, Low-cost Phi Meter

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

The use of a phi meter allows for definition of equivalence ratio during a fire test. Previous phi meter designs have used expensive catalysts and had restricted portability due to the large furnace and requirement for pure oxygen. The new design of the phi meter did not require the use of a catalyst. The furnace design was based on the existing micro-scale combustion calorimetry (MCC) furnace and operating conditions based on the secondary oxidizer furnace used in the steady state tube furnace (SSTF). Preliminary tests were conducted to study the effects of varying furnace temperatures on combustion efficiency. The SSTF was chosen to validate the phi meter measurements as it can both pre-set and independently quantify the equivalence ratio during a test. The data were in agreement with the data obtained on the SSTF. It was also validated by a comparison of CO2 yields obtained from the SSTF oxidizer and those obtained by the phi meter. The phi meter designed and constructed in this work was proven to work effectively on a bench-scale. The phi meter was then used to measure the equivalence ratio on a series of large-scale ISO 9705 tests for numerous fire conditions. The materials used were a range of non-homogenous materials such as polyurethane. The measurements corresponded accurately to the data collected, showing the novel design can be used from bench to large-scale tests to measure equivalence ratio. This cheaper, more portable, safer and easier to use phi meter design will enable more widespread use and the ability to quantify fire conditions of tests, allowing for better understanding of flammability and smoke toxicity.

Keywords: phi meter, smoke toxicity, fire condition, ISO9705, novel equipment

Procedia PDF Downloads 78
547 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation

Procedia PDF Downloads 105
546 Effect of Project Control Practices on the Performance of Building Construction Companies in Uganda: A Case Study of Kampala City

Authors: Tukundane Hillary

Abstract:

This research paper analytically evaluates the project control practice levels used by the building construction companies within Kampala, Uganda. The research also assesses the outcome of project control practices on the productivity of the companies. The research was performed to ascertain the current control practices among 160 respondents from various construction companies registered with the Uganda Registration Services Bureau. This research used amalgamation from multiple literature to obtain the variables. The research adopts 34 standard control practices from four vital project control duties: planning, monitoring, analyzing, and reporting. These project control tasks were organized using mean response ratings grounded on their relevance to the construction companies. Results showed that evaluating performance with the use of curves (4.32), timely access to information and encouragement (4.55), report representation using quantitative tools 4.75, and cost value comparison application during analysis (4.76) were rated least among the control practices. On the other hand, the top project control practices included formulation of the project schedule (8.88), Project feasibility validation (8.86), Budgeting for each activity (8.84), Key project route definition (8.81), Team awareness of the budget (8.77), Setting realistic targets for projects (8.50) and Consultation from subcontractors (8.74). From the results obtained by the sample respondents specified, it can be concluded that planning is the most vital project control task practiced in the building construction industry in Uganda. In addition, this research ascertained a substantial relationship between project control practices and the performance of building construction companies. Accordingly, this research recommends that project control practices be effectively observed by both contracting and consulting companies to enhance their overall performance and governance.

Keywords: cost value, project control, cost control, time control, project performance, control practices

Procedia PDF Downloads 33
545 Validation of Electrical Field Effect on Electrostatic Desalter Modeling with Experimental Laboratory Data

Authors: Fatemeh Yazdanmehr, Iulian Nistor

Abstract:

The scope of the current study is the evaluation of the electric field effect on electrostatic desalting mathematical modeling with laboratory data. This research study was focused on developing a model for an existing operation desalting unit of one of the Iranian heavy oil field with a 75 MBPD production capacity. The high temperature of inlet oil to dehydration unit reduces the oil recovery, so the mathematical modeling of desalter operation parameters is very significant. The existing production unit operating data has been used for the accuracy of the mathematical desalting plant model. The inlet oil temperature to desalter was decreased from 110 to 80°C, and the desalted electrical field was increased from 0.75 to 2.5 Kv/cm. The model result shows that the desalter parameter changes meet the water-oil specification and also the oil production and consequently annual income is increased. In addition to that, changing desalter operation conditions reduces environmental footprint because of flare gas reduction. Following to specify the accuracy of selected electrostatic desalter electrical field, laboratory data has been used. Experimental data are used to ensure the effect of electrical field change on desalter. Therefore, the lab test is done on a crude oil sample. The results include the dehydration efficiency in the presence of a demulsifier and under electrical field (0.75 Kv) conditions at various temperatures. Comparing lab experimental and electrostatic desalter mathematical model results shows 1-3 percent acceptable error which confirms the validity of desalter specification and operation conditions changes.

Keywords: desalter, electrical field, demulsification, mathematical modeling, water-oil separation

Procedia PDF Downloads 97
544 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron

Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni

Abstract:

The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.

Keywords: bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow

Procedia PDF Downloads 315
543 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon

Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn

Abstract:

The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.

Keywords: land use and land cover change, change detection, image processing, support vector machines

Procedia PDF Downloads 99