Search results for: OSA calibration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 448

Search results for: OSA calibration

208 Stability Indicating Method Development and Validation for Estimation of Antiasthmatic Drug in Combined Dosages Formed by RP-HPLC

Authors: Laxman H. Surwase, Lalit V. Sonawane, Bhagwat N. Poul

Abstract:

A simple stability indicating high performance liquid chromatographic method has been developed for the simultaneous determination of Levosalbutamol Sulphate and Ipratropium Bromide in bulk and pharmaceutical dosage form using reverse phase Zorbax Eclipse Plus C8 column (250mm×4.6mm), with mobile phase phosphate buffer (0.05M KH2PO4): acetonitrile (55:45v/v) pH 3.5 adjusted with ortho-phosphoric acid, the flow rate was 1.0 mL/min and the detection was carried at 212 nm. The retention times of Levosalbutamol Sulphate and Ipratropium Bromide were 2.2007 and 2.6611 min respectively. The correlation coefficient of Levosalbutamol Sulphate and Ipratropium Bromide was found to be 0.997 and 0.998.Calibration plots were linear over the concentration ranges 10-100µg/mL for both Levosalbutamol Sulphate and Ipratropium Bromide. The LOD and LOQ of Levosalbutamol Sulphate were 2.520µg/mL and 7.638µg/mL while for Ipratropium Bromide was 1.201µg/mL and 3.640 µg/mL. The accuracy of the proposed method was determined by recovery studies and found to be 100.15% for Levosalbutamol Sulphate and 100.19% for Ipratropium Bromide respectively. The method was validated for accuracy, linearity, sensitivity, precision, robustness, system suitability. The proposed method could be utilized for routine analysis of Levosalbutamol Sulphate and Ipratropium Bromide in bulk and pharmaceutical capsule dosage form.

Keywords: levosalbutamol sulphate, ipratropium bromide, RP-HPLC, phosphate buffer, acetonitrile

Procedia PDF Downloads 351
207 A New Social Vulnerability Index for Evaluating Social Vulnerability to Climate Change at the Local Scale

Authors: Cuong V Nguyen, Ralph Horne, John Fien, France Cheong

Abstract:

Social vulnerability to climate change is increasingly being acknowledged, and proposals to measure and manage it are emerging. Building upon this work, this paper proposes an approach to social vulnerability assessment using a new mechanism to aggregate and account for causal relationships among components of a Social Vulnerability Index (SVI). To operationalize this index, the authors propose a means to develop an appropriate primary dataset, through application of a specifically-designed household survey questionnaire. The data collection and analysis, including calibration and calculation of the SVI is demonstrated through application in case study city in central coastal Vietnam. The calculation of SVI at the fine-grained local neighbourhood scale provides high resolution in vulnerability assessment, and also obviates the need for secondary data, which may be unavailable or problematic, particularly at the local scale in developing countries. The SVI household survey is underpinned by the results of a Delphi survey, an in-depth interview and focus group discussions with local environmental professionals and community members. The research reveals inherent limitations of existing SVIs but also indicates the potential for their use in assessing social vulnerability and making decisions associated with responding to climate change at the local scale.

Keywords: climate change, local scale, social vulnerability, social vulnerability index

Procedia PDF Downloads 435
206 Study and Calibration of Autonomous UAV Systems with Thermal Sensing Allowing Screening of Environmental Concerns

Authors: Raahil Sheikh, Abhishek Maurya, Priya Gujjar, Himanshu Dwivedi, Prathamesh Minde

Abstract:

UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoided.

Keywords: UAV, drone, autonomous system, thermal imaging

Procedia PDF Downloads 75
205 Comparison Between a Droplet Digital PCR and Real Time PCR Method in Quantification of HBV DNA

Authors: Surangrat Srisurapanon, Chatchawal Wongjitrat, Navin Horthongkham, Ruengpung Sutthent

Abstract:

HBV infection causes a potential serious public health problem. The ability to detect the HBV DNA concentration is of the importance and improved continuously. By using quantitative Polymerase Chain Reaction (qPCR), several factors in standardized; source of material, calibration standard curve and PCR efficiency are inconsistent. Digital PCR (dPCR) is an alternative PCR-based technique for absolute quantification using Poisson's statistics without requiring a standard curve. Therefore, the aim of this study is to compare the data set of HBV DNA generated between dPCR and qPCR methods. All samples were quantified by Abbott’s real time PCR and 54 samples with 2 -6 log10 HBV DNA were selected for comparison with dPCR. Of these 54 samples, there were two outlier samples defined as negative by dPCR. Of these two, samples were defined as negative by dPCR, whereas 52 samples were positive by both the tests. The difference between the two assays was less than 0.25 log IU/mL in 24/52 samples (46%) of paired samples; less than 0.5 log IU/mL in 46/52 samples (88%) and less than 1 log in 50/52 samples (96%). The correlation coefficient was r=0.788 and P-value <0.0001. Comparison to qPCR, data generated by dPCR tend to be the overestimation in the sample with low HBV DNA concentration and underestimated in the sample with high viral load. The variation in DNA by dPCR measurement might be due to the pre-amplification bias, template. Moreover, a minor drawback of dPCR is the large quantity of DNA had to be used when compare to the qPCR. Since the technology is relatively new, the limitations of this assay will be improved.

Keywords: hepatitis B virus, real time PCR, digital PCR, DNA quantification

Procedia PDF Downloads 481
204 Probabilistic Model for Evaluating Seismic Soil Liquefaction Based on Energy Approach

Authors: Hamid Rostami, Ali Fallah Yeznabad, Mohammad H. Baziar

Abstract:

The energy-based method for evaluating seismic soil liquefaction has two main sections. First is the demand energy, which is dissipated energy of earthquake at a site, and second is the capacity energy as a representation of soil resistance against liquefaction hazard. In this study, using a statistical analysis of recorded data by 14 down-hole array sites in California, an empirical equation was developed to estimate the demand energy at sites. Because determination of capacity energy at a site needs to calculate several site calibration factors, which are obtained by experimental tests, in this study the standard penetration test (SPT) N-value was assumed as an alternative to the capacity energy at a site. Based on this assumption, the empirical equation was employed to calculate the demand energy for 193 liquefied and no-liquefied sites and then these amounts were plotted versus the corresponding SPT numbers for all sites. Subsequently, a discrimination analysis was employed to determine the equations of several boundary curves for various liquefaction likelihoods. Finally, a comparison was made between the probabilistic model and the commonly used stress method. As a conclusion, the results clearly showed that energy-based method can be more reliable than conventional stress-based method in evaluation of liquefaction occurrence.

Keywords: energy demand, liquefaction, probabilistic analysis, SPT number

Procedia PDF Downloads 367
203 Study and Calibration of Autonomous UAV Systems With Thermal Sensing With Multi-purpose Roles

Authors: Raahil Sheikh, Prathamesh Minde, Priya Gujjar, Himanshu Dwivedi, Abhishek Maurya

Abstract:

UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoided

Keywords: UAV, autonomous systems, drones, geo thermal imaging

Procedia PDF Downloads 85
202 Multimedia Firearms Training System

Authors: Aleksander Nawrat, Karol Jędrasiak, Artur Ryt, Dawid Sobel

Abstract:

The goal of the article is to present a novel Multimedia Firearms Training System. The system was developed in order to compensate for major problems of existing shooting training systems. The designed and implemented solution can be characterized by five major advantages: algorithm for automatic geometric calibration, algorithm of photometric recalibration, firearms hit point detection using thermal imaging camera, IR laser spot tracking algorithm for after action review analysis, and implementation of ballistics equations. The combination of the abovementioned advantages in a single multimedia firearms training system creates a comprehensive solution for detecting and tracking of the target point usable for shooting training systems and improving intervention tactics of uniformed services. The introduced algorithms of geometric and photometric recalibration allow the use of economically viable commercially available projectors for systems that require long and intensive use without most of the negative impacts on color mapping of existing multi-projector multimedia shooting range systems. The article presents the results of the developed algorithms and their application in real training systems.

Keywords: firearms shot detection, geometric recalibration, photometric recalibration, IR tracking algorithm, thermography, ballistics

Procedia PDF Downloads 222
201 Analysis of the Extreme Hydrometeorological Events in the Theorical Hydraulic Potential and Streamflow Forecast

Authors: Sara Patricia Ibarra-Zavaleta, Rabindranarth Romero-Lopez, Rosario Langrave, Annie Poulin, Gerald Corzo, Mathias Glaus, Ricardo Vega-Azamar, Norma Angelica Oropeza

Abstract:

The progressive change in climatic conditions worldwide has increased frequency and severity of extreme hydrometeorological events (EHE). Mexico is an example; this has been affected by the presence of EHE leaving economic, social and environmental losses. The objective of this research was to apply a Canadian distributed hydrological model (DHM) to tropical conditions and to evaluate its capacity to predict flows in a basin in the central Gulf of Mexico. In addition, the DHM (once calibrated and validated) was used to calculate the theoretical hydraulic power and the performance to predict streamflow before the presence of an EHE. The results of the DHM show that the goodness of fit indicators between the observed and simulated flows in the calibration process (NSE=0.83, RSR=0.021 and BIAS=-4.3) and validation: temporal was assessed at two points: point one (NSE=0.78, RSR=0.113 and BIAS=0.054) and point two (NSE=0.825, RSR=0.103 and BIAS=0.063) are satisfactory. The DHM showed its applicability in tropical environments and its ability to characterize the rainfall-runoff relationship in the study area. This work can serve as a tool for identifying vulnerabilities before floods and for the rational and sustainable management of water resources.

Keywords: HYDROTEL, hydraulic power, extreme hydrometeorological events, streamflow

Procedia PDF Downloads 341
200 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer

Authors: Nabil Saad, David Morgan, Manish Gupta

Abstract:

Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.

Keywords: aerosols, extinction, visibility, albedo

Procedia PDF Downloads 90
199 Validating Texture Analysis as a Tool for Determining Bioplastic (Bio)Degradation

Authors: Sally J. Price, Greg F. Walker, Weiyi Liu, Craig R. Bunt

Abstract:

Plastics, due to their long lifespan, are becoming more of an environmental concern once their useful life has been completed. There are a vast array of different types of plastic, and they can be found in almost every ecosystem on earth and are of particular concern in terrestrial environments where they can become incorporated into the food chain. Hence bioplastics have become more of interest to manufacturers and the public recently as they have the ability to (bio)degrade in commercial and in home composting situations. However, tools in which to quantify how they degrade in response to environmental variables are still being developed -one such approach is texture analysis using a TA.XT Texture Analyser, Stable Microsystems, was used to determine the force required to break or punch holes in standard ASTM D638 Type IV 3D printed bioplastic “dogbones” depending on the thicknesses of them. Manufacturers’ recommendations for calibrating the Texture Analyser are one such approach for standardising results; however, an independent technique using dummy dogbones and a substitute for the bioplastic was used alongside the samples. This approach was unexpectedly more valuable than realised at the start of the trial as irregular results were later discovered with the substitute material before valuable samples collected from the field were lost due to possible machine malfunction. This work will show the value of having an independent approach to machine calibration for accurate sample analysis with a Texture Analyser when analysing bioplastic samples.

Keywords: bioplastic, degradation, environment, texture analyzer

Procedia PDF Downloads 205
198 Floodplain Modeling of River Jhelum Using HEC-RAS: A Case Study

Authors: Kashif Hassan, M.A. Ahanger

Abstract:

Floods have become more frequent and severe due to effects of global climate change and human alterations of the natural environment. Flood prediction/ forecasting and control is one of the greatest challenges facing the world today. The forecast of floods is achieved by the use of hydraulic models such as HEC-RAS, which are designed to simulate flow processes of the surface water. Extreme flood events in river Jhelum , lasting from a day to few are a major disaster in the State of Jammu and Kashmir, India. In the present study HEC-RAS model was applied to two different reaches of river Jhelum in order to estimate the flood levels corresponding to 25, 50 and 100 year return period flood events at important locations and to deduce flood vulnerability of important areas and structures. The flow rates for the two reaches were derived from flood-frequency analysis of 50 years of historic peak flow data. Manning's roughness coefficient n was selected using detailed analysis. Rating Curves were also generated to serve as base for determining the boundary conditions. Calibration and Validation procedures were applied in order to ensure the reliability of the model. Sensitivity analysis was also performed in order to ensure the accuracy of Manning's n in generating water surface profiles.

Keywords: flood plain, HEC-RAS, Jhelum, return period

Procedia PDF Downloads 426
197 Simultaneous Extraction and Estimation of Steroidal Glycosides and Aglycone of Solanum

Authors: Karishma Chester, Sarvesh Paliwal, Sayeed Ahmad

Abstract:

Solanumnigrum L. (Family: Solanaceae), is an important Indian medicinal plant and have been used in various traditional formulations for hepato-protection. It has been reported to contain significant amount of steroidal glycosides such as solamargine and solasonine as well as their aglycone part solasodine. Being important pharmacologically active metabolites of several members of Solanaceae these markers have been attempted various times for their extraction and quantification but separately for glycoside and aglycone part because of their opposite polarity. Here, we propose for the first time simultaneous extraction and quantification of aglycone (solasodine)and glycosides (solamargine and solasonine) inleaves and berries of S.nigrumusing solvent extraction followed by HPTLC analysis. Simultaneous extraction was carried out by sonication in mixture of chloroform and methanol as solvent. The quantification was done using silica gel 60F254HPTLC plates as stationary phase and chloroform: methanol: acetone: 0.5 % ammonia (7: 2.5: 1: 0.4 v/v/v/v) as mobile phaseat 400 nm, after derivatization with an isaldehydesul furic acid reagent. The method was validated as per ICH guideline for calibration, linearity, precision, recovery, robustness, specificity, LOD, and LOQ. The statistical data obtained for validation showed that method can be used routinely for quality control of various solanaceous drugs reported for these markers as well as traditional formulations containing those plants as an ingredient.

Keywords: solanumnigrum, solasodine, solamargine, solasonine, quantification

Procedia PDF Downloads 329
196 The Comparison and Optimization of the Analytic Method for Canthaxanthin, Food Colorants

Authors: Hee-Jae Suh, Kyung-Su Kim, Min-Ji Kim, Yeon-Seong Jeong, Ok-Hwan Lee, Jae-Wook Shin, Hyang-Sook Chun, Chan Lee

Abstract:

Canthaxanthin is keto-carotenoid produced from beta-carotene and it has been approved to be used in many countries as a food coloring agent. Canthaxanthin has been analyzed using High Performance Liquid Chromatography (HPLC) system with various ways of pretreatment methods. Four official methods for verification of canthaxanthin at FSA (UK), AOAC (US), EFSA (EU) and MHLW (Japan) were compared to improve its analytical and the pretreatment method. The Linearity, the limit of detection (LOD), the limit of quantification (LOQ), the accuracy, the precision and the recovery ratio were determined from each method with modification in pretreatment method. All HPLC methods exhibited correlation coefficients of calibration curves for canthaxanthin as 0.9999. The analysis methods from FSA, AOAC, and MLHW showed the LOD of 0.395 ppm, 0.105 ppm, and 0.084 ppm, and the LOQ of 1.196 ppm, 0.318 ppm, 0.254 ppm, respectively. Among tested methods, HPLC method of MHLW with modification in pretreatments was finally selected for the analysis of canthaxanthin in lab, because it exhibited the resolution factor of 4.0 and the selectivity of 1.30. This analysis method showed a correlation coefficients value of 0.9999 and the lowest LOD and LOQ. Furthermore, the precision ratio was lower than 1 and the accuracy was almost 100%. The method presented the recovery ratio of 90-110% with modification in pretreatment method. The cross-validation of coefficient variation was 5 or less among tested three institutions in Korea.

Keywords: analytic method, canthaxanthin, food colorants, pretreatment method

Procedia PDF Downloads 683
195 Outdoor Anomaly Detection with a Spectroscopic Line Detector

Authors: O. J. G. Somsen

Abstract:

One of the tasks of optical surveillance is to detect anomalies in large amounts of image data. However, if the size of the anomaly is very small, limited information is available to distinguish it from the surrounding environment. Spectral detection provides a useful source of additional information and may help to detect anomalies with a size of a few pixels or less. Unfortunately, spectral cameras are expensive because of the difficulty of separating two spatial in addition to one spectral dimension. We investigate the possibility of modifying a simpler spectral line detector for outdoor detection. This may be especially useful if the area of interest forms a line, such as the horizon. We use a monochrome CCD that also enables detection into the near infrared. A simple camera is attached to the setup to determine which part of the environment is spectrally imaged. Our preliminary results indicate that sensitive detection of very small targets is indeed possible. Spectra could be taken from the various targets by averaging columns in the line image. By imaging a set of lines of various width we found narrow lines that could not be seen in the color image but remained visible in the spectral line image. A simultaneous analysis of the entire spectra can produce better results than visual inspection of the line spectral image. We are presently developing calibration targets for spatial and spectral focusing and alignment with the spatial camera. This will present improved results and more use in outdoor application

Keywords: anomaly detection, spectroscopic line imaging, image analysis, outdoor detection

Procedia PDF Downloads 481
194 Comparison of Unit Hydrograph Models to Simulate Flood Events at the Field Scale

Authors: Imene Skhakhfa, Lahbaci Ouerdachi

Abstract:

To ensure the overall coherence of simulated results, it is necessary to develop a robust validation process. In many applications, it is no longer content to calibrate and validate the model only in relation to the hydro graph measured at the outlet, but we try to better simulate the functioning of the watershed in space. Therefore the timing also performs compared to other variables such as water level measurements in intermediate stations or groundwater levels. As part of this work, we limit ourselves to modeling flood of short duration for which the process of evapotranspiration is negligible. The main parameters to identify the models are related to the method of unit hydro graph (HU). Three different models were tested: SNYDER, CLARK and SCS. These models differ in their mathematical structure and parameters to be calibrated while hydrological data are the same, the initial water content and precipitation. The models are compared on the basis of their performance in terms six objective criteria, three global criteria and three criteria representing volume, peak flow, and the mean square error. The first type of criteria gives more weight to strong events whereas the second considers all events to be of equal weight. The results show that the calibrated parameter values are dependent and also highlight the problems associated with the simulation of low flow events and intermittent precipitation.

Keywords: model calibration, intensity, runoff, hydrograph

Procedia PDF Downloads 486
193 Challenges for Implementing Standards Compliant with Iso/Iec 17025, for Narcotics and DNA Laboratory’s

Authors: Blerim Olluri

Abstract:

A forensic science laboratory in Kosovo has never been organized at the level of most modern forensic science laboratories. This was made possible after the war of 1999 with the help and support from the United States. The United States Government/ICITAP provided 9.5 million dollars to support this project, this support have greatly benefitted law enforcement in Kosovo. With the establishment of Operative Procedures of Work and the law for Kosovo Agency of Forensic, the accreditation with ISO/IEC 17025 of the KAF labs it becomes mandatory. Since 2012 Laboratory’s DNA/Serology and Narcotics has begun reviewing and harmonizing their procedures according to ISO/IEC 17025. The focus of this work was to create quality manuals, procedures, work instructions, quality documentation and quality records. Furthermore, during this time is done the validation of work methods from scientific qualified personnel of KAF, without any help from other foreign agencies or accreditation body.In October 2014 we had the first evaluation based on ISO 17025 standards. According to the initial report of this assessment we have non conformity in test and Calibration methods method’s, and accommodation and environmental conditions. We identified several issues that are of extreme importance to KAF. One the most important issue is to create a professional group with experts of KAF, which will work in all the obligations, requested from ISO/IEC 17025. As conclusions that we earn in this path of accreditation, are that laboratory’s need to take corrective action, and all nonconformance’s must be addressed and corrective action taken before accreditation can be granted.

Keywords: accreditation, assessment, narcotics, DNA

Procedia PDF Downloads 364
192 Modeling of Ductile Fracture Using Stress-Modified Critical Strain Criterion for Typical Pressure Vessel Steel

Authors: Carlos Cuenca, Diego Sarzosa

Abstract:

Ductile fracture occurs by the mechanism of void nucleation, void growth and coalescence. Potential sites for initiation are second phase particles or non-metallic inclusions. Modelling of ductile damage at the microscopic level is very difficult and complex task for engineers. Therefore, conservative predictions of ductile failure using simple models are necessary during the design and optimization of critical structures like pressure vessels and pipelines. Nowadays, it is well known that the initiation phase is strongly influenced by the stress triaxiality and plastic deformation at the microscopic level. Thus, a simple model used to study the ductile failure under multiaxial stress condition is the Stress Modified Critical Strain (SMCS) approach. Ductile rupture has been study for a structural steel under different stress triaxiality conditions using the SMCS method. Experimental tests are carried out to characterize the relation between stress triaxiality and equivalent plastic strain by notched round bars. After calibration of the plasticity and damage properties, predictions are made for low constraint bending specimens with and without side grooves. Stress/strain fields evolution are compared between the different geometries. Advantages and disadvantages of the SMCS methodology are discussed.

Keywords: damage, SMSC, SEB, steel, failure

Procedia PDF Downloads 297
191 Surface Enhanced Raman Substrate Detection on the Structure of γ-Aminobutyric Acid(GABA) Connected with Modified Gold-Chitosan Nanoparticles by Mercaptopropionic Acid (MPA)

Authors: Bingjie Wang, Su-Yeon Kwon, Ik-Joong Kang

Abstract:

A Surface-enhanced Raman Scattering (SERS) as the principle for enhancing Raman scattering by molecules adsorbed on rough metal surfaces or by nanostructures is used to detect the concentration change of γ-Aminobutyric Acid (GABA). As for the gold-chitosan nanoshell, it is made by using chitosan nanoparticles crosslinking with sodium tripolyphosphate(TPP) for the first step to form the chitosan nanoparticles, which would be covered with the gold sequentially. The size of the fabricated product was around 100nm. Based on the method that the sulfur end of the MPA linked to gold can form the very strong S–Au bond, and the carboxyl group, the other end of the MPA, can easily absorb the GABA. GABA is the mainly inhibitory neurotransmitter in the mammalian central nervous system in the human body. It plays such significant role in reducing neuronal excitability throughout the nervous system. When the system formed, it generated SERS, which made a clear difference in the intensity of Raman scattering within the range of GABA concentration. So it is obtained from the experiment that the calibration curve according to the GABA concentration relevant with the SERS scattering. In this study, DLS, SEM, FT-IR, UV, SERS were used to analyze the products to obtain the conclusion.

Keywords: chitosan-gold nanoshell, mercaptopropionic acid, γ-aminobutyric acid, surface-enhanced Raman scattering

Procedia PDF Downloads 263
190 Emotional Artificial Intelligence and the Right to Privacy

Authors: Emine Akar

Abstract:

The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.

Keywords: AI, privacy law, data protection, big data

Procedia PDF Downloads 88
189 A Generative Pretrained Transformer-Based Question-Answer Chatbot and Phantom-Less Quantitative Computed Tomography Bone Mineral Density Measurement System for Osteoporosis

Authors: Mian Huang, Chi Ma, Junyu Lin, William Lu

Abstract:

Introduction: Bone health attracts more attention recently and an intelligent question and answer (QA) chatbot for osteoporosis is helpful for science popularization. With Generative Pretrained Transformer (GPT) technology developing, we build an osteoporosis corpus dataset and then fine-tune LLaMA, a famous open-source GPT foundation large language model(LLM), on our self-constructed osteoporosis corpus. Evaluated by clinical orthopedic experts, our fine-tuned model outperforms vanilla LLaMA on osteoporosis QA task in Chinese. Three-dimensional quantitative computed tomography (QCT) measured bone mineral density (BMD) is considered as more accurate than DXA for BMD measurement in recent years. We develop an automatic Phantom-less QCT(PL-QCT) that is more efficient for BMD measurement since no need of an external phantom for calibration. Combined with LLM on osteoporosis, our PL-QCT provides efficient and accurate BMD measurement for our chatbot users. Material and Methods: We build an osteoporosis corpus containing about 30,000 Chinese literatures whose titles are related to osteoporosis. The whole process is done automatically, including crawling literatures in .pdf format, localizing text/figure/table region by layout segmentation algorithm and recognizing text by OCR algorithm. We train our model by continuous pre-training with Low-rank Adaptation (LoRA, rank=10) technology to adapt LLaMA-7B model to osteoporosis domain, whose basic principle is to mask the next word in the text and make the model predict that word. The loss function is defined as cross-entropy between the predicted and ground-truth word. Experiment is implemented on single NVIDIA A800 GPU for 15 days. Our automatic PL-QCT BMD measurement adopt AI-associated region-of-interest (ROI) generation algorithm for localizing vertebrae-parallel cylinder in cancellous bone. Due to no phantom for BMD calibration, we calculate ROI BMD by CT-BMD of personal muscle and fat. Results & Discussion: Clinical orthopaedic experts are invited to design 5 osteoporosis questions in Chinese, evaluating performance of vanilla LLaMA and our fine-tuned model. Our model outperforms LLaMA on over 80% of these questions, understanding ‘Expert Consensus on Osteoporosis’, ‘QCT for osteoporosis diagnosis’ and ‘Effect of age on osteoporosis’. Detailed results are shown in appendix. Future work may be done by training a larger LLM on the whole orthopaedics with more high-quality domain data, or a multi-modal GPT combining and understanding X-ray and medical text for orthopaedic computer-aided-diagnosis. However, GPT model gives unexpected outputs sometimes, such as repetitive text or seemingly normal but wrong answer (called ‘hallucination’). Even though GPT give correct answers, it cannot be considered as valid clinical diagnoses instead of clinical doctors. The PL-QCT BMD system provided by Bone’s QCT(Bone’s Technology(Shenzhen) Limited) achieves 0.1448mg/cm2(spine) and 0.0002 mg/cm2(hip) mean absolute error(MAE) and linear correlation coefficient R2=0.9970(spine) and R2=0.9991(hip)(compared to QCT-Pro(Mindways)) on 155 patients in three-center clinical trial in Guangzhou, China. Conclusion: This study builds a Chinese osteoporosis corpus and develops a fine-tuned and domain-adapted LLM as well as a PL-QCT BMD measurement system. Our fine-tuned GPT model shows better capability than LLaMA model on most testing questions on osteoporosis. Combined with our PL-QCT BMD system, we are looking forward to providing science popularization and early morning screening for potential osteoporotic patients.

Keywords: GPT, phantom-less QCT, large language model, osteoporosis

Procedia PDF Downloads 71
188 Groundwater Flow Assessment Based on Numerical Simulation at Omdurman Area, Khartoum State, Sudan

Authors: Adil Balla Elkrail

Abstract:

Visual MODFLOW computer codes were selected to simulate head distribution, calculate the groundwater budgets of the area, and evaluate the effect of external stresses on the groundwater head and to demonstrate how the groundwater model can be used as a comparative technique in order to optimize utilization of the groundwater resource. A conceptual model of the study area, aquifer parameters, boundary, and initial conditions were used to simulate the flow model. The trial-and-error technique was used to calibrate the model. The most important criteria used to check the calibrated model were Root Mean Square error (RMS), Mean Absolute error (AM), Normalized Root Mean Square error (NRMS) and mass balance. The maps of the simulated heads elaborated acceptable model calibration compared to observed heads map. A time length of eight years and the observed heads of the year 2004 were used for model prediction. The predictive simulation showed that the continuation of pumping will cause relatively high changes in head distribution and components of groundwater budget whereas, the low deficit computed (7122 m3/d) between inflows and outflows cannot create a significant drawdown of the potentiometric level. Hence, the area under consideration may represent a high permeability and productive zone and strongly recommended for further groundwater development.

Keywords: aquifers, model simulation, groundwater, calibrations, trail-and- error, prediction

Procedia PDF Downloads 241
187 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 153
186 Selective and Highly Sensitive Measurement of ¹⁵NH₃ Using Photoacoustic Spectroscopy for Environmental Applications

Authors: Emily Awuor, Helga Huszar, Zoltan Bozoki

Abstract:

Isotope analysis has found numerous applications in the environmental science discipline, most common being the tracing of environmental contaminants on both regional and global scales. Many environmental contaminants contain ammonia (NH₃) since it is the most abundant gas in the atmosphere and its largest sources are from agricultural and industrial activities. NH₃ isotopes (¹⁴NH₃ and ¹⁵NH₃) are therefore important and can be used in the traceability studies of these atmospheric pollutants. The goal of the project is the construction of a photoacoustic spectroscopy system that is capable of measuring ¹⁵NH₃ isotope selectively in terms of its concentration. A further objective is for the system to be robust, easy-to-use, and automated. This is provided by using two telecommunication type near-infrared distributed feedback (DFB) diode lasers and a laser coupler as the light source in the photoacoustic measurement system. The central wavelength of the lasers in use was 1532 nm, with the tuning range of ± 1 nm. In this range, strong absorption lines can be found for both ¹⁴NH₃ and ¹⁵NH₃. For the selective measurement of ¹⁵NH₃, wavelengths were chosen where the cross effect of ¹⁴NH₃ and water vapor is negligible. We completed the calibration of the photoacoustic system, and as a result, the lowest detectable concentration was 3.32 ppm (3Ϭ) in the case of ¹⁵NH₃ and 0.44 ppm (3Ϭ) in the case of ¹⁴NH₃. The results are most useful in the environmental pollution measurement and analysis.

Keywords: ammonia isotope, near-infrared DFB diode laser, photoacoustic spectroscopy, environmental monitoring

Procedia PDF Downloads 148
185 Condition Monitoring for Twin-Fluid Nozzles with Internal Mixing

Authors: C. Lanzerstorfer

Abstract:

Liquid sprays of water are frequently used in air pollution control for gas cooling purposes and for gas cleaning. Twin-fluid nozzles with internal mixing are often used for these purposes because of the small size of the drops produced. In these nozzles the liquid is dispersed by compressed air or another pressurized gas. In high efficiency scrubbers for particle separation, several nozzles are operated in parallel because of the size of the cross section. In such scrubbers, the scrubbing water has to be re-circulated. Precipitation of some solid material can occur in the liquid circuit, caused by chemical reactions. When such precipitations are detached from the place of formation, they can partly or totally block the liquid flow to a nozzle. Due to the resulting unbalanced supply of the nozzles with water and gas, the efficiency of separation decreases. Thus, the nozzles have to be cleaned if a certain fraction of blockages is reached. The aim of this study was to provide a tool for continuously monitoring the status of the nozzles of a scrubber based on the available operation data (water flow, air flow, water pressure and air pressure). The difference between the air pressure and the water pressure is not well suited for this purpose, because the difference is quite small and therefore very exact calibration of the pressure measurement would be required. Therefore, an equation for the reference air flow of a nozzle at the actual water flow and operation pressure was derived. This flow can be compared with the actual air flow for assessment of the status of the nozzles.

Keywords: condition monitoring, dual flow nozzles, flow equation, operation data

Procedia PDF Downloads 265
184 Coupled Analysis for Hazard Modelling of Debris Flow Due to Extreme Rainfall

Authors: N. V. Nikhil, S. R. Lee, Do Won Park

Abstract:

Korean peninsula receives about two third of the annual rainfall during summer season. The extreme rainfall pattern due to typhoon and heavy rainfall results in severe mountain disasters among which 55% of them are debris flows, a major natural hazard especially when occurring around major settlement areas. The basic mechanism underlined for this kind of failure is the unsaturated shallow slope failure by reduction of matric suction due to infiltration of water and liquefaction of the failed mass due to generation of positive pore water pressure leading to abrupt loss of strength and commencement of flow. However only an empirical model cannot simulate this complex mechanism. Hence, we have employed an empirical-physical based approach for hazard analysis of debris flow using TRIGRS, a debris flow initiation criteria and DAN3D in mountain Woonmyun, South Korea. Debris flow initiation criteria is required to discern the potential landslides which can transform into debris flow. DAN-3D, being a new model, does not have the calibrated values of rheology parameters for Korean conditions. Thus, in our analysis we have used the recent 2011 debris flow event in mountain Woonmyun san for calibration of both TRIGRS model and DAN-3D, thereafter identifying and predicting the debris flow initiation points, path, run out velocity, and area of spreading for future extreme rainfall based scenarios.

Keywords: debris flow, DAN-3D, extreme rainfall, hazard analysis

Procedia PDF Downloads 247
183 Characterization of Aquifer Systems and Identification of Potential Groundwater Recharge Zones Using Geospatial Data and Arc GIS in Kagandi Water Supply System Well Field

Authors: Aijuka Nicholas

Abstract:

A research study was undertaken to characterize the aquifers and identify the potential groundwater recharge zones in the Kagandi district. Quantitative characterization of hydraulic conductivities of aquifers is of fundamental importance to the study of groundwater flow and contaminant transport in aquifers. A conditional approach is used to represent the spatial variability of hydraulic conductivity. Briefly, it involves using qualitative and quantitative geologic borehole-log data to generate a three-dimensional (3D) hydraulic conductivity distribution, which is then adjusted through calibration of a 3D groundwater flow model using pumping-test data and historic hydraulic data. The approach consists of several steps. The study area was divided into five sub-watersheds on the basis of artificial drainage divides. A digital terrain model (DTM) was developed using Arc GIS to determine the general drainage pattern of Kagandi watershed. Hydrologic characterization involved the determination of the various hydraulic properties of the aquifers. Potential groundwater recharge zones were identified by integrating various thematic maps pertaining to the digital elevation model, land use, and drainage pattern in Arc GIS and Sufer golden software. The study demonstrates the potential of GIS in delineating groundwater recharge zones and that the developed methodology will be applicable to other watersheds in Uganda.

Keywords: aquifers, Arc GIS, groundwater recharge, recharge zones

Procedia PDF Downloads 147
182 Using Flow Line Modelling, Remote Sensing for Reconstructing Glacier Volume Loss Model for Athabasca Glacier, Canadian Rockies

Authors: Rituparna Nath, Shawn J. Marshall

Abstract:

Glaciers are one of the main sensitive climatic indicators, as they respond strongly to small climatic shifts. We develop a flow line model of glacier dynamics to simulate the past and future extent of glaciers in the Canadian Rocky Mountains, with the aim of coupling this model within larger scale regional climate models of glacier response to climate change. This paper will focus on glacier-climate modeling and reconstructions of glacier volume from the Little Ice Age (LIA) to present for Athabasca Glacier, Alberta, Canada. Glacier thickness, volume and mass change will be constructed using flow line modelling and examination of different climate scenarios that are able to give good reconstructions of LIA ice extent. With the availability of SPOT 5 imagery, Digital elevation models and GIS Arc Hydro tool, ice catchment properties-glacier width and LIA moraines have been extracted using automated procedures. Simulation of glacier mass change will inform estimates of meltwater run off over the historical period and model calibration from the LIA reconstruction will aid in future projections of the effects of climate change on glacier recession. Furthermore, the model developed will be effective for further future studies with ensembles of glaciers.

Keywords: flow line modeling, Athabasca Glacier, glacier mass balance, Remote Sensing, Arc hydro tool, little ice age

Procedia PDF Downloads 268
181 Mathematical modeling of the calculation of the absorbed dose in uranium production workers with the genetic effects.

Authors: P. Kazymbet, G. Abildinova, K.Makhambetov, M. Bakhtin, D. Rybalkina, K. Zhumadilov

Abstract:

Conducted cytogenetic research in workers Stepnogorsk Mining-Chemical Combine (Akmola region) with the study of 26341 chromosomal metaphase. Using a regression analysis with program DataFit, version 5.0, dependence between exposure dose and the following cytogenetic exponents has been studied: frequency of aberrant cells, frequency of chromosomal aberrations, frequency of the amounts of dicentric chromosomes, and centric rings. Experimental data on calibration curves "dose-effect" enabled the development of a mathematical model, allowing on data of the frequency of aberrant cells, chromosome aberrations, the amounts of dicentric chromosomes and centric rings calculate the absorbed dose at the time of the study. In the dose range of 0.1 Gy to 5.0 Gy dependence cytogenetic parameters on the dose had the following equation: Y = 0,0067е^0,3307х (R2 = 0,8206) – for frequency of chromosomal aberrations; Y = 0,0057е^0,3161х (R2 = 0,8832) –for frequency of cells with chromosomal aberrations; Y =5 Е-0,5е^0,6383 (R2 = 0,6321) – or frequency of the amounts of dicentric chromosomes and centric rings on cells. On the basis of cytogenetic parameters and regression equations calculated absorbed dose in workers of uranium production at the time of the study did not exceed 0.3 Gy.

Keywords: Stepnogorsk, mathematical modeling, cytogenetic, dicentric chromosomes

Procedia PDF Downloads 477
180 Mercaptopropionic Acid (MPA) Modifying Chitosan-Gold Nano Composite for γ-Aminobutyric Acid Analysis Using Raman Scattering

Authors: Bingjie Wang, Su-Yeon Kwon, Ik-Joong Kang

Abstract:

The goal of this experiment is to develop a sensor that can quickly check the concentration by using the nanoparticles made by chitosan and gold. Using chitosan nanoparticles crosslinking with sodium tripolyphosphate(TPP) is the first step to form the chitosan nanoparticles, which would be covered with the gold sequentially. The size of the fabricated product was around 100nm. Based on the method that the sulfur end of the MPA linked to gold can form the very strong S–Au bond, and the carboxyl group, the other end of the MPA, can easily absorb the GABA. As for the GABA, what is the primary inhibitory neurotransmitter in the mammalian central nervous system in the human body. It plays such significant role in reducing neuronal excitability pass through the nervous system. A Surface-enhanced Raman Scattering (SERS) as the principle for enhancing Raman scattering by molecules adsorbed on rough metal surfaces or by nanostructures is used to detect the concentration change of γ-Aminobutyric Acid (GABA). When the system is formed, it generated SERS, which made a clear difference in the intensity of Raman scattering within the range of GABA concentration. So it is obtained from the experiment that the calibration curve according to the GABA concentration relevant with the SERS scattering. In this study, DLS, SEM, FT-IR, UV, SERS were used to analyze the products to obtain the conclusion.

Keywords: mercaptopropionic acid, chitosan-gold nanoshell, γ-aminobutyric acid, surface-enhanced raman scattering

Procedia PDF Downloads 275
179 Quantitative Assessment of Soft Tissues by Statistical Analysis of Ultrasound Backscattered Signals

Authors: Da-Ming Huang, Ya-Ting Tsai, Shyh-Hau Wang

Abstract:

Ultrasound signals backscattered from the soft tissues are mainly depending on the size, density, distribution, and other elastic properties of scatterers in the interrogated sample volume. The quantitative analysis of ultrasonic backscattering is frequently implemented using the statistical approach due to that of backscattering signals tends to be with the nature of the random variable. Thus, the statistical analysis, such as Nakagami statistics, has been applied to characterize the density and distribution of scatterers of a sample. Yet, the accuracy of statistical analysis could be readily affected by the receiving signals associated with the nature of incident ultrasound wave and acoustical properties of samples. Thus, in the present study, efforts were made to explore such effects as the ultrasound operational modes and attenuation of biological tissue on the estimation of corresponding Nakagami statistical parameter (m parameter). In vitro measurements were performed from healthy and pathological fibrosis porcine livers using different single-element ultrasound transducers and duty cycles of incident tone burst ranging respectively from 3.5 to 7.5 MHz and 10 to 50%. Results demonstrated that the estimated m parameter tends to be sensitively affected by the use of ultrasound operational modes as well as the tissue attenuation. The healthy and pathological tissues may be characterized quantitatively by m parameter under fixed measurement conditions and proper calibration.

Keywords: ultrasound backscattering, statistical analysis, operational mode, attenuation

Procedia PDF Downloads 323