Search results for: quantification accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4194

Search results for: quantification accuracy

4164 A Review of Kinematics and Joint Load Forces in Total Knee Replacements Influencing Surgical Outcomes

Authors: Samira K. Al-Nasser, Siamak Noroozi, Roya Haratian, Adrian Harvey

Abstract:

A total knee replacement (TKR) is a surgical procedure necessary when there is severe pain and/or loss of function in the knee. Surgeons balance the load in the knee and the surrounding soft tissue by feeling the tension at different ranges of motion. This method can be unreliable and lead to early failure of the joint. The ideal kinematics and load distribution have been debated significantly based on previous biomechanical studies surrounding both TKRs and normal knees. Intraoperative sensors like VERASENSE and eLibra have provided a method for the quantification of the load indicating a balanced knee. A review of the literature written about intraoperative sensors and tension/stability of the knee was done. Studies currently debate the quantification of the load in medial and lateral compartments specifically. However, most research reported that following a TKR the medial compartment was loaded more heavily than the lateral compartment. In several cases, these results were shown to increase the success of the surgery because they mimic the normal kinematics of the knee. In conclusion, most research agrees that an intercompartmental load differential of between 10 and 20 pounds, where the medial load was higher than the lateral, and an absolute load of less than 70 pounds was ideal. However, further intraoperative sensor development could help improve the accuracy and understanding of the load distribution on the surgical outcomes in a TKR. A reduction in early revision surgeries for TKRs would provide an improved quality of life for patients and reduce the economic burden placed on both the National Health Service (NHS) and the patient.

Keywords: intraoperative sensors, joint load forces, kinematics, load balancing, and total knee replacement

Procedia PDF Downloads 136
4163 Quantification of Learned Non-Use of the Upper-Limb After a Stroke

Authors: K. K. A. Bakhti, D. Mottet, J. Froger, I. Laffont

Abstract:

Background: After a cerebrovascular accident (or stroke), many patients use excessive trunk movements to move their paretic hand towards a target (while the elbow is maintained flexed) even though they can use the upper-limb when the trunk is restrained. This phenomenon is labelled learned non-use and is known to be detrimental to neuroplasticity and recovery. Objective: The aim of this study is to quantify learned non-use of the paretic upper limb during a hand reaching task using 3D movement analysis. Methods: Thirty-four participants post supratentorial stroke were asked to reach a cone placed in front of them at 80% of their arm length. The reaching movement was repeated 5 times with the paretic hand, and then 5 times with the less-impaired hand. This sequence was first performed with the trunk free, then with the trunk restrained. Learned non-use of the upper-limb (LNUUL) was obtained from the difference of the amount of trunk compensation between the free trunk condition and the restrained trunk condition. Results: LNUUL was significantly higher for the paretic hand, with individual values ranging from 1% to 43%, and one-half of the patients with an LNUUL higher than 15%. Conclusions: Quantification of LNUUL can be used to objectively diagnose patients who need trunk rehabilitation. It can be also used for monitoring the rehabilitation progress. Quantification of LNUUL may guide upper-limb rehabilitation towards more optimal motor recovery avoiding maladaptive trunk compensation and its consequences on neuroplasticity.

Keywords: learned non-use, rehabilitation, stroke, upper limb

Procedia PDF Downloads 238
4162 The Comparison and Optimization of the Analytic Method for Canthaxanthin, Food Colorants

Authors: Hee-Jae Suh, Kyung-Su Kim, Min-Ji Kim, Yeon-Seong Jeong, Ok-Hwan Lee, Jae-Wook Shin, Hyang-Sook Chun, Chan Lee

Abstract:

Canthaxanthin is keto-carotenoid produced from beta-carotene and it has been approved to be used in many countries as a food coloring agent. Canthaxanthin has been analyzed using High Performance Liquid Chromatography (HPLC) system with various ways of pretreatment methods. Four official methods for verification of canthaxanthin at FSA (UK), AOAC (US), EFSA (EU) and MHLW (Japan) were compared to improve its analytical and the pretreatment method. The Linearity, the limit of detection (LOD), the limit of quantification (LOQ), the accuracy, the precision and the recovery ratio were determined from each method with modification in pretreatment method. All HPLC methods exhibited correlation coefficients of calibration curves for canthaxanthin as 0.9999. The analysis methods from FSA, AOAC, and MLHW showed the LOD of 0.395 ppm, 0.105 ppm, and 0.084 ppm, and the LOQ of 1.196 ppm, 0.318 ppm, 0.254 ppm, respectively. Among tested methods, HPLC method of MHLW with modification in pretreatments was finally selected for the analysis of canthaxanthin in lab, because it exhibited the resolution factor of 4.0 and the selectivity of 1.30. This analysis method showed a correlation coefficients value of 0.9999 and the lowest LOD and LOQ. Furthermore, the precision ratio was lower than 1 and the accuracy was almost 100%. The method presented the recovery ratio of 90-110% with modification in pretreatment method. The cross-validation of coefficient variation was 5 or less among tested three institutions in Korea.

Keywords: analytic method, canthaxanthin, food colorants, pretreatment method

Procedia PDF Downloads 683
4161 Quantification of Soft Tissue Artefacts Using Motion Capture Data and Ultrasound Depth Measurements

Authors: Azadeh Rouhandeh, Chris Joslin, Zhen Qu, Yuu Ono

Abstract:

The centre of rotation of the hip joint is needed for an accurate simulation of the joint performance in many applications such as pre-operative planning simulation, human gait analysis, and hip joint disorders. In human movement analysis, the hip joint center can be estimated using a functional method based on the relative motion of the femur to pelvis measured using reflective markers attached to the skin surface. The principal source of errors in estimation of hip joint centre location using functional methods is soft tissue artefacts due to the relative motion between the markers and bone. One of the main objectives in human movement analysis is the assessment of soft tissue artefact as the accuracy of functional methods depends upon it. Various studies have described the movement of soft tissue artefact invasively, such as intra-cortical pins, external fixators, percutaneous skeletal trackers, and Roentgen photogrammetry. The goal of this study is to present a non-invasive method to assess the displacements of the markers relative to the underlying bone using optical motion capture data and tissue thickness from ultrasound measurements during flexion, extension, and abduction (all with knee extended) of the hip joint. Results show that the artefact skin marker displacements are non-linear and larger in areas closer to the hip joint. Also marker displacements are dependent on the movement type and relatively larger in abduction movement. The quantification of soft tissue artefacts can be used as a basis for a correction procedure for hip joint kinematics.

Keywords: hip joint center, motion capture, soft tissue artefact, ultrasound depth measurement

Procedia PDF Downloads 281
4160 HPTLC Based Qualitative and Quantitative Evaluation of Uraria picta Desv: A Dashmool Species

Authors: Hari O. Saxena, Ganesh

Abstract:

In the present investigation, chemical fingerprints of methanolic extracts of roots, stem and leaves of Uraria picta were developed using HPTLC technique. These fingerprints will be useful for authentication as well as in differentiating the species from adulterants. These will also serve as a biochemical marker for this valuable species in pharmaceutical industries and plant systemic studies. Roots, stem and leaves of Uraria picta were further evaluated for quantification of an active ingredient lupeol to find out alternatives to roots. Results showed more content of lupeol in stem (0.048%, dry wt.) as compare to roots (0.017%, dry wt.) suggesting the utilization of stem in place of roots. It will avoid uprooting of this prestigious plant which ultimately will promote its conservation.

Keywords: chemical fingerprints, lupeol, quantification, Uraria picta

Procedia PDF Downloads 257
4159 The Network Relative Model Accuracy (NeRMA) Score: A Method to Quantify the Accuracy of Prediction Models in a Concurrent External Validation

Authors: Carl van Walraven, Meltem Tuna

Abstract:

Background: Network meta-analysis (NMA) quantifies the relative efficacy of 3 or more interventions from studies containing a subgroup of interventions. This study applied the analytical approach of NMA to quantify the relative accuracy of prediction models with distinct inclusion criteria that are evaluated on a common population (‘concurrent external validation’). Methods: We simulated binary events in 5000 patients using a known risk function. We biased the risk function and modified its precision by pre-specified amounts to create 15 prediction models with varying accuracy and distinct patient applicability. Prediction model accuracy was measured using the Scaled Brier Score (SBS). Overall prediction model accuracy was measured using fixed-effects methods that accounted for model applicability patterns. Prediction model accuracy was summarized as the Network Relative Model Accuracy (NeRMA) Score which ranges from -∞ through 0 (accuracy of random guessing) to 1 (accuracy of most accurate model in concurrent external validation). Results: The unbiased prediction model had the highest SBS. The NeRMA score correctly ranked all simulated prediction models by the extent of bias from the known risk function. A SAS macro and R-function was created to implement the NeRMA Score. Conclusions: The NeRMA Score makes it possible to quantify the accuracy of binomial prediction models having distinct inclusion criteria in a concurrent external validation.

Keywords: prediction model accuracy, scaled brier score, fixed effects methods, concurrent external validation

Procedia PDF Downloads 236
4158 Investigation of Complexity Dynamics in a DC Glow Discharge Magnetized Plasma Using Recurrence Quantification Analysis

Authors: Vramori Mitra, Bornali Sarma, Arun K. Sarma

Abstract:

Recurrence is a ubiquitous feature of any real dynamical system. The states in phase space trajectory of a system have an inherent tendency to return to the same state or its close state after certain time laps. Recurrence quantification analysis technique, based on this fundamental feature of a dynamical system, detects evaluation of state under variation of control parameter of the system. The paper presents the investigation of nonlinear dynamical behavior of plasma floating potential fluctuations obtained by using a Langmuir probe in different magnetic field under the variation of discharge voltages. The main measures of recurrence quantification analysis are considered as determinism, linemax and entropy. The increment of the DET and linemax variables asserts that the predictability and periodicity of the system is increasing. The variable linemax indicates that the chaoticity is being diminished with the slump of magnetic field while increase of magnetic field enhancing the chaotic behavior. Fractal property of the plasma time series estimated by DFA technique (Detrended fluctuation analysis) reflects that long-range correlation of plasma fluctuations is decreasing while fractal dimension is increasing with the enhancement of magnetic field which corroborates the RQA analysis.

Keywords: detrended fluctuation analysis, chaos, phase space, recurrence

Procedia PDF Downloads 328
4157 Video-Based System for Support of Robot-Enhanced Gait Rehabilitation of Stroke Patients

Authors: Matjaž Divjak, Simon Zelič, Aleš Holobar

Abstract:

We present a dedicated video-based monitoring system for quantification of patient’s attention to visual feedback during robot assisted gait rehabilitation. Two different approaches for eye gaze and head pose tracking are tested and compared. Several metrics for assessment of patient’s attention are also presented. Experimental results with healthy volunteers demonstrate that unobtrusive video-based gaze tracking during the robot-assisted gait rehabilitation is possible and is sufficiently robust for quantification of patient’s attention and assessment of compliance with the rehabilitation therapy.

Keywords: video-based attention monitoring, gaze estimation, stroke rehabilitation, user compliance

Procedia PDF Downloads 426
4156 Does sustainability disclosure improve analysts’ forecast accuracy Evidence from European banks

Authors: Albert Acheampong, Tamer Elshandidy

Abstract:

We investigate the extent to which sustainability disclosure from the narrative section of European banks’ annual reports improves analyst forecast accuracy. We capture sustainability disclosure using a machine learning approach and use forecast error to proxy analyst forecast accuracy. Our results suggest that sustainability disclosure significantly improves analyst forecast accuracy by reducing the forecast error. In a further analysis, we also find that the induction of Directive 2014/95/European Union (EU) is associated with increased disclosure content, which then reduces forecast error. Collectively, our results suggest that sustainability disclosure improves forecast accuracy, and the induction of the new EU directive strengthens this improvement. These results hold after several further and robustness analyses. Our findings have implications for market participants and policymakers.

Keywords: sustainability disclosure, machine learning, analyst forecast accuracy, forecast error, European banks, EU directive

Procedia PDF Downloads 76
4155 Contributing to Accuracy of Bid Cost Estimate in Construction Projects

Authors: Abdullah Alhomidan

Abstract:

This study is conducted to identify the main factors affecting accuracy of pretender cost estimate in building construction projects in Saudi Arabia from owners’ perspective. 44 factors affecting pretender cost estimate were identified through literature review and discussion with some construction experts. The results show that the top important factors affecting pretender cost estimate accuracy are: level of competitors in the tendering, material price changes, communications with suppliers, communications with client, and estimating method used.

Keywords: cost estimate, accuracy, pretender, estimating, bid estimate

Procedia PDF Downloads 556
4154 Microstructural Investigation and Fatigue Damage Quantification of Anisotropic Behavior in AA2017 Aluminum Alloy under Cyclic Loading

Authors: Abdelghani May

Abstract:

This paper reports on experimental investigations concerning the underlying reasons for the anisotropic behavior observed during the cyclic loading of AA2017 aluminum alloy. Initially, we quantified the evolution of fatigue damage resulting from controlled proportional cyclic loadings along the axial and shear directions. Our primary objective at this stage was to verify the anisotropic mechanical behavior recently observed. To accomplish this, we utilized various models of fatigue damage quantification and conducted a comparative study of the obtained results. Our analysis confirmed the anisotropic nature of the material under investigation. In the subsequent step, we performed microstructural investigations aimed at understanding the origins of the anisotropic mechanical behavior. To this end, we utilized scanning electron microscopy to examine the phases and precipitates in both the transversal and longitudinal sections. Our findings indicate that the structure and morphology of these entities are responsible for the anisotropic behavior observed in the aluminum alloy. Furthermore, results obtained from Kikuchi diagrams, pole figures, and inverse pole figures have corroborated these conclusions. These findings demonstrate significant differences in the crystallographic texture of the material.

Keywords: microstructural investigation, fatigue damage quantification, anisotropic behavior, AA2017 aluminum alloy, cyclic loading, crystallographic texture, scanning electron microscopy

Procedia PDF Downloads 76
4153 Fast Prototyping of Precise, Flexible, Multiplexed, Printed Electrochemical Enzyme-Linked Immunosorbent Assay System for Point-of-Care Biomarker Quantification

Authors: Zahrasadat Hosseini, Jie Yuan

Abstract:

Point-of-care (POC) diagnostic devices based on lab-on-a-chip (LOC) technology have the potential to revolutionize medical diagnostics. However, the development of an ideal microfluidic system based on LOC technology for diagnostics purposes requires overcoming several obstacles, such as improving sensitivity, selectivity, portability, cost-effectiveness, and prototyping methods. While numerous studies have introduced technologies and systems that advance these criteria, existing systems still have limitations. Electrochemical enzyme-linked immunosorbent assay (e-ELISA) in a LOC device offers numerous advantages, including enhanced sensitivity, decreased turnaround time, minimized sample and analyte consumption, reduced cost, disposability, and suitability for miniaturization, integration, and multiplexing. In this study, we present a novel design and fabrication method for a microfluidic diagnostic platform that integrates screen-printed electrochemical carbon/silver chloride electrodes on flexible printed circuit boards with flexible, multilayer, polydimethylsiloxane (PDMS) microfluidic networks to accurately manipulate and pre-immobilize analytes for performing electrochemical enzyme-linked immunosorbent assay (e-ELISA) for multiplexed quantification of blood serum biomarkers. We further demonstrate fast, cost-effective prototyping, as well as accurate and reliable detection performance of this device for quantification of interleukin-6-spiked samples through electrochemical analytics methods. We anticipate that our invention represents a significant step towards the development of user-friendly, portable, medical-grade, POC diagnostic devices.

Keywords: lab-on-a-chip, point-of-care diagnostics, electrochemical ELISA, biomarker quantification, fast prototyping

Procedia PDF Downloads 83
4152 Fast Prototyping of Precise, Flexible, Multiplexed, Printed Electrochemical Enzyme-Linked Immunosorbent Assay Platform for Point-of-Care Biomarker Quantification

Authors: Zahrasadat Hosseini, Jie Yuan

Abstract:

Point-of-care (POC) diagnostic devices based on lab-on-a-chip (LOC) technology have the potential to revolutionize medical diagnostics. However, the development of an ideal microfluidic system based on LOC technology for diagnostics purposes requires overcoming several obstacles, such as improving sensitivity, selectivity, portability, cost-effectiveness, and prototyping methods. While numerous studies have introduced technologies and systems that advance these criteria, existing systems still have limitations. Electrochemical enzyme-linked immunosorbent assay (e-ELISA) in a LOC device offers numerous advantages, including enhanced sensitivity, decreased turnaround time, minimized sample and analyte consumption, reduced cost, disposability, and suitability for miniaturization, integration, and multiplexing. In this study, we present a novel design and fabrication method for a microfluidic diagnostic platform that integrates screen-printed electrochemical carbon/silver chloride electrodes on flexible printed circuit boards with flexible, multilayer, polydimethylsiloxane (PDMS) microfluidic networks to accurately manipulate and pre-immobilize analytes for performing electrochemical enzyme-linked immunosorbent assay (e-ELISA) for multiplexed quantification of blood serum biomarkers. We further demonstrate fast, cost-effective prototyping, as well as accurate and reliable detection performance of this device for quantification of interleukin-6-spiked samples through electrochemical analytics methods. We anticipate that our invention represents a significant step towards the development of user-friendly, portable, medical-grade POC diagnostic devices.

Keywords: lab-on-a-chip, point-of-care diagnostics, electrochemical ELISA, biomarker quantification, fast prototyping

Procedia PDF Downloads 85
4151 Loss Quantification Archaeological Sites in Watershed Due to the Use and Occupation of Land

Authors: Elissandro Voigt Beier, Cristiano Poleto

Abstract:

The main objective of the research is to assess the loss through the quantification of material culture (archaeological fragments) in rural areas, sites explored economically by machining on seasonal crops, and also permanent, in a hydrographic subsystem Camaquã River in the state of Rio Grande do Sul, Brazil. The study area consists of different micro basins and differs in area, ranging between 1,000 m² and 10,000 m², respectively the largest and the smallest, all with a large number of occurrences and outcrop locations of archaeological material and high density in intense farm environment. In the first stage of the research aimed to identify the dispersion of points of archaeological material through field survey through plot points by the Global Positioning System (GPS), within each river basin, was made use of concise bibliography on the topic in the region, helping theoretically in understanding the old landscaping with preferences of occupation for reasons of ancient historical people through the settlements relating to the practice observed in the field. The mapping was followed by the cartographic development in the region through the development of cartographic products of the land elevation, consequently were created cartographic products were to contribute to the understanding of the distribution of the absolute materials; the definition and scope of the material dispersed; and as a result of human activities the development of revolving letter by mechanization of in situ material, it was also necessary for the preparation of materials found density maps, linking natural environments conducive to ancient historical occupation with the current human occupation. The third stage of the project it is for the systematic collection of archaeological material without alteration or interference in the subsurface of the indigenous settlements, thus, the material was prepared and treated in the laboratory to remove soil excesses, cleaning through previous communication methodology, measurement and quantification. Approximately 15,000 were identified archaeological fragments belonging to different periods of ancient history of the region, all collected outside of its environmental and historical context and it also has quite changed and modified. The material was identified and cataloged considering features such as object weight, size, type of material (lithic, ceramic, bone, Historical porcelain and their true association with the ancient history) and it was disregarded its principles as individual lithology of the object and functionality same. As observed preliminary results, we can point out the change of materials by heavy mechanization and consequent soil disturbance processes, and these processes generate loading of archaeological materials. Therefore, as a next step will be sought, an estimate of potential losses through a mathematical model. It is expected by this process, to reach a reliable model of high accuracy which can be applied to an archeological site of lower density without encountering a significant error.

Keywords: degradation of heritage, quantification in archaeology, watershed, use and occupation of land

Procedia PDF Downloads 277
4150 Standardization of a Methodology for Quantification of Antimicrobials Used for the Treatment of Multi-Resistant Bacteria Using Two Types of Biosensors and Production of Anti-Antimicrobial Antibodies

Authors: Garzon V., Bustos R., Salvador J. P., Marco M. P., Pinacho D. G.

Abstract:

Bacterial resistance to antimicrobial treatment has increased significantly in recent years, making it a public health problem. Large numbers of bacteria are resistant to all or nearly all known antimicrobials, creating the need for the development of new types of antimicrobials or the use of “last line” antimicrobial drug therapies for the treatment of multi-resistant bacteria. Some of the chemical groups of antimicrobials most used for the treatment of infections caused by multiresistant bacteria in the clinic are Glycopeptide (Vancomycin), Polymyxin (Colistin), Lipopeptide (Daptomycin) and Carbapenem (Meropenem). Molecules that require therapeutic drug monitoring (TDM). Due to the above, a methodology based on nanobiotechnology based on an optical and electrochemical biosensor is being developed, which allows the evaluation of the plasmatic levels of some antimicrobials such as glycopeptide, polymyxin, lipopeptide and carbapenem quickly, at a low cost, with a high specificity and sensitivity and that can be implemented in the future in public and private health hospitals. For this, the project was divided into five steps i) Design of specific anti-drug antibodies, produced in rabbits for each of the types of antimicrobials, evaluating the results by means of an immunoassay analysis (ELISA); ii) quantification by means of an electrochemical biosensor that allows quantification with high sensitivity and selectivity of the reference antimicrobials; iii) Comparison of antimicrobial quantification with an optical type biosensor; iv) Validation of the methodologies used with biosensor by means of an immunoassay. Finding as a result that it is possible to quantify antibiotics by means of the optical and electrochemical biosensor at concentrations on average of 1,000ng/mL, the antibodies being sensitive and specific for each of the antibiotic molecules, results that were compared with immunoassays and HPLC chromatography. Thus, contributing to the safe use of these drugs commonly used in clinical practice and new antimicrobial drugs.

Keywords: antibiotics, electrochemical biosensor, optical biosensor, therapeutic drug monitoring

Procedia PDF Downloads 83
4149 Satellite Image Classification Using Firefly Algorithm

Authors: Paramjit Kaur, Harish Kundra

Abstract:

In the recent years, swarm intelligence based firefly algorithm has become a great focus for the researchers to solve the real time optimization problems. Here, firefly algorithm is used for the application of satellite image classification. For experimentation, Alwar area is considered to multiple land features like vegetation, barren, hilly, residential and water surface. Alwar dataset is considered with seven band satellite images. Firefly Algorithm is based on the attraction of less bright fireflies towards more brightener one. For the evaluation of proposed concept accuracy assessment parameters are calculated using error matrix. With the help of Error matrix, parameters of Kappa Coefficient, Overall Accuracy and feature wise accuracy parameters of user’s accuracy & producer’s accuracy can be calculated. Overall results are compared with BBO, PSO, Hybrid FPAB/BBO, Hybrid ACO/SOFM and Hybrid ACO/BBO based on the kappa coefficient and overall accuracy parameters.

Keywords: image classification, firefly algorithm, satellite image classification, terrain classification

Procedia PDF Downloads 401
4148 Cracks Detection and Measurement Using VLP-16 LiDAR and Intel Depth Camera D435 in Real-Time

Authors: Xinwen Zhu, Xingguang Li, Sun Yi

Abstract:

Crack is one of the most common damages in buildings, bridges, roads and so on, which may pose safety hazards. However, cracks frequently happen in structures of various materials. Traditional methods of manual detection and measurement, which are known as subjective, time-consuming, and labor-intensive, are gradually unable to meet the needs of modern development. In addition, crack detection and measurement need be safe considering space limitations and danger. Intelligent crack detection has become necessary research. In this paper, an efficient method for crack detection and quantification using a 3D sensor, LiDAR, and depth camera is proposed. This method works even in a dark environment, which is usual in real-world applications. The LiDAR rapidly spins to scan the surrounding environment and discover cracks through lasers thousands of times per second, providing a rich, 3D point cloud in real-time. The LiDAR provides quite accurate depth information. The precision of the distance of each point can be determined within around  ±3 cm accuracy, and not only it is good for getting a precise distance, but it also allows us to see far of over 100m going with the top range models. But the accuracy is still large for some high precision structures of material. To make the depth of crack is much more accurate, the depth camera is in need. The cracks are scanned by the depth camera at the same time. Finally, all data from LiDAR and Depth cameras are analyzed, and the size of the cracks can be quantified successfully. The comparison shows that the minimum and mean absolute percentage error between measured and calculated width are about 2.22% and 6.27%, respectively. The experiments and results are presented in this paper.

Keywords: LiDAR, depth camera, real-time, detection and measurement

Procedia PDF Downloads 224
4147 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study

Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar

Abstract:

Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.

Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices

Procedia PDF Downloads 508
4146 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients

Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho

Abstract:

Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).

Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper

Procedia PDF Downloads 146
4145 Extraction and Quantification of Triclosan in Wastewater Samples Using Molecularly Imprinted Membrane Adsorbent

Authors: Siyabonga Aubrey Mhlongo, Linda Lunga Sibali, Phumlane Selby Mdluli, Peter Papoh Ndibewu, Kholofelo Clifford Malematja

Abstract:

This paper reports on the successful extraction and quantification of an antibacterial and antifungal agent present in some consumer products (Triclosan: C₁₂H₇Cl₃O₂)generally found in wastewater or effluents using molecularly imprinted membrane adsorbent (MIMs) followed by quantification and removal on a high-performance liquid chromatography (HPLC). Triclosan is an antibacterial and antifungal agent present in some consumer products like toothpaste, soaps, detergents, toys, and surgical cleaning treatments. The MIMs was fabricated usingpolyvinylidene fluoride (PVDF) polymer with selective micro composite particles known as molecularly imprinted polymers (MIPs)via a phase inversion by immersion precipitation technique. This resulted in an improved hydrophilicity and mechanical behaviour of the membranes. Wastewater samples were collected from the Umbogintwini Industrial Complex (UIC) (south coast of Durban, KwaZulu-Natal in South Africa). central UIC effluent treatment plant and pre-treated before analysis. Experimental parameters such as sample size, contact time, stirring speed were optimised. The resultant MIMs had an adsorption efficiency of 97% of TCS with reference to NIMs and bare membrane, which had 92%, 88%, respectively. The analytical method utilized in this review had limits of detection (LoD) and limits of quantification (LoQ) of 0.22, 0.71µgL-1 in wastewater effluent, respectively. The percentage recovery for the effluent samples was 68%. The detection of TCS was monitored for 10 consecutive days, where optimum TCS traces detected in the treated wastewater was 55.0μg/L inday 9 of the monitored days, while the lowest detected was 6.0μg/L. As the concentrations of analytefound in effluent water samples were not so diverse, this study suggested that MIMs could be the best potential adsorbent for the development and continuous progress in membrane technologyand environmental sciences, lending its capability to desalination.

Keywords: molecularly imprinted membrane, triclosan, phase inversion, wastewater

Procedia PDF Downloads 124
4144 Development and Total Error Concept Validation of Common Analytical Method for Quantification of All Residual Solvents Present in Amino Acids by Gas Chromatography-Head Space

Authors: A. Ramachandra Reddy, V. Murugan, Prema Kumari

Abstract:

Residual solvents in Pharmaceutical samples are monitored using gas chromatography with headspace (GC-HS). Based on current regulatory and compendial requirements, measuring the residual solvents are mandatory for all release testing of active pharmaceutical ingredients (API). Generally, isopropyl alcohol is used as the residual solvent in proline and tryptophan; methanol in cysteine monohydrate hydrochloride, glycine, methionine and serine; ethanol in glycine and lysine monohydrate; acetic acid in methionine. In order to have a single method for determining these residual solvents (isopropyl alcohol, ethanol, methanol and acetic acid) in all these 7 amino acids a sensitive and simple method was developed by using gas chromatography headspace technique with flame ionization detection. During development, no reproducibility, retention time variation and bad peak shape of acetic acid peaks were identified due to the reaction of acetic acid with the stationary phase (cyanopropyl dimethyl polysiloxane phase) of column and dissociation of acetic acid with water (if diluent) while applying temperature gradient. Therefore, dimethyl sulfoxide was used as diluent to avoid these issues. But most the methods published for acetic acid quantification by GC-HS uses derivatisation technique to protect acetic acid. As per compendia, risk-based approach was selected as appropriate to determine the degree and extent of the validation process to assure the fitness of the procedure. Therefore, Total error concept was selected to validate the analytical procedure. An accuracy profile of ±40% was selected for lower level (quantitation limit level) and for other levels ±30% with 95% confidence interval (risk profile 5%). The method was developed using DB-Waxetr column manufactured by Agilent contains 530 µm internal diameter, thickness: 2.0 µm, and length: 30 m. A constant flow of 6.0 mL/min. with constant make up mode of Helium gas was selected as a carrier gas. The present method is simple, rapid, and accurate, which is suitable for rapid analysis of isopropyl alcohol, ethanol, methanol and acetic acid in amino acids. The range of the method for isopropyl alcohol is 50ppm to 200ppm, ethanol is 50ppm to 3000ppm, methanol is 50ppm to 400ppm and acetic acid 100ppm to 400ppm, which covers the specification limits provided in European pharmacopeia. The accuracy profile and risk profile generated as part of validation were found to be satisfactory. Therefore, this method can be used for testing of residual solvents in amino acids drug substances.

Keywords: amino acid, head space, gas chromatography, total error

Procedia PDF Downloads 148
4143 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression

Authors: Keisuke Takahata, Hiroshi Suetsugu

Abstract:

Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.

Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification

Procedia PDF Downloads 183
4142 Lipschitz Classifiers Ensembles: Usage for Classification of Target Events in C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

This paper introduces an original method for guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers. The solution was obtained as a finite closed set of alternative hypotheses, which contains an object of classification with a probability of not less than the specified value. Thus, the classification is represented by a set of hypothetical classes. In this case, the smaller the cardinality of the discrete set of hypothetical classes is, the higher is the classification accuracy. Experiments have shown that if the cardinality of the classifiers ensemble is increased then the cardinality of this set of hypothetical classes is reduced. The problem of the guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers is relevant in the multichannel classification of target events in C-OTDR monitoring systems. Results of suggested approach practical usage to accuracy control in C-OTDR monitoring systems are present.

Keywords: Lipschitz classifiers, confidence set, C-OTDR monitoring, classifiers accuracy, classifiers ensemble

Procedia PDF Downloads 492
4141 A TgCNN-Based Surrogate Model for Subsurface Oil-Water Phase Flow under Multi-Well Conditions

Authors: Jian Li

Abstract:

The uncertainty quantification and inversion problems of subsurface oil-water phase flow usually require extensive repeated forward calculations for new runs with changed conditions. To reduce the computational time, various forms of surrogate models have been built. Related research shows that deep learning has emerged as an effective surrogate model, while most surrogate models with deep learning are purely data-driven, which always leads to poor robustness and abnormal results. To guarantee the model more consistent with the physical laws, a coupled theory-guided convolutional neural network (TgCNN) based surrogate model is built to facilitate computation efficiency under the premise of satisfactory accuracy. The model is a convolutional neural network based on multi-well reservoir simulation. The core notion of this proposed method is to bridge two separate blocks on top of an overall network. They underlie the TgCNN model in a coupled form, which reflects the coupling nature of pressure and water saturation in the two-phase flow equation. The model is driven by not only labeled data but also scientific theories, including governing equations, stochastic parameterization, boundary, and initial conditions, well conditions, and expert knowledge. The results show that the TgCNN-based surrogate model exhibits satisfactory accuracy and efficiency in subsurface oil-water phase flow under multi-well conditions.

Keywords: coupled theory-guided convolutional neural network, multi-well conditions, surrogate model, subsurface oil-water phase

Procedia PDF Downloads 86
4140 Two-Dimensional Modeling of Spent Nuclear Fuel Using FLUENT

Authors: Imane Khalil, Quinn Pratt

Abstract:

In a nuclear reactor, an array of fuel rods containing stacked uranium dioxide pellets clad with zircalloy is the heat source for a thermodynamic cycle of energy conversion from heat to electricity. After fuel is used in a nuclear reactor, the assemblies are stored underwater in a spent nuclear fuel pool at the nuclear power plant while heat generation and radioactive decay rates decrease before it is placed in packages for dry storage or transportation. A computational model of a Boiling Water Reactor spent fuel assembly is modeled using FLUENT, the computational fluid dynamics package. Heat transfer simulations were performed on the two-dimensional 9x9 spent fuel assembly to predict the maximum cladding temperature for different input to the FLUENT model. Uncertainty quantification is used to predict the heat transfer and the maximum temperature profile inside the assembly.

Keywords: spent nuclear fuel, conduction, heat transfer, uncertainty quantification

Procedia PDF Downloads 220
4139 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics

Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris

Abstract:

The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.

Keywords: cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization

Procedia PDF Downloads 156
4138 Development and Validation of High-Performance Liquid Chromatography Method for the Determination and Pharmacokinetic Study of Linagliptin in Rat Plasma

Authors: Hoda Mahgoub, Abeer Hanafy

Abstract:

Linagliptin (LNG) belongs to dipeptidyl-peptidase-4 (DPP-4) inhibitor class. DPP-4 inhibitors represent a new therapeutic approach for the treatment of type 2 diabetes in adults. The aim of this work was to develop and validate an accurate and reproducible HPLC method for the determination of LNG with high sensitivity in rat plasma. The method involved separation of both LNG and pindolol (internal standard) at ambient temperature on a Zorbax Eclipse XDB C18 column and a mobile phase composed of 75% methanol: 25% formic acid 0.1% pH 4.1 at a flow rate of 1.0 mL.min-1. UV detection was performed at 254nm. The method was validated in compliance with ICH guidelines and found to be linear in the range of 5–1000ng.mL-1. The limit of quantification (LOQ) was found to be 5ng.mL-1 based on 100µL of plasma. The variations for intra- and inter-assay precision were less than 10%, and the accuracy values were ranged between 93.3% and 102.5%. The extraction recovery (R%) was more than 83%. The method involved a single extraction step of a very small plasma volume (100µL). The assay was successfully applied to an in-vivo pharmacokinetic study of LNG in rats that were administered a single oral dose of 10mg.kg-1 LNG. The maximum concentration (Cmax) was found to be 927.5 ± 23.9ng.mL-1. The area under the plasma concentration-time curve (AUC0-72) was 18285.02 ± 605.76h.ng.mL-1. In conclusion, the good accuracy and low LOQ of the bioanalytical HPLC method were suitable for monitoring the full pharmacokinetic profile of LNG in rats. The main advantages of the method were the sensitivity, small sample volume, single-step extraction procedure and the short time of analysis.

Keywords: HPLC, linagliptin, pharmacokinetic study, rat plasma

Procedia PDF Downloads 241
4137 Quantification of Hydrogen Sulfide and Methyl Mercaptan in Air Samples from a Waste Management Facilities

Authors: R. F. Vieira, S. A. Figueiredo, O. M. Freitas, V. F. Domingues, C. Delerue-Matos

Abstract:

The presence of sulphur compounds like hydrogen sulphide and mercaptans is one of the reasons for waste-water treatment and waste management being associated with odour emissions. In this context having a quantifying method for these compounds helps in the optimization of treatment with the goal of their elimination, namely biofiltration processes. The aim of this study was the development of a method for quantification of odorous gases in waste treatment plants air samples. A method based on head space solid phase microextraction (HS-SPME) coupled with gas chromatography - flame photometric detector (GC-FPD) was used to analyse H2S and Metil Mercaptan (MM). The extraction was carried out with a 75-μm Carboxen-polydimethylsiloxane fiber coating at 22 ºC for 20 min, and analysed by a GC 2010 Plus A from Shimadzu with a sulphur filter detector: splitless mode (0.3 min), the column temperature program was from 60 ºC, increased by 15 ºC/min to 100 ºC (2 min). The injector temperature was held at 250 ºC, and the detector at 260 ºC. For calibration curve a gas diluter equipment (digital Hovagas G2 - Multi Component Gas Mixer) was used to do the standards. This unit had two input connections, one for a stream of the dilute gas and another for a stream of nitrogen and an output connected to a glass bulb. A 40 ppm H2S and a 50 ppm MM cylinders were used. The equipment was programmed to the selected concentration, and it automatically carried out the dilution to the glass bulb. The mixture was left flowing through the glass bulb for 5 min and then the extremities were closed. This method allowed the calibration between 1-20 ppm for H2S and 0.02-0.1 ppm and 1-3.5 ppm for MM. Several quantifications of air samples from inlet and outlet of a biofilter operating in a waste management facility in the north of Portugal allowed the evaluation the biofilters performance.

Keywords: biofiltration, hydrogen sulphide, mercaptans, quantification

Procedia PDF Downloads 476
4136 Voxel Models as Input for Heat Transfer Simulations with Siemens NX Based on X-Ray Microtomography Images of Random Fibre Reinforced Composites

Authors: Steven Latré, Frederik Desplentere, Ilya Straumit, Stepan V. Lomov

Abstract:

A method is proposed in order to create a three-dimensional finite element model representing fibre reinforced insulation materials for the simulation software Siemens NX. VoxTex software, a tool for quantification of µCT images of fibrous materials, is used for the transformation of microtomography images of random fibre reinforced composites into finite element models. An automatic tool was developed to execute the import of the models to the thermal solver module of Siemens NX. The paper describes the numerical tools used for the image quantification and the transformation and illustrates them on several thermal simulations of fibre reinforced insulation blankets filled with low thermal conductive fillers. The calculation of thermal conductivity is validated by comparison with the experimental data.

Keywords: analysis, modelling, thermal, voxel

Procedia PDF Downloads 287
4135 On the Added Value of Probabilistic Forecasts Applied to the Optimal Scheduling of a PV Power Plant with Batteries in French Guiana

Authors: Rafael Alvarenga, Hubert Herbaux, Laurent Linguet

Abstract:

The uncertainty concerning the power production of intermittent renewable energy is one of the main barriers to the integration of such assets into the power grid. Efforts have thus been made to develop methods to quantify this uncertainty, allowing producers to ensure more reliable and profitable engagements related to their future power delivery. Even though a diversity of probabilistic approaches was proposed in the literature giving promising results, the added value of adopting such methods for scheduling intermittent power plants is still unclear. In this study, the profits obtained by a decision-making model used to optimally schedule an existing PV power plant connected to batteries are compared when the model is fed with deterministic and probabilistic forecasts generated with two of the most recent methods proposed in the literature. Moreover, deterministic forecasts with different accuracy levels were used in the experiments, testing the utility and the capability of probabilistic methods of modeling the progressively increasing uncertainty. Even though probabilistic approaches are unquestionably developed in the recent literature, the results obtained through a study case show that deterministic forecasts still provide the best performance if accurate, ensuring a gain of 14% on final profits compared to the average performance of probabilistic models conditioned to the same forecasts. When the accuracy of deterministic forecasts progressively decreases, probabilistic approaches start to become competitive options until they completely outperform deterministic forecasts when these are very inaccurate, generating 73% more profits in the case considered compared to the deterministic approach.

Keywords: PV power forecasting, uncertainty quantification, optimal scheduling, power systems

Procedia PDF Downloads 87