Search results for: software-based validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1345

Search results for: software-based validation

835 Mapping of Urban Micro-Climate in Lyon (France) by Integrating Complementary Predictors at Different Scales into Multiple Linear Regression Models

Authors: Lucille Alonso, Florent Renard

Abstract:

The characterizations of urban heat island (UHI) and their interactions with climate change and urban climates are the main research and public health issue, due to the increasing urbanization of the population. These solutions require a better knowledge of the UHI and micro-climate in urban areas, by combining measurements and modelling. This study is part of this topic by evaluating microclimatic conditions in dense urban areas in the Lyon Metropolitan Area (France) using a combination of data traditionally used such as topography, but also from LiDAR (Light Detection And Ranging) data, Landsat 8 satellite observation and Sentinel and ground measurements by bike. These bicycle-dependent weather data collections are used to build the database of the variable to be modelled, the air temperature, over Lyon’s hyper-center. This study aims to model the air temperature, measured during 6 mobile campaigns in Lyon in clear weather, using multiple linear regressions based on 33 explanatory variables. They are of various categories such as meteorological parameters from remote sensing, topographic variables, vegetation indices, the presence of water, humidity, bare soil, buildings, radiation, urban morphology or proximity and density to various land uses (water surfaces, vegetation, bare soil, etc.). The acquisition sources are multiple and come from the Landsat 8 and Sentinel satellites, LiDAR points, and cartographic products downloaded from an open data platform in Greater Lyon. Regarding the presence of low, medium, and high vegetation, the presence of buildings and ground, several buffers close to these factors were tested (5, 10, 20, 25, 50, 100, 200 and 500m). The buffers with the best linear correlations with air temperature for ground are 5m around the measurement points, for low and medium vegetation, and for building 50m and for high vegetation is 100m. The explanatory model of the dependent variable is obtained by multiple linear regression of the remaining explanatory variables (Pearson correlation matrix with a |r| < 0.7 and VIF with < 5) by integrating a stepwise sorting algorithm. Moreover, holdout cross-validation is performed, due to its ability to detect over-fitting of multiple regression, although multiple regression provides internal validation and randomization (80% training, 20% testing). Multiple linear regression explained, on average, 72% of the variance for the study days, with an average RMSE of only 0.20°C. The impact on the model of surface temperature in the estimation of air temperature is the most important variable. Other variables are recurrent such as distance to subway stations, distance to water areas, NDVI, digital elevation model, sky view factor, average vegetation density, or building density. Changing urban morphology influences the city's thermal patterns. The thermal atmosphere in dense urban areas can only be analysed on a microscale to be able to consider the local impact of trees, streets, and buildings. There is currently no network of fixed weather stations sufficiently deployed in central Lyon and most major urban areas. Therefore, it is necessary to use mobile measurements, followed by modelling to characterize the city's multiple thermal environments.

Keywords: air temperature, LIDAR, multiple linear regression, surface temperature, urban heat island

Procedia PDF Downloads 127
834 From Intuitive to Constructive Audit Risk Assessment: A Complementary Approach to CAATTs Adoption

Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy

Abstract:

The use of the audit risk model in auditing has faced limitations and difficulties, leading auditors to rely on a conceptual level of its application. The qualitative approach to assessing risks has resulted in different risk assessments, affecting the quality of audits and decision-making on the adoption of CAATTs. This study aims to investigate risk factors impacting the implementation of the audit risk model and propose a complementary risk-based instrument (KRIs) to form substance risk judgments and mitigate against heightened risk of material misstatement (RMM). The study addresses the question of how risk factors impact the implementation of the audit risk model, improve risk judgments, and aid in the adoption of CAATTs. The study uses a three-stage scale development procedure involving a pretest and subsequent study with two independent samples. The pretest involves an exploratory factor analysis, while the subsequent study employs confirmatory factor analysis for construct validation. Additionally, the authors test the ability of the KRIs to predict audit efforts needed to mitigate against heightened RMM. Data was collected through two independent samples involving 767 participants. The collected data was analyzed using exploratory factor analysis and confirmatory factor analysis to assess scale validity and construct validation. The suggested KRIs, comprising two risk components and seventeen risk items, are found to have high predictive power in determining audit efforts needed to reduce RMM. The study validates the suggested KRIs as an effective instrument for risk assessment and decision-making on the adoption of CAATTs. This study contributes to the existing literature by implementing a holistic approach to risk assessment and providing a quantitative expression of assessed risks. It bridges the gap between intuitive risk evaluation and the theoretical domain, clarifying the mechanism of risk assessments. It also helps improve the uniformity and quality of risk assessments, aiding audit standard-setters in issuing updated guidelines on CAATT adoption. A few limitations and recommendations for future research should be mentioned. First, the process of developing the scale was conducted in the Israeli auditing market, which follows the International Standards on Auditing (ISAs). Although ISAs are adopted in European countries, for greater generalization, future studies could focus on other countries that adopt additional or local auditing standards. Second, this study revealed risk factors that have a material impact on the assessed risk. However, there could be additional risk factors that influence the assessment of the RMM. Therefore, future research could investigate other risk segments, such as operational and financial risks, to bring a broader generalizability to our results. Third, although the sample size in this study fits acceptable scale development procedures and enables drawing conclusions from the body of research, future research may develop standardized measures based on larger samples to reduce the generation of equivocal results and suggest an extended risk model.

Keywords: audit risk model, audit efforts, CAATTs adoption, key risk indicators, sustainability

Procedia PDF Downloads 66
833 Estimation and Validation of Free Lime Analysis of Clinker by Quantitative Phase Analysis Using X ray diffraction

Authors: Suresh Palla, Kalpna Sharma, Gaurav Bhatnagar, S. K. Chaturvedi, B. N. Mohapatra

Abstract:

Determining the content of free lime is especially important to judge reactivity of the raw materials and clinker quality. The free lime limit isn’t the same for all cements; it depends on several factors, especially the temperature reached during the cooking and the grain size distribution in cement after grinding. Estimation of free lime by conventional method is influenced by the presence of portlandite and misleads the actual free lime content in the clinker for quality check up conditions. To ensure the product quality according to the standard specifications in terms of within the quality limits or not, a reliable, precise, and very reproducible method to quantify the relative phase abundances in the Portland Cement clinker and Portland Cements is to use X-ray diffraction (XRD) in combination with the Rietveld method. In the present study, a methodology was proposed using XRD to validate the obtained results of free lime by conventional method. The XRD and TG/DTA results confirm the presence of portlandite in the clinker to take the decision on the obtained free lime results through conventional method.

Keywords: free lime, quantitative phase analysis, conventional method, x ray diffraction

Procedia PDF Downloads 126
832 An Ultrasonic Signal Processing System for Tomographic Imaging of Reinforced Concrete Structures

Authors: Edwin Forero-Garcia, Jaime Vitola, Brayan Cardenas, Johan Casagua

Abstract:

This research article presents the integration of electronic and computer systems, which developed an ultrasonic signal processing system that performs the capture, adaptation, and analog-digital conversion to later carry out its processing and visualization. The capture and adaptation of the signal were carried out from the design and implementation of an analog electronic system distributed in stages: 1. Coupling of impedances; 2. Analog filter; 3. Signal amplifier. After the signal conditioning was carried out, the ultrasonic information was digitized using a digital microcontroller to carry out its respective processing. The digital processing of the signals was carried out in MATLAB software for the elaboration of A-Scan, B and D-Scan types of ultrasonic images. Then, advanced processing was performed using the SAFT technique to improve the resolution of the Scan-B-type images. Thus, the information from the ultrasonic images was displayed in a user interface developed in .Net with Visual Studio. For the validation of the system, ultrasonic signals were acquired, and in this way, the non-invasive inspection of the structures was carried out and thus able to identify the existing pathologies in them.

Keywords: acquisition, signal processing, ultrasound, SAFT, HMI

Procedia PDF Downloads 96
831 Rapid, Direct, Real-Time Method for Bacteria Detection on Surfaces

Authors: Evgenia Iakovleva, Juha Koivisto, Pasi Karppinen, J. Inkinen, Mikko Alava

Abstract:

Preventing the spread of infectious diseases throughout the worldwide is one of the most important tasks of modern health care. Infectious diseases not only account for one fifth of the deaths in the world, but also cause many pathological complications for the human health. Touch surfaces pose an important vector for the spread of infections by varying microorganisms, including antimicrobial resistant organisms. Further, antimicrobial resistance is reply of bacteria to the overused or inappropriate used of antibiotics everywhere. The biggest challenges in bacterial detection by existing methods are non-direct determination, long time of analysis, the sample preparation, use of chemicals and expensive equipment, and availability of qualified specialists. Therefore, a high-performance, rapid, real-time detection is demanded in rapid practical bacterial detection and to control the epidemiological hazard. Among the known methods for determining bacteria on the surfaces, Hyperspectral methods can be used as direct and rapid methods for microorganism detection on different kind of surfaces based on fluorescence without sampling, sample preparation and chemicals. The aim of this study was to assess the relevance of such systems to remote sensing of surfaces for microorganisms detection to prevent a global spread of infectious diseases. Bacillus subtilis and Escherichia coli with different concentrations (from 0 to 10x8 cell/100µL) were detected with hyperspectral camera using different filters as visible visualization of bacteria and background spots on the steel plate. A method of internal standards was applied for monitoring the correctness of the analysis results. Distances from sample to hyperspectral camera and light source are 25 cm and 40 cm, respectively. Each sample is optically imaged from the surface by hyperspectral imaging system, utilizing a JAI CM-140GE-UV camera. Light source is BeamZ FLATPAR DMX Tri-light, 3W tri-colour LEDs (red, blue and green). Light colors are changed through DMX USB Pro interface. The developed system was calibrated following a standard procedure of setting exposure and focused for light with λ=525 nm. The filter is ThorLabs KuriousTM hyperspectral filter controller with wavelengths from 420 to 720 nm. All data collection, pro-processing and multivariate analysis was performed using LabVIEW and Python software. The studied human eye visible and invisible bacterial stains clustered apart from a reference steel material by clustering analysis using different light sources and filter wavelengths. The calculation of random and systematic errors of the analysis results proved the applicability of the method in real conditions. Validation experiments have been carried out with photometry and ATP swab-test. The lower detection limit of developed method is several orders of magnitude lower than for both validation methods. All parameters of the experiments were the same, except for the light. Hyperspectral imaging method allows to separate not only bacteria and surfaces, but also different types of bacteria, such as Gram-negative Escherichia coli and Gram-positive Bacillus subtilis. Developed method allows skipping the sample preparation and the use of chemicals, unlike all other microbiological methods. The time of analysis with novel hyperspectral system is a few seconds, which is innovative in the field of microbiological tests.

Keywords: Escherichia coli, Bacillus subtilis, hyperspectral imaging, microorganisms detection

Procedia PDF Downloads 210
830 Simultaneous Determination of p-Phenylenediamine, N-Acetyl-p-phenylenediamine and N,N-Diacetyl-p-phenylenediamine in Human Urine by LC-MS/MS

Authors: Khaled M. Mohamed

Abstract:

Background: P-Phenylenediamine (PPD) is used in the manufacture of hair dyes and skin decoration. In some developing countries, suicidal, homicidal and accidental cases by PPD were recorded. In this work, a sensitive LC-MS/MS method for determination of PPD and its metabolites N-acetyl-p-phenylenediamine (MAPPD) and N,N-diacetyl-p-phenylenediamine (DAPPD) in human urine has been developed and validated. Methods: PPD, MAPPD and DAPPD were extracted from urine by methylene chloride at alkaline pH. Acetanilide was used as internal standard (IS). The analytes and IS were separated on an Eclipse XDB- C18 column (150 X 4.6 mm, 5 µm) using a mobile phase of acetonitrile-1% formic acid in gradient elution. Detection was performed by LC-MS/MS using electrospray positive ionization under multiple reaction-monitoring mode. The transition ions m/z 109 → 92, m/z 151 → 92, m/z 193 → 92, and m/z 136 → 77 were selected for the quantification of PPD, MAPPD, DAPPD, and IS, respectively. Results: Calibration curves were linear in the range 10–2000 ng/mL for all analytes. The mean recoveries for PPD, MAPPD and DAPPD were 57.62, 74.19 and 50.99%, respectively. Intra-assay and inter-assay imprecisions were within 1.58–9.52% and 5.43–9.45% respectively for PPD, MAPPD and DAPPD. Inter-assay accuracies were within -7.43 and 7.36 for all compounds. PPD, MAPPD and DAPPD were stable in urine at –20 degrees for 24 hours. Conclusions: The method was successfully applied to the analysis of PPD, MAPPD and DAPPD in urine samples collected from suicidal cases.

Keywords: p-Phenylenediamine, metabolites, urine, LC-MS/MS, validation

Procedia PDF Downloads 348
829 Solar Energy Generation Based Urban Development: A Case of Jodhpur City

Authors: A. Kumar, V. Devadas

Abstract:

India has the most year-round favorable sunny conditions along with the second-highest solar irradiation in the world, the country holds the potential to become the global solar hub. The solar and wind-based generation capacity has skyrocketed in India with the successful effort of the Ministry of Renewable Energy, whereas the potential of rooftop based solar power generation has yet to be explored for proposed solar cities in India. The research aims to analyze the gap in the energy scenario in Jodhpur City and proposes interventions of solar energy generation systems as a catalyst for urban development. The research is based on the system concept which deals with simulation between the city system as a whole and its interactions between different subsystems. A system-dynamics based mathematical model is developed by identifying the control parameters using regression and correlation analysis to assess the gap in energy sector. The base model validation is done using the past 10 years timeline data collected from secondary sources. Further, energy consumption and solar energy generation-based projection are made for testing different scenarios to conclude the feasibility for maintaining the city level energy independence till 2031.

Keywords: city, consumption, energy, generation

Procedia PDF Downloads 121
828 Performance Analysis of Compression Socks Strips

Authors: Hafiz Faisal Siddique, Adnan Ahmed Mazari, Antonin Havelka

Abstract:

Compression socks are highly recommended textile garment for pressure exertion on the lower part of leg. The extent of compression that a patient can easily manage depends on stage (limb size and shape) of venous disease and his activities (mobility, age). Due to dynamic mechanical influence, the socks destroy their extent of pressure exertion around the leg. The main aim of this research is to investigate how the performance of compression socks is deteriorated due to expected induced wearing mechanical impacts. Wearing mechanical impacts influence the durability parameter i.e. tensile energy loss. For tensile energy loss, cut-strip samples were interacted to constant rate of loading and un-loading, cyclic-loading upto 15th cycles for ±5mm extension (considering muscles expansion and relaxation) and were dwelled (stayed) for 3 minutes at 25%, 50% and 75% extension levels, simultaneously. Statistical validation of tensile energy loss was performed by introducing measures of correlation, p-value (≤ 0.05), R-square values using MINITAB 17 software.

Keywords: compression socks, loading and unloading, 15th cyclic loading, Dwell time effect

Procedia PDF Downloads 149
827 Image Encryption Using Eureqa to Generate an Automated Mathematical Key

Authors: Halima Adel Halim Shnishah, David Mulvaney

Abstract:

Applying traditional symmetric cryptography algorithms while computing encryption and decryption provides immunity to secret keys against different attacks. One of the popular techniques generating automated secret keys is evolutionary computing by using Eureqa API tool, which got attention in 2013. In this paper, we are generating automated secret keys for image encryption and decryption using Eureqa API (tool which is used in evolutionary computing technique). Eureqa API models pseudo-random input data obtained from a suitable source to generate secret keys. The validation of generated secret keys is investigated by performing various statistical tests (histogram, chi-square, correlation of two adjacent pixels, correlation between original and encrypted images, entropy and key sensitivity). Experimental results obtained from methods including histogram analysis, correlation coefficient, entropy and key sensitivity, show that the proposed image encryption algorithms are secure and reliable, with the potential to be adapted for secure image communication applications.

Keywords: image encryption algorithms, Eureqa, statistical measurements, automated key generation

Procedia PDF Downloads 473
826 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software

Authors: Chandra Mukherjee

Abstract:

The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.

Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction

Procedia PDF Downloads 402
825 Development of Anterior Lumbar Interbody Fusion (ALIF) Peek Cage Based on the Korean Lumbar Anatomical Information

Authors: Chang Soo Chon, Cheol Woong Ko, Han Sung Kim

Abstract:

The aim of this study is to develop an anterior lumbar interbody fusion (ALIF) PEEK cage suitable for Korean people. In this study, CT images were obtained from Korean male (173cm, 71kg) and 3D Korean lumbar models were reconstructed based on the CT images to investigate anatomical characteristics. Major design parameters of anterior lumbar interbody fusion (ALIF) PEEK Cage were selected using the morphological measurement information of the Korean Lumbar models. Through finite element analysis and mechanical tests, the developed ALIF PEEK Cage prototype was compared with the Fidji Cage (Zimmer.Inc, USA) and it was found that the ALIF prototype showed similar and/or superior mechanical performance compared to the FidJi Cage. Also, clinical validation for the ALIF PEEK Cage prototype was carried out to check predictable troubles in surgical operations. Finally, it is considered that the convenience and stability of the prototype was clinically verified.

Keywords: inter-body anterior fusion, ALIF cage, PEEK, Korean lumbar, CT image, animal test

Procedia PDF Downloads 510
824 Model-Free Distributed Control of Dynamical Systems

Authors: Javad Khazaei, Rick Blum

Abstract:

Distributed control is an efficient and flexible approach for coordination of multi-agent systems. One of the main challenges in designing a distributed controller is identifying the governing dynamics of the dynamical systems. Data-driven system identification is currently undergoing a revolution. With the availability of high-fidelity measurements and historical data, model-free identification of dynamical systems can facilitate the control design without tedious modeling of high-dimensional and/or nonlinear systems. This paper develops a distributed control design using consensus theory for linear and nonlinear dynamical systems using sparse identification of system dynamics. Compared with existing consensus designs that heavily rely on knowing the detailed system dynamics, the proposed model-free design can accurately capture the dynamics of the system with available measurements and input data and provide guaranteed performance in consensus and tracking problems. Heterogeneous damped oscillators are chosen as examples of dynamical system for validation purposes.

Keywords: consensus tracking, distributed control, model-free control, sparse identification of dynamical systems

Procedia PDF Downloads 256
823 Use of Structural Family Therapy and Dialectical Behavior Therapy with High-Conflict Couples

Authors: Eman Tadros, Natasha Finney

Abstract:

The following case study involving a high-conflict, Children’s Services Bureau (CSB) referred couple is analyzed and reviewed through an integrated lens of structural family therapy and dialectical behavior therapy. In structural family therapy, normal family development is not characterized by a lack of problems, but instead by families’ having developed a functional structure for dealing with their problems. Whereas, in dialectical behavioral therapy normal family development can be characterized by having a supportive and validating environment, where all family members feel a sense of acceptance and validation for who they are and where they are in life. The clinical case conceptualization highlights the importance of conceptualizing how change occurs within a therapeutic setting. In the current case study, the couple did not only experience high-conflict, but there were also issues of substance use, health issues, and other complicating factors. Clinicians should view their clients holistically and tailor their treatment to fit their unique needs. In this framework, change occurs within the family unit, by accepting each member as they are, while at the same time working together to change maladaptive familial structures.

Keywords: couples, dialectical behavior therapy, high-conflict, structural family therapy

Procedia PDF Downloads 334
822 Comparison of Methods for Detecting and Quantifying Amplitude Modulation of Wind Farm Noise

Authors: Phuc D. Nguyen, Kristy L. Hansen, Branko Zajamsek

Abstract:

The existence of special characteristics of wind farm noise such as amplitude modulation (AM) contributes significantly to annoyance, which could ultimately result in sleep disturbance and other adverse health effects for residents living near wind farms. In order to detect and quantify this phenomenon, several methods have been developed which can be separated into three types: time-domain, frequency-domain and hybrid methods. However, due to a lack of systematic validation of these methods, it is still difficult to select the best method for identifying AM. Furthermore, previous comparisons between AM methods have been predominantly qualitative or based on synthesised signals, which are not representative of the actual noise. In this study, a comparison between methods for detecting and quantifying AM has been carried out. The results are based on analysis of real noise data which were measured at a wind farm in South Australia. In order to evaluate the performance of these methods in terms of detecting AM, an approach has been developed to select the most successful method of AM detection. This approach uses a receiver operating characteristic (ROC) curve which is based on detection of AM in audio files by experts.

Keywords: amplitude modulation, wind farm noise, ROC curve

Procedia PDF Downloads 132
821 Simulator Dynamic Positioning System with Azimuthal Thruster

Authors: Robson C. Santos, Christian N. Barreto, Gerson G. Cunha, Severino J. C. Neto

Abstract:

This paper aims to project the construction of a prototype azimuthal thruster, mounted with materials of low cost and easy access, testing in a controlled environment to measure their performance, characteristics and feasibility of future projects. The construction of the simulation of dynamic positioning software, responsible for simulating a vessel and reposition it when necessary . Tests for partial and full validation of the model were conducted, operates independently of the control system and executes the commands and commands of the helix of rotation azimuth. The system provides an interface to the user and simulates the conditions unfavorable positioning of a vessel, accurately calculates the azimuth angle, the direction of rotation of the helix and the time that this should be turned on so that the vessel back to position original. There is a serial communication that connects the Simulation Dynamic Positioning System with Embedded System causing the user-generated data to simulate the DP system arrives in the form of control signals to the motors of the propellant. This article addresses issues in the marine industry employees.

Keywords: azimuthal thruster, dynamic positioning, embedded system, simulator dynamic positioning

Procedia PDF Downloads 454
820 The Best Prediction Data Mining Model for Breast Cancer Probability in Women Residents in Kabul

Authors: Mina Jafari, Kobra Hamraee, Saied Hossein Hosseini

Abstract:

The prediction of breast cancer disease is one of the challenges in medicine. In this paper we collected 528 records of women’s information who live in Kabul including demographic, life style, diet and pregnancy data. There are many classification algorithm in breast cancer prediction and tried to find the best model with most accurate result and lowest error rate. We evaluated some other common supervised algorithms in data mining to find the best model in prediction of breast cancer disease among afghan women living in Kabul regarding to momography result as target variable. For evaluating these algorithms we used Cross Validation which is an assured method for measuring the performance of models. After comparing error rate and accuracy of three models: Decision Tree, Naive Bays and Rule Induction, Decision Tree with accuracy of 94.06% and error rate of %15 is found the best model to predicting breast cancer disease based on the health care records.

Keywords: decision tree, breast cancer, probability, data mining

Procedia PDF Downloads 128
819 Intelligent Computing with Bayesian Regularization Artificial Neural Networks for a Nonlinear System of COVID-19 Epidemic Model for Future Generation Disease Control

Authors: Tahir Nawaz Cheema, Dumitru Baleanu, Ali Raza

Abstract:

In this research work, we design intelligent computing through Bayesian Regularization artificial neural networks (BRANNs) introduced to solve the mathematical modeling of infectious diseases (Covid-19). The dynamical transmission is due to the interaction of people and its mathematical representation based on the system's nonlinear differential equations. The generation of the dataset of the Covid-19 model is exploited by the power of the explicit Runge Kutta method for different countries of the world like India, Pakistan, Italy, and many more. The generated dataset is approximately used for training, testing, and validation processes for every frequent update in Bayesian Regularization backpropagation for numerical behavior of the dynamics of the Covid-19 model. The performance and effectiveness of designed methodology BRANNs are checked through mean squared error, error histograms, numerical solutions, absolute error, and regression analysis.

Keywords: mathematical models, beysian regularization, bayesian-regularization backpropagation networks, regression analysis, numerical computing

Procedia PDF Downloads 133
818 Bone Fracture Detection with X-Ray Images Using Mobilenet V3 Architecture

Authors: Ashlesha Khanapure, Harsh Kashyap, Abhinav Anand, Sanjana Habib, Anupama Bidargaddi

Abstract:

Technologies that are developing quickly are being developed daily in a variety of disciplines, particularly the medical field. For the purpose of detecting bone fractures in X-ray pictures of different body segments, our work compares the ResNet-50 and MobileNetV3 architectures. It evaluates accuracy and computing efficiency with X-rays of the elbow, hand, and shoulder from the MURA dataset. Through training and validation, the models are evaluated on normal and fractured images. While ResNet-50 showcases superior accuracy in fracture identification, MobileNetV3 showcases superior speed and resource optimization. Despite ResNet-50’s accuracy, MobileNetV3’s swifter inference makes it a viable choice for real-time clinical applications, emphasizing the importance of balancing computational efficiency and accuracy in medical imaging. We created a graphical user interface (GUI) for MobileNet V3 model bone fracture detection. This research underscores MobileNetV3’s potential to streamline bone fracture diagnoses, potentially revolutionizing orthopedic medical procedures and enhancing patient care.

Keywords: CNN, MobileNet V3, ResNet-50, healthcare, MURA, X-ray, fracture detection

Procedia PDF Downloads 42
817 Simulation of the Asphaltene Deposition Rate in a Wellbore Blockage via Computational Fluid Dynamic

Authors: Xiaodong Gao, Pingchuan Dong, Qichao Gao

Abstract:

There has been lots of published work focused on asphaltene deposited on the smooth pipe under steady conditions, while particle deposition on the blockage wellbores under transient conditions has not been well elucidated. This work attempts to predict the deposition rate of asphaltene particles in blockage tube through CFD simulation. The Euler-Lagrange equation has been applied during the flow of crude oil and asphaltene particles. The net gravitational force, virtual mass, pressure gradient, saffman lift, and drag forces are incorporated in the simulations process. Validation of CFD simulation results is compared to the benchmark experiments from the previous literature. Furthermore, the effect of blockage location, blockage length, and blockage thickness on deposition rate are also analyzed. The simulation results indicate that the maximum deposition rate of asphaltene occurs in the blocked tube section, and the greater the deposition thickness, the greater the deposition rate. Moreover, the deposition amount and maximum deposition rate along the length of the tube have the same trend. Results of this study are in the ability to better understand the deposition of asphaltene particles in production and help achieve to deal with the asphaltene challenges.

Keywords: asphaltene deposition rate, blockage length, blockage thickness, blockage diameter, transient condition

Procedia PDF Downloads 188
816 Vulnerability of Groundwater to Pollution in Akwa Ibom State, Southern Nigeria, using the DRASTIC Model and Geographic Information System (GIS)

Authors: Aniedi A. Udo, Magnus U. Igboekwe, Rasaaq Bello, Francis D. Eyenaka, Michael C. Ohakwere-Eze

Abstract:

Groundwater vulnerability to pollution was assessed in Akwa Ibom State, Southern Nigeria, with the aim of locating areas with high potentials for resource contamination, especially due to anthropogenic influence. The electrical resistivity method was utilized in the collection of the initial field data. Additional data input, which included depth to static water level, drilled well log data, aquifer recharge data, percentage slope, as well as soil information, were sourced from secondary sources. The initial field data were interpreted both manually and with computer modeling to provide information on the geoelectric properties of the subsurface. Interpreted results together with the secondary data were used to develop the DRASTIC thematic maps. A vulnerability assessment was performed using the DRASTIC model in a GIS environment and areas with high vulnerability which needed immediate attention was clearly mapped out and presented using an aquifer vulnerability map. The model was subjected to validation and the rate of validity was 73% within the area of study.

Keywords: groundwater, vulnerability, DRASTIC model, pollution

Procedia PDF Downloads 200
815 Functional Instruction Set Simulator (ISS) of a Neural Network (NN) IP with Native BF-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A Functional Model to mimic the functional correctness of a Neural Network Compute Accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of gcc compilers to BF-16 datatype, which we addressed with a native BF-16 generator integrated to our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex Neural Network Accelerator design by proposing a Functional Model-based scoreboard or Software model using SystemC. The proposed Functional Model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT bringing up micro-steps of execution.

Keywords: ISA (instruction set architecture), NN (neural network), TLM (transaction-level modeling), GEMM (general matrix multiplication)

Procedia PDF Downloads 75
814 SEM Image Classification Using CNN Architectures

Authors: Güzi̇n Ti̇rkeş, Özge Teki̇n, Kerem Kurtuluş, Y. Yekta Yurtseven, Murat Baran

Abstract:

A scanning electron microscope (SEM) is a type of electron microscope mainly used in nanoscience and nanotechnology areas. Automatic image recognition and classification are among the general areas of application concerning SEM. In line with these usages, the present paper proposes a deep learning algorithm that classifies SEM images into nine categories by means of an online application to simplify the process. The NFFA-EUROPE - 100% SEM data set, containing approximately 21,000 images, was used to train and test the algorithm at 80% and 20%, respectively. Validation was carried out using a separate data set obtained from the Middle East Technical University (METU) in Turkey. To increase the accuracy in the results, the Inception ResNet-V2 model was used in view of the Fine-Tuning approach. By using a confusion matrix, it was observed that the coated-surface category has a negative effect on the accuracy of the results since it contains other categories in the data set, thereby confusing the model when detecting category-specific patterns. For this reason, the coated-surface category was removed from the train data set, hence increasing accuracy by up to 96.5%.

Keywords: convolutional neural networks, deep learning, image classification, scanning electron microscope

Procedia PDF Downloads 114
813 Soil Mixed Constructed Permeable Reactive Barrier for Groundwater Remediation: Field Observation

Authors: Ziyda Abunada

Abstract:

In-situ remediation of contaminated land with deep mixing can deliver a multi-technique remedial strategy. A field trail includes permeable reactive barrier (PRB) took place at a severely contaminated site in Yorkshire to the north of the UK through the SMiRT (Soil Mix Remediation Technology) project in May 2011. SMiRT involved the execution of the largest research field trials in the UK to provide field validation. Innovative modified bentonite materials in combination with zeolite and organoclay were used to construct six different walls of a hexagonal PRB. Field monitoring, testing and site cores were collected from the PRB twice: once 2 months after the construction and again in March 2014 (almost 34 months later).This paper presents an overview of the results of the PRB materials’ relative performance with some initial 3-year time-related assessment. Results from the monitoring program and the site cores are presented. Some good correlations are seen together with some clear difference among the materials’ efficiency. These preliminary observations represent a potential for further investigations and highlighted the main lessons learned in a filed scale.

Keywords: in-situ remediation, groundwater, permeable reactive barrier, site cores

Procedia PDF Downloads 194
812 Relation between Pavement Roughness and Distress Parameters for Highways

Authors: Suryapeta Harini

Abstract:

Road surface roughness is one of the essential aspects of the road's functional condition, indicating riding comfort in both the transverse and longitudinal directions. The government of India has made maintaining good surface evenness a prerequisite for all highway projects. Pavement distress data was collected with a Network Survey Vehicle (NSV) on a National Highway. It determines the smoothness and frictional qualities of the pavement surface, which are related to driving safety and ease. Based on the data obtained in the field, a regression equation was created with the IRI value and the visual distresses. The suggested system can use wireless acceleration sensors and GPS to gather vehicle status and location data, as well as calculate the international roughness index (IRI). Potholes, raveling, rut depth, cracked area, and repair work are all affected by pavement roughness, according to the current study. The study was carried out in one location. Data collected through using Bump integrator was used for the validation. The bump integrator (BI) obtained using deflection from the network survey vehicle was correlated with the distress parameter to establish an equation.

Keywords: roughness index, network survey vehicle, regression, correlation

Procedia PDF Downloads 166
811 Challenges, Practices, and Opportunities of Knowledge Management in Industrial Research Institutes: Lessons Learned from Flanders Make

Authors: Zhenmin Tao, Jasper De Smet, Koen Laurijssen, Jeroen Stuyts, Sonja Sioncke

Abstract:

Today, the quality of knowledge management (KM)become one of the underpinning factors in the success of an organization, as it determines the effectiveness of capitalizing the organization’s knowledge. Overall, KMin an organization consists of five aspects: (knowledge) creation, validation, presentation, distribution, and application. Among others, KM in research institutes is considered as the cornerstone as their activities cover all five aspects. Furthermore, KM in a research institute facilitates the steering committee to envision the future roadmap, identify knowledge gaps, and make decisions on future research directions. Likewise, KMis even more challenging in industrial research institutes. From a technical perspective, technology advancement in the past decades calls for combinations of breadth and depth in expertise that poses challenges in talent acquisition and, therefore, knowledge creation. From a regulatory perspective, the strict intellectual property protection from industry collaborators and/or the contractual agreements made by possible funding authoritiesform extra barriers to knowledge validation, presentation, and distribution. From a management perspective, seamless KM activities are only guaranteed by inter-disciplinary talents that combine technical background knowledge, management skills, and leadership, let alone international vision. From a financial perspective, the long feedback period of new knowledge, together with the massive upfront investment costs and low reusability of the fixed assets, lead to low RORC (return on research capital) that jeopardize KM practice. In this study, we aim to address the challenges, practices, and opportunitiesof KM in Flanders Make – a leading European research institute specialized in the manufacturing industry. In particular, the analyses encompass an internal KM project which involves functionalities ranging from management to technical domain experts. This wide range of functionalities provides comprehensive empirical evidence on the challenges and practices w.r.t.the abovementioned KMaspects. Then, we ground our analysis onto the critical dimensions ofKM–individuals, socio‐organizational processes, and technology. The analyses have three steps: First, we lay the foundation and define the environment of this study by briefing the KM roles played by different functionalities in Flanders Make. Second, we zoom in to the CoreLab MotionS where the KM project is located. In this step, given the technical domains covered by MotionS products, the challenges in KM will be addressed w.r.t. the five KM aspects and three critical dimensions. Third, by detailing the objectives, practices, results, and limitations of the MotionSKMproject, we justify the practices and opportunities derived in the execution ofKMw.r.t. the challenges addressed in the second step. The results of this study are twofold: First, a KM framework that consolidates past knowledge is developed. A library based on this framework can, therefore1) overlook past research output, 2) accelerate ongoing research activities, and 3) envision future research projects. Second, the challenges inKM on both individual (actions) level and socio-organizational level (e.g., interactions between individuals)are identified. By doing so, suggestions and guidelines will be provided in KM in the context of industrial research institute. To this end, the results in this study are reflected towards the findings in existing literature.

Keywords: technical knowledge management framework, industrial research institutes, individual knowledge management, socio-organizational knowledge management.

Procedia PDF Downloads 103
810 South Atlantic Architects Validation of the Construction Decision Making Inventory

Authors: Tulio Sulbaran, Sandeep Langar

Abstract:

Architects are an integral part of the construction industry and are continuously incorporating decisions that influence projects during their life cycle. These decisions aim at selecting best alternative from the ones available. Unfortunately, this decision making process is mainly unexplored in the construction industry. No instrument to measure construction decision, based on knowledgebase of decision-makers, has existed. Additionally, limited literature is available on the topic. Recently, an instrument to gain an understanding of the construction decision-making process was developed by Dr. Tulio Sulbaran from the University of Texas, San Antonio. The instrument’s name is 'Construction Decision Making Inventory (CDMI)'. The CDMI is an innovative idea to measure the 'What? When? How? Moreover, Who?' of the construction decision-making process. As an innovative idea, its statistical validity (accuracy of the assessment) is yet to be assessed. Thus, the purpose of this paper is to describe the results of a case study with architects in the south-east of the United States aimed to determine the CDMI validity. The results of the case study are important because they assess the validity of the tool. Furthermore, as the architects evaluated each question within the measurements, this study is also guiding the enhancement of the CDMI.

Keywords: decision, support, inventory, architect

Procedia PDF Downloads 319
809 Application of Support Vector Machines in Forecasting Non-Residential

Authors: Wiwat Kittinaraporn, Napat Harnpornchai, Sutja Boonyachut

Abstract:

This paper deals with the application of a novel neural network technique, so-called Support Vector Machine (SVM). The objective of this study is to explore the variable and parameter of forecasting factors in the construction industry to build up forecasting model for construction quantity in Thailand. The scope of the research is to study the non-residential construction quantity in Thailand. There are 44 sets of yearly data available, ranging from 1965 to 2009. The correlation between economic indicators and construction demand with the lag of one year was developed by Apichat Buakla. The selected variables are used to develop SVM models to forecast the non-residential construction quantity in Thailand. The parameters are selected by using ten-fold cross-validation method. The results are indicated in term of Mean Absolute Percentage Error (MAPE). The MAPE value for the non-residential construction quantity predicted by Epsilon-SVR in corporation with Radial Basis Function (RBF) of kernel function type is 5.90. Analysis of the experimental results show that the support vector machine modelling technique can be applied to forecast construction quantity time series which is useful for decision planning and management purpose.

Keywords: forecasting, non-residential, construction, support vector machines

Procedia PDF Downloads 426
808 CFD Simulation and Experimental Validation of the Bubble-Induced Flow during Electrochemical Water Splitting

Authors: Gabriel Wosiak, Jeyse da Silva, Sthefany S. Sena, Renato N. de Andrade, Ernesto Pereira

Abstract:

The bubble formation during hydrogen production by electrolysis and several electrochemical processes is an inherent phenomenon and can impact the energy consumption of the processes. In this work, it was reported both experimental and computational results describe the effect of bubble displacement, which, under the cases investigated, leads to the formation of a convective flow in the solution. The process is self-sustained, and a solution vortex is formed, which modifies the bubble growth and covering at the electrode surface. Using the experimental data, we have built a model to simulate it, which, with high accuracy, describes the phenomena. Then, it simulated many different experimental conditions and evaluated the effects of the boundary conditions on the bubble surface covering the surface. We have observed a position-dependent bubble covering the surface, which has an effect on the water-splitting efficiency. It was shown that the bubble covering is not uniform at the electrode surface, and using statistical analysis; it was possible to evaluate the influence of the gas type (H2 and O2), current density, and the bubble size (and cross-effects) on the covering fraction and the asymmetric behavior over the electrode surface.

Keywords: water splitting, bubble, electrolysis, hydrogen production

Procedia PDF Downloads 89
807 Application and Assessment of Artificial Neural Networks for Biodiesel Iodine Value Prediction

Authors: Raquel M. De sousa, Sofiane Labidi, Allan Kardec D. Barros, Alex O. Barradas Filho, Aldalea L. B. Marques

Abstract:

Several parameters are established in order to measure biodiesel quality. One of them is the iodine value, which is an important parameter that measures the total unsaturation within a mixture of fatty acids. Limitation of unsaturated fatty acids is necessary since warming of a higher quantity of these ones ends in either formation of deposits inside the motor or damage of lubricant. Determination of iodine value by official procedure tends to be very laborious, with high costs and toxicity of the reagents, this study uses an artificial neural network (ANN) in order to predict the iodine value property as an alternative to these problems. The methodology of development of networks used 13 esters of fatty acids in the input with convergence algorithms of backpropagation type were optimized in order to get an architecture of prediction of iodine value. This study allowed us to demonstrate the neural networks’ ability to learn the correlation between biodiesel quality properties, in this case iodine value, and the molecular structures that make it up. The model developed in the study reached a correlation coefficient (R) of 0.99 for both network validation and network simulation, with Levenberg-Maquardt algorithm.

Keywords: artificial neural networks, biodiesel, iodine value, prediction

Procedia PDF Downloads 597
806 Forklift Allocation in Warehouse Operations with Restricted Halls

Authors: Mauricio Becerra Fernández, Olga Rosana Romero Quiroga, Elsa Cristina González La Rotta

Abstract:

The logistics facilities design and construction is one of the strategic decisions that critically affects the performance of the company, from the economic perspective and relationship with customers. The case study company is the Colombian logistic sector leader, with over 60 years of experience, with sales of about one hundred twenty million dollars at the end of 2014. The preliminary design for the warehouse layout and operation includes a customer that provides approximately 17% of the profits of the company, considering the possibility of moving two forklifts in the warehouse halls. Some changes were not consider in previous stages of design, operations required forklift with different characteristics, whose size, do not allow the circulation of more than a forklift at a time. Therefore, it is necessary to assess the impact of this restriction on the warehouse operation, so decision makers implement actions to achieve efficient operation. The problem is addressed by recognizing logistics processes, which develop in a warehouse, collection of processes information behavior, the simulation of the current situation using ProModel software, model validation, making adjustments required, experiments design, conclusions and recommendations for the company.

Keywords: design, discrete events simulation, forklift allocation, logistics facilities, warehouse

Procedia PDF Downloads 294