Search results for: accuracy
1651 Leukocyte Detection Using Image Stitching and Color Overlapping Windows
Authors: Lina, Arlends Chris, Bagus Mulyawan, Agus B. Dharmawan
Abstract:
Blood cell analysis plays a significant role in the diagnosis of human health. As an alternative to the traditional technique conducted by laboratory technicians, this paper presents an automatic white blood cell (leukocyte) detection system using Image Stitching and Color Overlapping Windows. The advantage of this method is to present a detection technique of white blood cells that are robust to imperfect shapes of blood cells with various image qualities. The input for this application is images from a microscope-slide translation video. The preprocessing stage is performed by stitching the input images. First, the overlapping parts of the images are determined, then stitching and blending processes of two input images are performed. Next, the Color Overlapping Windows is performed for white blood cell detection which consists of color filtering, window candidate checking, window marking, finds window overlaps, and window cropping processes. Experimental results show that this method could achieve an average of 82.12% detection accuracy of the leukocyte images.Keywords: color overlapping windows, image stitching, leukocyte detection, white blood cell detection
Procedia PDF Downloads 3101650 Levels of Selected Heavy Metals in Varieties of Vegetable oils Consumed in Kingdom of Saudi Arabia and Health Risk Assessment of Local Population
Authors: Muhammad Waqar Ashraf
Abstract:
Selected heavy metals, namely Cu, Zn, Fe, Mn, Cd, Pb, and As, in seven popular varieties of edible vegetable oils collected from Saudi Arabia, were determined by graphite furnace atomic absorption spectrometry (GF-AAS) using microwave digestion. The accuracy of procedure was confirmed by certified reference materials (NIST 1577b). The concentrations for copper, zinc, iron, manganese, lead and arsenic were observed in the range of 0.035 - 0.286, 0.955 - 3.10, 17.3 - 57.8, 0.178 - 0.586, 0.011 - 0.017 and 0.011 - 0.018 µg/g, respectively. Cadmium was found to be in the range of 2.36 - 6.34 ng/g. The results are compared internationally and with standards laid down by world health agencies. A risk assessment study has been carried out to assess exposure to these metals via consumption of vegetable oils. A comparison has been made with safety intake levels for these heavy metals recommended by Institute of Medicine of the National Academies (IOM), US Environmental Protection Agency (US EPA) and Joint FAO/WHO Expert Committee on Food Additives (JECFA). The results indicated that the dietary intakes of the selected heavy metals from daily consumption of 25 g of edible vegetable oils for a 70 kg individual should pose no significant health risk to local population.Keywords: vegetable oils, heavy metals, contamination, health risk assessment
Procedia PDF Downloads 4511649 F-VarNet: Fast Variational Network for MRI Reconstruction
Authors: Omer Cahana, Maya Herman, Ofer Levi
Abstract:
Magnetic resonance imaging (MRI) is a long medical scan that stems from a long acquisition time. This length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach, such as compress sensing (CS) or parallel imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. In order to achieve that, two properties have to exist: i) the signal must be sparse under a known transform domain, ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm needs to be applied to recover the signal. While the rapid advance in the deep learning (DL) field, which has demonstrated tremendous successes in various computer vision task’s, the field of MRI reconstruction is still in an early stage. In this paper, we present an extension of the state-of-the-art model in MRI reconstruction -VarNet. We utilize VarNet by using dilated convolution in different scales, which extends the receptive field to capture more contextual information. Moreover, we simplified the sensitivity map estimation (SME), for it holds many unnecessary layers for this task. Those improvements have shown significant decreases in computation costs as well as higher accuracy.Keywords: MRI, deep learning, variational network, computer vision, compress sensing
Procedia PDF Downloads 1621648 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach
Authors: Utkarsh A. Mishra, Ankit Bansal
Abstract:
At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks
Procedia PDF Downloads 2231647 DNpro: A Deep Learning Network Approach to Predicting Protein Stability Changes Induced by Single-Site Mutations
Authors: Xiao Zhou, Jianlin Cheng
Abstract:
A single amino acid mutation can have a significant impact on the stability of protein structure. Thus, the prediction of protein stability change induced by single site mutations is critical and useful for studying protein function and structure. Here, we presented a deep learning network with the dropout technique for predicting protein stability changes upon single amino acid substitution. While using only protein sequence as input, the overall prediction accuracy of the method on a standard benchmark is >85%, which is higher than existing sequence-based methods and is comparable to the methods that use not only protein sequence but also tertiary structure, pH value and temperature. The results demonstrate that deep learning is a promising technique for protein stability prediction. The good performance of this sequence-based method makes it a valuable tool for predicting the impact of mutations on most proteins whose experimental structures are not available. Both the downloadable software package and the user-friendly web server (DNpro) that implement the method for predicting protein stability changes induced by amino acid mutations are freely available for the community to use.Keywords: bioinformatics, deep learning, protein stability prediction, biological data mining
Procedia PDF Downloads 4681646 Recommender System Based on Mining Graph Databases for Data-Intensive Applications
Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi
Abstract:
In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.Keywords: graph databases, NLP, recommendation systems, similarity metrics
Procedia PDF Downloads 1041645 Hydro-Gravimetric Ann Model for Prediction of Groundwater Level
Authors: Jayanta Kumar Ghosh, Swastik Sunil Goriwale, Himangshu Sarkar
Abstract:
Groundwater is one of the most valuable natural resources that society consumes for its domestic, industrial, and agricultural water supply. Its bulk and indiscriminate consumption affects the groundwater resource. Often, it has been found that the groundwater recharge rate is much lower than its demand. Thus, to maintain water and food security, it is necessary to monitor and management of groundwater storage. However, it is challenging to estimate groundwater storage (GWS) by making use of existing hydrological models. To overcome the difficulties, machine learning (ML) models are being introduced for the evaluation of groundwater level (GWL). Thus, the objective of this research work is to develop an ML-based model for the prediction of GWL. This objective has been realized through the development of an artificial neural network (ANN) model based on hydro-gravimetry. The model has been developed using training samples from field observations spread over 8 months. The developed model has been tested for the prediction of GWL in an observation well. The root means square error (RMSE) for the test samples has been found to be 0.390 meters. Thus, it can be concluded that the hydro-gravimetric-based ANN model can be used for the prediction of GWL. However, to improve the accuracy, more hydro-gravimetric parameter/s may be considered and tested in future.Keywords: machine learning, hydro-gravimetry, ground water level, predictive model
Procedia PDF Downloads 1271644 Model-Based Field Extraction from Different Class of Administrative Documents
Authors: Jinen Daghrir, Anis Kricha, Karim Kalti
Abstract:
The amount of incoming administrative documents is massive and manually processing these documents is a costly task especially on the timescale. In fact, this problem has led an important amount of research and development in the context of automatically extracting fields from administrative documents, in order to reduce the charges and to increase the citizen satisfaction in administrations. In this matter, we introduce an administrative document understanding system. Given a document in which a user has to select fields that have to be retrieved from a document class, a document model is automatically built. A document model is represented by an attributed relational graph (ARG) where nodes represent fields to extract, and edges represent the relation between them. Both of vertices and edges are attached with some feature vectors. When another document arrives to the system, the layout objects are extracted and an ARG is generated. The fields extraction is translated into a problem of matching two ARGs which relies mainly on the comparison of the spatial relationships between layout objects. Experimental results yield accuracy rates from 75% to 100% tested on eight document classes. Our proposed method has a good performance knowing that the document model is constructed using only one single document.Keywords: administrative document understanding, logical labelling, logical layout analysis, fields extraction from administrative documents
Procedia PDF Downloads 2131643 Finite Element Analysis of Thermally-Induced Bistable Plate Using Four Plate Elements
Authors: Jixiao Tao, Xiaoqiao He
Abstract:
The present study deals with the finite element (FE) analysis of thermally-induced bistable plate using various plate elements. The quadrilateral plate elements include the 4-node conforming plate element based on the classical laminate plate theory (CLPT), the 4-node and 9-node Mindlin plate element based on the first-order shear deformation laminated plate theory (FSDT), and a displacement-based 4-node quadrilateral element (RDKQ-NL20). Using the von-Karman’s large deflection theory and the total Lagrangian (TL) approach, the nonlinear FE governing equations for plate under thermal load are derived. Convergence analysis for four elements is first conducted. These elements are then used to predict the stable shapes of thermally-induced bistable plate. Numerical test shows that the plate element based on FSDT, namely the 4-node and 9-node Mindlin, and the RDKQ-NL20 plate element can predict two stable cylindrical shapes while the 4-node conforming plate predicts a saddles shape. Comparing the simulation results with ABAQUS, the RDKQ-NL20 element shows the best accuracy among all the elements.Keywords: Bistable, finite element method, geometrical nonlinearity, quadrilateral plate elements
Procedia PDF Downloads 2201642 Dynamic Analysis of Nanosize FG Rectangular Plates Based on Simple Nonlocal Quasi 3D HSDT
Authors: Sabrina Boutaleb, Fouad Bourad, Kouider Halim Benrahou, Abdelouahed Tounsi
Abstract:
In the present work, the dynamic analysis of the functionally graded rectangular nanoplates is studied. The theory of nonlocal elasticity based on the quasi 3D high shear deformation theory (quasi 3D HSDT) has been employed to determine the natural frequencies of the nanosized FG plate. In HSDT, a cubic function is employed in terms of thickness coordinates to introduce the influence of transverse shear deformation and stretching thickness. The theory of nonlocal elasticity is utilized to examine the impact of the small scale on the natural frequency of the FG rectangular nanoplate. The equations of motion are deduced by implementing Hamilton’s principle. To demonstrate the accuracy of the proposed method, the calculated results in specific cases are compared and examined with available results in the literature, and a good agreement is observed. Finally, the influence of the various parameters, such as the nonlocal coefficient, the material indexes, the aspect ratio, and the thickness-to-length ratio, on the dynamic properties of the FG nanoplates is illustrated and discussed in detail.Keywords: nonlocal elasticity theory, FG nanoplate, free vibration, refined theory, elastic foundation
Procedia PDF Downloads 1201641 Digital Retinal Images: Background and Damaged Areas Segmentation
Authors: Eman A. Gani, Loay E. George, Faisel G. Mohammed, Kamal H. Sager
Abstract:
Digital retinal images are more appropriate for automatic screening of diabetic retinopathy systems. Unfortunately, a significant percentage of these images are poor quality that hinders further analysis due to many factors (such as patient movement, inadequate or non-uniform illumination, acquisition angle and retinal pigmentation). The retinal images of poor quality need to be enhanced before the extraction of features and abnormalities. So, the segmentation of retinal image is essential for this purpose, the segmentation is employed to smooth and strengthen image by separating the background and damaged areas from the overall image thus resulting in retinal image enhancement and less processing time. In this paper, methods for segmenting colored retinal image are proposed to improve the quality of retinal image diagnosis. The methods generate two segmentation masks; i.e., background segmentation mask for extracting the background area and poor quality mask for removing the noisy areas from the retinal image. The standard retinal image databases DIARETDB0, DIARETDB1, STARE, DRIVE and some images obtained from ophthalmologists have been used to test the validation of the proposed segmentation technique. Experimental results indicate the introduced methods are effective and can lead to high segmentation accuracy.Keywords: retinal images, fundus images, diabetic retinopathy, background segmentation, damaged areas segmentation
Procedia PDF Downloads 4031640 Phenotypical and Genotypical Assessment Techniques for Identification of Some Contagious Mastitis Pathogens
Authors: Ayman El Behiry, Rasha Nabil Zahran, Reda Tarabees, Eman Marzouk, Musaad Al-Dubaib
Abstract:
Mastitis is one of the most economic disease affecting dairy cows worldwide. Its classic diagnosis using bacterial culture and biochemical findings is a difficult and prolonged method. In this research, using of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) permitted identification of different microorganisms with high accuracy and rapidity (only 24 hours for microbial growth and analysis). During the application of MALDI-TOF MS, one hundred twenty strains of Staphylococcus and Streptococcus species isolated from milk of cows affected by clinical and subclinical mastitis were identified, and the results were compared with those obtained by traditional methods as API and VITEK 2 Systems. 37 of totality 39 strains (~95%) of Staphylococcus aureus (S. aureus) were exactly detected by MALDI TOF MS and then confirmed by a nuc-based PCR technique, whereas accurate identification was observed in 100% (50 isolates) of the coagulase negative staphylococci (CNS) and Streptococcus agalactiae (31 isolates). In brief, our results demonstrated that MALDI-TOF MS is a fast and truthful technique which has the capability to replace conventional identification of several bacterial strains usually isolated in clinical laboratories of microbiology.Keywords: identification, mastitis pathogens, mass spectral, phenotypical
Procedia PDF Downloads 3331639 Tracking Filtering Algorithm Based on ConvLSTM
Authors: Ailing Yang, Penghan Song, Aihua Cai
Abstract:
The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention
Procedia PDF Downloads 1761638 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software
Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman
Abstract:
Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation
Procedia PDF Downloads 1231637 Diagnosis of Alzheimer Diseases in Early Step Using Support Vector Machine (SVM)
Authors: Amira Ben Rabeh, Faouzi Benzarti, Hamid Amiri, Mouna Bouaziz
Abstract:
Alzheimer is a disease that affects the brain. It causes degeneration of nerve cells (neurons) and in particular cells involved in memory and intellectual functions. Early diagnosis of Alzheimer Diseases (AD) raises ethical questions, since there is, at present, no cure to offer to patients and medicines from therapeutic trials appear to slow the progression of the disease as moderate, accompanying side effects sometimes severe. In this context, analysis of medical images became, for clinical applications, an essential tool because it provides effective assistance both at diagnosis therapeutic follow-up. Computer Assisted Diagnostic systems (CAD) is one of the possible solutions to efficiently manage these images. In our work; we proposed an application to detect Alzheimer’s diseases. For detecting the disease in early stage we used the three sections: frontal to extract the Hippocampus (H), Sagittal to analysis the Corpus Callosum (CC) and axial to work with the variation features of the Cortex(C). Our method of classification is based on Support Vector Machine (SVM). The proposed system yields a 90.66% accuracy in the early diagnosis of the AD.Keywords: Alzheimer Diseases (AD), Computer Assisted Diagnostic(CAD), hippocampus, Corpus Callosum (CC), cortex, Support Vector Machine (SVM)
Procedia PDF Downloads 3841636 Jordan Water District Interactive Billing and Accounting Information System
Authors: Adrian J. Forca, Simeon J. Cainday III
Abstract:
The Jordan Water District Interactive Billing and Accounting Information Systems is designed for Jordan Water District to uplift the efficiency and effectiveness of its services to its customers. It is designed to process computations of water bills in accurate and fast way through automating the manual process and ensures that correct rates and fees are applied. In addition to billing process, a mobile app will be integrated into it to support rapid and accurate water bill generation. An interactive feature will be incorporated to support electronic billing to customers who wish to receive water bills through the use of electronic mail. The system will also improve, organize and avoid data inaccuracy in accounting processes because data will be stored in a database which is designed logically correct through normalization. Furthermore, strict programming constraints will be plunged to validate account access privilege based on job function and data being stored and retrieved to ensure data security, reliability, and accuracy. The system will be able to cater the billing and accounting services of Jordan Water District resulting in setting forth the manual process and adapt to the modern technological innovations.Keywords: accounting, bill, information system, interactive
Procedia PDF Downloads 2511635 3D Numerical Investigation of Asphalt Pavements Behaviour Using Infinite Elements
Authors: K. Sandjak, B. Tiliouine
Abstract:
This article presents the main results of three-dimensional (3-D) numerical investigation of asphalt pavement structures behaviour using a coupled Finite Element-Mapped Infinite Element (FE-MIE) model. The validation and numerical performance of this model are assessed by confronting critical pavement responses with Burmister’s solution and FEM simulation results for multi-layered elastic structures. The coupled model is then efficiently utilised to perform 3-D simulations of a typical asphalt pavement structure in order to investigate the impact of two tire configurations (conventional dual and new generation wide-base tires) on critical pavement response parameters. The numerical results obtained show the effectiveness and the accuracy of the coupled (FE-MIE) model. In addition, the simulation results indicate that, compared with conventional dual tire assembly, single wide base tire caused slightly greater fatigue asphalt cracking and subgrade rutting potentials and can thus be utilised in view of its potential to provide numerous mechanical, economic, and environmental benefits.Keywords: 3-D numerical investigation, asphalt pavements, dual and wide base tires, Infinite elements
Procedia PDF Downloads 2151634 A Case Study of Deep Learning for Disease Detection in Crops
Authors: Felipe A. Guth, Shane Ward, Kevin McDonnell
Abstract:
In the precision agriculture area, one of the main tasks is the automated detection of diseases in crops. Machine Learning algorithms have been studied in recent decades for such tasks in view of their potential for improving economic outcomes that automated disease detection may attain over crop fields. The latest generation of deep learning convolution neural networks has presented significant results in the area of image classification. In this way, this work has tested the implementation of an architecture of deep learning convolution neural network for the detection of diseases in different types of crops. A data augmentation strategy was used to meet the requirements of the algorithm implemented with a deep learning framework. Two test scenarios were deployed. The first scenario implemented a neural network under images extracted from a controlled environment while the second one took images both from the field and the controlled environment. The results evaluated the generalisation capacity of the neural networks in relation to the two types of images presented. Results yielded a general classification accuracy of 59% in scenario 1 and 96% in scenario 2.Keywords: convolutional neural networks, deep learning, disease detection, precision agriculture
Procedia PDF Downloads 2591633 High-Resolution Computed Tomography Imaging Features during Pandemic 'COVID-19'
Authors: Sahar Heidary, Ramin Ghasemi Shayan
Abstract:
By the development of new coronavirus (2019-nCoV) pneumonia, chest high-resolution computed tomography (HRCT) has been one of the main investigative implements. To realize timely and truthful diagnostics, defining the radiological features of the infection is of excessive value. The purpose of this impression was to consider the imaging demonstrations of early-stage coronavirus disease 2019 (COVID-19) and to run an imaging base for a primary finding of supposed cases and stratified interference. The right prophetic rate of HRCT was 85%, sensitivity was 73% for all patients. Total accuracy was 68%. There was no important change in these values for symptomatic and asymptomatic persons. These consequences were besides free of the period of X-ray from the beginning of signs or interaction. Therefore, we suggest that HRCT is a brilliant attachment for early identification of COVID-19 pneumonia in both symptomatic and asymptomatic individuals in adding to the role of predictive gauge for COVID-19 pneumonia. Patients experienced non-contrast HRCT chest checkups and images were restored in a thin 1.25 mm lung window. Images were estimated for the existence of lung scratches & a CT severity notch was allocated separately for each patient based on the number of lung lobes convoluted.Keywords: COVID-19, radiology, respiratory diseases, HRCT
Procedia PDF Downloads 1421632 Enhancing a Recidivism Prediction Tool with Machine Learning: Effectiveness and Algorithmic Fairness
Authors: Marzieh Karimihaghighi, Carlos Castillo
Abstract:
This work studies how Machine Learning (ML) may be used to increase the effectiveness of a criminal recidivism risk assessment tool, RisCanvi. The two key dimensions of this analysis are predictive accuracy and algorithmic fairness. ML-based prediction models obtained in this study are more accurate at predicting criminal recidivism than the manually-created formula used in RisCanvi, achieving an AUC of 0.76 and 0.73 in predicting violent and general recidivism respectively. However, the improvements are small, and it is noticed that algorithmic discrimination can easily be introduced between groups such as national vs foreigner, or young vs old. It is described how effectiveness and algorithmic fairness objectives can be balanced, applying a method in which a single error disparity in terms of generalized false positive rate is minimized, while calibration is maintained across groups. Obtained results show that this bias mitigation procedure can substantially reduce generalized false positive rate disparities across multiple groups. Based on these results, it is proposed that ML-based criminal recidivism risk prediction should not be introduced without applying algorithmic bias mitigation procedures.Keywords: algorithmic fairness, criminal risk assessment, equalized odds, recidivism
Procedia PDF Downloads 1521631 Predicting Potential Protein Therapeutic Candidates from the Gut Microbiome
Authors: Prasanna Ramachandran, Kareem Graham, Helena Kiefel, Sunit Jain, Todd DeSantis
Abstract:
Microbes that reside inside the mammalian GI tract, commonly referred to as the gut microbiome, have been shown to have therapeutic effects in animal models of disease. We hypothesize that specific proteins produced by these microbes are responsible for this activity and may be used directly as therapeutics. To speed up the discovery of these key proteins from the big-data metagenomics, we have applied machine learning techniques. Using amino acid sequences of known epitopes and their corresponding binding partners, protein interaction descriptors (PID) were calculated, making a positive interaction set. A negative interaction dataset was calculated using sequences of proteins known not to interact with these same binding partners. Using Random Forest and positive and negative PID, a machine learning model was trained and used to predict interacting versus non-interacting proteins. Furthermore, the continuous variable, cosine similarity in the interaction descriptors was used to rank bacterial therapeutic candidates. Laboratory binding assays were conducted to test the candidates for their potential as therapeutics. Results from binding assays reveal the accuracy of the machine learning prediction and are subsequently used to further improve the model.Keywords: protein-interactions, machine-learning, metagenomics, microbiome
Procedia PDF Downloads 3761630 Evaluation of Features Extraction Algorithms for a Real-Time Isolated Word Recognition System
Authors: Tomyslav Sledevič, Artūras Serackis, Gintautas Tamulevičius, Dalius Navakauskas
Abstract:
This paper presents a comparative evaluation of features extraction algorithm for a real-time isolated word recognition system based on FPGA. The Mel-frequency cepstral, linear frequency cepstral, linear predictive and their cepstral coefficients were implemented in hardware/software design. The proposed system was investigated in the speaker-dependent mode for 100 different Lithuanian words. The robustness of features extraction algorithms was tested recognizing the speech records at different signals to noise rates. The experiments on clean records show highest accuracy for Mel-frequency cepstral and linear frequency cepstral coefficients. For records with 15 dB signal to noise rate the linear predictive cepstral coefficients give best result. The hard and soft part of the system is clocked on 50 MHz and 100 MHz accordingly. For the classification purpose, the pipelined dynamic time warping core was implemented. The proposed word recognition system satisfies the real-time requirements and is suitable for applications in embedded systems.Keywords: isolated word recognition, features extraction, MFCC, LFCC, LPCC, LPC, FPGA, DTW
Procedia PDF Downloads 4951629 Audio-Visual Recognition Based on Effective Model and Distillation
Authors: Heng Yang, Tao Luo, Yakun Zhang, Kai Wang, Wei Qin, Liang Xie, Ye Yan, Erwei Yin
Abstract:
Recent years have seen that audio-visual recognition has shown great potential in a strong noise environment. The existing method of audio-visual recognition has explored methods with ResNet and feature fusion. However, on the one hand, ResNet always occupies a large amount of memory resources, restricting the application in engineering. On the other hand, the feature merging also brings some interferences in a high noise environment. In order to solve the problems, we proposed an effective framework with bidirectional distillation. At first, in consideration of the good performance in extracting of features, we chose the light model, Efficientnet as our extractor of spatial features. Secondly, self-distillation was applied to learn more information from raw data. Finally, we proposed a bidirectional distillation in decision-level fusion. In more detail, our experimental results are based on a multi-model dataset from 24 volunteers. Eventually, the lipreading accuracy of our framework was increased by 2.3% compared with existing systems, and our framework made progress in audio-visual fusion in a high noise environment compared with the system of audio recognition without visual.Keywords: lipreading, audio-visual, Efficientnet, distillation
Procedia PDF Downloads 1341628 A Study of Adaptive Fault Detection Method for GNSS Applications
Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee
Abstract:
A purpose of this study is to develop efficient detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive estimation. Due to dependence of radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. Thus, to utilize GNSS for aerospace or ground vehicles requiring high level of safety, unhealthy measurements should be considered seriously. For the reason, this paper proposes adaptive fault detection method to deal with unhealthy measurements in various harsh environments. By the proposed method, the test statistics for fault detection is generated by estimated measurement noise. Pseudorange and carrier-phase measurement noise are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. Performance of the proposed method was evaluated by field-collected GNSS measurements. To evaluate the fault detection capability, intentional faults were added to measurements. The experimental result shows that the proposed detection method is efficient in detecting unhealthy measurements and improves the accuracy of GNSS positioning under fault occurrence.Keywords: adaptive estimation, fault detection, GNSS, residual
Procedia PDF Downloads 5751627 Performance Prediction of a SANDIA 17-m Vertical Axis Wind Turbine Using Improved Double Multiple Streamtube
Authors: Abolfazl Hosseinkhani, Sepehr Sanaye
Abstract:
Different approaches have been used to predict the performance of the vertical axis wind turbines (VAWT), such as experimental, computational fluid dynamics (CFD), and analytical methods. Analytical methods, such as momentum models that use streamtubes, have low computational cost and sufficient accuracy. The double multiple streamtube (DMST) is one of the most commonly used of momentum models, which divide the rotor plane of VAWT into upwind and downwind. In fact, results from the DMST method have shown some discrepancy compared with experiment results; that is because the Darrieus turbine is a complex and aerodynamically unsteady configuration. In this study, analytical-experimental-based corrections, including dynamic stall, streamtube expansion, and finite blade length correction are used to improve the DMST method. Results indicated that using these corrections for a SANDIA 17-m VAWT will lead to improving the results of DMST.Keywords: vertical axis wind turbine, analytical, double multiple streamtube, streamtube expansion model, dynamic stall model, finite blade length correction
Procedia PDF Downloads 1351626 A Different Approach to Smart Phone-Based Wheat Disease Detection System Using Deep Learning for Ethiopia
Authors: Nathenal Thomas Lambamo
Abstract:
Based on the fact that more than 85% of the labor force and 90% of the export earnings are taken by agriculture in Ethiopia and it can be said that it is the backbone of the overall socio-economic activities in the country. Among the cereal crops that the agriculture sector provides for the country, wheat is the third-ranking one preceding teff and maize. In the present day, wheat is in higher demand related to the expansion of industries that use them as the main ingredient for their products. The local supply of wheat for these companies covers only 35 to 40% and the rest 60 to 65% percent is imported on behalf of potential customers that exhaust the country’s foreign currency reserves. The above facts show that the need for this crop in the country is too high and in reverse, the productivity of the crop is very less because of these reasons. Wheat disease is the most devastating disease that contributes a lot to this unbalance in the demand and supply status of the crop. It reduces both the yield and quality of the crop by 27% on average and up to 37% when it is severe. This study aims to detect the most frequent and degrading wheat diseases, Septoria and Leaf rust, using the most efficiently used subset of machine learning technology, deep learning. As a state of the art, a deep learning class classification technique called Convolutional Neural Network (CNN) has been used to detect diseases and has an accuracy of 99.01% is achieved.Keywords: septoria, leaf rust, deep learning, CNN
Procedia PDF Downloads 761625 Machine Learning Approach for Mutation Testing
Authors: Michael Stewart
Abstract:
Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing
Procedia PDF Downloads 1981624 Tibyan Automated Arabic Correction Using Machine-Learning in Detecting Syntactical Mistakes
Authors: Ashwag O. Maghraby, Nida N. Khan, Hosnia A. Ahmed, Ghufran N. Brohi, Hind F. Assouli, Jawaher S. Melibari
Abstract:
The Arabic language is one of the most important languages. Learning it is so important for many people around the world because of its religious and economic importance and the real challenge lies in practicing it without grammatical or syntactical mistakes. This research focused on detecting and correcting the syntactic mistakes of Arabic syntax according to their position in the sentence and focused on two of the main syntactical rules in Arabic: Dual and Plural. It analyzes each sentence in the text, using Stanford CoreNLP morphological analyzer and machine-learning approach in order to detect the syntactical mistakes and then correct it. A prototype of the proposed system was implemented and evaluated. It uses support vector machine (SVM) algorithm to detect Arabic grammatical errors and correct them using the rule-based approach. The prototype system has a far accuracy 81%. In general, it shows a set of useful grammatical suggestions that the user may forget about while writing due to lack of familiarity with grammar or as a result of the speed of writing such as alerting the user when using a plural term to indicate one person.Keywords: Arabic language acquisition and learning, natural language processing, morphological analyzer, part-of-speech
Procedia PDF Downloads 1521623 CFD-DEM Modelling and Analysis of the Continuous Separation of Sized Particles Using Inertial Microfluidics
Authors: Hui Zhu, Yuan Wang, Shibo Kuang, Aibing Yu
Abstract:
The inertial difference induced by the microfluidics inside a curved micro-channel has great potential to provide a fast, inexpensive, and portable solution to the separation of micro- and sub-micro particles in many applications such as aerosol collections, airborne bacteria and virus detections, as well as particle sortation. In this work, the separation behaviors of different sized particles inside a reported curved micro-channel have been studied by a combined approach of computational fluid dynamics for gas and discrete element model for particles (CFD-DEM). The micro-channel is operated by controlling the gas flow rates at all of its branches respectively used to load particles, introduce gas streams, collect particles of various sizes. The validity of the model has been examined by comparing by the calculated separation efficiency of different sized particles against the measurement. On this basis, the separation mechanisms of the inertial microfluidic separator are elucidated in terms of the interactions between particles, between particle and fluid, and between particle and wall. The model is then used to study the effect of feed solids concentration on the separation accuracy and efficiency. The results obtained from the present study demonstrate that the CFD-DEM approach can provide a convenient way to study the particle separation behaviors in micro-channels of various types.Keywords: CFD-DEM, inertial effect, microchannel, separation
Procedia PDF Downloads 2921622 Neural Networks for Distinguishing the Performance of Two Hip Joint Implants on the Basis of Hip Implant Side and Ground Reaction Force
Authors: L. Parisi
Abstract:
In this research work, neural networks were applied to classify two types of hip joint implants based on the relative hip joint implant side speed and three components of each ground reaction force. The condition of walking gait at normal velocity was used and carried out with each of the two hip joint implants assessed. Ground reaction forces’ kinetic temporal changes were considered in the first approach followed but discarded in the second one. Ground reaction force components were obtained from eighteen patients under such gait condition, half of which had a hip implant type I-II, whilst the other half had the hip implant, defined as type III by Orthoload®. After pre-processing raw gait kinetic data and selecting the time frames needed for the analysis, the ground reaction force components were used to train a MLP neural network, which learnt to distinguish the two hip joint implants in the abovementioned condition. Further to training, unknown hip implant side and ground reaction force components were presented to the neural networks, which assigned those features into the right class with a reasonably high accuracy for the hip implant type I-II and the type III. The results suggest that neural networks could be successfully applied in the performance assessment of hip joint implants.Keywords: kinemic gait data, neural networks, hip joint implant, hip arthroplasty, rehabilitation engineering
Procedia PDF Downloads 354