Search results for: bagging ensemble methods
14517 Tongue Image Retrieval Based Using Machine Learning
Authors: Ahmad FAROOQ, Xinfeng Zhang, Fahad Sabah, Raheem Sarwar
Abstract:
In Traditional Chinese Medicine, tongue diagnosis is a vital inspection tool (TCM). In this study, we explore the potential of machine learning in tongue diagnosis. It begins with the cataloguing of the various classifications and characteristics of the human tongue. We infer 24 kinds of tongues from the material and coating of the tongue, and we identify 21 attributes of the tongue. The next step is to apply machine learning methods to the tongue dataset. We use the Weka machine learning platform to conduct the experiment for performance analysis. The 457 instances of the tongue dataset are used to test the performance of five different machine learning methods, including SVM, Random Forests, Decision Trees, and Naive Bayes. Based on accuracy and Area under the ROC Curve, the Support Vector Machine algorithm was shown to be the most effective for tongue diagnosis (AUC).Keywords: medical imaging, image retrieval, machine learning, tongue
Procedia PDF Downloads 8114516 Performance Evaluation of Dynamic Signal Control System for Mixed Traffic Conditions
Authors: Aneesh Babu, S. P. Anusha
Abstract:
A dynamic signal control system combines traditional traffic lights with an array of sensors to intelligently control vehicle and pedestrian traffic. The present study focus on evaluating the performance of dynamic signal control systems for mixed traffic conditions. Data collected from four different approaches to a typical four-legged signalized intersection at Trivandrum city in the Kerala state of India is used for the study. Performance of three other dynamic signal control methods, namely (i) Non-sequential method (ii) Webster design for consecutive signal cycle using flow as input, and (iii) dynamic signal control using RFID delay as input, were evaluated. The evaluation of the dynamic signal control systems was carried out using a calibrated VISSIM microsimulation model. Python programming was used to integrate the dynamic signal control algorithm through the COM interface in VISSIM. The intersection delay obtained from different dynamic signal control methods was compared with the delay obtained from fixed signal control. Based on the study results, it was observed that the intersection delay was reduced significantly by using dynamic signal control methods. The dynamic signal control method using delay from RFID sensors resulted in a higher percentage reduction in delay and hence is a suitable choice for implementation under mixed traffic conditions. The developed dynamic signal control strategies can be implemented in ITS applications under mixed traffic conditions.Keywords: dynamic signal control, intersection delay, mixed traffic conditions, RFID sensors
Procedia PDF Downloads 10714515 Induction Heating Process Design Using Comsol® Multiphysics Software Version 4.2a
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
Induction heating computer simulation is a powerful tool for process design and optimization, induction coil design, equipment selection, as well as education and business presentations. The authors share their vast experience in the practical use of computer simulation for different induction heating and heat treating processes. In this paper deals with mathematical modeling and numerical simulation of induction heating furnaces with axisymmetric geometries. For the numerical solution, we propose finite element methods combined with boundary (FEM) for the electromagnetic model using COMSOL® Multiphysics Software. Some numerical results for an industrial furnace are shown with high frequency.Keywords: numerical methods, induction furnaces, induction heating, finite element method, Comsol multiphysics software
Procedia PDF Downloads 45014514 Information and Communication Technology (ICT) and Yoruba Language Teaching
Authors: Ayoola Idowu Olasebikan
Abstract:
The global community has become increasingly dependent on various kinds of technologies out of which Information and Communication Technologies (ICTs) appear to be the most prominent. ICTs have become multipurpose tools which have had a revolutionary impact on how we see the world and how we live in it. Yoruba is the most widely spoken African language outside Africa but it remains one of the badly spoken language in the world as a result of its outdated teaching method in the African schools which prevented its standard version from being spoken and written. This paper conducts a critical review of the traditional methods of teaching Yoruba language. It then examines the possibility of leveraging on ICTs for improved methods of teaching Yoruba language to achieve global standard and spread. It identified key ICT platforms that can be deployed for the teaching of Yoruba language and the constraints facing each of them. The paper concludes that Information and Communication Technologies appear to provide veritable opportunity for paradigm shift in the methods of teaching Yoruba Language. It also opines that Yoruba language has the potential to transform economic fortune of Africa for sustainable development provided its teaching is taken beyond the brick and mortar classroom to the virtual classroom/global information super highway called internet or any other ICTs medium. It recommends that students and teachers of Yoruba language should be encouraged to acquire basic skills in computer and internet technology in order to enhance their ability to develop and retrieve electronic Yoruba language teaching materials.Keywords: Africa, ICT, teaching method, Yoruba language
Procedia PDF Downloads 36114513 Analysis of Dynamics Underlying the Observation Time Series by Using a Singular Spectrum Approach
Authors: O. Delage, H. Bencherif, T. Portafaix, A. Bourdier
Abstract:
The main purpose of time series analysis is to learn about the dynamics behind some time ordered measurement data. Two approaches are used in the literature to get a better knowledge of the dynamics contained in observation data sequences. The first of these approaches concerns time series decomposition, which is an important analysis step allowing patterns and behaviors to be extracted as components providing insight into the mechanisms producing the time series. As in many cases, time series are short, noisy, and non-stationary. To provide components which are physically meaningful, methods such as Empirical Mode Decomposition (EMD), Empirical Wavelet Transform (EWT) or, more recently, Empirical Adaptive Wavelet Decomposition (EAWD) have been proposed. The second approach is to reconstruct the dynamics underlying the time series as a trajectory in state space by mapping a time series into a set of Rᵐ lag vectors by using the method of delays (MOD). Takens has proved that the trajectory obtained with the MOD technic is equivalent to the trajectory representing the dynamics behind the original time series. This work introduces the singular spectrum decomposition (SSD), which is a new adaptive method for decomposing non-linear and non-stationary time series in narrow-banded components. This method takes its origin from singular spectrum analysis (SSA), a nonparametric spectral estimation method used for the analysis and prediction of time series. As the first step of SSD is to constitute a trajectory matrix by embedding a one-dimensional time series into a set of lagged vectors, SSD can also be seen as a reconstruction method like MOD. We will first give a brief overview of the existing decomposition methods (EMD-EWT-EAWD). The SSD method will then be described in detail and applied to experimental time series of observations resulting from total columns of ozone measurements. The results obtained will be compared with those provided by the previously mentioned decomposition methods. We will also compare the reconstruction qualities of the observed dynamics obtained from the SSD and MOD methods.Keywords: time series analysis, adaptive time series decomposition, wavelet, phase space reconstruction, singular spectrum analysis
Procedia PDF Downloads 10414512 Estimation of Stress Intensity Factors from near Crack Tip Field
Authors: Zhuang He, Andrei Kotousov
Abstract:
All current experimental methods for determination of stress intensity factors are based on the assumption that the state of stress near the crack tip is plane stress. Therefore, these methods rely on strain and displacement measurements made outside the near crack tip region affected by the three-dimensional effects or by process zone. In this paper, we develop and validate an experimental procedure for the evaluation of stress intensity factors from the measurements of the out-of-plane displacements in the surface area controlled by 3D effects. The evaluation of stress intensity factors is possible when the process zone is sufficiently small, and the displacement field generated by the 3D effects is fully encapsulated by K-dominance region.Keywords: digital image correlation, stress intensity factors, three-dimensional effects, transverse displacement
Procedia PDF Downloads 61514511 The Structure and Function Investigation and Analysis of the Automatic Spin Regulator (ASR) in the Powertrain System of Construction and Mining Machines with the Focus on Dump Trucks
Authors: Amir Mirzaei
Abstract:
The powertrain system is one of the most basic and essential components in a machine. The occurrence of motion is practically impossible without the presence of this system. When power is generated by the engine, it is transmitted by the powertrain system to the wheels, which are the last parts of the system. Powertrain system has different components according to the type of use and design. When the force generated by the engine reaches to the wheels, the amount of frictional force between the tire and the ground determines the amount of traction and non-slip or the amount of slip. At various levels, such as icy, muddy, and snow-covered ground, the amount of friction coefficient between the tire and the ground decreases dramatically and considerably, which in turn increases the amount of force loss and the vehicle traction decreases drastically. This condition is caused by the phenomenon of slipping, which, in addition to the waste of energy produced, causes the premature wear of driving tires. It also causes the temperature of the transmission oil to rise too much, as a result, causes a reduction in the quality and become dirty to oil and also reduces the useful life of the clutches disk and plates inside the transmission. this issue is much more important in road construction and mining machinery than passenger vehicles and is always one of the most important and significant issues in the design discussion, in order to overcome. One of these methods is the automatic spin regulator system which is abbreviated as ASR. The importance of this method and its structure and function have solved one of the biggest challenges of the powertrain system in the field of construction and mining machinery. That this research is examined.Keywords: automatic spin regulator, ASR, methods of reducing slipping, methods of preventing the reduction of the useful life of clutches disk and plate, methods of preventing the premature dirtiness of transmission oil, method of preventing the reduction of the useful life of tires
Procedia PDF Downloads 7914510 Optimization of Proton Exchange Membrane Fuel Cell Parameters Based on Modified Particle Swarm Algorithms
Authors: M. Dezvarei, S. Morovati
Abstract:
In recent years, increasing usage of electrical energy provides a widespread field for investigating new methods to produce clean electricity with high reliability and cost management. Fuel cells are new clean generations to make electricity and thermal energy together with high performance and no environmental pollution. According to the expansion of fuel cell usage in different industrial networks, the identification and optimization of its parameters is really significant. This paper presents optimization of a proton exchange membrane fuel cell (PEMFC) parameters based on modified particle swarm optimization with real valued mutation (RVM) and clonal algorithms. Mathematical equations of this type of fuel cell are presented as the main model structure in the optimization process. Optimized parameters based on clonal and RVM algorithms are compared with the desired values in the presence and absence of measurement noise. This paper shows that these methods can improve the performance of traditional optimization methods. Simulation results are employed to analyze and compare the performance of these methodologies in order to optimize the proton exchange membrane fuel cell parameters.Keywords: clonal algorithm, proton exchange membrane fuel cell (PEMFC), particle swarm optimization (PSO), real-valued mutation (RVM)
Procedia PDF Downloads 35114509 Practice of Applying MIDI Technology to Train Creative Teaching Skills
Authors: Yang Zhuo
Abstract:
This study explores the integration of MIDI technology as one of the important digital technologies in music teaching, from the perspective of teaching practice, into the process of cultivating students' teaching skills. At the same time, the framework elements of the learning environment for music education students are divided into four aspects: digital technology supported learning space, new knowledge learning, teaching methods, and teaching evaluation. In teaching activities, more attention should be paid to students' subjectivity and interaction between them so as to enhance their emotional experience in teaching practice simulation. In the process of independent exploration and cooperative interaction, problems should be discovered and solved, and basic knowledge of music and teaching methods should be exercised in practice.Keywords: music education, educational technology, MIDI, teacher training
Procedia PDF Downloads 8414508 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 12314507 Comparative Study between Classical P-Q Method and Modern Fuzzy Controller Method to Improve the Power Quality of an Electrical Network
Authors: A. Morsli, A. Tlemçani, N. Ould Cherchali, M. S. Boucherit
Abstract:
This article presents two methods for the compensation of harmonics generated by a nonlinear load. The first is the classic method P-Q. The second is the controller by modern method of artificial intelligence specifically fuzzy logic. Both methods are applied to an Active Power Filter shunt (APFs) based on a three-phase voltage converter at five levels NPC topology. In calculating the harmonic currents of reference, we use the algorithm P-Q and pulse generation, we use the intersective PWM. For flexibility and dynamics, we use fuzzy logic. The results give us clear that the rate of Harmonic Distortion issued by fuzzy logic is better than P-Q.Keywords: fuzzy logic controller, P-Q method, pulse width modulation (PWM), shunt active power filter (sAPF), total harmonic distortion (THD)
Procedia PDF Downloads 54814506 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks
Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar
Abstract:
DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)
Procedia PDF Downloads 31814505 Literature Review: Application of Artificial Intelligence in EOR
Authors: Masoumeh Mofarrah, Amir NahanMoghadam
Abstract:
Higher oil prices and increasing oil demand are main reasons for great attention to Enhanced Oil Recovery (EOR). Comprehensive researches have been accomplished to develop, appraise and improve EOR methods and their application. Recently Artificial Intelligence (AI) gained popularity in petroleum industry that can help petroleum engineers to solve some fundamental petroleum engineering problems such as reservoir simulation, EOR project risk analysis, well log interpretation and well test model selection. This study presents a historical overview of most popular AI tools including neural networks, genetic algorithms, fuzzy logic and expert systems in petroleum industry and discusses two case studies to represent the application of two mentioned AI methods for selecting an appropriate EOR method based on reservoir characterization in feasible and effective way.Keywords: artificial intelligence, EOR, neural networks, expert systems
Procedia PDF Downloads 40814504 Time Delay Estimation Using Signal Envelopes for Synchronisation of Recordings
Authors: Sergei Aleinik, Mikhail Stolbov
Abstract:
In this work, a method of time delay estimation for dual-channel acoustic signals (speech, music, etc.) recorded under reverberant conditions is investigated. Standard methods based on cross-correlation of the signals show poor results in cases involving strong reverberation, large distances between microphones and asynchronous recordings. Under similar conditions, a method based on cross-correlation of temporal envelopes of the signals delivers a delay estimation of acceptable quality. This method and its properties are described and investigated in detail, including its limits of applicability. The method’s optimal parameter estimation and a comparison with other known methods of time delay estimation are also provided.Keywords: cross-correlation, delay estimation, signal envelope, signal processing
Procedia PDF Downloads 48514503 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data
Authors: M. Mueller, M. Kuehn, M. Voelker
Abstract:
In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing
Procedia PDF Downloads 13114502 Production of High-Content Fructo-Oligosaccharides
Authors: C. Nobre, C. C. Castro, A.-L. Hantson, J. A. Teixeira, L. R. Rodrigues, G. De Weireld
Abstract:
Fructo-oligosaccharides (FOS) are produced from sucrose by Aureobasidium pullulans in yields between 40-60% (w/w). To increase the amount of FOS it is necessary to remove the small, non-prebiotic sugars, present. Two methods for producing high-purity FOS have been developed: the use of microorganisms able to consume small saccharides; and the use of continuous chromatography to separate sugars: simulated moving bed (SMB). It is herein proposed the combination of both methods. The aim of this study is to optimize the composition of the fermentative broth (in terms of salts and sugars) that will be further purified by SMB. A yield of 0.63 gFOS.g Sucrose-1 was obtained with A. pullulans using low amounts of salts in the initial fermentative broth. By removing the small sugars, Saccharomyces cerevisiae and Zymomonas mobilis increased the percentage of FOS from around 56.0% to 83% (w/w) in average, losing only 10% (w/w) of FOS during the recovery process.Keywords: fructo-oligosaccharides, microbial treatment, Saccharomyces cerevisiae, Zymomonas mobilis
Procedia PDF Downloads 30814501 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: image processing, illumination equalization, shadow filtering, object detection
Procedia PDF Downloads 21614500 Assessing the Effects of Entrepreneurship Education and Moderating Variables on Venture Creation Intention of Undergraduate Students in Ghana
Authors: Daniel K. Gameti
Abstract:
The paper explored the effects of active and passive entrepreneurship education methods on the venture creation intention of undergraduate students in Ghana. The study also examined the moderating effect of gender and negative personal characteristics (risk tolerance, stress tolerance and fear of failure) on students’ venture creation intention. Deductive approach was used in collecting quantitative data from 555 business students from one public university and one private university through self-administered questionnaires. Descriptive statistic was used to determine the dominant method of entrepreneurship education used in Ghana. Further, structural equation model was used to test four hypotheses. The results of the study show that the dominant method of education used in Ghana was lectures and the least method used was field trip. The study further revealed that passive methods of education are less effective compared to active methods which were statistically significant in venture creation intention among students. There was also statistical difference between male and female students’ venture creation intention but stronger among male students and finally, the only personal characteristics that influence students’ intention was stress tolerance because risk tolerance and fear of failure were statistically insignificant.Keywords: entrepreneurship education, Ghana, moderating variables, venture creation intention, undergraduate students
Procedia PDF Downloads 45314499 The Marker Active Compound Identification of Calotropis gigantea Roots Extract as an Anticancer
Authors: Roihatul Mutiah, Sukardiman, Aty Widyawaruyanti
Abstract:
Calotropis gigantiea (L.) R. Br (Apocynaceae) commonly called as “Biduri” or “giant milk weed” is a well-known weed to many cultures for treating various disorders. Several studies reported that C.gigantea roots has anticancer activity. The main aim of this research was to isolate and identify an active marker compound of C.gigantea roots for quality control purpose of its extract in the development as anticancer natural product. The isolation methods was bioactivity guided column chromatography, TLC, and HPLC. Evaluated anticancer activity of there substances using MTT assay methods. Identification structure active compound by UV, 1HNMR, 13CNMR, HMBC, HMQC spectral and other references. The result showed that the marker active compound was identical as Calotropin.Keywords: calotropin, Calotropis gigantea, anticancer, marker active
Procedia PDF Downloads 33614498 Evaluation of Drilling-Induced Delamination of Flax/Epoxy Composites by Non-Destructive Testing Methods
Authors: Hadi Rezghimaleki, Masatoshi Kubouchi, Yoshihiko Arao
Abstract:
The use of natural fiber composites (NFCs) is growing at a fast rate regarding industrial applications and principle researches due to their eco-friendly, renewable nature, and low density/costs. Drilling is one of the most important machining operations that are carried out on natural fiber composites. Delamination is a major concern in the drilling process of NFCs that affects the structural integrity and long-term reliability of the machined components. Flax fiber reinforced epoxy composite laminates were prepared by hot press technique. In this research, we evaluated drilling-induced delamination of flax/epoxy composites by X-ray computed tomography (CT), ultrasonic testing (UT), and optical methods and compared the results.Keywords: natural fiber composites, flax/epoxy, X-ray CT, ultrasonic testing
Procedia PDF Downloads 29914497 Machine Learning for Disease Prediction Using Symptoms and X-Ray Images
Authors: Ravija Gunawardana, Banuka Athuraliya
Abstract:
Machine learning has emerged as a powerful tool for disease diagnosis and prediction. The use of machine learning algorithms has the potential to improve the accuracy of disease prediction, thereby enabling medical professionals to provide more effective and personalized treatments. This study focuses on developing a machine-learning model for disease prediction using symptoms and X-ray images. The importance of this study lies in its potential to assist medical professionals in accurately diagnosing diseases, thereby improving patient outcomes. Respiratory diseases are a significant cause of morbidity and mortality worldwide, and chest X-rays are commonly used in the diagnosis of these diseases. However, accurately interpreting X-ray images requires significant expertise and can be time-consuming, making it difficult to diagnose respiratory diseases in a timely manner. By incorporating machine learning algorithms, we can significantly enhance disease prediction accuracy, ultimately leading to better patient care. The study utilized the Mask R-CNN algorithm, which is a state-of-the-art method for object detection and segmentation in images, to process chest X-ray images. The model was trained and tested on a large dataset of patient information, which included both symptom data and X-ray images. The performance of the model was evaluated using a range of metrics, including accuracy, precision, recall, and F1-score. The results showed that the model achieved an accuracy rate of over 90%, indicating that it was able to accurately detect and segment regions of interest in the X-ray images. In addition to X-ray images, the study also incorporated symptoms as input data for disease prediction. The study used three different classifiers, namely Random Forest, K-Nearest Neighbor and Support Vector Machine, to predict diseases based on symptoms. These classifiers were trained and tested using the same dataset of patient information as the X-ray model. The results showed promising accuracy rates for predicting diseases using symptoms, with the ensemble learning techniques significantly improving the accuracy of disease prediction. The study's findings indicate that the use of machine learning algorithms can significantly enhance disease prediction accuracy, ultimately leading to better patient care. The model developed in this study has the potential to assist medical professionals in diagnosing respiratory diseases more accurately and efficiently. However, it is important to note that the accuracy of the model can be affected by several factors, including the quality of the X-ray images, the size of the dataset used for training, and the complexity of the disease being diagnosed. In conclusion, the study demonstrated the potential of machine learning algorithms for disease prediction using symptoms and X-ray images. The use of these algorithms can improve the accuracy of disease diagnosis, ultimately leading to better patient care. Further research is needed to validate the model's accuracy and effectiveness in a clinical setting and to expand its application to other diseases.Keywords: K-nearest neighbor, mask R-CNN, random forest, support vector machine
Procedia PDF Downloads 15614496 Designing Directed Network with Optimal Controllability
Authors: Liang Bai, Yandong Xiao, Haorang Wang, Songyang Lao
Abstract:
The directedness of links is crucial to determine the controllability in complex networks. Even the edge directions can determine the controllability of complex networks. Obviously, for a given network, we wish to design its edge directions that make this network approach the optimal controllability. In this work, we firstly introduce two methods to enhance network by assigning edge directions. However, these two methods could not completely mitigate the negative effects of inaccessibility and dilations. Thus, to approach the optimal network controllability, the edge directions must mitigate the negative effects of inaccessibility and dilations as much as possible. Finally, we propose the edge direction for optimal controllability. The optimal method has been found to be successfully useful on real-world and synthetic networks.Keywords: complex network, dynamics, network control, optimization
Procedia PDF Downloads 18514495 Effect of Different Processing Methods on the Quality Attributes of Pigeon Pea Used in Bread Production
Authors: B. F. Olanipekun, O. J. Oyelade, C. O. Osemobor
Abstract:
Pigeon pea is a very good source of protein and micronutrient, but it is being underutilized in Nigeria because of several constraints. This research considered the effect of different processing methods on the quality attributes of pigeon pea used in bread production towards enhancing its utility. Pigeon pea was obtained at a local market and processed into the flour using three processing methods: soaking, sprouting and roasting and were used to bake bread in different proportions. Chemical composition and sensory attributes of the breads were thereafter determined. The highest values of protein and ash contents were obtained from 20 % substitution of sprouted pigeon pea in wheat flour and may be attributable to complex biochemical changes occurring during hydration, to invariably lead to protein constituent being broken down. Hydrolytic activities of the enzymes from the sprouted sample resulted in improvement in the constituent of total protein probably due to reduction in the carbohydrate content. Sensory qualities analyses showed that bread produced with soaked and roasted pigeon pea flours at 5 and 10% inclusion, respectively were mostly accepted than other blends, and products with sprouted pigeon pea flour were least accepted. The findings of this research suggest that supplementing wheat flour with sprouted pigeon peas have more nutritional potentials. However, with sensory analysis indices, the soaked and roasted pigeon peas up to 10% are majorly accepted, and also can improve the nutritional status. Overall, this will be very beneficial to population dependent on plant protein in order to combat malnutrition problems.Keywords: pigeon pea, processing, protein, malnutrition
Procedia PDF Downloads 25014494 Influence of Selected Finishing Technologies on the Roughness Parameters of Stainless Steel Manufactured by Selective Laser Melting Method
Authors: J. Hajnys, M. Pagac, J. Petru, P. Stefek, J. Mesicek, J. Kratochvil
Abstract:
The new progressive method of 3D metal printing SLM (Selective Laser Melting) is increasingly expanded into the normal operation. As a result, greater demands are placed on the surface quality of the parts produced in this way. The article deals with research of selected finishing methods (tumbling, face milling, sandblasting, shot peening and brushing) and their impact on the final surface roughness. The 20 x 20 x 7 mm produced specimens using SLM additive technology on the Renishaw AM400 were subjected to testing of these finishing methods by adjusting various parameters. Surface parameters of roughness Sa, Sz were chosen as the evaluation criteria and profile parameters Ra, Rz were used as additional measurements. Optical measurement of surface roughness was performed on Alicona Infinite Focus 5. An experiment conducted to optimize the surface roughness revealed, as expected, that the best roughness parameters were achieved through a face milling operation. Tumbling is particularly suitable for 3D printing components, as tumbling media are able to reach even complex shapes and, after changing to polishing bodies, achieve a high surface gloss. Surface quality after tumbling depends on the process time. Other methods with satisfactory results are shot peening and tumbling, which should be the focus of further research.Keywords: additive manufacturing, selective laser melting, SLM, surface roughness, stainless steel
Procedia PDF Downloads 13114493 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data
Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder
Abstract:
Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods
Procedia PDF Downloads 25314492 A Review of Self-Healing Concrete and Various Methods of Its Scientific Implementation
Authors: Davoud Beheshtizadeh, Davood Jafari
Abstract:
Concrete, with its special properties and advantages, has caused it to be widely and increasingly used in construction industry, especially in infrastructures of the country. On the other hand, some defects of concrete and, most importantly, micro-cracks in the concrete after setting have caused the cost of repair and maintenance of infrastructure; therefore, self-healing concretes have been of attention in other countries in the recent years. These concretes have been repaired with general mechanisms such as physical, chemical, biological and combined mechanisms, each of which has different subsets and methods of execution and operation. Also, some of these types of mechanisms are of high importance, which has led to a special production method, and as this subject is new in Iran, this knowledge is almost unknown or at least some part of it has not been considered at all. The present article completely introduces various self-healing mechanisms as a review and tries to present the disadvantages and advantages of each method along with its scope of application.Keywords: micro-cracks, self-healing concrete, microcapsules, concrete, cement, self-sensitive
Procedia PDF Downloads 14614491 AHP and TOPSIS Methods for Supplier Selection Problem in Medical Devices Company
Authors: Sevde D. Karayel, Ediz Atmaca
Abstract:
Supplier selection subject is vital because of development competitiveness and performance of firms which have right, rapid and with low cost procurement. Considering the fact that competition between firms is no longer on their supply chains, hence it is very clear that performance of the firms’ not only depend on their own success but also success of all departments in supply chain. For this purpose, firms want to work with suppliers which are cost effective, flexible in terms of demand and high quality level for customer satisfaction. However, diversification and redundancy of their expectations from suppliers, supplier selection problems need to be solved as a hard problem. In this study, supplier selection problem is discussed for critical piece, which is using almost all production of products in and has troubles with lead time from supplier, in a firm that produces medical devices. Analyzing policy in the current situation of the firm in the supplier selection indicates that supplier selection is made based on the purchasing department experience and other authorized persons’ general judgments. Because selection do not make based on the analytical methods, it is caused disruptions in production, lateness and extra cost. To solve the problem, AHP and TOPSIS which are multi-criteria decision making techniques, which are effective, easy to implement and can analyze many criteria simultaneously, are used to make a selection among alternative suppliers.Keywords: AHP-TOPSIS methods, multi-criteria decision making, supplier selection problem, supply chain management
Procedia PDF Downloads 26414490 A Philosophical Study of Men's Rights Discourses in Light of Feminism
Authors: Michael Barker
Abstract:
Men’s rights activists are largely antifeminism. Evaluation of men’s rights discourses, however, shows that men’s rights’ goals would be better achieved by working with feminism. Discussion of men’s rights discourses, though, is prone to confusion because there is no commonly used men’s rights language. In the presentation ‘male sexism’, ‘matriarchy’ and ‘masculism’ will be unpacked as part of a suggested men’s rights language. Once equipped with a men’s rights vocabulary, sustained philosophical assessment of the extent to which several categories of male disadvantages are wrongful will be offered. Following this, conditions that cause each category of male sexism will be discussed. It shall be argued that male sexism is caused more so by matriarchy than by patriarchy or by feminism. In closing, the success at which various methods address the categories of male sexism will be contrasted. Ultimately, it will be shown that male disadvantages are addressed more successfully by methods that work with, than against, feminism.Keywords: gender studies, feminism, patriarchy, men’s rights, male sexism, matriarchy, masculism
Procedia PDF Downloads 37114489 Combining Shallow and Deep Unsupervised Machine Learning Techniques to Detect Bad Actors in Complex Datasets
Authors: Jun Ming Moey, Zhiyaun Chen, David Nicholson
Abstract:
Bad actors are often hard to detect in data that imprints their behaviour patterns because they are comparatively rare events embedded in non-bad actor data. An unsupervised machine learning framework is applied here to detect bad actors in financial crime datasets that record millions of transactions undertaken by hundreds of actors (<0.01% bad). Specifically, the framework combines ‘shallow’ (PCA, Isolation Forest) and ‘deep’ (Autoencoder) methods to detect outlier patterns. Detection performance analysis for both the individual methods and their combination is reported.Keywords: detection, machine learning, deep learning, unsupervised, outlier analysis, data science, fraud, financial crime
Procedia PDF Downloads 9514488 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study
Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming
Abstract:
Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.Keywords: binary outcomes, statistical methods, clinical trials, simulation study
Procedia PDF Downloads 115