Search results for: virtual Machine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3969

Search results for: virtual Machine

2619 Scalar Modulation Technique for Six-Phase Matrix Converter Fed Series-Connected Two-Motor Drives

Authors: A. Djahbar, M. Aillerie, E. Bounadja

Abstract:

In this paper we treat a new structure of a high-power actuator which is used to either industry or electric traction. Indeed, the actuator is constituted by two induction motors, the first is a six-phase motor connected in series with another three-phase motor via the stators. The whole is supplied by a single static converter. Our contribution in this paper is the optimization of the system supply source. This is feeding the multimotor group by a direct converter frequency without using the DC-link capacitor. The modelling of the components of multimotor system is presented first. Only the first component of stator currents is used to produce the torque/flux of the first machine in the group. The second component of stator currents is considered as additional degrees of freedom and which can be used for power conversion for the other connected motors. The decoupling of each motor from the group is obtained using the direct vector control scheme. Simulation results demonstrate the effectiveness of the proposed structure.

Keywords: induction machine, motor drives, scalar modulation technique, three-to-six phase matrix converter

Procedia PDF Downloads 548
2618 Facilitating Written Biology Assessment in Large-Enrollment Courses Using Machine Learning

Authors: Luanna B. Prevost, Kelli Carter, Margaurete Romero, Kirsti Martinez

Abstract:

Writing is an essential scientific practice, yet, in several countries, the increasing university science class-size limits the use of written assessments. Written assessments allow students to demonstrate their learning in their own words and permit the faculty to evaluate students’ understanding. However, the time and resources required to grade written assessments prohibit their use in large-enrollment science courses. This study examined the use of machine learning algorithms to automatically analyze student writing and provide timely feedback to the faculty about students' writing in biology. Written responses to questions about matter and energy transformation were collected from large-enrollment undergraduate introductory biology classrooms. Responses were analyzed using the LightSide text mining and classification software. Cohen’s Kappa was used to measure agreement between the LightSide models and human raters. Predictive models achieved agreement with human coding of 0.7 Cohen’s Kappa or greater. Models captured that when writing about matter-energy transformation at the ecosystem level, students focused on primarily on the concepts of heat loss, recycling of matter, and conservation of matter and energy. Models were also produced to capture writing about processes such as decomposition and biochemical cycling. The models created in this study can be used to provide automatic feedback about students understanding of these concepts to biology faculty who desire to use formative written assessments in larger enrollment biology classes, but do not have the time or personnel for manual grading.

Keywords: machine learning, written assessment, biology education, text mining

Procedia PDF Downloads 281
2617 Milling Simulations with a 3-DOF Flexible Planar Robot

Authors: Hoai Nam Huynh, Edouard Rivière-Lorphèvre, Olivier Verlinden

Abstract:

Manufacturing technologies are becoming continuously more diversified over the years. The increasing use of robots for various applications such as assembling, painting, welding has also affected the field of machining. Machining robots can deal with larger workspaces than conventional machine-tools at a lower cost and thus represent a very promising alternative for machining applications. Furthermore, their inherent structure ensures them a great flexibility of motion to reach any location on the workpiece with the desired orientation. Nevertheless, machining robots suffer from a lack of stiffness at their joints restricting their use to applications involving low cutting forces especially finishing operations. Vibratory instabilities may also happen while machining and deteriorate the precision leading to scrap parts. Some researchers are therefore concerned with the identification of optimal parameters in robotic machining. This paper continues the development of a virtual robotic machining simulator in order to find optimized cutting parameters in terms of depth of cut or feed per tooth for example. The simulation environment combines an in-house milling routine (DyStaMill) achieving the computation of cutting forces and material removal with an in-house multibody library (EasyDyn) which is used to build a dynamic model of a 3-DOF planar robot with flexible links. The position of the robot end-effector submitted to milling forces is controlled through an inverse kinematics scheme while controlling the position of its joints separately. Each joint is actuated through a servomotor for which the transfer function has been computed in order to tune the corresponding controller. The output results feature the evolution of the cutting forces when the robot structure is deformable or not and the tracking errors of the end-effector. Illustrations of the resulting machined surfaces are also presented. The consideration of the links flexibility has highlighted an increase of the cutting forces magnitude. This proof of concept will aim to enrich the database of results in robotic machining for potential improvements in production.

Keywords: control, milling, multibody, robotic, simulation

Procedia PDF Downloads 249
2616 Comparison Between Genetic Algorithms and Particle Swarm Optimization Optimized Proportional Integral Derirative and PSS for Single Machine Infinite System

Authors: Benalia Nadia, Zerzouri Nora, Ben Si Ali Nadia

Abstract:

Abstract: Among the many different modern heuristic optimization methods, genetic algorithms (GA) and the particle swarm optimization (PSO) technique have been attracting a lot of interest. The GA has gained popularity in academia and business mostly because to its simplicity, ability to solve highly nonlinear mixed integer optimization problems that are typical of complex engineering systems, and intuitiveness. The mechanics of the PSO methodology, a relatively recent heuristic search tool, are modeled after the swarming or cooperative behavior of biological groups. It is suitable to compare the performance of the two techniques since they both aim to solve a particular objective function but make use of distinct computing methods. In this article, PSO and GA optimization approaches are used for the parameter tuning of the power system stabilizer and Proportional integral derivative regulator. Load angle and rotor speed variations in the single machine infinite bus bar system is used to measure the performance of the suggested solution.

Keywords: SMIB, genetic algorithm, PSO, transient stability, power system stabilizer, PID

Procedia PDF Downloads 83
2615 Capturing the Stress States in Video Conferences by Photoplethysmographic Pulse Detection

Authors: Jarek Krajewski, David Daxberger

Abstract:

We propose a stress detection method based on an RGB camera using heart rate detection, also known as Photoplethysmography Imaging (PPGI). This technique focuses on the measurement of the small changes in skin colour caused by blood perfusion. A stationary lab setting with simulated video conferences is chosen using constant light conditions and a sampling rate of 30 fps. The ground truth measurement of heart rate is conducted with a common PPG system. The proposed approach for pulse peak detection is based on a machine learning-based approach, applying brute force feature extraction for the prediction of heart rate pulses. The statistical analysis showed good agreement (correlation r = .79, p<0.05) between the reference heart rate system and the proposed method. Based on these findings, the proposed method could provide a reliable, low-cost, and contactless way of measuring HR parameters in daily-life environments.

Keywords: heart rate, PPGI, machine learning, brute force feature extraction

Procedia PDF Downloads 123
2614 Prediction of All-Beta Protein Secondary Structure Using Garnier-Osguthorpe-Robson Method

Authors: K. Tejasri, K. Suvarna Vani, S. Prathyusha, S. Ramya

Abstract:

Proteins are chained sequences of amino acids which are brought together by the peptide bonds. Many varying formations of the chains are possible due to multiple combinations of amino acids and rotation in numerous positions along the chain. Protein structure prediction is one of the crucial goals worked towards by the members of bioinformatics and theoretical chemistry backgrounds. Among the four different structure levels in proteins, we emphasize mainly the secondary level structure. Generally, the secondary protein basically comprises alpha-helix and beta-sheets. Multi-class classification problem of data with disparity is truly a challenge to overcome and has to be addressed for the beta strands. Imbalanced data distribution constitutes a couple of the classes of data having very limited training samples collated with other classes. The secondary structure data is extracted from the protein primary sequence, and the beta-strands are predicted using suitable machine learning algorithms.

Keywords: proteins, secondary structure elements, beta-sheets, beta-strands, alpha-helices, machine learning algorithms

Procedia PDF Downloads 94
2613 Predicting Wealth Status of Households Using Ensemble Machine Learning Algorithms

Authors: Habtamu Ayenew Asegie

Abstract:

Wealth, as opposed to income or consumption, implies a more stable and permanent status. Due to natural and human-made difficulties, households' economies will be diminished, and their well-being will fall into trouble. Hence, governments and humanitarian agencies offer considerable resources for poverty and malnutrition reduction efforts. One key factor in the effectiveness of such efforts is the accuracy with which low-income or poor populations can be identified. As a result, this study aims to predict a household’s wealth status using ensemble Machine learning (ML) algorithms. In this study, design science research methodology (DSRM) is employed, and four ML algorithms, Random Forest (RF), Adaptive Boosting (AdaBoost), Light Gradient Boosted Machine (LightGBM), and Extreme Gradient Boosting (XGBoost), have been used to train models. The Ethiopian Demographic and Health Survey (EDHS) dataset is accessed for this purpose from the Central Statistical Agency (CSA)'s database. Various data pre-processing techniques were employed, and the model training has been conducted using the scikit learn Python library functions. Model evaluation is executed using various metrics like Accuracy, Precision, Recall, F1-score, area under curve-the receiver operating characteristics (AUC-ROC), and subjective evaluations of domain experts. An optimal subset of hyper-parameters for the algorithms was selected through the grid search function for the best prediction. The RF model has performed better than the rest of the algorithms by achieving an accuracy of 96.06% and is better suited as a solution model for our purpose. Following RF, LightGBM, XGBoost, and AdaBoost algorithms have an accuracy of 91.53%, 88.44%, and 58.55%, respectively. The findings suggest that some of the features like ‘Age of household head’, ‘Total children ever born’ in a family, ‘Main roof material’ of their house, ‘Region’ they lived in, whether a household uses ‘Electricity’ or not, and ‘Type of toilet facility’ of a household are determinant factors to be a focal point for economic policymakers. The determinant risk factors, extracted rules, and designed artifact achieved 82.28% of the domain expert’s evaluation. Overall, the study shows ML techniques are effective in predicting the wealth status of households.

Keywords: ensemble machine learning, households wealth status, predictive model, wealth status prediction

Procedia PDF Downloads 38
2612 Hybrid Feature Selection Method for Sentiment Classification of Movie Reviews

Authors: Vishnu Goyal, Basant Agarwal

Abstract:

Sentiment analysis research provides methods for identifying the people’s opinion written in blogs, reviews, social networking websites etc. Sentiment analysis is to understand what opinion people have about any given entity, object or thing. Sentiment analysis research can be broadly categorised into three types of approaches i.e. semantic orientation, machine learning and lexicon based approaches. Feature selection methods improve the performance of the machine learning algorithms by eliminating the irrelevant features. Information gain feature selection method has been considered best method for sentiment analysis; however, it has the drawback of selection of threshold. Therefore, in this paper, we propose a hybrid feature selection methods comprising of information gain and proposed feature selection method. Initially, features are selected using Information Gain (IG) and further more noisy features are eliminated using the proposed feature selection method. Experimental results show the efficiency of the proposed feature selection methods.

Keywords: feature selection, sentiment analysis, hybrid feature selection

Procedia PDF Downloads 338
2611 Second Language Development with an Intercultural Approach: A Pilot Program Applied to Higher Education Students from a Escuela Normal in Atequiza, Mexico

Authors: Frida C. Jaime Franco, C. Paulina Navarro Núñez, R. Jacob Sánchez Nájera

Abstract:

The importance of developing multi-language abilities in our global society is noteworthy. However, the necessity, interest, and consciousness of the significance that the development of another language represents, apart from the mother tongue, is not always the same in all contexts as it is in multicultural communities, especially in rural higher education institutions immersed in small communities. Leading opportunities for digital interaction among learners from Mexico and abroad partners represents scaffolding towards, not only language skills development but also intercultural communicative competences (ICC). This study leads us to consider what should be the best approach to work while applying a program of ICC integrated into the practice of EFL. While analyzing the roots of the language, it is possible to obtain the main objective of learning another language, to communicate with a functional purpose, as well as attaching social practices to the learning process, giving a result of functionality and significance to the target language. Hence, the collateral impact that collaborative learning leads to, aims to contribute to a better global understanding as well as a means of self and other cultural awareness through intercultural communication. While communicating through the target language by online collaboration among students in platforms of long-distance communication, language is used as a tool of interaction to broaden students’ perspectives reaching a substantial improvement with the help of their differences. This process should consider the application of the target language in the inquiry of sociocultural information, expecting the learners to integrate communicative skills to handle cultural differentiation at the same time they apply the knowledge of their target language in a real scenario of communication, despite being through virtual resources.

Keywords: collaborative learning, communicative approach, culture, interaction, interculturalism, target language, virtual partnership

Procedia PDF Downloads 130
2610 Detection and Classification of Rubber Tree Leaf Diseases Using Machine Learning

Authors: Kavyadevi N., Kaviya G., Gowsalya P., Janani M., Mohanraj S.

Abstract:

Hevea brasiliensis, also known as the rubber tree, is one of the foremost assets of crops in the world. One of the most significant advantages of the Rubber Plant in terms of air oxygenation is its capacity to reduce the likelihood of an individual developing respiratory allergies like asthma. To construct such a system that can properly identify crop diseases and pests and then create a database of insecticides for each pest and disease, we must first give treatment for the illness that has been detected. We shall primarily examine three major leaf diseases since they are economically deficient in this article, which is Bird's eye spot, algal spot and powdery mildew. And the recommended work focuses on disease identification on rubber tree leaves. It will be accomplished by employing one of the superior algorithms. Input, Preprocessing, Image Segmentation, Extraction Feature, and Classification will be followed by the processing technique. We will use time-consuming procedures that they use to detect the sickness. As a consequence, the main ailments, underlying causes, and signs and symptoms of diseases that harm the rubber tree are covered in this study.

Keywords: image processing, python, convolution neural network (CNN), machine learning

Procedia PDF Downloads 76
2609 The Affordances and Challenges of Online Learning and Teaching for Secondary School Students

Authors: Hahido Samaras

Abstract:

In many cases, especially with the pandemic playing a major role in fast-tracking the growth of the digital industry, online learning has become a necessity or even a standard educational model nowadays, reliably overcoming barriers such as location, time and cost and frequently combined with a face-to-face format (e.g., in blended learning). This being the case, it is evident that students in many parts of the world, as well as their parents, will increasingly need to become aware of the pros and cons of online versus traditional courses. This fast-growing mode of learning, accelerated during the years of the pandemic, presents an abundance of exciting options especially matched for a large number of secondary school students in remote places of the world where access to stimulating educational settings and opportunities for a variety of learning alternatives are scarce, adding advantages such as flexibility, affordability, engagement, flow and personalization of the learning experience. However, online learning can also present several challenges, such as a lack of student motivation and social interactions in natural settings, digital literacy, and technical issues, to name a few. Therefore, educational researchers will need to conduct further studies focusing on the benefits and weaknesses of online learning vs. traditional learning, while instructional designers propose ways of enhancing student motivation and engagement in virtual environments. Similarly, teachers will be required to become more and more technology-capable, at the same time developing their knowledge about their students’ particular characteristics and needs so as to match them with the affordances the technology offers. And, of course, schools, education programs, and policymakers will have to invest in powerful tools and advanced courses for online instruction. By developing digital courses that incorporate intentional opportunities for community-building and interaction in the learning environment, as well as taking care to include built-in design principles and strategies that align learning outcomes with learning assignments, activities, and assessment practices, rewarding academic experiences can derive for all students. This paper raises various issues regarding the effectiveness of online learning on students by reviewing a large number of research studies related to the usefulness and impact of online learning following the COVID-19-induced digital education shift. It also discusses what students, teachers, decision-makers, and parents have reported about this mode of learning to date. Best practices are proposed for parties involved in the development of online learning materials, particularly for secondary school students, as there is a need for educators and developers to be increasingly concerned about the impact of virtual learning environments on student learning and wellbeing.

Keywords: blended learning, online learning, secondary schools, virtual environments

Procedia PDF Downloads 100
2608 Investigation of Boll Properties on Cotton Picker Machine Performance

Authors: Shahram Nowrouzieh, Abbas Rezaei Asl, Mohamad Ali Jafari

Abstract:

Cotton, as a strategic crop, plays an important role in providing human food and clothing need, because of its oil, protein, and fiber. Iran has been one of the largest cotton producers in the world in the past, but unfortunately, for economic reasons, its production is reduced now. One of the ways to reduce the cost of cotton production is to expand the mechanization of cotton harvesting. Iranian farmers do not accept the function of cotton harvesters. One reason for this lack of acceptance of cotton harvesting machines is the number of field losses on these machines. So, the majority of cotton fields are harvested by hand. Although the correct setting of the harvesting machine is very important in the cotton losses, the morphological properties of the cotton plant also affect the performance of cotton harvesters. In this study, the effect of some cotton morphological properties such as the height of the cotton plant, number, and length of sympodial and monopodial branches, boll dimensions, boll weight, number of carpels and bracts angle were evaluated on the performance of cotton picker. In this research, the efficiency of John Deere 9920 spindle Cotton picker is investigated on five different Iranian cotton cultivars. The results indicate that there was a significant difference between the five cultivars in terms of machine harvest efficiency. Golestan cultivar showed the best cotton harvester performance with an average of 87.6% of total harvestable seed cotton and Khorshid cultivar had the least cotton harvester performance. The principal component analysis showed that, at 50.76% probability, the cotton picker efficiency is affected by the bracts angle positively and by boll dimensions, the number of carpels and the height of cotton plants negatively. The seed cotton remains (in the plant and on the ground) after harvester in PCA scatter plot were in the same zone with boll dimensions and several carpels.

Keywords: cotton, bract, harvester, carpel

Procedia PDF Downloads 135
2607 Comparison of Deep Learning and Machine Learning Algorithms to Diagnose and Predict Breast Cancer

Authors: F. Ghazalnaz Sharifonnasabi, Iman Makhdoom

Abstract:

Breast cancer is a serious health concern that affects many people around the world. According to a study published in the Breast journal, the global burden of breast cancer is expected to increase significantly over the next few decades. The number of deaths from breast cancer has been increasing over the years, but the age-standardized mortality rate has decreased in some countries. It’s important to be aware of the risk factors for breast cancer and to get regular check- ups to catch it early if it does occur. Machin learning techniques have been used to aid in the early detection and diagnosis of breast cancer. These techniques, that have been shown to be effective in predicting and diagnosing the disease, have become a research hotspot. In this study, we consider two deep learning approaches including: Multi-Layer Perceptron (MLP), and Convolutional Neural Network (CNN). We also considered the five-machine learning algorithm titled: Decision Tree (C4.5), Naïve Bayesian (NB), Support Vector Machine (SVM), K-Nearest Neighbors (KNN) Algorithm and XGBoost (eXtreme Gradient Boosting) on the Breast Cancer Wisconsin Diagnostic dataset. We have carried out the process of evaluating and comparing classifiers involving selecting appropriate metrics to evaluate classifier performance and selecting an appropriate tool to quantify this performance. The main purpose of the study is predicting and diagnosis breast cancer, applying the mentioned algorithms and also discovering of the most effective with respect to confusion matrix, accuracy and precision. It is realized that CNN outperformed all other classifiers and achieved the highest accuracy (0.982456). The work is implemented in the Anaconda environment based on Python programing language.

Keywords: breast cancer, multi-layer perceptron, Naïve Bayesian, SVM, decision tree, convolutional neural network, XGBoost, KNN

Procedia PDF Downloads 75
2606 Collaborative Data Refinement for Enhanced Ionic Conductivity Prediction in Garnet-Type Materials

Authors: Zakaria Kharbouch, Mustapha Bouchaara, F. Elkouihen, A. Habbal, A. Ratnani, A. Faik

Abstract:

Solid-state lithium-ion batteries have garnered increasing interest in modern energy research due to their potential for safer, more efficient, and sustainable energy storage systems. Among the critical components of these batteries, the electrolyte plays a pivotal role, with LLZO garnet-based electrolytes showing significant promise. Garnet materials offer intrinsic advantages such as high Li-ion conductivity, wide electrochemical stability, and excellent compatibility with lithium metal anodes. However, optimizing ionic conductivity in garnet structures poses a complex challenge, primarily due to the multitude of potential dopants that can be incorporated into the LLZO crystal lattice. The complexity of material design, influenced by numerous dopant options, requires a systematic method to find the most effective combinations. This study highlights the utility of machine learning (ML) techniques in the materials discovery process to navigate the complex range of factors in garnet-based electrolytes. Collaborators from the materials science and ML fields worked with a comprehensive dataset previously employed in a similar study and collected from various literature sources. This dataset served as the foundation for an extensive data refinement phase, where meticulous error identification, correction, outlier removal, and garnet-specific feature engineering were conducted. This rigorous process substantially improved the dataset's quality, ensuring it accurately captured the underlying physical and chemical principles governing garnet ionic conductivity. The data refinement effort resulted in a significant improvement in the predictive performance of the machine learning model. Originally starting at an accuracy of 0.32, the model underwent substantial refinement, ultimately achieving an accuracy of 0.88. This enhancement highlights the effectiveness of the interdisciplinary approach and underscores the substantial potential of machine learning techniques in materials science research.

Keywords: lithium batteries, all-solid-state batteries, machine learning, solid state electrolytes

Procedia PDF Downloads 61
2605 A Less Complexity Deep Learning Method for Drones Detection

Authors: Mohamad Kassab, Amal El Fallah Seghrouchni, Frederic Barbaresco, Raed Abu Zitar

Abstract:

Detecting objects such as drones is a challenging task as their relative size and maneuvering capabilities deceive machine learning models and cause them to misclassify drones as birds or other objects. In this work, we investigate applying several deep learning techniques to benchmark real data sets of flying drones. A deep learning paradigm is proposed for the purpose of mitigating the complexity of those systems. The proposed paradigm consists of a hybrid between the AdderNet deep learning paradigm and the Single Shot Detector (SSD) paradigm. The goal was to minimize multiplication operations numbers in the filtering layers within the proposed system and, hence, reduce complexity. Some standard machine learning technique, such as SVM, is also tested and compared to other deep learning systems. The data sets used for training and testing were either complete or filtered in order to remove the images with mall objects. The types of data were RGB or IR data. Comparisons were made between all these types, and conclusions were presented.

Keywords: drones detection, deep learning, birds versus drones, precision of detection, AdderNet

Procedia PDF Downloads 182
2604 Machine Translation Analysis of Chinese Dish Names

Authors: Xinyu Zhang, Olga Torres-Hostench

Abstract:

This article presents a comparative study evaluating and comparing the quality of machine translation (MT) output of Chinese gastronomy nomenclature. Chinese gastronomic culture is experiencing an increased international acknowledgment nowadays. The nomenclature of Chinese gastronomy not only reflects a specific aspect of culture, but it is related to other areas of society such as philosophy, traditional medicine, etc. Chinese dish names are composed of several types of cultural references, such as ingredients, colors, flavors, culinary techniques, cooking utensils, toponyms, anthroponyms, metaphors, historical tales, among others. These cultural references act as one of the biggest difficulties in translation, in which the use of translation techniques is usually required. Regarding the lack of Chinese food-related translation studies, especially in Chinese-Spanish translation, and the current massive use of MT, the quality of the MT output of Chinese dish names is questioned. Fifty Chinese dish names with different types of cultural components were selected in order to complete this study. First, all of these dish names were translated by three different MT tools (Google Translate, Baidu Translate and Bing Translator). Second, a questionnaire was designed and completed by 12 Chinese online users (Chinese graduates of a Hispanic Philology major) in order to find out user preferences regarding the collected MT output. Finally, human translation techniques were observed and analyzed to identify what translation techniques would be observed more often in the preferred MT proposals. The result reveals that the MT output of the Chinese gastronomy nomenclature is not of high quality. It would be recommended not to trust the MT in occasions like restaurant menus, TV culinary shows, etc. However, the MT output could be used as an aid for tourists to have a general idea of a dish (the main ingredients, for example). Literal translation turned out to be the most observed technique, followed by borrowing, generalization and adaptation, while amplification, particularization and transposition were infrequently observed. Possibly because that the MT engines at present are limited to relate equivalent terms and offer literal translations without taking into account the whole context meaning of the dish name, which is essential to the application of those less observed techniques. This could give insight into the post-editing of the Chinese dish name translation. By observing and analyzing translation techniques in the proposals of the machine translators, the post-editors could better decide which techniques to apply in each case so as to correct mistakes and improve the quality of the translation.

Keywords: Chinese dish names, cultural references, machine translation, translation techniques

Procedia PDF Downloads 137
2603 Educational Turn towards Digitalization by Changing Leadership, Networks and Qualification Concepts

Authors: Patricia Girrbach

Abstract:

Currently, our society is facing a new and incremental upheaval technological revolution named digitalization. In order to face the relating challenges organizations have to be prepared. They need appropriate circumstances in order to cope with current issues concerning digital transformation processes. Nowadays digitalization emerged as top issues for companies and business leaders. In this context, it is a pressure on companies to have a positive, productive digital culture. And indeed, Organizations realize that they need to address this important issue. In this context 87 percent of organizations quote culture and engagement as one of their top challenges in terms of any change process, but especially in terms of the digital turn. Executives can give their company a competitive advantage and attract top talent by having a strong workplace culture that supports digitalization. Many current studies attest that fact. Digital-oriented companies can hire more easily, they have the lowest voluntary turnover rates, deliver better customer service, and are more profitable over the long run. Based on this background it is important to provide companies starting points and practical measurements how to reach this goal. The major findings are that firms need to make sense out of digitalization. In this context, they should focus on internal but also on external stakeholders. Furthermore, they should create certain working conditions and they should support the qualification of employees, e.g. by Virtual Reality. These measurements can create positive experiences in terms of digitalization in order to ensure the support of stuff in terms of the digital turn. Based on several current studies and literature research this paper provides concrete measurements for companies in order to enable the digital turn. Therefore, the aim of this paper is providing possible practical starting points which support both the education of employees by digitalization as well as the digital turn itself within the organization.

Keywords: digitalization, industry 4.0, education 4.0, virtual reality

Procedia PDF Downloads 159
2602 Discrimination and Classification of Vestibular Neuritis Using Combined Fisher and Support Vector Machine Model

Authors: Amine Ben Slama, Aymen Mouelhi, Sondes Manoubi, Chiraz Mbarek, Hedi Trabelsi, Mounir Sayadi, Farhat Fnaiech

Abstract:

Vertigo is a sensation of feeling off balance; the cause of this symptom is very difficult to interpret and needs a complementary exam. Generally, vertigo is caused by an ear problem. Some of the most common causes include: benign paroxysmal positional vertigo (BPPV), Meniere's disease and vestibular neuritis (VN). In clinical practice, different tests of videonystagmographic (VNG) technique are used to detect the presence of vestibular neuritis (VN). The topographical diagnosis of this disease presents a large diversity in its characteristics that confirm a mixture of problems for usual etiological analysis methods. In this study, a vestibular neuritis analysis method is proposed with videonystagmography (VNG) applications using an estimation of pupil movements in the case of an uncontrolled motion to obtain an efficient and reliable diagnosis results. First, an estimation of the pupil displacement vectors using with Hough Transform (HT) is performed to approximate the location of pupil region. Then, temporal and frequency features are computed from the rotation angle variation of the pupil motion. Finally, optimized features are selected using Fisher criterion evaluation for discrimination and classification of the VN disease.Experimental results are analyzed using two categories: normal and pathologic. By classifying the reduced features using the Support Vector Machine (SVM), 94% is achieved as classification accuracy. Compared to recent studies, the proposed expert system is extremely helpful and highly effective to resolve the problem of VNG analysis and provide an accurate diagnostic for medical devices.

Keywords: nystagmus, vestibular neuritis, videonystagmographic system, VNG, Fisher criterion, support vector machine, SVM

Procedia PDF Downloads 136
2601 Machine Learning Techniques in Bank Credit Analysis

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner

Abstract:

The aim of this paper is to compare and discuss better classifier algorithm options for credit risk assessment by applying different Machine Learning techniques. Using records from a Brazilian financial institution, this study uses a database of 5,432 companies that are clients of the bank, where 2,600 clients are classified as non-defaulters, 1,551 are classified as defaulters and 1,281 are temporarily defaulters, meaning that the clients are overdue on their payments for up 180 days. For each case, a total of 15 attributes was considered for a one-against-all assessment using four different techniques: Artificial Neural Networks Multilayer Perceptron (ANN-MLP), Artificial Neural Networks Radial Basis Functions (ANN-RBF), Logistic Regression (LR) and finally Support Vector Machines (SVM). For each method, different parameters were analyzed in order to obtain different results when the best of each technique was compared. Initially the data were coded in thermometer code (numerical attributes) or dummy coding (for nominal attributes). The methods were then evaluated for each parameter and the best result of each technique was compared in terms of accuracy, false positives, false negatives, true positives and true negatives. This comparison showed that the best method, in terms of accuracy, was ANN-RBF (79.20% for non-defaulter classification, 97.74% for defaulters and 75.37% for the temporarily defaulter classification). However, the best accuracy does not always represent the best technique. For instance, on the classification of temporarily defaulters, this technique, in terms of false positives, was surpassed by SVM, which had the lowest rate (0.07%) of false positive classifications. All these intrinsic details are discussed considering the results found, and an overview of what was presented is shown in the conclusion of this study.

Keywords: artificial neural networks (ANNs), classifier algorithms, credit risk assessment, logistic regression, machine Learning, support vector machines

Procedia PDF Downloads 103
2600 Feasibility Study of Measurement of Turning Based-Surfaces Using Perthometer, Optical Profiler and Confocal Sensor

Authors: Khavieya Anandhan, Soundarapandian Santhanakrishnan, Vijayaraghavan Laxmanan

Abstract:

In general, measurement of surfaces is carried out by using traditional methods such as contact type stylus instruments. This prevalent approach is challenged by using non-contact instruments such as optical profiler, co-ordinate measuring machine, laser triangulation sensors, machine vision system, etc. Recently, confocal sensor is trying to be used in the surface metrology field. This sensor, such as a confocal sensor, is explored in this study to determine the surface roughness value for various turned surfaces. Turning is a crucial machining process to manufacture products such as grooves, tapered domes, threads, tapers, etc. The roughness value of turned surfaces are in the range of range 0.4-12.5 µm, were taken for analysis. Three instruments were used, namely, perthometer, optical profiler, and confocal sensor. Among these, in fact, a confocal sensor is least explored, despite its good resolution about 5 nm. Thus, such a high-precision sensor was used in this study to explore the possibility of measuring turned surfaces. Further, using this data, measurement uncertainty was also studied.

Keywords: confocal sensor, optical profiler, surface roughness, turned surfaces

Procedia PDF Downloads 134
2599 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification

Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos

Abstract:

Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.

Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology

Procedia PDF Downloads 149
2598 Machine Learning Techniques for COVID-19 Detection: A Comparative Analysis

Authors: Abeer A. Aljohani

Abstract:

COVID-19 virus spread has been one of the extreme pandemics across the globe. It is also referred to as coronavirus, which is a contagious disease that continuously mutates into numerous variants. Currently, the B.1.1.529 variant labeled as omicron is detected in South Africa. The huge spread of COVID-19 disease has affected several lives and has surged exceptional pressure on the healthcare systems worldwide. Also, everyday life and the global economy have been at stake. This research aims to predict COVID-19 disease in its initial stage to reduce the death count. Machine learning (ML) is nowadays used in almost every area. Numerous COVID-19 cases have produced a huge burden on the hospitals as well as health workers. To reduce this burden, this paper predicts COVID-19 disease is based on the symptoms and medical history of the patient. This research presents a unique architecture for COVID-19 detection using ML techniques integrated with feature dimensionality reduction. This paper uses a standard UCI dataset for predicting COVID-19 disease. This dataset comprises symptoms of 5434 patients. This paper also compares several supervised ML techniques to the presented architecture. The architecture has also utilized 10-fold cross validation process for generalization and the principal component analysis (PCA) technique for feature reduction. Standard parameters are used to evaluate the proposed architecture including F1-Score, precision, accuracy, recall, receiver operating characteristic (ROC), and area under curve (AUC). The results depict that decision tree, random forest, and neural networks outperform all other state-of-the-art ML techniques. This achieved result can help effectively in identifying COVID-19 infection cases.

Keywords: supervised machine learning, COVID-19 prediction, healthcare analytics, random forest, neural network

Procedia PDF Downloads 92
2597 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State

Authors: Hamisu Idi

Abstract:

The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.

Keywords: cement, quality, variation, assignable cause, common cause

Procedia PDF Downloads 261
2596 “CheckPrivate”: Artificial Intelligence Powered Mobile Application to Enhance the Well-Being of Sextual Transmitted Diseases Patients in Sri Lanka under Cultural Barriers

Authors: Warnakulasuriya Arachichige Malisha Ann Rosary Fernando, Udalamatta Gamage Omila Chalanka Jinadasa, Bihini Pabasara Amandi Amarasinghe, Manul Thisuraka Mandalawatta, Uthpala Samarakoon, Manori Gamage

Abstract:

The surge in sexually transmitted diseases (STDs) has become a critical public health crisis demanding urgent attention and action. Like many other nations, Sri Lanka is grappling with a significant increase in STDs due to a lack of education and awareness regarding their dangers. Presently, the available applications for tracking and managing STDs cover only a limited number of easily detectable infections, resulting in a significant gap in effectively controlling their spread. To address this gap and combat the rising STD rates, it is essential to leverage technology and data. Employing technology to enhance the tracking and management of STDs is vital to prevent their further propagation and to enable early intervention and treatment. This requires adopting a comprehensive approach that involves raising public awareness about the perils of STDs, improving access to affordable healthcare services for early detection and treatment, and utilizing advanced technology and data analysis. The proposed mobile application aims to cater to a broad range of users, including STD patients, recovered individuals, and those unaware of their STD status. By harnessing cutting-edge technologies like image detection, symptom-based identification, prevention methods, doctor and clinic recommendations, and virtual counselor chat, the application offers a holistic approach to STD management. In conclusion, the escalating STD rates in Sri Lanka and across the globe require immediate action. The integration of technology-driven solutions, along with comprehensive education and healthcare accessibility, is the key to curbing the spread of STDs and promoting better overall public health.

Keywords: STD, machine learning, NLP, artificial intelligence

Procedia PDF Downloads 81
2595 New Advanced Medical Software Technology Challenges and Evolution of the Regulatory Framework in Expert Software, Artificial Intelligence, and Machine Learning

Authors: Umamaheswari Shanmugam, Silvia Ronchi, Radu Vornicu

Abstract:

Software, artificial intelligence, and machine learning can improve healthcare through innovative and advanced technologies that are able to use the large amount and variety of data generated during healthcare services every day. As we read the news, over 500 machine learning or other artificial intelligence medical devices have now received FDA clearance or approval, the first ones even preceding the year 2000. One of the big advantages of these new technologies is the ability to get experience and knowledge from real-world use and to continuously improve their performance. Healthcare systems and institutions can have a great benefit because the use of advanced technologies improves the same time efficiency and efficacy of healthcare. Software-defined as a medical device, is stand-alone software that is intended to be used for patients for one or more of these specific medical intended uses: - diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of a disease, any other health conditions, replacing or modifying any part of a physiological or pathological process–manage the received information from in vitro specimens derived from the human samples (body) and without principal main action of its principal intended use by pharmacological, immunological or metabolic definition. Software qualified as medical devices must comply with the general safety and performance requirements applicable to medical devices. These requirements are necessary to ensure high performance and quality and also to protect patients’ safety. The evolution and the continuous improvement of software used in healthcare must take into consideration the increase in regulatory requirements, which are becoming more complex in each market. The gap between these advanced technologies and the new regulations is the biggest challenge for medical device manufacturers. Regulatory requirements can be considered a market barrier, as they can delay or obstacle the device approval, but they are necessary to ensure performance, quality, and safety, and at the same time, they can be a business opportunity if the manufacturer is able to define in advance the appropriate regulatory strategy. The abstract will provide an overview of the current regulatory framework, the evolution of the international requirements, and the standards applicable to medical device software in the potential market all over the world.

Keywords: artificial intelligence, machine learning, SaMD, regulatory, clinical evaluation, classification, international requirements, MDR, 510k, PMA, IMDRF, cyber security, health care systems.

Procedia PDF Downloads 89
2594 A Machine Learning-Based Model to Screen Antituberculosis Compound Targeted against LprG Lipoprotein of Mycobacterium tuberculosis

Authors: Syed Asif Hassan, Syed Atif Hassan

Abstract:

Multidrug-resistant Tuberculosis (MDR-TB) is an infection caused by the resistant strains of Mycobacterium tuberculosis that do not respond either to isoniazid or rifampicin, which are the most important anti-TB drugs. The increase in the occurrence of a drug-resistance strain of MTB calls for an intensive search of novel target-based therapeutics. In this context LprG (Rv1411c) a lipoprotein from MTB plays a pivotal role in the immune evasion of Mtb leading to survival and propagation of the bacterium within the host cell. Therefore, a machine learning method will be developed for generating a computational model that could predict for a potential anti LprG activity of the novel antituberculosis compound. The present study will utilize dataset from PubChem database maintained by National Center for Biotechnology Information (NCBI). The dataset involves compounds screened against MTB were categorized as active and inactive based upon PubChem activity score. PowerMV, a molecular descriptor generator, and visualization tool will be used to generate the 2D molecular descriptors for the actives and inactive compounds present in the dataset. The 2D molecular descriptors generated from PowerMV will be used as features. We feed these features into three different classifiers, namely, random forest, a deep neural network, and a recurring neural network, to build separate predictive models and choosing the best performing model based on the accuracy of predicting novel antituberculosis compound with an anti LprG activity. Additionally, the efficacy of predicted active compounds will be screened using SMARTS filter to choose molecule with drug-like features.

Keywords: antituberculosis drug, classifier, machine learning, molecular descriptors, prediction

Procedia PDF Downloads 391
2593 Component Based Testing Using Clustering and Support Vector Machine

Authors: Iqbaldeep Kaur, Amarjeet Kaur

Abstract:

Software Reusability is important part of software development. So component based software development in case of software testing has gained a lot of practical importance in the field of software engineering from academic researcher and also from software development industry perspective. Finding test cases for efficient reuse of test cases is one of the important problems aimed by researcher. Clustering reduce the search space, reuse test cases by grouping similar entities according to requirements ensuring reduced time complexity as it reduce the search time for retrieval the test cases. In this research paper we proposed approach for re-usability of test cases by unsupervised approach. In unsupervised learning we proposed k-mean and Support Vector Machine. We have designed the algorithm for requirement and test case document clustering according to its tf-idf vector space and the output is set of highly cohesive pattern groups.

Keywords: software testing, reusability, clustering, k-mean, SVM

Procedia PDF Downloads 430
2592 Digital Platform of Crops for Smart Agriculture

Authors: Pascal François Faye, Baye Mor Sall, Bineta Dembele, Jeanne Ana Awa Faye

Abstract:

In agriculture, estimating crop yields is key to improving productivity and decision-making processes such as financial market forecasting and addressing food security issues. The main objective of this paper is to have tools to predict and improve the accuracy of crop yield forecasts using machine learning (ML) algorithms such as CART , KNN and SVM . We developed a mobile app and a web app that uses these algorithms for practical use by farmers. The tests show that our system (collection and deployment architecture, web application and mobile application) is operational and validates empirical knowledge on agro-climatic parameters in addition to proactive decision-making support. The experimental results obtained on the agricultural data, the performance of the ML algorithms are compared using cross-validation in order to identify the most effective ones following the agricultural data. The proposed applications demonstrate that the proposed approach is effective in predicting crop yields and provides timely and accurate responses to farmers for decision support.

Keywords: prediction, machine learning, artificial intelligence, digital agriculture

Procedia PDF Downloads 80
2591 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 144
2590 A Novel Combined Finger Counting and Finite State Machine Technique for ASL Translation Using Kinect

Authors: Rania Ahmed Kadry Abdel Gawad Birry, Mohamed El-Habrouk

Abstract:

This paper presents a brief survey of the techniques used for sign language recognition along with the types of sensors used to perform the task. It presents a modified method for identification of an isolated sign language gesture using Microsoft Kinect with the OpenNI framework. It presents the way of extracting robust features from the depth image provided by Microsoft Kinect and the OpenNI interface and to use them in creating a robust and accurate gesture recognition system, for the purpose of ASL translation. The Prime Sense’s Natural Interaction Technology for End-user - NITE™ - was also used in the C++ implementation of the system. The algorithm presents a simple finger counting algorithm for static signs as well as directional Finite State Machine (FSM) description of the hand motion in order to help in translating a sign language gesture. This includes both letters and numbers performed by a user, which in-turn may be used as an input for voice pronunciation systems.

Keywords: American sign language, finger counting, hand tracking, Microsoft Kinect

Procedia PDF Downloads 296