Search results for: motor for washing machine
2799 Designing Energy Efficient Buildings for Seasonal Climates Using Machine Learning Techniques
Authors: Kishor T. Zingre, Seshadhri Srinivasan
Abstract:
Energy consumption by the building sector is increasing at an alarming rate throughout the world and leading to more building-related CO₂ emissions into the environment. In buildings, the main contributors to energy consumption are heating, ventilation, and air-conditioning (HVAC) systems, lighting, and electrical appliances. It is hypothesised that the energy efficiency in buildings can be achieved by implementing sustainable technologies such as i) enhancing the thermal resistance of fabric materials for reducing heat gain (in hotter climates) and heat loss (in colder climates), ii) enhancing daylight and lighting system, iii) HVAC system and iv) occupant localization. Energy performance of various sustainable technologies is highly dependent on climatic conditions. This paper investigated the use of machine learning techniques for accurate prediction of air-conditioning energy in seasonal climates. The data required to train the machine learning techniques is obtained using the computational simulations performed on a 3-story commercial building using EnergyPlus program plugged-in with OpenStudio and Google SketchUp. The EnergyPlus model was calibrated against experimental measurements of surface temperatures and heat flux prior to employing for the simulations. It has been observed from the simulations that the performance of sustainable fabric materials (for walls, roof, and windows) such as phase change materials, insulation, cool roof, etc. vary with the climate conditions. Various renewable technologies were also used for the building flat roofs in various climates to investigate the potential for electricity generation. It has been observed that the proposed technique overcomes the shortcomings of existing approaches, such as local linearization or over-simplifying assumptions. In addition, the proposed method can be used for real-time estimation of building air-conditioning energy.Keywords: building energy efficiency, energyplus, machine learning techniques, seasonal climates
Procedia PDF Downloads 1132798 Socio-Motor Experience between Affectivity and Movement from Harry Potter to Lord of the Rings
Authors: Manuela Gamba, Niki Mandolesi
Abstract:
Teenagers today have little knowledge about how to move or play together. The adults who are part of sports culture must find an effective way to foster this essential ability. Our research in Italy uses a 'holistic model' based on fantasy literature to explore the relationships between the game identities and self-identities of young people and the achievement of psycho-motor, emotional and social well-being in the realms of sport and education. Physical activity projects were carried out in schools and extra-curricular associations in Rome, combining outdoor activities and distance learning. This holistic and malleable game model is inspired by fantasy accounts of the journeys taken in The Lord of Rings and Harry Potter books. We know that many have a lot of resistance to the idea of using fantasy and play as a pedagogical tool, but the results obtained in this experience are surprising. Our interventions and investigations focused on promoting self-esteem, awareness, a sense of belonging, social integration, cooperation, well-being, and informed decision making: a basis for healthy and effective citizenship. For teenagers, creative thinking is the right stimulus to involve and compare the story of characters to their own journey through social and self-reflective identity analysis. We observed how important it is to engage students emotionally as well as cognitively and that enabling them to play with identity through relationships with peers. There is a need today for a multidisciplinary synthesis of analog and digital values, especially in response to recent distance-living experiences. There is a need for a global reconceptualization of free time and nature in the human experience.Keywords: awareness, creativity, identity, play
Procedia PDF Downloads 1892797 An Automated R-Peak Detection Method Using Common Vector Approach
Authors: Ali Kirkbas
Abstract:
R peaks in an electrocardiogram (ECG) are signs of cardiac activity in individuals that reveal valuable information about cardiac abnormalities, which can lead to mortalities in some cases. This paper examines the problem of detecting R-peaks in ECG signals, which is a two-class pattern classification problem in fact. To handle this problem with a reliable high accuracy, we propose to use the common vector approach which is a successful machine learning algorithm. The dataset used in the proposed method is obtained from MIT-BIH, which is publicly available. The results are compared with the other popular methods under the performance metrics. The obtained results show that the proposed method shows good performance than that of the other. methods compared in the meaning of diagnosis accuracy and simplicity which can be operated on wearable devices.Keywords: ECG, R-peak classification, common vector approach, machine learning
Procedia PDF Downloads 622796 The Principle of a Thought Formation: The Biological Base for a Thought
Authors: Ludmila Vucolova
Abstract:
The thought is a process that underlies consciousness and cognition and understanding its origin and processes is a longstanding goal of many academic disciplines. By integrating over twenty novel ideas and hypotheses of this theoretical proposal, we can speculate that thought is an emergent property of coded neural events, translating the electro-chemical interactions of the body with its environment—the objects of sensory stimulation, X, and Y. The latter is a self- generated feedback entity, resulting from the arbitrary pattern of the motion of a body’s motor repertory (M). A culmination of these neural events gives rise to a thought: a state of identity between an observed object X and a symbol Y. It manifests as a “state of awareness” or “state of knowing” and forms our perception of the physical world. The values of the variables of a construct—X (object), S1 (sense for the perception of X), Y (object), S2 (sense for perception of Y), and M (motor repertory that produces Y)—will specify the particular conscious percept at any given time. The proposed principle of interaction between the elements of a construct (X, Y, S1, S2, M) is universal and applies for all modes of communication (normal, deaf, blind, deaf and blind people) and for various language systems (Chinese, Italian, English, etc.). The particular arrangement of modalities of each of the three modules S1 (5 of 5), S2 (1 of 3), and M (3 of 3) defines a specific mode of communication. This multifaceted paradigm demonstrates a predetermined pattern of relationships between X, Y, and M that passes from generation to generation. The presented analysis of a cognitive experience encompasses the key elements of embodied cognition theories and unequivocally accords with the scientific interpretation of cognition as the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses, and cognition means thinking and awareness. By assembling the novel ideas presented in twelve sections, we can reveal that in the invisible “chaos”, there is an order, a structure with landmarks and principles of operations and mental processes (thoughts) are physical and have a biological basis. This innovative proposal explains the phenomenon of mental imagery; give the first insight into the relationship between mental states and brain states, and support the notion that mind and body are inseparably connected. The findings of this theoretical proposal are supported by the current scientific data and are substantiated by the records of the evolution of language and human intelligence.Keywords: agent, awareness, cognitive, element, experience, feedback, first person, imagery, language, mental, motor, object, sensory, symbol, thought
Procedia PDF Downloads 3832795 Dependence of Autoignition Delay Period on Equivalence Ratio for i-Octane, Primary Reference Fuel
Authors: Sunil Verma
Abstract:
In today’s world non-renewable sources are depleting quickly, so there is a need to produce efficient and unconventional engines to minimize the use of fuel. Also, there are many fatal accidents happening every year during extraction, distillation, transportation and storage of fuel. Reason for explosions of gaseous fuel is unwanted autoignition. Autoignition characterstics of fuel are mandatory to study to build efficient engines and to avoid accidents. This report is concerned with study of autoignition delay characteristics of iso-octane by using rapid compression machine. The paper clearly explains the dependence of ignition delay characteristics on variation of equivalence ratios from lean to rich mixtures. The equivalence ratio is varied from 0.3 to 1.2.Keywords: autoignition, iso-octane, combustion, rapid compression machine, equivalence ratio, ignition delay
Procedia PDF Downloads 4422794 A Combined Approach Based on Artificial Intelligence and Computer Vision for Qualitative Grading of Rice Grains
Authors: Hemad Zareiforoush, Saeed Minaei, Ahmad Banakar, Mohammad Reza Alizadeh
Abstract:
The quality inspection of rice (Oryza sativa L.) during its various processing stages is very important. In this research, an artificial intelligence-based model coupled with computer vision techniques was developed as a decision support system for qualitative grading of rice grains. For conducting the experiments, first, 25 samples of rice grains with different levels of percentage of broken kernels (PBK) and degree of milling (DOM) were prepared and their qualitative grade was assessed by experienced experts. Then, the quality parameters of the same samples examined by experts were determined using a machine vision system. A grading model was developed based on fuzzy logic theory in MATLAB software for making a relationship between the qualitative characteristics of the product and its quality. Totally, 25 rules were used for qualitative grading based on AND operator and Mamdani inference system. The fuzzy inference system was consisted of two input linguistic variables namely, DOM and PBK, which were obtained by the machine vision system, and one output variable (quality of the product). The model output was finally defuzzified using Center of Maximum (COM) method. In order to evaluate the developed model, the output of the fuzzy system was compared with experts’ assessments. It was revealed that the developed model can estimate the qualitative grade of the product with an accuracy of 95.74%.Keywords: machine vision, fuzzy logic, rice, quality
Procedia PDF Downloads 4182793 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning
Authors: Saahith M. S., Sivakami R.
Abstract:
In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis
Procedia PDF Downloads 362792 Direct Drive Double Fed Wind Generator
Authors: Vlado Ostovic
Abstract:
An electric machine topology characterized by single tooth winding in both stator and rotor is presented. The proposed machine is capable of operating as a direct drive double fed wind generator (DDDF, D3F) because it requires no gearbox and only a reduced-size converter. A wind turbine drive built around a D3F generator is cheaper to manufacture, requires less maintenance, and has a higher energy yield than its conventional counterparts. The single tooth wound generator of a D3F turbine has superb volume utilization and lower stator I2R losses due to its extremely short-end windings. Both stator and rotor of a D3F generator can be manufactured in segments, which simplifies its assembly and transportation to the site, and makes production cheaper.Keywords: direct drive, double fed generator, gearbox, permanent magnet generators, single tooth winding, wind power
Procedia PDF Downloads 1892791 Machine Learning Based Approach for Measuring Promotion Effectiveness in Multiple Parallel Promotions’ Scenarios
Authors: Revoti Prasad Bora, Nikita Katyal
Abstract:
Promotion is a key element in the retail business. Thus, analysis of promotions to quantify their effectiveness in terms of Revenue and/or Margin is an essential activity in the retail industry. However, measuring the sales/revenue uplift is based on estimations, as the actual sales/revenue without the promotion is not present. Further, the presence of Halo and Cannibalization in a multiple parallel promotions’ scenario complicates the problem. Calculating Baseline by considering inter-brand/competitor items or using Halo and Cannibalization's impact on Revenue calculations by considering Baseline as an interpretation of items’ unit sales in neighboring nonpromotional weeks individually may not capture the overall Revenue uplift in the case of multiple parallel promotions. Hence, this paper proposes a Machine Learning based method for calculating the Revenue uplift by considering the Halo and Cannibalization impact on the Baseline and the Revenue. In the first section of the proposed methodology, Baseline of an item is calculated by incorporating the impact of the promotions on its related items. In the later section, the Revenue of an item is calculated by considering both Halo and Cannibalization impacts. Hence, this methodology enables correct calculation of the overall Revenue uplift due a given promotion.Keywords: Halo, Cannibalization, promotion, Baseline, temporary price reduction, retail, elasticity, cross price elasticity, machine learning, random forest, linear regression
Procedia PDF Downloads 1752790 Automatic Detection of Suicidal Behaviors Using an RGB-D Camera: Azure Kinect
Authors: Maha Jazouli
Abstract:
Suicide is one of the most important causes of death in the prison environment, both in Canada and internationally. Rates of attempts of suicide and self-harm have been on the rise in recent years, with hangings being the most frequent method resorted to. The objective of this article is to propose a method to automatically detect in real time suicidal behaviors. We present a gesture recognition system that consists of three modules: model-based movement tracking, feature extraction, and gesture recognition using machine learning algorithms (MLA). Our proposed system gives us satisfactory results. This smart video surveillance system can help assist staff responsible for the safety and health of inmates by alerting them when suicidal behavior is detected, which helps reduce mortality rates and save lives.Keywords: suicide detection, Kinect azure, RGB-D camera, SVM, machine learning, gesture recognition
Procedia PDF Downloads 1872789 'CardioCare': A Cutting-Edge Fusion of IoT and Machine Learning to Bridge the Gap in Cardiovascular Risk Management
Authors: Arpit Patil, Atharav Bhagwat, Rajas Bhope, Pramod Bide
Abstract:
This research integrates IoT and ML to predict heart failure risks, utilizing the Framingham dataset. IoT devices gather real-time physiological data, focusing on heart rate dynamics, while ML, specifically Random Forest, predicts heart failure. Rigorous feature selection enhances accuracy, achieving over 90% prediction rate. This amalgamation marks a transformative step in proactive healthcare, highlighting early detection's critical role in cardiovascular risk mitigation. Challenges persist, necessitating continual refinement for improved predictive capabilities.Keywords: cardiovascular diseases, internet of things, machine learning, cardiac risk assessment, heart failure prediction, early detection, cardio data analysis
Procedia PDF Downloads 92788 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation
Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen
Abstract:
Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning
Procedia PDF Downloads 712787 Advances in Machine Learning and Deep Learning Techniques for Image Classification and Clustering
Authors: R. Nandhini, Gaurab Mudbhari
Abstract:
Ranging from the field of health care to self-driving cars, machine learning and deep learning algorithms have revolutionized the field with the proper utilization of images and visual-oriented data. Segmentation, regression, classification, clustering, dimensionality reduction, etc., are some of the Machine Learning tasks that helped Machine Learning and Deep Learning models to become state-of-the-art models for the field where images are key datasets. Among these tasks, classification and clustering are essential but difficult because of the intricate and high-dimensional characteristics of image data. This finding examines and assesses advanced techniques in supervised classification and unsupervised clustering for image datasets, emphasizing the relative efficiency of Convolutional Neural Networks (CNNs), Vision Transformers (ViTs), Deep Embedded Clustering (DEC), and self-supervised learning approaches. Due to the distinctive structural attributes present in images, conventional methods often fail to effectively capture spatial patterns, resulting in the development of models that utilize more advanced architectures and attention mechanisms. In image classification, we investigated both CNNs and ViTs. One of the most promising models, which is very much known for its ability to detect spatial hierarchies, is CNN, and it serves as a core model in our study. On the other hand, ViT is another model that also serves as a core model, reflecting a modern classification method that uses a self-attention mechanism which makes them more robust as this self-attention mechanism allows them to lean global dependencies in images without relying on convolutional layers. This paper evaluates the performance of these two architectures based on accuracy, precision, recall, and F1-score across different image datasets, analyzing their appropriateness for various categories of images. In the domain of clustering, we assess DEC, Variational Autoencoders (VAEs), and conventional clustering techniques like k-means, which are used on embeddings derived from CNN models. DEC, a prominent model in the field of clustering, has gained the attention of many ML engineers because of its ability to combine feature learning and clustering into a single framework and its main goal is to improve clustering quality through better feature representation. VAEs, on the other hand, are pretty well known for using latent embeddings for grouping similar images without requiring for prior label by utilizing the probabilistic clustering method.Keywords: machine learning, deep learning, image classification, image clustering
Procedia PDF Downloads 62786 Accessing Motional Quotient for All Round Development
Authors: Zongping Wang, Chengjun Cui, Jiacun Wang
Abstract:
The concept of intelligence has been widely used to access an individual's cognitive abilities to learn, form concepts, understand, apply logic, and reason. According to the multiple intelligence theory, there are eight distinguished types of intelligence. One of them is the bodily-kinaesthetic intelligence that links to the capacity of an individual controlling his body and working with objects. Motor intelligence, on the other hand, reflects the capacity to understand, perceive and solve functional problems by motor behavior. Both bodily-kinaesthetic intelligence and motor intelligence refer directly or indirectly to bodily capacity. Inspired by these two intelligence concepts, this paper introduces motional intelligence (MI). MI is two-fold. (1) Body strength, which is the capacity of various organ functions manifested by muscle activity under the control of the central nervous system during physical exercises. It can be measured by the magnitude of muscle contraction force, the frequency of repeating a movement, the time to finish a movement of body position, the duration to maintain muscles in a working status, etc. Body strength reflects the objective of MI. (2) Level of psychiatric willingness to physical events. It is a subjective thing and determined by an individual’s self-consciousness to physical events and resistance to fatigue. As such, we call it subjective MI. Subjective MI can be improved through education and proper social events. The improvement of subjective MI can lead to that of objective MI. A quantitative score of an individual’s MI is motional quotient (MQ). MQ is affected by several factors, including genetics, physical training, diet and lifestyle, family and social environment, and personal awareness of the importance of physical exercise. Genes determine one’s body strength potential. Physical training, in general, makes people stronger, faster and swifter. Diet and lifestyle have a direct impact on health. Family and social environment largely affect one’s passion for physical activities, so does personal awareness of the importance of physical exercise. The key to the success of the MQ study is developing an acceptable and efficient system that can be used to assess MQ objectively and quantitatively. We should apply different accessing systems to different groups of people according to their ages and genders. Field test, laboratory test and questionnaire are among essential components of MQ assessment. A scientific interpretation of MQ score is part of an MQ assessment system as it will help an individual to improve his MQ. IQ (intelligence quotient) and EQ (emotional quotient) and their test have been studied intensively. We argue that IQ and EQ study alone is not sufficient for an individual’s all round development. The significance of MQ study is that it offsets IQ and EQ study. MQ reflects an individual’s mental level as well as bodily level of intelligence in physical activities. It is well-known that the American Springfield College seal includes the Luther Gulick triangle with the words “spirit,” “mind,” and “body” written within it. MQ, together with IQ and EQ, echoes this education philosophy. Since its inception in 2012, the MQ research has spread rapidly in China. By now, six prestigious universities in China have established research centers on MQ and its assessment.Keywords: motional Intelligence, motional quotient, multiple intelligence, motor intelligence, all round development
Procedia PDF Downloads 1622785 A Non-Destructive Estimation Method for Internal Time in Perilla Leaf Using Hyperspectral Data
Authors: Shogo Nagano, Yusuke Tanigaki, Hirokazu Fukuda
Abstract:
Vegetables harvested early in the morning or late in the afternoon are valued in plant production, and so the time of harvest is important. The biological functions known as circadian clocks have a significant effect on this harvest timing. The purpose of this study was to non-destructively estimate the circadian clock and so construct a method for determining a suitable harvest time. We took eight samples of green busil (Perilla frutescens var. crispa) every 4 hours, six times for 1 day and analyzed all samples at the same time. A hyperspectral camera was used to collect spectrum intensities at 141 different wavelengths (350–1050 nm). Calculation of correlations between spectrum intensity of each wavelength and harvest time suggested the suitability of the hyperspectral camera for non-destructive estimation. However, even the highest correlated wavelength had a weak correlation, so we used machine learning to raise the accuracy of estimation and constructed a machine learning model to estimate the internal time of the circadian clock. Artificial neural networks (ANN) were used for machine learning because this is an effective analysis method for large amounts of data. Using the estimation model resulted in an error between estimated and real times of 3 min. The estimations were made in less than 2 hours. Thus, we successfully demonstrated this method of non-destructively estimating internal time.Keywords: artificial neural network (ANN), circadian clock, green busil, hyperspectral camera, non-destructive evaluation
Procedia PDF Downloads 2972784 Supervised Learning for Cyber Threat Intelligence
Authors: Jihen Bennaceur, Wissem Zouaghi, Ali Mabrouk
Abstract:
The major aim of cyber threat intelligence (CTI) is to provide sophisticated knowledge about cybersecurity threats to ensure internal and external safeguards against modern cyberattacks. Inaccurate, incomplete, outdated, and invaluable threat intelligence is the main problem. Therefore, data analysis based on AI algorithms is one of the emergent solutions to overcome the threat of information-sharing issues. In this paper, we propose a supervised machine learning-based algorithm to improve threat information sharing by providing a sophisticated classification of cyber threats and data. Extensive simulations investigate the accuracy, precision, recall, f1-score, and support overall to validate the designed algorithm and to compare it with several supervised machine learning algorithms.Keywords: threat information sharing, supervised learning, data classification, performance evaluation
Procedia PDF Downloads 1462783 The Lateral and Torsional Vibration Analysis of a Rotor-Bearing System Using Transfer Matrix Method
Authors: Mohammad Hadi Jalali, Mostafa Ghayour, Saeed Ziaei-Rad, Behrooz Shahriari
Abstract:
The vibration problems that can be occurred in the operational conditions of rotating machines may cause damage to the machine or even failure of the machine completely. Therefore, dynamic analysis of rotors is vital in the design and development stages of the rotating machines. In this study, the uncoupled torsional and lateral vibration analysis of a rotor-bearing system is carried out using transfer matrix method. The Campbell diagram, critical speed and the mode shape corresponding to the critical speed are obtained in order to evaluate the dynamic behavior of the rotor.Keywords: transfer matrix method, rotor-bearing system, campbell diagram, critical speed
Procedia PDF Downloads 4892782 Dynamic Compensation for Environmental Temperature Variation in the Coolant Refrigeration Cycle as a Means of Increasing Machine-Tool Precision
Authors: Robbie C. Murchison, Ibrahim Küçükdemiral, Andrew Cowell
Abstract:
Thermal effects are the largest source of dimensional error in precision machining, and a major proportion is caused by ambient temperature variation. The use of coolant is a primary means of mitigating these effects, but there has been limited work on coolant temperature control. This research critically explored whether CNC-machine coolant refrigeration systems adapted to actively compensate for ambient temperature variation could increase machining accuracy. Accuracy data were collected from operators’ checklists for a CNC 5-axis mill and statistically reduced to bias and precision metrics for observations of one day over a sample period of 27 days. Temperature data were collected using three USB dataloggers in ambient air, the chiller inflow, and the chiller outflow. The accuracy and temperature data were analysed using Pearson correlation, then the thermodynamics of the system were described using system identification with MATLAB. It was found that 75% of thermal error is reflected in the hot coolant temperature but that this is negligibly dependent on ambient temperature. The effect of the coolant refrigeration process on hot coolant outflow temperature was also found to be negligible. Therefore, the evidence indicated that it would not be beneficial to adapt coolant chillers to compensate for ambient temperature variation. However, it is concluded that hot coolant outflow temperature is a robust and accessible source of thermal error data which could be used for prevention strategy evaluation or as the basis of other thermal error strategies.Keywords: CNC manufacturing, machine-tool, precision machining, thermal error
Procedia PDF Downloads 882781 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker
Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation
Procedia PDF Downloads 222780 Machine Learning for Exoplanetary Habitability Assessment
Authors: King Kumire, Amos Kubeka
Abstract:
The synergy of machine learning and astronomical technology advancement is giving rise to the new space age, which is pronounced by better habitability assessments. To initiate this discussion, it should be recorded for definition purposes that the symbiotic relationship between astronomy and improved computing has been code-named the Cis-Astro gateway concept. The cosmological fate of this phrase has been unashamedly plagiarized from the cis-lunar gateway template and its associated LaGrange points which act as an orbital bridge to the moon from our planet Earth. However, for this study, the scientific audience is invited to bridge toward the discovery of new habitable planets. It is imperative to state that cosmic probes of this magnitude can be utilized as the starting nodes of the astrobiological search for galactic life. This research can also assist by acting as the navigation system for future space telescope launches through the delimitation of target exoplanets. The findings and the associated platforms can be harnessed as building blocks for the modeling of climate change on planet earth. The notion that if the human genus exhausts the resources of the planet earth or there is a bug of some sort that makes the earth inhabitable for humans explains the need to find an alternative planet to inhabit. The scientific community, through interdisciplinary discussions of the International Astronautical Federation so far has the common position that engineers can reduce space mission costs by constructing a stable cis-lunar orbit infrastructure for refilling and carrying out other associated in-orbit servicing activities. Similarly, the Cis-Astro gateway can be envisaged as a budget optimization technique that models extra-solar bodies and can facilitate the scoping of future mission rendezvous. It should be registered as well that this broad and voluminous catalog of exoplanets shall be narrowed along the way using machine learning filters. The gist of this topic revolves around the indirect economic rationale of establishing a habitability scoping platform.Keywords: machine-learning, habitability, exoplanets, supercomputing
Procedia PDF Downloads 882779 Machine Learning for Exoplanetary Habitability Assessment
Authors: King Kumire, Amos Kubeka
Abstract:
The synergy of machine learning and astronomical technology advancement is giving rise to the new space age, which is pronounced by better habitability assessments. To initiate this discussion, it should be recorded for definition purposes that the symbiotic relationship between astronomy and improved computing has been code-named the Cis-Astro gateway concept. The cosmological fate of this phrase has been unashamedly plagiarized from the cis-lunar gateway template and its associated LaGrange points which act as an orbital bridge to the moon from our planet Earth. However, for this study, the scientific audience is invited to bridge toward the discovery of new habitable planets. It is imperative to state that cosmic probes of this magnitude can be utilized as the starting nodes of the astrobiological search for galactic life. This research can also assist by acting as the navigation system for future space telescope launches through the delimitation of target exoplanets. The findings and the associated platforms can be harnessed as building blocks for the modeling of climate change on planet earth. The notion that if the human genus exhausts the resources of the planet earth or there is a bug of some sort that makes the earth inhabitable for humans explains the need to find an alternative planet to inhabit. The scientific community, through interdisciplinary discussions of the International Astronautical Federation so far, has the common position that engineers can reduce space mission costs by constructing a stable cis-lunar orbit infrastructure for refilling and carrying out other associated in-orbit servicing activities. Similarly, the Cis-Astro gateway can be envisaged as a budget optimization technique that models extra-solar bodies and can facilitate the scoping of future mission rendezvous. It should be registered as well that this broad and voluminous catalog of exoplanets shall be narrowed along the way using machine learning filters. The gist of this topic revolves around the indirect economic rationale of establishing a habitability scoping platform.Keywords: exoplanets, habitability, machine-learning, supercomputing
Procedia PDF Downloads 1142778 Stable Tending Control of Complex Power Systems: An Example of Localized Design of Power System Stabilizers
Authors: Wenjuan Du
Abstract:
The phase compensation method was proposed based on the concept of the damping torque analysis (DTA). It is a method for the design of a PSS (power system stabilizer) to suppress local-mode power oscillations in a single-machine infinite-bus power system. This paper presents the application of the phase compensation method for the design of a PSS in a multi-machine power system. The application is achieved by examining the direct damping contribution of the stabilizer to the power oscillations. By using linearized equal area criterion, a theoretical proof to the application for the PSS design is presented. Hence PSS design in the paper is an example of stable tending control by localized method.Keywords: phase compensation method, power system small-signal stability, power system stabilizer
Procedia PDF Downloads 6372777 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform
Authors: Reza Mohammadzadeh
Abstract:
The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.Keywords: data model, geotechnical risks, machine learning, underground coal mining
Procedia PDF Downloads 2742776 A Survey on Ambient Intelligence in Agricultural Technology
Abstract:
Despite the advances made in various new technologies, application of these technologies for agriculture still remains a formidable task, as it involves integration of diverse domains for monitoring the different process involved in agricultural management. Advances in ambient intelligence technology represents one of the most powerful technology for increasing the yield of agricultural crops and to mitigate the impact of water scarcity, climatic change and methods for managing pests, weeds, and diseases. This paper proposes a GPS-assisted, machine to machine solutions that combine information collected by multiple sensors for the automated management of paddy crops. To maintain the economic viability of paddy cultivation, the various techniques used in agriculture are discussed and a novel system which uses ambient intelligence technique is proposed in this paper. The ambient intelligence based agricultural system gives a great scope.Keywords: ambient intelligence, agricultural technology, smart agriculture, precise farming
Procedia PDF Downloads 6052775 Operational Characteristics of the Road Surface Improvement
Authors: Iuri Salukvadze
Abstract:
Construction takes importance role in the history of mankind, there is not a single thing-product in our lives in which the builder’s work was not to be materialized, because to create all of it requires setting up factories, roads, and bridges, etc. The function of the Republic of Georgia, as part of the connecting Europe-Asia transport corridor, is significantly increased. In the context of transit function a large part of the cargo traffic belongs to motor transport, hence the improvement of motor roads transport infrastructure is rather important and rise the new, increased operational demands for existing as well as new motor roads. Construction of the durable road surface is related to rather large values, but because of high transport-operational properties, such as high-speed, less fuel consumption, less depreciation of tires, etc. If the traffic intensity is high, therefore the reimbursement of expenses occurs rapidly and accordingly is increasing income. If the traffic intensity is relatively small, it is recommended to use lightened structures of road carpet in order to pay for capital investments amounted to no more than normative one. The road carpet is divided into the following basic types: asphaltic concrete and cement concrete. Asphaltic concrete is the most perfect type of road carpet. It is arranged in two or three layers on rigid foundation and will be compacted. Asphaltic concrete is artificial building material, which due stratum will be selected and measured from stone skeleton and sand, interconnected by bitumen and a mixture of mineral powder. Less strictly selected similar material is called as bitumen-mineral mixture. Asphaltic concrete is non-rigid building material and well durable on vertical loadings; it is less resistant to the impact of horizontal forces. The cement concrete is monolithic and durable material, it is well durable the horizontal loads and is less resistant related to vertical loads. The cement concrete consists from strictly selected, measured stone material and sand, the binder is cement. The cement concrete road carpet represents separate slabs of sizes from 3 ÷ 5 op to 6 ÷ 8 meters. The slabs are reinforced by a rather complex system. Between the slabs are arranged seams that are designed for avoiding of additional stresses due temperature fluctuations on the length of slabs. For the joint behavior of separate slabs, they are connected by metal rods. Rods provide the changes in the length of slabs and distribute to the slab vertical forces and bending moments. The foundation layers will be extremely durable, for that is required high-quality stone material, cement, and metal. The qualification work aims to: in order for improvement of traffic conditions on motor roads to prolong operational conditions and improving their characteristics. The work consists from three chapters, 80 pages, 5 tables and 5 figures. In the work are stated general concepts as well as carried out by various companies using modern methods tests and their results. In the chapter III are stated carried by us tests related to this issue and specific examples to improving the operational characteristics.Keywords: asphalt, cement, cylindrikal sample of asphalt, building
Procedia PDF Downloads 2212774 Design and Development of Tandem Dynamometer for Testing and Validation of Motor Performance Parameters
Authors: Vedansh More, Lalatendu Bal, Ronak Panchal, Atharva Kulkarni
Abstract:
The project aims at developing a cost-effective test bench capable of testing and validating the complete powertrain package of an electric vehicle. Emrax 228 high voltage synchronous motor was selected as the prime mover for study. A tandem type dynamometer comprising of two loading methods; inertial, using standard inertia rollers and absorptive, using a separately excited DC generator with resistive coils was developed. The absorptive loading of the prime mover was achieved by implementing a converter circuit through which duty of the input field voltage level was controlled. This control was efficacious in changing the magnetic flux and hence the generated voltage which was ultimately dropped across resistive coils assembled in a load bank with all parallel configuration. The prime mover and loading elements were connected via a chain drive with a 2:1 reduction ratio which allows flexibility in placement of components and a relaxed rating of the DC generator. The development will aid in determination of essential characteristics like torque-RPM, power-RPM, torque factor, RPM factor, heat loads of devices and battery pack state of charge efficiency but also provides a significant financial advantage over existing versions of dynamometers with its cost-effective solution.Keywords: absorptive load, chain drive, chordal action, DC generator, dynamometer, electric vehicle, inertia rollers, load bank, powertrain, pulse width modulation, reduction ratio, road load, testbench
Procedia PDF Downloads 2282773 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique
Authors: C. Manjula, Lilly Florence
Abstract:
Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.Keywords: decision tree, genetic algorithm, machine learning, software defect prediction
Procedia PDF Downloads 3282772 Artificial Intelligence-Based Thermal Management of Battery System for Electric Vehicles
Authors: Raghunandan Gurumurthy, Aricson Pereira, Sandeep Patil
Abstract:
The escalating adoption of electric vehicles (EVs) across the globe has underscored the critical importance of advancing battery system technologies. This has catalyzed a shift towards the design and development of battery systems that not only exhibit higher energy efficiency but also boast enhanced thermal performance and sophisticated multi-material enclosures. A significant leap in this domain has been the incorporation of simulation-based design optimization for battery packs and Battery Management Systems (BMS), a move further enriched by integrating artificial intelligence/machine learning (AI/ML) approaches. These strategies are pivotal in refining the design, manufacturing, and operational processes for electric vehicles and energy storage systems. By leveraging AI/ML, stakeholders can now predict battery performance metrics—such as State of Health, State of Charge, and State of Power—with unprecedented accuracy. Furthermore, as Li-ion batteries (LIBs) become more prevalent in urban settings, the imperative for bolstering thermal and fire resilience has intensified. This has propelled Battery Thermal Management Systems (BTMs) to the forefront of energy storage research, highlighting the role of machine learning and AI not just as tools for enhanced safety management through accurate temperature forecasts and diagnostics but also as indispensable allies in the early detection and warning of potential battery fires.Keywords: electric vehicles, battery thermal management, industrial engineering, machine learning, artificial intelligence, manufacturing
Procedia PDF Downloads 952771 Machining Stability of a Milling Machine with Different Preloaded Spindle
Authors: Jui-Pin Hung, Qiao-Wen Chang, Kung-Da Wu, Yong-Run Chen
Abstract:
This study was aimed to investigate the machining stability of a spindle tool with different preloaded amount. To this end, the vibration tests were conducted on the spindle unit with different preload to assess the dynamic characteristics and machining stability of the spindle unit. Current results demonstrate that the tool tip frequency response characteristics and the machining stabilities in X and Y direction are affected to change for spindle with different preload. As can be found from the results, a high preloaded spindle tool shows higher limited cutting depth at mid position, while a spindle with low preload shows a higher limited depth. This implies that the machining stability of spindle tool system is affected to vary by the machine frame structure. Besides, such an effect is quite different and varied with the preload of the spindle.Keywords: bearing preload, dynamic compliance, machining stability, spindle
Procedia PDF Downloads 3842770 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 121