Search results for: predictive data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25288

Search results for: predictive data mining

24808 Hydro Geochemistry and Water Quality in a River Affected by Lead Mining in Southern Spain

Authors: Rosendo Mendoza, María Carmen Hidalgo, María José Campos-Suñol, Julián Martínez, Javier Rey

Abstract:

The impact of mining environmental liabilities and mine drainage on surface water quality has been investigated in the hydrographic basin of the La Carolina mining district (southern Spain). This abandoned mining district is characterized by the existence of important mineralizations of sulfoantimonides of Pb - Ag, and sulfides of Cu - Fe. All surface waters reach the main river of this mining area, the Grande River, which ends its course in the Rumblar reservoir. This waterbody is intended to supply 89,000 inhabitants, as well as irrigation and livestock. Therefore, the analysis and control of the metal(loid) concentration that exists in these surface waters is an important issue because of the potential pollution derived from metallic mining. A hydrogeochemical campaign consisting of 20 water sampling points was carried out in the hydrographic network of the Grande River, as well as two sampling points in the Rumbler reservoir and at the main tailings impoundment draining to the river. Although acid mine drainage (pH below 4) is discharged into the Grande river from some mine adits, the pH values in the river water are always neutral or slightly alkaline. This is mainly the result of a dilution process of the small volumes of mine waters by net alkaline waters of the river. However, during the dry season, the surface waters present high mineralization due to a constant discharge from the abandoned flooded mines and a decrease in the contribution of surface runoff. The concentrations of dissolved Cd and Pb in the water reach values of 2 and 81 µg/l, respectively, exceeding the limit established by the Environmental Quality Standard for surface water. In addition, the concentrations of dissolved As, Cu, and Pb in the waters of the Rumblar reservoir reached values of 10, 20, and 11 µg/l, respectively. These values are higher than the maximum allowable concentration for human consumption, a circumstance that is especially alarming.

Keywords: environmental quality, hydrogeochemistry, metal mining, surface water

Procedia PDF Downloads 129
24807 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 177
24806 Customer Preference in the Textile Market: Fabric-Based Analysis

Authors: Francisca Margarita Ocran

Abstract:

Underwear, and more particularly bras and panties, are defined as intimate clothing. Strictly speaking, they enhance the place of women in the public or private satchel. Therefore, women's lingerie is a complex garment with a high involvement profile, motivating consumers to buy it not only by its functional utility but also by the multisensory experience it provides them. Customer behavior models are generally based on customer data mining, and each model is designed to answer questions at a specific time. Predicting the customer experience is uncertain and difficult. Thus, knowledge of consumers' tastes in lingerie deserves to be treated as an experiential product, where the dimensions of the experience motivating consumers to buy a lingerie product and to remain faithful to it must be analyzed in detail by the manufacturers and retailers to engage and retain consumers, which is why this research aims to identify the variables that push consumers to choose their lingerie product, based on an in-depth analysis of the types of fabrics used to make lingerie. The data used in this study comes from online purchases. Machine learning approach with the use of Python programming language and Pycaret gives us a precision of 86.34%, 85.98%, and 84.55% for the three algorithms to use concerning the preference of a buyer in front of a range of lingerie. Gradient Boosting, random forest, and K Neighbors were used in this study; they are very promising and rich in the classification of preference in the textile industry.

Keywords: consumer behavior, data mining, lingerie, machine learning, preference

Procedia PDF Downloads 70
24805 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis

Authors: Hyun-Woo Cho

Abstract:

Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.

Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques

Procedia PDF Downloads 373
24804 Application Potential of Forward Osmosis-Nanofiltration Hybrid Process for the Treatment of Mining Waste Water

Authors: Ketan Mahawer, Abeer Mutto, S. K. Gupta

Abstract:

The mining wastewater contains inorganic metal salts, which makes it saline and additionally contributes to contaminating the surface and underground freshwater reserves that exist nearby mineral processing industries. Therefore, treatment of wastewater and water recovery is obligatory by any available technology before disposing it into the environment. Currently, reverse osmosis (RO) is the commercially acceptable conventional membrane process for saline wastewater treatment, but consumes an enormous amount of energy and makes the process expensive. To solve this industrial problem with minimum energy consumption, we tested the feasibility of forward osmosis-nanofiltration (FO-NF) hybrid process for the mining wastewater treatment. The FO-NF process experimental results for 0.029M concentration of saline wastewater treated by 0.42 M sodium-sulfate based draw solution shows that specific energy consumption of the FO-NF process compared with standalone NF was slightly above (between 0.5-1 kWh/m3) from conventional process. However, average freshwater recovery was 30% more from standalone NF with same feed and operating conditions. Hence, FO-NF process in place of RO/NF offers a huge possibility for treating mining industry wastewater and concentrates the metals as the by-products without consuming an excessive/large amount of energy and in addition, mitigates the fouling in long periods of treatment, which also decreases the maintenance and replacement cost of the separation process.

Keywords: forward osmosis, nanofiltration, mining, draw solution, divalent solute

Procedia PDF Downloads 104
24803 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence

Authors: Sogand Barghi

Abstract:

The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.

Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting

Procedia PDF Downloads 52
24802 Implementation of Knowledge and Attitude Management Based on Holistic Approach in Andragogy Learning, as an Effort to Solve the Environmental Problems of Post-Coal Mining Activity

Authors: Aloysius Hardoko, Susilo

Abstract:

The root cause of the problem after the environmental damage due to coal mining activities defined as the province of East Kalimantan corridor masterplan economic activity accelerated the expansion of Indonesia's economic development (MP3EI) is the behavior of adults. Adult behavior can be changed through knowledge management and attitude. Based on the root of the problem, the objective of the research is to apply knowledge management and attitude based on holistic approach in learning andragogy as an effort to solve environmental problems after coal mining activities. Research methods to achieve the objective of using quantitative research with pretest postes group design. Knowledge management and attitudes based on a holistic approach in adult learning are applied through initial learning activities, core and case-based cover of environmental damage. The research instrument is a description of the case of environmental damage. The data analysis uses t-test to see the effect of knowledge management attitude based on holistic approach before and after adult learning. Location and sample of representative research of adults as many as 20 people in Kutai Kertanegara District, one of the districts in East Kalimantan province, which suffered the worst environmental damage. The conclusion of the research result is the application of knowledge management and attitude in adult learning influence to adult knowledge and attitude to overcome environmental problem post-coal mining activity.

Keywords: knowledge management and attitude, holistic approach, andragogy learning, environmental Issue

Procedia PDF Downloads 192
24801 Predictive Maintenance Based on Oil Analysis Applicable to Transportation Fleets

Authors: Israel Ibarra Solis, Juan Carlos Rodriguez Sierra, Ma. del Carmen Salazar Hernandez, Isis Rodriguez Sanchez, David Perez Guerrero

Abstract:

At the present paper we try to explain the analysis techniques use for the lubricating oil in a maintenance period of a city bus (Mercedes Benz Boxer 40), which is call ‘R-24 route’, line Coecillo Centro SA de CV in Leon Guanajuato, to estimate the optimal time for the oil change. Using devices such as the rotational viscometer and the atomic absorption spectrometer, they can detect the incipient form when the oil loses its lubricating properties and, therefore, cannot protect the mechanical components of diesel engines such these trucks. Timely detection of lost property in the oil, it allows us taking preventive plan maintenance for the fleet.

Keywords: atomic absorption spectrometry, maintenance, predictive velocity rate, lubricating oils

Procedia PDF Downloads 547
24800 Heavy Metal Pollution of the Soils around the Mining Area near Shamlugh Town (Armenia) and Related Risks to the Environment

Authors: G. A. Gevorgyan, K. A. Ghazaryan, T. H. Derdzyan

Abstract:

The heavy metal pollution of the soils around the mining area near Shamlugh town and related risks to human health were assessed. The investigations showed that the soils were polluted with heavy metals that can be ranked by anthropogenic pollution degree as follows: Cu>Pb>As>Co>Ni>Zn. The main sources of the anthropogenic metal pollution of the soils were the copper mining area near Shamlugh town, the Chochkan tailings storage facility and the trucks transferring are from the mining area. Copper pollution degree in some observation sites was unallowable for agricultural production. The total non-carcinogenic chronic hazard index (THI) values in some places, including observation sites in Shamlugh town, were above the safe level (THI<1) for children living in this territory. Although the highest heavy metal enrichment degree in the soils was registered in case of copper, the highest health risks to humans especially children were posed by cobalt which is explained by the fact that heavy metals have different toxicity levels and penetration characteristics.

Keywords: Armenia, copper mine, heavy metal pollution of soil, health risks

Procedia PDF Downloads 402
24799 Reinforcement Learning for Quality-Oriented Production Process Parameter Optimization Based on Predictive Models

Authors: Akshay Paranjape, Nils Plettenberg, Robert Schmitt

Abstract:

Producing faulty products can be costly for manufacturing companies and wastes resources. To reduce scrap rates in manufacturing, process parameters can be optimized using machine learning. Thus far, research mainly focused on optimizing specific processes using traditional algorithms. To develop a framework that enables real-time optimization based on a predictive model for an arbitrary production process, this study explores the application of reinforcement learning (RL) in this field. Based on a thorough review of literature about RL and process parameter optimization, a model based on maximum a posteriori policy optimization that can handle both numerical and categorical parameters is proposed. A case study compares the model to state–of–the–art traditional algorithms and shows that RL can find optima of similar quality while requiring significantly less time. These results are confirmed in a large-scale validation study on data sets from both production and other fields. Finally, multiple ways to improve the model are discussed.

Keywords: reinforcement learning, production process optimization, evolutionary algorithms, policy optimization, actor critic approach

Procedia PDF Downloads 79
24798 Did Nature of Job Matters - Impact of Perceived Job Autonomy on Turnover Intention in Sales and Marketing Managers: Moderating Effect of Procedural and Distributive Justice

Authors: Muhammad Babar Shahzad

Abstract:

The purpose of our study is to investigate the relationship between perceived job autonomy and turnover intention in sales & marketing staff. Perceived job autonomy is considered one of most studied dimension of Job Characteristic Model. But still there is a confusion in scholars about predictive role of perceived job autonomy in turnover intention. In line of more complex research on this relation, we investigated the relationship between perceived job autonomy and turnover intention. Did nature of job have any impact on this relationship. On the call of different authors we take interactive effect of perceived job autonomy and procedural justice on turnover intention. Predictive role of distributive justice to employee outcomes is not deniable. But predictive role of distributive justice will be prone in different contextual influences. Interactive role of distributive justice and perceived job autonomy is also not tested before. We collected date from 279 marketing and sales managers working in financial institution, FMCG industries, Pharamesutical Industry & Bank. Strong and direct negative relation was found in perceived job autonomy, distributive justice & procedural justice on turnover intention. Distributive and procedural justice is also amplifying the negative relationship of perceived job autonomy and turnover intention. Limitation and future direction for research is also discussed.

Keywords: perceived job autonomy, turnover intention, procedural justice, distributive job

Procedia PDF Downloads 495
24797 Innovative Predictive Modeling and Characterization of Composite Material Properties Using Machine Learning and Genetic Algorithms

Authors: Hamdi Beji, Toufik Kanit, Tanguy Messager

Abstract:

This study aims to construct a predictive model proficient in foreseeing the linear elastic and thermal characteristics of composite materials, drawing on a multitude of influencing parameters. These parameters encompass the shape of inclusions (circular, elliptical, square, triangle), their spatial coordinates within the matrix, orientation, volume fraction (ranging from 0.05 to 0.4), and variations in contrast (spanning from 10 to 200). A variety of machine learning techniques are deployed, including decision trees, random forests, support vector machines, k-nearest neighbors, and an artificial neural network (ANN), to facilitate this predictive model. Moreover, this research goes beyond the predictive aspect by delving into an inverse analysis using genetic algorithms. The intent is to unveil the intrinsic characteristics of composite materials by evaluating their thermomechanical responses. The foundation of this research lies in the establishment of a comprehensive database that accounts for the array of input parameters mentioned earlier. This database, enriched with this diversity of input variables, serves as a bedrock for the creation of machine learning and genetic algorithm-based models. These models are meticulously trained to not only predict but also elucidate the mechanical and thermal conduct of composite materials. Remarkably, the coupling of machine learning and genetic algorithms has proven highly effective, yielding predictions with remarkable accuracy, boasting scores ranging between 0.97 and 0.99. This achievement marks a significant breakthrough, demonstrating the potential of this innovative approach in the field of materials engineering.

Keywords: machine learning, composite materials, genetic algorithms, mechanical and thermal proprieties

Procedia PDF Downloads 44
24796 A Location Routing Model for the Logistic System in the Mining Collection Centers of the Northern Region of Boyacá-Colombia

Authors: Erika Ruíz, Luis Amaya, Diego Carreño

Abstract:

The main objective of this study is to design a mathematical model for the logistics of mining collection centers in the northern region of the department of Boyacá (Colombia), determining the structure that facilitates the flow of products along the supply chain. In order to achieve this, it is necessary to define a suitable design of the distribution network, taking into account the products, customer’s characteristics and the availability of information. Likewise, some other aspects must be defined, such as number and capacity of collection centers to establish, routes that must be taken to deliver products to the customers, among others. This research will use one of the operation research problems, which is used in the design of distribution networks known as Location Routing Problem (LRP).

Keywords: location routing problem, logistic, mining collection, model

Procedia PDF Downloads 203
24795 Inclusion of Students with Disabilities (SWD) in Higher Education Institutions (HEIs): Self-Advocacy and Engagement as Central

Authors: Tadesse Abera

Abstract:

This study aimed to investigate the contribution of self-advocacy and engagement in the inclusion of SWDs in HEIs. A convergent parallel mixed methods design was employed. This article reports the quantitative strand. A total of 246 SWDs were selected through stratified proportionate random sampling technique from five public HEIs in Ethiopia. Data were collected through Self-advocacy questionnaire, student engagement scale, and college student experience questionnaire and analyzed through frequency, percentage, mean, standard deviation, correlation, one sample t-test and multiple regression. Both self-advocacy and engagement were found to have a predictive power on inclusion of respondents in the HEIs, where engagement was found to be more predictor. From the components of self-advocacy, knowledge of self and leadership and from engagement dimensions sense of belonging, cognitive, and valuing in their respective orders were found to have a stronger predictive power on the inclusion of respondents in the institutions. Based on the findings it was concluded that, if students with disabilities work hard to be self-determined, strive for realizing social justice, exert quality effort and seek active involvement, their inclusion in the institutions would be ensured.

Keywords: self-advocacy, engagement, inclusion, students with disabilities, higher education institution

Procedia PDF Downloads 59
24794 Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control

Authors: Vinko Lešić, Mario Vašak, Anita Martinčević, Marko Gulin, Antonio Starčić, Hrvoje Novak

Abstract:

With 40% of total world energy consumption, building systems are developing into technically complex large energy consumers suitable for application of sophisticated power management approaches to largely increase the energy efficiency and even make them active energy market participants. Centralized control system of building heating and cooling managed by economically-optimal model predictive control shows promising results with estimated 30% of energy efficiency increase. The research is focused on implementation of such a method on a case study performed on two floors of our faculty building with corresponding sensors wireless data acquisition, remote heating/cooling units and central climate controller. Building walls are mathematically modeled with corresponding material types, surface shapes and sizes. Models are then exploited to predict thermal characteristics and changes in different building zones. Exterior influences such as environmental conditions and weather forecast, people behavior and comfort demands are all taken into account for deriving price-optimal climate control. Finally, a DC microgrid with photovoltaics, wind turbine, supercapacitor, batteries and fuel cell stacks is added to make the building a unit capable of active participation in a price-varying energy market. Computational burden of applying model predictive control on such a complex system is relaxed through a hierarchical decomposition of the microgrid and climate control, where the former is designed as higher hierarchical level with pre-calculated price-optimal power flows control, and latter is designed as lower level control responsible to ensure thermal comfort and exploit the optimal supply conditions enabled by microgrid energy flows management. Such an approach is expected to enable the inclusion of more complex building subsystems into consideration in order to further increase the energy efficiency.

Keywords: price-optimal building climate control, Microgrid power flow optimisation, hierarchical model predictive control, energy efficient buildings, energy market participation

Procedia PDF Downloads 448
24793 A Data-Mining Model for Protection of FACTS-Based Transmission Line

Authors: Ashok Kalagura

Abstract:

This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.

Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC

Procedia PDF Downloads 412
24792 Implementing a Neural Network on a Low-Power and Mobile Cluster to Aide Drivers with Predictive AI for Traffic Behavior

Authors: Christopher Lama, Alix Rieser, Aleksandra Molchanova, Charles Thangaraj

Abstract:

New technologies like Tesla’s Dojo have made high-performance embedded computing more available. Although automobile computing has developed and benefited enormously from these more recent technologies, the costs are still high, prohibitively high in some cases for broader adaptation, particularly for the after-market and enthusiast markets. This project aims to implement a Raspberry Pi-based low-power (under one hundred Watts) highly mobile computing cluster for a neural network. The computing cluster built from off-the-shelf components is more affordable and, therefore, makes wider adoption possible. The paper describes the design of the neural network, Raspberry Pi-based cluster, and applications the cluster will run. The neural network will use input data from sensors and cameras to project a live view of the road state as the user drives. The neural network will be trained to predict traffic behavior and generate warnings when potentially dangerous situations are predicted. The significant outcomes of this study will be two folds, firstly, to implement and test the low-cost cluster, and secondly, to ascertain the effectiveness of the predictive AI implemented on the cluster.

Keywords: CS pedagogy, student research, cluster computing, machine learning

Procedia PDF Downloads 79
24791 Predictive Modelling Approaches in Food Processing and Safety

Authors: Amandeep Sharma, Digvaijay Verma, Ruplal Choudhary

Abstract:

Food processing is an activity across the globe that help in better handling of agricultural produce, including dairy, meat, and fish. The operations carried out in the food industry includes raw material quality authenticity; sorting and grading; processing into various products using thermal treatments – heating, freezing, and chilling; packaging; and storage at the appropriate temperature to maximize the shelf life of the products. All this is done to safeguard the food products and to ensure the distribution up to the consumer. The approaches to develop predictive models based on mathematical or statistical tools or empirical models’ development has been reported for various milk processing activities, including plant maintenance and wastage. Recently AI is the key factor for the fourth industrial revolution. AI plays a vital role in the food industry, not only in quality and food security but also in different areas such as manufacturing, packaging, and cleaning. A new conceptual model was developed, which shows that smaller sample size as only spectra would be required to predict the other values hence leads to saving on raw materials and chemicals otherwise used for experimentation during the research and new product development activity. It would be a futuristic approach if these tools can be further clubbed with the mobile phones through some software development for their real time application in the field for quality check and traceability of the product.

Keywords: predictive modlleing, ann, ai, food

Procedia PDF Downloads 68
24790 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 181
24789 A Quadratic Model to Early Predict the Blastocyst Stage with a Time Lapse Incubator

Authors: Cecile Edel, Sandrine Giscard D'Estaing, Elsa Labrune, Jacqueline Lornage, Mehdi Benchaib

Abstract:

Introduction: The use of incubator equipped with time-lapse technology in Artificial Reproductive Technology (ART) allows a continuous surveillance. With morphocinetic parameters, algorithms are available to predict the potential outcome of an embryo. However, the different proposed time-lapse algorithms do not take account the missing data, and then some embryos could not be classified. The aim of this work is to construct a predictive model even in the case of missing data. Materials and methods: Patients: A retrospective study was performed, in biology laboratory of reproduction at the hospital ‘Femme Mère Enfant’ (Lyon, France) between 1 May 2013 and 30 April 2015. Embryos (n= 557) obtained from couples (n=108) were cultured in a time-lapse incubator (Embryoscope®, Vitrolife, Goteborg, Sweden). Time-lapse incubator: The morphocinetic parameters obtained during the three first days of embryo life were used to build the predictive model. Predictive model: A quadratic regression was performed between the number of cells and time. N = a. T² + b. T + c. N: number of cells at T time (T in hours). The regression coefficients were calculated with Excel software (Microsoft, Redmond, WA, USA), a program with Visual Basic for Application (VBA) (Microsoft) was written for this purpose. The quadratic equation was used to find a value that allows to predict the blastocyst formation: the synthetize value. The area under the curve (AUC) obtained from the ROC curve was used to appreciate the performance of the regression coefficients and the synthetize value. A cut-off value has been calculated for each regression coefficient and for the synthetize value to obtain two groups where the difference of blastocyst formation rate according to the cut-off values was maximal. The data were analyzed with SPSS (IBM, Il, Chicago, USA). Results: Among the 557 embryos, 79.7% had reached the blastocyst stage. The synthetize value corresponds to the value calculated with time value equal to 99, the highest AUC was then obtained. The AUC for regression coefficient ‘a’ was 0.648 (p < 0.001), 0.363 (p < 0.001) for the regression coefficient ‘b’, 0.633 (p < 0.001) for the regression coefficient ‘c’, and 0.659 (p < 0.001) for the synthetize value. The results are presented as follow: blastocyst formation rate under cut-off value versus blastocyst rate formation above cut-off value. For the regression coefficient ‘a’ the optimum cut-off value was -1.14.10-3 (61.3% versus 84.3%, p < 0.001), 0.26 for the regression coefficient ‘b’ (83.9% versus 63.1%, p < 0.001), -4.4 for the regression coefficient ‘c’ (62.2% versus 83.1%, p < 0.001) and 8.89 for the synthetize value (58.6% versus 85.0%, p < 0.001). Conclusion: This quadratic regression allows to predict the outcome of an embryo even in case of missing data. Three regression coefficients and a synthetize value could represent the identity card of an embryo. ‘a’ regression coefficient represents the acceleration of cells division, ‘b’ regression coefficient represents the speed of cell division. We could hypothesize that ‘c’ regression coefficient could represent the intrinsic potential of an embryo. This intrinsic potential could be dependent from oocyte originating the embryo. These hypotheses should be confirmed by studies analyzing relationship between regression coefficients and ART parameters.

Keywords: ART procedure, blastocyst formation, time-lapse incubator, quadratic model

Procedia PDF Downloads 293
24788 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence

Authors: Muhammad Bilal Shaikh

Abstract:

Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.

Keywords: multimodal AI, computer vision, NLP, mineral processing, mining

Procedia PDF Downloads 54
24787 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 20
24786 Virtual Dimension Analysis of Hyperspectral Imaging to Characterize a Mining Sample

Authors: L. Chevez, A. Apaza, J. Rodriguez, R. Puga, H. Loro, Juan Z. Davalos

Abstract:

Virtual Dimension (VD) procedure is used to analyze Hyperspectral Image (HIS) treatment-data in order to estimate the abundance of mineral components of a mining sample. Hyperspectral images coming from reflectance spectra (NIR region) are pre-treated using Standard Normal Variance (SNV) and Minimum Noise Fraction (MNF) methodologies. The endmember components are identified by the Simplex Growing Algorithm (SVG) and after adjusted to the reflectance spectra of reference-databases using Simulated Annealing (SA) methodology. The obtained abundance of minerals of the sample studied is very near to the ones obtained using XRD with a total relative error of 2%.

Keywords: hyperspectral imaging, minimum noise fraction, MNF, simplex growing algorithm, SGA, standard normal variance, SNV, virtual dimension, XRD

Procedia PDF Downloads 141
24785 MP-SMC-I Method for Slip Suppression of Electric Vehicles under Braking

Authors: Tohru Kawabe

Abstract:

In this paper, a new SMC (Sliding Mode Control) method with MP (Model Predictive Control) integral action for the slip suppression of EV (Electric Vehicle) under braking is proposed. The proposed method introduce the integral term with standard SMC gain , where the integral gain is optimized for each control period by the MPC algorithms. The aim of this method is to improve the safety and the stability of EVs under braking by controlling the wheel slip ratio. There also include numerical simulation results to demonstrate the effectiveness of the method.

Keywords: sliding mode control, model predictive control, integral action, electric vehicle, slip suppression

Procedia PDF Downloads 544
24784 Case Study about Women Driving in Saudi Arabia Announced in 2018: Netnographic and Data Mining Study

Authors: Majdah Alnefaie

Abstract:

The ‘netnographic study’ and data mining have been used to monitor the public interaction on Social Media Sites (SMSs) to understand what the motivational factors influence the Saudi intentions regarding allowing women driving in Saudi Arabia in 2018. The netnographic study monitored the publics’ textual and visual communications in Twitter, Snapchat, and YouTube. SMSs users’ communications method is also known as electronic word of mouth (eWOM). Netnography methodology is still in its initial stages as it depends on manual extraction, reading and classification of SMSs users text. On the other hand, data mining is come from the computer and physical sciences background, therefore it is much harder to extract meaning from unstructured qualitative data. In addition, the new development in data mining software does not support the Arabic text, especially local slang in Saudi Arabia. Therefore, collaborations between social and computer scientists such as ‘netnographic study’ and data mining will enhance the efficiency of this study methodology leading to comprehensive research outcome. The eWOM communications between individuals on SMSs can promote a sense that sharing their preferences and experiences regarding politics and social government regulations is a part of their daily life, highlighting the importance of using SMSs as assistance in promoting participation in political and social. Therefore, public interactions on SMSs are important tools to comprehend people’s intentions regarding the new government regulations in the country. This study aims to answer this question, "What factors influence the Saudi Arabians' intentions of Saudi female's car-driving in 2018". The study utilized qualitative method known as netnographic study. The study used R studio to collect and analyses 27000 Saudi users’ comments from 25th May until 25th June 2018. The study has developed data collection model that support importing and analysing the Arabic text in the local slang. The data collection model in this study has been clustered based on different type of social networks, gender and the study main factors. The social network analysis was employed to collect comments from SMSs owned by governments’ originations, celebrities, vloggers, social activist and news SMSs accounts. The comments were collected from both males and females SMSs users. The sentiment analysis shows that the total number of positive comments Saudi females car driving was higher than negative comments. The data have provided the most important factors influenced the Saudi Arabians’ intention of Saudi females car driving including, culture and environment, freedom of choice, equal opportunities, security and safety. The most interesting finding indicted that women driving would play a role in increasing the individual freedom of choice. Saudi female will be able to drive cars to fulfill her daily life and family needs without being stressed due to the lack of transportation. The study outcome will help Saudi government to improve woman quality of life by increasing the ability to find more jobs and studies, increasing income through decreasing the spending on transport means such as taxi and having more freedom of choice in woman daily life needs. The study enhances the importance of using use marketing research to measure the public opinions on the new government regulations in the country. The study has explained the limitations and suggestions for future research.

Keywords: netnographic study, data mining, social media, Saudi Arabia, female driving

Procedia PDF Downloads 135
24783 Predictive Semi-Empirical NOx Model for Diesel Engine

Authors: Saurabh Sharma, Yong Sun, Bruce Vernham

Abstract:

Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model.  Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.

Keywords: diesel engine, machine learning, NOₓ emission, semi-empirical

Procedia PDF Downloads 101
24782 Predicting Expectations of Non-Monogamy in Long-Term Romantic Relationships

Authors: Michelle R. Sullivan

Abstract:

Positive romantic relationships and marriages offer a buffer against a host of physical and emotional difficulties. Conversely, poor relationship quality and marital discord can have deleterious consequences for individuals and families. Research has described non-monogamy, infidelity, and consensual non-monogamy, as both consequential and causal of relationship difficulty, or as a unique way a couple strives to make a relationship work. Much research on consensual non-monogamy has built on feminist theory and critique. To the author’s best knowledge, to date, no studies have examined the predictive relationship between individual and relationship characteristics and expectations of non-monogamy. The current longitudinal study: 1) estimated the prevalence of expectations of partner non-monogamy and 2) evaluated whether gender, sexual identity, age, education, how a couple met, and relationship quality were predictive expectations of partner non-monogamy. This study utilized the publically available longitudinal dataset, How Couples Meet and Stay Together. Adults aged 18- to 98-years old (n=4002) were surveyed by phone over 5 waves from 2009-2014. Demographics and how a couple met were gathered through self-report in Wave 1, and relationship quality and expectations of partner non-monogamy were gathered through self-report in Waves 4 and 5 (n=1047). The prevalence of expectations of partner non-monogamy (encompassing both infidelity and consensual non-monogamy) was 4.8%. Logistic regression models indicated that sexual identity, gender, education, and relationship quality were significantly predictive of expectations of partner non-monogamy. Specifically, male gender, lower education, identifying as lesbian, gay, or bisexual, and a lower relationship quality scores were predictive of expectations of partner non-monogamy. Male gender was not predictive of expectations of partner non-monogamy in the follow up logistic regression model. Age and whether a couple met online were not associated with expectations of partner non-monogamy. Clinical implications include awareness of the increased likelihood of lesbian, gay, and bisexual individuals to have an expectation of non-monogamy and the sequelae of relationship dissatisfaction that may be related. Future research directions could differentiate between non-monogamy subtypes and the person and relationship variables that lead to the likelihood of consensual non-monogamy and infidelity as separate constructs, as well as explore the relationship between predicting partner behavior and actual partner behavioral outcomes.

Keywords: open relationship, polyamory, infidelity, relationship satisfaction

Procedia PDF Downloads 144
24781 Data Analysis Tool for Predicting Water Scarcity in Industry

Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse

Abstract:

Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.

Keywords: data mining, industry, machine Learning, shortage, water resources

Procedia PDF Downloads 109
24780 Syndromic Surveillance Framework Using Tweets Data Analytics

Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden

Abstract:

Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.

Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza

Procedia PDF Downloads 98
24779 A Comprehensive Review of Artificial Intelligence Applications in Sustainable Building

Authors: Yazan Al-Kofahi, Jamal Alqawasmi.

Abstract:

In this study, a comprehensive literature review (SLR) was conducted, with the main goal of assessing the existing literature about how artificial intelligence (AI), machine learning (ML), deep learning (DL) models are used in sustainable architecture applications and issues including thermal comfort satisfaction, energy efficiency, cost prediction and many others issues. For this reason, the search strategy was initiated by using different databases, including Scopus, Springer and Google Scholar. The inclusion criteria were used by two research strings related to DL, ML and sustainable architecture. Moreover, the timeframe for the inclusion of the papers was open, even though most of the papers were conducted in the previous four years. As a paper filtration strategy, conferences and books were excluded from database search results. Using these inclusion and exclusion criteria, the search was conducted, and a sample of 59 papers was selected as the final included papers in the analysis. The data extraction phase was basically to extract the needed data from these papers, which were analyzed and correlated. The results of this SLR showed that there are many applications of ML and DL in Sustainable buildings, and that this topic is currently trendy. It was found that most of the papers focused their discussions on addressing Environmental Sustainability issues and factors using machine learning predictive models, with a particular emphasis on the use of Decision Tree algorithms. Moreover, it was found that the Random Forest repressor demonstrates strong performance across all feature selection groups in terms of cost prediction of the building as a machine-learning predictive model.

Keywords: machine learning, deep learning, artificial intelligence, sustainable building

Procedia PDF Downloads 46