Search results for: predictive insights
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2821

Search results for: predictive insights

2641 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks

Authors: Sulemana Ibrahim

Abstract:

Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.

Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks

Procedia PDF Downloads 35
2640 Fuzzy Logic Based Fault Tolerant Model Predictive MLI Topology

Authors: Abhimanyu Kumar, Chirag Gupta

Abstract:

This work presents a comprehensive study on the employment of Model Predictive Control (MPC) for a three-phase voltage-source inverter to regulate the output voltage efficiently. The inverter is modeled via the Clarke Transformation, considering a scenario where the load is unknown. An LC filter model is developed, demonstrating its efficacy in Total Harmonic Distortion (THD) reduction. The system, when implemented with fault-tolerant multilevel inverter topologies, ensures reliable operation even under fault conditions, a requirement that is paramount with the increasing dependence on renewable energy sources. The research also integrates a Fuzzy Logic based fault tolerance system which identifies and manages faults, ensuring consistent inverter performance. The efficacy of the proposed methodology is substantiated through rigorous simulations and comparative results, shedding light on the voltage prediction efficiency and the robustness of the model even under fault conditions.

Keywords: total harmonic distortion, fuzzy logic, renewable energy sources, MLI

Procedia PDF Downloads 77
2639 A High Content Screening Platform for the Accurate Prediction of Nephrotoxicity

Authors: Sijing Xiong, Ran Su, Lit-Hsin Loo, Daniele Zink

Abstract:

The kidney is a major target for toxic effects of drugs, industrial and environmental chemicals and other compounds. Typically, nephrotoxicity is detected late during drug development, and regulatory animal models could not solve this problem. Validated or accepted in silico or in vitro methods for the prediction of nephrotoxicity are not available. We have established the first and currently only pre-validated in vitro models for the accurate prediction of nephrotoxicity in humans and the first predictive platforms based on renal cells derived from human pluripotent stem cells. In order to further improve the efficiency of our predictive models, we recently developed a high content screening (HCS) platform. This platform employed automated imaging in combination with automated quantitative phenotypic profiling and machine learning methods. 129 image-based phenotypic features were analyzed with respect to their predictive performance in combination with 44 compounds with different chemical structures that included drugs, environmental and industrial chemicals and herbal and fungal compounds. The nephrotoxicity of these compounds in humans is well characterized. A combination of chromatin and cytoskeletal features resulted in high predictivity with respect to nephrotoxicity in humans. Test balanced accuracies of 82% or 89% were obtained with human primary or immortalized renal proximal tubular cells, respectively. Furthermore, our results revealed that a DNA damage response is commonly induced by different PTC-toxicants with diverse chemical structures and injury mechanisms. Together, the results show that the automated HCS platform allows efficient and accurate nephrotoxicity prediction for compounds with diverse chemical structures.

Keywords: high content screening, in vitro models, nephrotoxicity, toxicity prediction

Procedia PDF Downloads 290
2638 Uncertainty Estimation in Neural Networks through Transfer Learning

Authors: Ashish James, Anusha James

Abstract:

The impressive predictive performance of deep learning techniques on a wide range of tasks has led to its widespread use. Estimating the confidence of these predictions is paramount for improving the safety and reliability of such systems. However, the uncertainty estimates provided by neural networks (NNs) tend to be overconfident and unreasonable. Ensemble of NNs typically produce good predictions but uncertainty estimates tend to be inconsistent. Inspired by these, this paper presents a framework that can quantitatively estimate the uncertainties by leveraging the advances in transfer learning through slight modification to the existing training pipelines. This promising algorithm is developed with an intention of deployment in real world problems which already boast a good predictive performance by reusing those pretrained models. The idea is to capture the behavior of the trained NNs for the base task by augmenting it with the uncertainty estimates from a supplementary network. A series of experiments with known and unknown distributions show that the proposed approach produces well calibrated uncertainty estimates with high quality predictions.

Keywords: uncertainty estimation, neural networks, transfer learning, regression

Procedia PDF Downloads 100
2637 Breast Cancer Mortality and Comorbidities in Portugal: A Predictive Model Built with Real World Data

Authors: Cecília M. Antão, Paulo Jorge Nogueira

Abstract:

Breast cancer (BC) is the first cause of cancer mortality among Portuguese women. This retrospective observational study aimed at identifying comorbidities associated with BC female patients admitted to Portuguese public hospitals (2010-2018), investigating the effect of comorbidities on BC mortality rate, and building a predictive model using logistic regression. Results showed that the BC mortality in Portugal decreased in this period and reached 4.37% in 2018. Adjusted odds ratio indicated that secondary malignant neoplasms of liver, of bone and bone marrow, congestive heart failure, and diabetes were associated with an increased chance of dying from breast cancer. Although the Lisbon district (the most populated area) accounted for the largest percentage of BC patients, the logistic regression model showed that, besides patient’s age, being resident in Bragança, Castelo Branco, or Porto districts was directly associated with an increase of the mortality rate.

Keywords: breast cancer, comorbidities, logistic regression, adjusted odds ratio

Procedia PDF Downloads 60
2636 What the Future Holds for Social Media Data Analysis

Authors: P. Wlodarczak, J. Soar, M. Ally

Abstract:

The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.

Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning

Procedia PDF Downloads 392
2635 Langerian Mindfulness and School Manager’s Competencies: A Comprehensive Model in Khorasan Razavi Educational Province

Authors: Reza Taherian, Naziasadat Naseri, Elham Fariborzi, Faride Hashmiannejad

Abstract:

Effective management plays a crucial role in the success of educational institutions and training organizations. This study aims to develop and validate a professional competency model for managers in the education and training sector of Khorasan Razavi Province using a mindfulness approach based on Langerian theory. Employing a mixed exploratory design, the research involved qualitative data collection from experts and top national and provincial managers, as well as quantitative data collection using a researcher-developed questionnaire. The findings revealed that 81% of the competency of education and training managers is influenced by the dimensions of Langerian mindfulness, including engagement, seeking, producing, and flexibility. These dimensions were found to be predictive of the competencies of education and training managers, which encompass specialized knowledge, professional skills, pedagogical knowledge, commitment to Islamic values, personal characteristics, and creativity. This research provides valuable insights into the essential role of mindfulness in shaping the competencies of education and training managers, shedding light on the specific dimensions that significantly contribute to managerial success in Khorasan Razavi province.

Keywords: school managers, school manager’s competencies, mindfulness, Langerian mindfulness

Procedia PDF Downloads 25
2634 A Mega-Analysis of the Predictive Power of Initial Contact within Minimal Social Network

Authors: Cathal Ffrench, Ryan Barrett, Mike Quayle

Abstract:

It is accepted in social psychology that categorization leads to ingroup favoritism, without further thought given to the processes that may co-occur or even precede categorization. These categorizations move away from the conceptualization of the self as a unique social being toward an increasingly collective identity. Subsequently, many individuals derive much of their self-evaluations from these collective identities. The seminal literature on this topic argues that it is primarily categorization that evokes instances of ingroup favoritism. Apropos to these theories, we argue that categorization acts to enhance and further intergroup processes rather than defining them. More accurately, we propose categorization aids initial ingroup contact and this first contact is predictive of subsequent favoritism on individual and collective levels. This analysis focuses on Virtual Interaction APPLication (VIAPPL) based studies, a software interface that builds on the flaws of the original minimal group studies. The VIAPPL allows the exchange of tokens in an intra and inter-group manner. This token exchange is how we classified the first contact. The study involves binary longitudinal analysis to better understand the subsequent exchanges of individuals based on who they first interacted with. Studies were selected on the criteria of evidence of explicit first interactions and two-group designs. Our findings paint a compelling picture in support of a motivated contact hypothesis, which suggests that an individual’s first motivated contact toward another has strong predictive capabilities for future behavior. This contact can lead to habit formation and specific favoritism towards individuals where contact has been established. This has important implications for understanding how group conflict occurs, and how intra-group individual bias can develop.

Keywords: categorization, group dynamics, initial contact, minimal social networks, momentary contact

Procedia PDF Downloads 124
2633 Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control

Authors: Vinko Lešić, Mario Vašak, Anita Martinčević, Marko Gulin, Antonio Starčić, Hrvoje Novak

Abstract:

With 40% of total world energy consumption, building systems are developing into technically complex large energy consumers suitable for application of sophisticated power management approaches to largely increase the energy efficiency and even make them active energy market participants. Centralized control system of building heating and cooling managed by economically-optimal model predictive control shows promising results with estimated 30% of energy efficiency increase. The research is focused on implementation of such a method on a case study performed on two floors of our faculty building with corresponding sensors wireless data acquisition, remote heating/cooling units and central climate controller. Building walls are mathematically modeled with corresponding material types, surface shapes and sizes. Models are then exploited to predict thermal characteristics and changes in different building zones. Exterior influences such as environmental conditions and weather forecast, people behavior and comfort demands are all taken into account for deriving price-optimal climate control. Finally, a DC microgrid with photovoltaics, wind turbine, supercapacitor, batteries and fuel cell stacks is added to make the building a unit capable of active participation in a price-varying energy market. Computational burden of applying model predictive control on such a complex system is relaxed through a hierarchical decomposition of the microgrid and climate control, where the former is designed as higher hierarchical level with pre-calculated price-optimal power flows control, and latter is designed as lower level control responsible to ensure thermal comfort and exploit the optimal supply conditions enabled by microgrid energy flows management. Such an approach is expected to enable the inclusion of more complex building subsystems into consideration in order to further increase the energy efficiency.

Keywords: price-optimal building climate control, Microgrid power flow optimisation, hierarchical model predictive control, energy efficient buildings, energy market participation

Procedia PDF Downloads 440
2632 Predicting Machine-Down of Woodworking Industrial Machines

Authors: Matteo Calabrese, Martin Cimmino, Dimos Kapetis, Martina Manfrin, Donato Concilio, Giuseppe Toscano, Giovanni Ciandrini, Giancarlo Paccapeli, Gianluca Giarratana, Marco Siciliano, Andrea Forlani, Alberto Carrotta

Abstract:

In this paper we describe a machine learning methodology for Predictive Maintenance (PdM) applied on woodworking industrial machines. PdM is a prominent strategy consisting of all the operational techniques and actions required to ensure machine availability and to prevent a machine-down failure. One of the challenges with PdM approach is to design and develop of an embedded smart system to enable the health status of the machine. The proposed approach allows screening simultaneously multiple connected machines, thus providing real-time monitoring that can be adopted with maintenance management. This is achieved by applying temporal feature engineering techniques and training an ensemble of classification algorithms to predict Remaining Useful Lifetime of woodworking machines. The effectiveness of the methodology is demonstrated by testing an independent sample of additional woodworking machines without presenting machine down event.

Keywords: predictive maintenance, machine learning, connected machines, artificial intelligence

Procedia PDF Downloads 197
2631 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis

Authors: Chang-Jen Lan

Abstract:

Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as X

Keywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index

Procedia PDF Downloads 112
2630 Evaluating the Suitability and Performance of Dynamic Modulus Predictive Models for North Dakota’s Asphalt Mixtures

Authors: Duncan Oteki, Andebut Yeneneh, Daba Gedafa, Nabil Suleiman

Abstract:

Most agencies lack the equipment required to measure the dynamic modulus (|E*|) of asphalt mixtures, necessitating the need to use predictive models. This study compared measured |E*| values for nine North Dakota asphalt mixes using the original Witczak, modified Witczak, and Hirsch models. The influence of temperature on the |E*| models was investigated, and Pavement ME simulations were conducted using measured |E*| and predictions from the most accurate |E*| model. The results revealed that the original Witczak model yielded the lowest Se/Sy and highest R² values, indicating the lowest bias and highest accuracy, while the poorest overall performance was exhibited by the Hirsch model. Using predicted |E*| as inputs in the Pavement ME generated conservative distress predictions compared to using measured |E*|. The original Witczak model was recommended for predicting |E*| for low-reliability pavements in North Dakota.

Keywords: asphalt mixture, binder, dynamic modulus, MEPDG, pavement ME, performance, prediction

Procedia PDF Downloads 23
2629 Effectiveness of the Lacey Assessment of Preterm Infants to Predict Neuromotor Outcomes of Premature Babies at 12 Months Corrected Age

Authors: Thanooja Naushad, Meena Natarajan, Tushar Vasant Kulkarni

Abstract:

Background: The Lacey Assessment of Preterm Infants (LAPI) is used in clinical practice to identify premature babies at risk of neuromotor impairments, especially cerebral palsy. This study attempted to find the validity of the Lacey assessment of preterm infants to predict neuromotor outcomes of premature babies at 12 months corrected age and to compare its predictive ability with the brain ultrasound. Methods: This prospective cohort study included 89 preterm infants (45 females and 44 males) born below 35 weeks gestation who were admitted to the neonatal intensive care unit of a government hospital in Dubai. Initial assessment was done using the Lacey assessment after the babies reached 33 weeks postmenstrual age. Follow up assessment on neuromotor outcomes was done at 12 months (± 1 week) corrected age using two standardized outcome measures, i.e., infant neurological international battery and Alberta infant motor scale. Brain ultrasound data were collected retrospectively. Data were statistically analyzed, and the diagnostic accuracy of the Lacey assessment of preterm infants (LAPI) was calculated -when used alone and in combination with the brain ultrasound. Results: On comparison with brain ultrasound, the Lacey assessment showed superior specificity (96% vs. 77%), higher positive predictive value (57% vs. 22%), and higher positive likelihood ratio (18 vs. 3) to predict neuromotor outcomes at one year of age. The sensitivity of Lacey assessment was lower than brain ultrasound (66% vs. 83%), whereas specificity was similar (97% vs. 98%). A combination of Lacey assessment and brain ultrasound results showed higher sensitivity (80%), positive (66%), and negative (98%) predictive values, positive likelihood ratio (24), and test accuracy (95%) than Lacey assessment alone in predicting neurological outcomes. The negative predictive value of the Lacey assessment was similar to that of its combination with brain ultrasound (96%). Conclusion: Results of this study suggest that the Lacey assessment of preterm infants can be used as a supplementary assessment tool for premature babies in the neonatal intensive care unit. Due to its high specificity, Lacey assessment can be used to identify those babies at low risk of abnormal neuromotor outcomes at a later age. When used along with the findings of the brain ultrasound, Lacey assessment has better sensitivity to identify preterm babies at particular risk. These findings have applications in identifying premature babies who may benefit from early intervention services.

Keywords: brain ultrasound, lacey assessment of preterm infants, neuromotor outcomes, preterm

Procedia PDF Downloads 112
2628 Psychological Testing in Industrial/Organizational Psychology: Validity and Reliability of Psychological Assessments in the Workplace

Authors: Melissa C. Monney

Abstract:

Psychological testing has been of interest to researchers for many years as useful tools in assessing and diagnosing various disorders as well as to assist in understanding human behavior. However, for over 20 years now, researchers and laypersons alike have been interested in using them for other purposes, such as determining factors in employee selection, promotion, and even termination. In recent years, psychological assessments have been useful in facilitating workplace decision processing, regarding employee circulation within organizations. This literature review explores four of the most commonly used psychological tests in workplace environments, namely cognitive ability, emotional intelligence, integrity, and personality tests, as organizations have used these tests to assess different factors of human behavior as predictive measures of future employee behaviors. The findings suggest that while there is much controversy and debate regarding the validity and reliability of these tests in workplace settings as they were not originally designed for these purposes, the use of such assessments in the workplace has been useful in decreasing costs and employee turnover as well as increase job satisfaction by ensuring the right employees are selected for their roles.

Keywords: cognitive ability, personality testing, predictive validity, workplace behavior

Procedia PDF Downloads 216
2627 Risk Propagation in Electricity Markets: Measuring the Asymmetric Transmission of Downside and Upside Risks in Energy Prices

Authors: Montserrat Guillen, Stephania Mosquera-Lopez, Jorge Uribe

Abstract:

An empirical study of market risk transmission between electricity prices in the Nord Pool interconnected market is done. Crucially, it is differentiated between risk propagation in the two tails of the price variation distribution. Thus, the downside risk from upside risk spillovers is distinguished. The results found document an asymmetric nature of risk and risk propagation in the two tails of the electricity price log variations. Risk spillovers following price increments in the market are transmitted to a larger extent than those after price reductions. Also, asymmetries related to both, the size of the transaction area and related to whether a given area behaves as a net-exporter or net-importer of electricity, are documented. For instance, on the one hand, the bigger the area of the transaction, the smaller the size of the volatility shocks that it receives. On the other hand, exporters of electricity, alongside countries with a significant dependence on renewable sources, tend to be net-transmitters of volatility to the rest of the system. Additionally, insights on the predictive power of positive and negative semivariances for future market volatility are provided. It is shown that depending on the forecasting horizon, downside and upside shocks to the market are featured by a distinctive persistence, and that upside volatility impacts more on net-importers of electricity, while the opposite holds for net-exporters.

Keywords: electricity prices, realized volatility, semivariances, volatility spillovers

Procedia PDF Downloads 151
2626 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights

Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan

Abstract:

The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyze huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic well being is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that supports the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.

Keywords: big data, COVID-19, health, indexing, NoSQL, sharding, scalability, well being

Procedia PDF Downloads 43
2625 Identification and Force Control of a Two Chambers Pneumatic Soft Actuator

Authors: Najib K. Dankadai, Ahmad 'Athif Mohd Faudzi, Khairuddin Osman, Muhammad Rusydi Muhammad Razif, IIi Najaa Aimi Mohd Nordin

Abstract:

Researches in soft actuators are now growing rapidly because of their adequacy to be applied in sectors like medical, agriculture, biological and welfare. This paper presents system identification (SI) and control of the force generated by a two chambers pneumatic soft actuator (PSA). A force mathematical model for the actuator was identified experimentally using data acquisition card and MATLAB SI toolbox. Two control techniques; a predictive functional control (PFC) and conventional proportional integral and derivative (PID) schemes are proposed and compared based on the identified model for the soft actuator flexible mechanism. Results of this study showed that both of the proposed controllers ensure accurate tracking when the closed loop system was tested with the step, sinusoidal and multi step reference input through MATLAB simulation although the PFC provides a better response than the PID.

Keywords: predictive functional control (PFC), proportional integral and derivative (PID), soft actuator, system identification

Procedia PDF Downloads 294
2624 Predicting Match Outcomes in Team Sport via Machine Learning: Evidence from National Basketball Association

Authors: Jacky Liu

Abstract:

This paper develops a team sports outcome prediction system with potential for wide-ranging applications across various disciplines. Despite significant advancements in predictive analytics, existing studies in sports outcome predictions possess considerable limitations, including insufficient feature engineering and underutilization of advanced machine learning techniques, among others. To address these issues, we extend the Sports Cross Industry Standard Process for Data Mining (SRP-CRISP-DM) framework and propose a unique, comprehensive predictive system, using National Basketball Association (NBA) data as an example to test this extended framework. Our approach follows a holistic methodology in feature engineering, employing both Time Series and Non-Time Series Data, as well as conducting Explanatory Data Analysis and Feature Selection. Furthermore, we contribute to the discourse on target variable choice in team sports outcome prediction, asserting that point spread prediction yields higher profits as opposed to game-winner predictions. Using machine learning algorithms, particularly XGBoost, results in a significant improvement in predictive accuracy of team sports outcomes. Applied to point spread betting strategies, it offers an astounding annual return of approximately 900% on an initial investment of $100. Our findings not only contribute to academic literature, but have critical practical implications for sports betting. Our study advances the understanding of team sports outcome prediction a burgeoning are in complex system predictions and pave the way for potential profitability and more informed decision making in sports betting markets.

Keywords: machine learning, team sports, game outcome prediction, sports betting, profits simulation

Procedia PDF Downloads 68
2623 Improved Predictive Models for the IRMA Network Using Nonlinear Optimisation

Authors: Vishwesh Kulkarni, Nikhil Bellarykar

Abstract:

Cellular complexity stems from the interactions among thousands of different molecular species. Thanks to the emerging fields of systems and synthetic biology, scientists are beginning to unravel these regulatory, signaling, and metabolic interactions and to understand their coordinated action. Reverse engineering of biological networks has has several benefits but a poor quality of data combined with the difficulty in reproducing it limits the applicability of these methods. A few years back, many of the commonly used predictive algorithms were tested on a network constructed in the yeast Saccharomyces cerevisiae (S. cerevisiae) to resolve this issue. The network was a synthetic network of five genes regulating each other for the so-called in vivo reverse-engineering and modeling assessment (IRMA). The network was constructed in S. cereviase since it is a simple and well characterized organism. The synthetic network included a variety of regulatory interactions, thus capturing the behaviour of larger eukaryotic gene networks on a smaller scale. We derive a new set of algorithms by solving a nonlinear optimization problem and show how these algorithms outperform other algorithms on these datasets.

Keywords: synthetic gene network, network identification, optimization, nonlinear modeling

Procedia PDF Downloads 130
2622 Transfer of Business Anti-Corruption Norms in Developing Countries: A Case Study of Vietnam

Authors: Candice Lemaitre

Abstract:

During the 1990s, an alliance of international intergovernmental and non-governmental organizations proposed a set of regulatory norms designed to reduce corruption. Many governments in developing countries, such as Vietnam, enacted these global anti-corruption norms into their domestic law. This article draws on empirical research to understand why these anti-corruption norms have failed to reduce corruption in Vietnam and many other developing countries. Rather than investigating state compliance with global anti-corruption provisions, a topic that has already attracted considerable attention, this article aims to explore the comparatively under-researched area of business compliance. Based on data collected from semi-structured interviews with business managers in Vietnam and archival research, this article examines how businesses in Vietnam interpret and comply with global anti-corruption norms. It investigates why different types of companies in Vietnam engage with and respond to these norms in different ways. This article suggests that global anti-corruption norms have not been effective in reducing corruption in Vietnam because there is fragmentation in the way companies in Vietnam interpret and respond to these norms. This fragmentation results from differences in the epistemic (or interpretive) communities that companies draw upon to interpret global anti-corruption norms. This article uses discourse analysis to understand how the communities interpret global anti-corruption norms. This investigation aims to generate some predictive insights into how companies are likely to respond to anti-corruption regimes based on global anti-corruption norms.

Keywords: anti-corruption, business law, legal transfer, Vietnam

Procedia PDF Downloads 136
2621 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 168
2620 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification

Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang

Abstract:

This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.

Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI

Procedia PDF Downloads 61
2619 The Predictive Value of Serum Bilirubin in the Post-Transplant De Novo Malignancy: A Data Mining Approach

Authors: Nasim Nosoudi, Amir Zadeh, Hunter White, Joshua Conrad, Joon W. Shim

Abstract:

De novo Malignancy has become one of the major causes of death after transplantation, so early cancer diagnosis and detection can drastically improve survival rates post-transplantation. Most previous work focuses on using artificial intelligence (AI) to predict transplant success or failure outcomes. In this work, we focused on predicting de novo malignancy after liver transplantation using AI. We chose the patients that had malignancy after liver transplantation with no history of malignancy pre-transplant. Their donors were cancer-free as well. We analyzed 254,200 patient profiles with post-transplant malignancy from the US Organ Procurement and Transplantation Network (OPTN). Several popular data mining methods were applied to the resultant dataset to build predictive models to characterize de novo malignancy after liver transplantation. Recipient's bilirubin, creatinine, weight, gender, number of days recipient was on the transplant waiting list, Epstein Barr Virus (EBV), International normalized ratio (INR), and ascites are among the most important factors affecting de novo malignancy after liver transplantation

Keywords: De novo malignancy, bilirubin, data mining, transplantation

Procedia PDF Downloads 79
2618 The Investigation of Predictor Affect of Childhood Trauma, Dissociation, Alexithymia, and Gender on Dissociation in University Students

Authors: Gizem Akcan, Erdinc Ozturk

Abstract:

The purpose of the study was to determine some psychosocial variables that predict dissociation in university students. These psychosocial variables were perceived childhood trauma, alexithymia, and gender. 150 (75 males, 75 females) university students (bachelor, master and postgraduate) were enrolled in this study. They were chosen from universities in Istanbul at the education year of 2016-2017. Dissociative Experiences Scale (DES), Childhood Trauma Questionnaire (CTQ) and Toronto Alexithymia Scale were used to assess related variables. Demographic Information Form was given to students in order to have their demographic information. Frequency Distribution, Linear Regression Analysis, and t-test analysis were used for statistical analysis. Childhood trauma and alexithymia were found to have predictive value on dissociation among university students. However, physical abuse, physical neglect and emotional neglect sub dimensions of childhood trauma and externally-oriented thinking sub dimension of alexithymia did not have predictive value on dissociation. Moreover, there was no significant difference between males and females in terms of dissociation scores of participants.

Keywords: childhood trauma, dissociation, alexithymia, gender

Procedia PDF Downloads 366
2617 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 368
2616 Design and Development of Real-Time Optimal Energy Management System for Hybrid Electric Vehicles

Authors: Masood Roohi, Amir Taghavipour

Abstract:

This paper describes a strategy to develop an energy management system (EMS) for a charge-sustaining power-split hybrid electric vehicle. This kind of hybrid electric vehicles (HEVs) benefit from the advantages of both parallel and series architecture. However, it gets relatively more complicated to manage power flow between the battery and the engine optimally. The applied strategy in this paper is based on nonlinear model predictive control approach. First of all, an appropriate control-oriented model which was accurate enough and simple was derived. Towards utilization of this controller in real-time, the problem was solved off-line for a vast area of reference signals and initial conditions and stored the computed manipulated variables inside look-up tables. Look-up tables take a little amount of memory. Also, the computational load dramatically decreased, because to find required manipulated variables the controller just needed a simple interpolation between tables.

Keywords: hybrid electric vehicles, energy management system, nonlinear model predictive control, real-time

Procedia PDF Downloads 320
2615 Drivers of E-Participation: Case of Saudi Arabia

Authors: R. Alrashedi, A. Persaud

Abstract:

This study provides insights into the readiness of users to participate in e-government activities in Saudi Arabia. A user-centric model of e-participation is developed based on a review of the literature and empirically tested. The findings are based on an online survey of a sample of 200 hundred Saudi citizens and residents living in Saudi Arabia. The study found that trust of the government, attitude towards e-participation, e-participation through the use of social media, and social influence and social identity positively influence e-participation while perceived benefits of e-government is negatively related to e-participation. This study contributes to the literature by providing empirical evidence of the drivers of e-participation. The study also provides insights that could be used by policymakers to increase the level of e-participation in Saudi Arabia.

Keywords: e-government, e-participation, social media, trust, social influence and social identity

Procedia PDF Downloads 434
2614 Exploring Unexplored Horizons: Innovative Applications of Applied Fluid Mechanics in Sustainable Energy

Authors: Elvira S. Castillo, Surupa Shaw

Abstract:

This paper delves into the uncharted territories of innovative applications of applied fluid mechanics in sustainable energy. By exploring the intersection of fluid mechanics principles with renewable energy technologies, the study uncovers untapped potential and novel solutions. Through theoretical analyses, the research investigates how fluid dynamics can be strategically leveraged to enhance the efficiency and sustainability of renewable energy systems. The findings contribute to expanding the discourse on sustainable energy by presenting innovative perspectives and practical insights. This paper serves as a guide for future research endeavors and offers valuable insights for implementing advanced methodologies and technologies to address global energy challenges.

Keywords: fluid mechanics, sustainable energy, sustainble practices, renewable energy

Procedia PDF Downloads 20
2613 Development of a Predictive Model to Prevent Financial Crisis

Authors: Tengqin Han

Abstract:

Delinquency has been a crucial factor in economics throughout the years. Commonly seen in credit card and mortgage, it played one of the crucial roles in causing the most recent financial crisis in 2008. In each case, a delinquency is a sign of the loaner being unable to pay off the debt, and thus may cause a lost of property in the end. Individually, one case of delinquency seems unimportant compared to the entire credit system. China, as an emerging economic entity, the national strength and economic strength has grown rapidly, and the gross domestic product (GDP) growth rate has remained as high as 8% in the past decades. However, potential risks exist behind the appearance of prosperity. Among the risks, the credit system is the most significant one. Due to long term and a large amount of balance of the mortgage, it is critical to monitor the risk during the performance period. In this project, about 300,000 mortgage account data are analyzed in order to develop a predictive model to predict the probability of delinquency. Through univariate analysis, the data is cleaned up, and through bivariate analysis, the variables with strong predictive power are detected. The project is divided into two parts. In the first part, the analysis data of 2005 are split into 2 parts, 60% for model development, and 40% for in-time model validation. The KS of model development is 31, and the KS for in-time validation is 31, indicating the model is stable. In addition, the model is further validation by out-of-time validation, which uses 40% of 2006 data, and KS is 33. This indicates the model is still stable and robust. In the second part, the model is improved by the addition of macroeconomic economic indexes, including GDP, consumer price index, unemployment rate, inflation rate, etc. The data of 2005 to 2010 is used for model development and validation. Compared with the base model (without microeconomic variables), KS is increased from 41 to 44, indicating that the macroeconomic variables can be used to improve the separation power of the model, and make the prediction more accurate.

Keywords: delinquency, mortgage, model development, model validation

Procedia PDF Downloads 202
2612 Online Learning for Modern Business Models: Theoretical Considerations and Algorithms

Authors: Marian Sorin Ionescu, Olivia Negoita, Cosmin Dobrin

Abstract:

This scientific communication reports and discusses learning models adaptable to modern business problems and models specific to digital concepts and paradigms. In the PAC (probably approximately correct) learning model approach, in which the learning process begins by receiving a batch of learning examples, the set of learning processes is used to acquire a hypothesis, and when the learning process is fully used, this hypothesis is used in the prediction of new operational examples. For complex business models, a lot of models should be introduced and evaluated to estimate the induced results so that the totality of the results are used to develop a predictive rule, which anticipates the choice of new models. In opposition, for online learning-type processes, there is no separation between the learning (training) and predictive phase. Every time a business model is approached, a test example is considered from the beginning until the prediction of the appearance of a model considered correct from the point of view of the business decision. After choosing choice a part of the business model, the label with the logical value "true" is known. Some of the business models are used as examples of learning (training), which helps to improve the prediction mechanisms for future business models.

Keywords: machine learning, business models, convex analysis, online learning

Procedia PDF Downloads 118