Search results for: predictive analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1244

Search results for: predictive analytics

344 The Determinants of Corporate Hedging Strategy

Authors: Ademola Ajibade

Abstract:

Previous studies have explored several rationales for hedging strategies, but the evidence provided by these studies remains ambiguous. Using a hand-collected dataset of 2460 observations of non-financial firms in eight African countries covering 2013-2022, this paper investigates the determinants and extent of corporate hedge use. In particular, this paper focuses on the link between country-specific conditions and the corporate hedging behaviour of firms. To our knowledge, this represents the first African studies investigating the association between country-specific factors and corporate hedging policy. The evidence based on both univariate and multivariate reveal that country-level corruption and government quality are important indicators of the decisions and extent of hedge use among African firms. However, the connection between country-specific factors as a rationale for corporate hedge use is stronger for firms located in highly corrupt countries. This suggest that firms located in corrupt countries are more motivated to hedge due to the large exposure they face. In addition, we test the risk management theories and observe that CEOs educational qualification and experience shape corporate hedge behaviour. We implement a lagged variables in a panel data setting to address endogeneity concern and implement an interaction term between governance indices and firm-specific variables to test for robustness. Generally, our findings reveal that institutional factors shape risk management decisions and have a predictive power in explaining corporate hedging strategy.

Keywords: corporate hedging, governance quality, corruption, derivatives

Procedia PDF Downloads 70
343 Prediction of Damage to Cutting Tools in an Earth Pressure Balance Tunnel Boring Machine EPB TBM: A Case Study L3 Guadalajara Metro Line (Mexico)

Authors: Silvia Arrate, Waldo Salud, Eloy París

Abstract:

The wear of cutting tools is one of the most decisive elements when planning tunneling works, programming the maintenance stops and saving the optimum stock of spare parts during the evolution of the excavation. Being able to predict the behavior of cutting tools can give a very competitive advantage in terms of costs and excavation performance, optimized to the needs of the TBM itself. The incredible evolution of data science in recent years gives the option to implement it at the time of analyzing the key and most critical parameters related to machinery with the purpose of knowing how the cutting head is performing in front of the excavated ground. Taking this as a case study, Metro Line 3 of Guadalajara in Mexico will develop the feasibility of using Specific Energy versus data science applied over parameters of Torque, Penetration, and Contact Force, among others, to predict the behavior and status of cutting tools. The results obtained through both techniques are analyzed and verified in the function of the wear and the field situations observed in the excavation in order to determine its effectiveness regarding its predictive capacity. In conclusion, the possibilities and improvements offered by the application of digital tools and the programming of calculation algorithms for the analysis of wear of cutting head elements compared to purely empirical methods allow early detection of possible damage to cutting tools, which is reflected in optimization of excavation performance and a significant improvement in costs and deadlines.

Keywords: cutting tools, data science, prediction, TBM, wear

Procedia PDF Downloads 32
342 Evaluation of Firearm Injury Syndromic Surveillance in Utah

Authors: E. Bennion, A. Acharya, S. Barnes, D. Ferrell, S. Luckett-Cole, G. Mower, J. Nelson, Y. Nguyen

Abstract:

Objective: This study aimed to evaluate the validity of a firearm injury query in the Early Notification of Community-based Epidemics syndromic surveillance system. Syndromic surveillance data are used at the Utah Department of Health for early detection of and rapid response to unusually high rates of violence and injury, among other health outcomes. The query of interest was defined by the Centers for Disease Control and Prevention and used chief complaint and discharge diagnosis codes to capture initial emergency department encounters for firearm injury of all intents. Design: Two epidemiologists manually reviewed electronic health records of emergency department visits captured by the query from April-May 2020, compared results, and sent conflicting determinations to two arbiters. Results: Of the 85 unique records captured, 67 were deemed probable, 19 were ruled out, and two were undetermined, resulting in a positive predictive value of 75.3%. Common reasons for false positives included non-initial encounters and misleading keywords. Conclusion: Improving the validity of syndromic surveillance data would better inform outbreak response decisions made by state and local health departments. The firearm injury definition could be refined to exclude non-initial encounters by negating words such as “last month,” “last week,” and “aftercare”; and to exclude non-firearm injury by negating words such as “pellet gun,” “air gun,” “nail gun,” “bullet bike,” and “exit wound” when a firearm is not mentioned.

Keywords: evaluation, health information system, firearm injury, syndromic surveillance

Procedia PDF Downloads 156
341 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students

Authors: Samah Senbel

Abstract:

Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.

Keywords: computer science education, database design, graduate and undergraduate students, pedagogy

Procedia PDF Downloads 103
340 Consumer Value and Purchase Behaviour: The Mediating Role of Consumers' Expectations of Corporate Social Responsibility in Durban, South Africa

Authors: Abosede Ijabadeniyi, Jeevarathnam P. Govender

Abstract:

Prevailing strategic Corporate Social Responsibility (CSR) research is predominantly centred around the predictive implications of the construct on behavioural outcomes. This phenomenon limits the depth of our understanding of the trajectory of strategic CSR. The purpose of this paper is to investigate the mediating effects of CSR expectations on the relationship between consumer value and purchase behaviour by identifying the implications of the multidimensionality of CSR (economic, legal, ethical and philanthropic) on the latter. Drawing from the stakeholder theory and its interplay with the prevalence of Ubuntu values; the underlying force which governs the values of South African camaraderie, we hypothesise that the multidimensionality of CSR expectations has positive mediating effects in the relationship between consumer value and purchase behaviour. Partial Least Square (PLS) path modelling was employed, using six measures of the average path coefficient (APC) to test the relationship between the constructs. Results from a sample of mall shoppers of (n=411), based on a survey conducted across five major malls in Durban, South Africa, indicate that only the legal dimension of CSR serves as a mediating factor in the relationship among the constructs. South Africa’s unique history of segregation, leading to the proliferation of spontaneous organisational approach to CSR and higher expectations of organisational legitimacy are identified as antecedents of consumers’ reliance on the law (legal CSR) to redress the ills of the past, sustainable development, and socially responsible behaviour. The paper also highlights theoretical and managerial implications for future research.

Keywords: consumer value, corporate marketing, corporate social responsibility, purchase behaviour, Ubuntu

Procedia PDF Downloads 350
339 A Physiological Approach for Early Detection of Hemorrhage

Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain

Abstract:

Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.

Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning

Procedia PDF Downloads 151
338 Short Life Cycle Time Series Forecasting

Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar

Abstract:

The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.

Keywords: forecast, short life cycle product, structured judgement, time series

Procedia PDF Downloads 337
337 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication

Authors: Vedant Janapaty

Abstract:

Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.

Keywords: estuary, remote sensing, machine learning, Fourier transform

Procedia PDF Downloads 83
336 Obstetric Outcome after Hysteroscopic Septum Resection in Patients with Uterine Septa of Various Sizes

Authors: Nilanchali Singh, Alka Kriplani, Reeta Mahey, Garima Kachhawa

Abstract:

Objective: Resection of larger uterine septa does improve obstetric performance but whether smaller septa need resection and their impact on obstetric outcome is not clear. We wanted to evaluate the role of septal resection of septa of various sizes in obstetric performance. Methods: This retrospective cohort study comprised of 107 patients with uterine septum. The patients were categorized on the basis of extent of uterine septum into four groups: a) Subsepta (< 1/3rd), b) Septum > 1/3 to ½, c) Septum>1/2 to whole uterine cervix, d) Septum traversing whole of uterine cavity and cervix. Out of these 107 patients, 74 could be contacted telephonically and outcomes recorded. Sensitivity and specificity of investigative modalities were calculated. Results: Infertility was seen in maximum number of cases in complete septa (100%), whereas abortions were seen more commonly, in subsepta (18%). MRI had maximum sensitivity and positive predictive value, followed by hysteron-salpingography. Tubal block, fibroid, endometriosis, pelvic adhesions, ovarian pathologies were seen in some but no definite association of these pathologies was seen with any subgroup of septa. Almost five-year follow-up was recorded in all the subgroups. Significant reduction in infertility was seen in all septal subgroup (p=0.046, 0.032 & 0.05) patients except in subsepta (< 1/3rd uterine cavity) after septum resection. Abortions were significantly reduced (p=0.048) in third subgroup (i.e. septum > ½ to upto internal os) after hysteroscopic septum resection. Take home baby rate was 33% in subsepta and around 50% in the remaining subgroups of septa. Conclusions: Septal resection improves obstetric performance in patients with uterine septa of various sizes. Whether septal resection improves obstetric performance in patients with subsepta or very small septa, is controversial. Larger studies addressing this issue need to be planned.

Keywords: septal resection, obstetric outcome, infertility, septum size

Procedia PDF Downloads 300
335 Optimization of Springback Prediction in U-Channel Process Using Response Surface Methodology

Authors: Muhamad Sani Buang, Shahrul Azam Abdullah, Juri Saedon

Abstract:

There is not much effective guideline on development of design parameters selection on springback for advanced high strength steel sheet metal in U-channel process during cold forming process. This paper presents the development of predictive model for springback in U-channel process on advanced high strength steel sheet employing Response Surface Methodology (RSM). The experimental was performed on dual phase steel sheet, DP590 in U-channel forming process while design of experiment (DoE) approach was used to investigates the effects of four factors namely blank holder force (BHF), clearance (C) and punch travel (Tp) and rolling direction (R) were used as input parameters using two level values by applying Full Factorial design (24). From a statistical analysis of variant (ANOVA), result showed that blank holder force (BHF), clearance (C) and punch travel (Tp) displayed significant effect on springback of flange angle (β2) and wall opening angle (β1), while rolling direction (R) factor is insignificant. The significant parameters are optimized in order to reduce the springback behavior using Central Composite Design (CCD) in RSM and the optimum parameters were determined. A regression model for springback was developed. The effect of individual parameters and their response was also evaluated. The results obtained from optimum model are in agreement with the experimental values

Keywords: advance high strength steel, u-channel process, springback, design of experiment, optimization, response surface methodology (rsm)

Procedia PDF Downloads 525
334 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 94
333 Contribution of Spatial Teledetection to the Geological Mapping of the Imiter Buttonhole: Application to the Mineralized Structures of the Principal Corps B3 (CPB3) of the Imiter Mine (Anti-atlas, Morocco)

Authors: Bouayachi Ali, Alikouss Saida, Baroudi Zouhir, Zerhouni Youssef, Zouhair Mohammed, El Idrissi Assia, Essalhi Mourad

Abstract:

The world-class Imiter silver deposit is located on the northern flank of the Precambrian Imiter buttonhole. This deposit is formed by epithermal veins hosted in the sandstone-pelite formations of the lower complex and in the basic conglomerates of the upper complex, these veins are controlled by a regional scale fault cluster, oriented N70°E to N90°E. The present work on the contribution of remote sensing on the geological mapping of the Imiter buttonhole and application to the mineralized structures of the Principal Corps B3. Mapping on satellite images is a very important tool in mineral prospecting. It allows the localization of the zones of interest in order to orientate the field missions by helping the localization of the major structures which facilitates the interpretation, the programming and the orientation of the mining works. The predictive map also allows for the correction of field mapping work, especially the direction and dimensions of structures such as dykes, corridors or scrapings. The use of a series of processing such as SAM, PCA, MNF and unsupervised and supervised classification on a Landsat 8 satellite image of the study area allowed us to highlight the main facies of the Imite area. To improve the exploration research, we used another processing that allows to realize a spatial distribution of the alteration mineral indices, and the application of several filters on the different bands to have lineament maps.

Keywords: principal corps B3, teledetection, Landsat 8, Imiter II, silver mineralization, lineaments

Procedia PDF Downloads 81
332 Application of Bayesian Model Averaging and Geostatistical Output Perturbation to Generate Calibrated Ensemble Weather Forecast

Authors: Muhammad Luthfi, Sutikno Sutikno, Purhadi Purhadi

Abstract:

Weather forecast has necessarily been improved to provide the communities an accurate and objective prediction as well. To overcome such issue, the numerical-based weather forecast was extensively developed to reduce the subjectivity of forecast. Yet the Numerical Weather Predictions (NWPs) outputs are unfortunately issued without taking dynamical weather behavior and local terrain features into account. Thus, NWPs outputs are not able to accurately forecast the weather quantities, particularly for medium and long range forecast. The aim of this research is to aid and extend the development of ensemble forecast for Meteorology, Climatology, and Geophysics Agency of Indonesia. Ensemble method is an approach combining various deterministic forecast to produce more reliable one. However, such forecast is biased and uncalibrated due to its underdispersive or overdispersive nature. As one of the parametric methods, Bayesian Model Averaging (BMA) generates the calibrated ensemble forecast and constructs predictive PDF for specified period. Such method is able to utilize ensemble of any size but does not take spatial correlation into account. Whereas space dependencies involve the site of interest and nearby site, influenced by dynamic weather behavior. Meanwhile, Geostatistical Output Perturbation (GOP) reckons the spatial correlation to generate future weather quantities, though merely built by a single deterministic forecast, and is able to generate an ensemble of any size as well. This research conducts both BMA and GOP to generate the calibrated ensemble forecast for the daily temperature at few meteorological sites nearby Indonesia international airport.

Keywords: Bayesian Model Averaging, ensemble forecast, geostatistical output perturbation, numerical weather prediction, temperature

Procedia PDF Downloads 267
331 Transcriptomic Analysis of Non-Alcoholic Fatty Liver Disease in Cafeteria Diet Induced Obese Rats

Authors: Mohammad Jamal

Abstract:

Non-alcoholic fatty liver disease (NAFLD) has become one of the most chronic liver diseases, prevalent among people with morbid obesity. NAFLD does not develop clinically significant liver disease, however cirrhosis and liver cancer develop in subset and currently there are no approved therapies for the treatment of NAFLD. The study is aimed to understand the various key genes involved in the mechanism of NAFLD which can be valuable for developing diagnostic and predictive biomarkers based on their histologic stage of liver. The study was conducted on 16 male Sprague Dawley rats. The animals were divided in two groups: control group (n=8) fed on ad libitum normal chow and regular water and the cafeteria group (CAF)) (n=8) fed on high fatty/ carbohydrate diet. The animals received their respective diet from 4 weeks onwards from D.O.B until 25 weeks. Liver was extracted and RT² Profiler PCR Array was used to assess the NAFLD related genes. Histological evaluation was performed using H&E stain in liver tissue sections. Our PCR array results showed that genes involved in anti-inflammatory activity (Ifng, IL10), fatty acid uptake/oxidation (Fabp5), apoptosis (Fas), lipogenesis (Gck and Srebf1), Insulin signalling (Igfbp1) and metabolic pathway (pdk4) were upregulated in the liver of cafeteria fed obese rats. Bloated hepatocytes, displaced nucleus and higher lipid content were seen in the liver of cafeteria fed obese rats. Although Liver biopsies remain the gold standard in evaluating NAFLD, however an approach towards non-invasive markers could be used in understanding the physiology, therapeutic potential, and the targets to combat NAFLD.

Keywords: biomarkers, cafeteria diet, obesity, NAFLD

Procedia PDF Downloads 121
330 Modeling Breathable Particulate Matter Concentrations over Mexico City Retrieved from Landsat 8 Satellite Imagery

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Magnolia G. Martinez-Rivera, Pablo de J. Angeles-Salto, Carlos Herrera-Ventosa

Abstract:

In order to diminish health risks, it is of major importance to monitor air quality. However, this process is accompanied by the high costs of physical and human resources. In this context, this research is carried out with the main objective of developing a predictive model for concentrations of inhalable particles (PM10-2.5) using remote sensing. To develop the model, satellite images, mainly from Landsat 8, of the Mexico City’s Metropolitan Area were used. Using historical PM10 and PM2.5 measurements of the RAMA (Automatic Environmental Monitoring Network of Mexico City) and through the processing of the available satellite images, a preliminary model was generated in which it was possible to observe critical opportunity areas that will allow the generation of a robust model. Through the preliminary model applied to the scenes of Mexico City, three areas were identified that cause great interest due to the presumed high concentration of PM; the zones are those that present high plant density, bodies of water and soil without constructions or vegetation. To date, work continues on this line to improve the preliminary model that has been proposed. In addition, a brief analysis was made of six models, presented in articles developed in different parts of the world, this in order to visualize the optimal bands for the generation of a suitable model for Mexico City. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air quality, modeling pollution, particulate matter, remote sensing

Procedia PDF Downloads 139
329 The Determinants of Customer’s Purchase Intention of Islamic Credit Card: Evidence from Pakistan

Authors: Nasir Mehmood, Muhammad Yar Khan, Anam Javeed

Abstract:

This study aims to scrutinize the dynamics which tend to impact customer’s purchasing intention of Islamic credit card and nexus of product’s knowledge and religiosity with the attitude of potential Islamic credit card’s customer. The theory of reasoned action strengthened the idea that intentions due to its proven predictive power are most likely to instigate intended consumer behavior. Particularly, the study examines the relationships of perceived financial cost (PFC), subjective norms (SN), and attitude (ATT) with the intention to purchase Islamic credit cards. Using a convenience sampling approach, data have been collected from 450 customers of banks located in Rawalpindi and Islamabad. A five-point Likert scale self-administered questionnaire was used to collect the data. The data were analyzed using the Statistical Package of Social Sciences (SPSS) through the procedures of principal component and multiple regression analysis. The results suggested that customer’s religiosity and product knowledge are strong indicators of attitude towards buying Islamic credit cards. Likewise, subjective norms, attitude, and perceived financial cost have a significant positive impact on customers’ purchase intent of Islamic bank’s credit cards. This study models a useful path for future researchers to further investigate the underlined phenomenon along with a variety of psychodynamic factors which are still in its infancy, at least in the Pakistani banking sector. The study also provides an insight to the practitioners and Islamic bank managers for directing their efforts toward educating customers regarding the use of Islamic credit cards and other financial products.

Keywords: attitude, Islamic credit card, religiosity, subjective norms

Procedia PDF Downloads 125
328 Entrepreneurship Education and Student Entrepreneurial Intention: A Comprehensive Review, Synthesis of Empirical Findings, and Strategic Insights for Future Research Advancements

Authors: Abdul Waris Jalili, Yanqing Wang, Som Suor

Abstract:

This research paper explores the relationship between entrepreneurship education and students' entrepreneurial intentions. It aims to determine if entrepreneurship education reliably predicts students' intention to become entrepreneurs and how and when this relationship occurs. This study aims to investigate the predictive relationship between entrepreneurship education and student entrepreneurial intentions. The goal is to understand the factors that influence this relationship and to identify any mediating or moderating factors. A thorough and systematic search and review of empirical articles published between 2013 and 2023 were conducted. Three databases, Google Scholar, Science Direct, and PubMed, were explored to gather relevant studies. Criteria such as reporting empirical results, publication in English, and addressing the research questions were used to select 35 papers for analysis. The collective findings of the reviewed studies suggest a generally positive relationship between entrepreneurship education and student entrepreneurial intentions. However, recent findings indicate that this relationship may be more complex than previously thought. Mediators and moderators have been identified, highlighting instances where entrepreneurship education indirectly influences student entrepreneurial intentions. The review also emphasizes the need for more robust research designs to establish causality in this field. This research adds to the existing literature by providing a comprehensive review of the relationship between entrepreneurship education and student entrepreneurial intentions. It highlights the complexity of this relationship and the importance of considering mediators and moderators. The study also calls for future research to explore different facets of entrepreneurship education independently and examine complex relationships more comprehensively.

Keywords: entrepreneurship, entrepreneurship education, entrepreneurial intention, entrepreneurial self-efficacy

Procedia PDF Downloads 42
327 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia PDF Downloads 118
326 A Good Start for Digital Transformation of the Companies: A Literature and Experience-Based Predefined Roadmap

Authors: Batuhan Kocaoglu

Abstract:

Nowadays digital transformation is a hot topic both in service and production business. For the companies who want to stay alive in the following years, they should change how they do their business. Industry leaders started to improve their ERP (Enterprise Resource Planning) like backbone technologies to digital advances such as analytics, mobility, sensor-embedded smart devices, AI (Artificial Intelligence) and more. Selecting the appropriate technology for the related business problem also is a hot topic. Besides this, to operate in the modern environment and fulfill rapidly changing customer expectations, a digital transformation of the business is required and change the way the business runs, affect how they do their business. Even the digital transformation term is trendy the literature is limited and covers just the philosophy instead of a solid implementation plan. Current studies urge firms to start their digital transformation, but few tell us how to do. The huge investments scare companies with blur definitions and concepts. The aim of this paper to solidify the steps of the digital transformation and offer a roadmap for the companies and academicians. The proposed roadmap is developed based upon insights from the literature review, semi-structured interviews, and expert views to explore and identify crucial steps. We introduced our roadmap in the form of 8 main steps: Awareness; Planning; Operations; Implementation; Go-live; Optimization; Autonomation; Business Transformation; including a total of 11 sub-steps with examples. This study also emphasizes four dimensions of the digital transformation mainly: Readiness assessment; Building organizational infrastructure; Building technical infrastructure; Maturity assessment. Finally, roadmap corresponds the steps with three main terms used in digital transformation literacy as Digitization; Digitalization; and Digital Transformation. The resulted model shows that 'business process' and 'organizational issues' should be resolved before technology decisions and 'digitization'. Companies can start their journey with the solid steps, using the proposed roadmap to increase the success of their project implementation. Our roadmap is also adaptable for relevant Industry 4.0 and enterprise application projects. This roadmap will be useful for companies to persuade their top management for investments. Our results can be used as a baseline for further researches related to readiness assessment and maturity assessment studies.

Keywords: digital transformation, digital business, ERP, roadmap

Procedia PDF Downloads 147
325 The Competitiveness of Small and Medium Sized Enterprises: Digital Transformation of Business Models

Authors: Chante Van Tonder, Bart Bossink, Chris Schachtebeck, Cecile Nieuwenhuizen

Abstract:

Small and Medium-Sized Enterprises (SMEs) play a key role in national economies around the world, being contributors to economic and social well-being. Due to this, the success, growth and competitiveness of SMEs are critical. However, there are many factors that undermine this, such as resource constraints, poor information communication infrastructure (ICT), skills shortages and poor management. The Fourth Industrial Revolution offers new tools and opportunities such as digital transformation and business model innovation (BMI) to the SME sector to enhance its competitiveness. Adopting and leveraging digital technologies such as cloud, mobile technologies, big data and analytics can significantly improve business efficiencies, value proposition and customer experiences. Digital transformation can contribute to the growth and competitiveness of SMEs. However, SMEs are lagging behind in the participation of digital transformation. Extant research lacks conceptual and empirical research on how digital transformation drives BMI and the impact it has on the growth and competitiveness of SMEs. The purpose of the study is, therefore, to close this gap by developing and empirically validating a conceptual model to determine if SMEs are achieving BMI through digital transformation and how this is impacting the growth, competitiveness and overall business performance. An empirical study is being conducted on 300 SMEs, consisting of 150 South-African and 150 Dutch SMEs, to achieve this purpose. Structural equation modeling is used, since it is a multivariate statistical analysis technique that is used to analyse structural relationships and is a suitable research method to test the hypotheses in the model. Empirical research is needed to gather more insight into how and if SMEs are digitally transformed and how BMI can be driven through digital transformation. The findings of this study can be used by SME business owners, managers and employees at all levels. The findings will indicate if digital transformation can indeed impact the growth, competitiveness and overall performance of an SME, reiterating the importance and potential benefits of adopting digital technologies. In addition, the findings will also exhibit how BMI can be achieved in light of digital transformation. This study contributes to the body of knowledge in a highly relevant and important topic in management studies by analysing the impact of digital transformation on BMI on a large number of SMEs that are distinctly different in economic and cultural factors

Keywords: business models, business model innovation, digital transformation, SMEs

Procedia PDF Downloads 221
324 Prediction of Antibacterial Peptides against Propionibacterium acnes from the Peptidomes of Achatina fulica Mucus Fractions

Authors: Suwapitch Chalongkulasak, Teerasak E-Kobon, Pramote Chumnanpuen

Abstract:

Acne vulgaris is a common skin disease mainly caused by the Gram–positive pathogenic bacterium, Propionibacterium acnes. This bacterium stimulates inflammation process in human sebaceous glands. Giant African snail (Achatina fulica) is alien species that rapidly reproduces and seriously damages agricultural products in Thailand. There were several research reports on the medical and pharmaceutical benefits of this snail mucus peptides and proteins. This study aimed to in silico predict multifunctional bioactive peptides from A. fulica mucus peptidome using several bioinformatic tools for determination of antimicrobial (iAMPpred), anti–biofilm (dPABBs), cytotoxic (Toxinpred), cell membrane penetrating (CPPpred) and anti–quorum sensing (QSPpred) peptides. Three candidate peptides with the highest predictive score were selected and re-designed/modified to improve the required activities. Structural and physicochemical properties of six anti–P. acnes (APA) peptide candidates were performed by PEP–FOLD3 program and the five aforementioned tools. All candidates had random coiled structure and were named as APA1–ori, APA2–ori, APA3–ori, APA1–mod, APA2–mod and APA3–mod. To validate the APA activity, these peptide candidates were synthesized and tested against six isolates of P. acnes. The modified APA peptides showed high APA activity on some isolates. Therefore, our biomimetic mucus peptides could be useful for preventing acne vulgaris and further examined on other activities important to medical and pharmaceutical applications.

Keywords: Propionibacterium acnes, Achatina fulica, peptidomes, antibacterial peptides, snail mucus

Procedia PDF Downloads 118
323 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria

Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov

Abstract:

This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.

Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model

Procedia PDF Downloads 36
322 Smart Sensor Data to Predict Machine Performance with IoT-Based Machine Learning and Artificial Intelligence

Authors: C. J. Rossouw, T. I. van Niekerk

Abstract:

The global manufacturing industry is utilizing the internet and cloud-based services to further explore the anatomy and optimize manufacturing processes in support of the movement into the Fourth Industrial Revolution (4IR). The 4IR from a third world and African perspective is hindered by the fact that many manufacturing systems that were developed in the third industrial revolution are not inherently equipped to utilize the internet and services of the 4IR, hindering the progression of third world manufacturing industries into the 4IR. This research focuses on the development of a non-invasive and cost-effective cyber-physical IoT system that will exploit a machine’s vibration to expose semantic characteristics in the manufacturing process and utilize these results through a real-time cloud-based machine condition monitoring system with the intention to optimize the system. A microcontroller-based IoT sensor was designed to acquire a machine’s mechanical vibration data, process it in real-time, and transmit it to a cloud-based platform via Wi-Fi and the internet. Time-frequency Fourier analysis was applied to the vibration data to form an image representation of the machine’s behaviour. This data was used to train a Convolutional Neural Network (CNN) to learn semantic characteristics in the machine’s behaviour and relate them to a state of operation. The same data was also used to train a Convolutional Autoencoder (CAE) to detect anomalies in the data. Real-time edge-based artificial intelligence was achieved by deploying the CNN and CAE on the sensor to analyse the vibration. A cloud platform was deployed to visualize the vibration data and the results of the CNN and CAE in real-time. The cyber-physical IoT system was deployed on a semi-automated metal granulation machine with a set of trained machine learning models. Using a single sensor, the system was able to accurately visualize three states of the machine’s operation in real-time. The system was also able to detect a variance in the material being granulated. The research demonstrates how non-IoT manufacturing systems can be equipped with edge-based artificial intelligence to establish a remote machine condition monitoring system.

Keywords: IoT, cyber-physical systems, artificial intelligence, manufacturing, vibration analytics, continuous machine condition monitoring

Procedia PDF Downloads 72
321 The Theory of the Mystery: Unifying the Quantum and Cosmic Worlds

Authors: Md. Najiur Rahman

Abstract:

This hypothesis reveals a profound and symmetrical connection that goes beyond the boundaries of quantum physics and cosmology, revolutionizing our understanding of the fundamental building blocks of the cosmos, given its name ‘The Theory of the Mystery’. This theory has an elegantly simple equation, “R = ∆r / √∆m” which establishes a beautiful and well-crafted relationship between the radius (R) of an elementary particle or galaxy, the relative change in radius (∆r), and the mass difference (∆m) between related entities. It is fascinating to note that this formula presents a super synchronization, one which involves the convergence of every basic particle and any single celestial entity into perfect alignment with its respective mass and radius. In addition, we have a Supporting equation that defines the mass-radius connection of an entity by the equation: R=√m/N, where N is an empirically established constant, determined to be approximately 42.86 kg/m, representing the proportionality between mass and radius. It provides precise predictions, collects empirical evidence, and explores the far-reaching consequences of theories such as General Relativity. This elegant symmetry reveals a fundamental principle that underpins the cosmos: each component, whether small or large, follows a precise mass-radius relationship to exert gravity by a universal law. This hypothesis represents a transformative process towards a unified theory of physics, and the pursuit of experimental verification will show that each particle and galaxy is bound by gravity and plays a unique but harmonious role in shaping the universe. It promises to reveal the great symphony of the mighty cosmos. The predictive power of our hypothesis invites the exploration of entities at the farthest reaches of the cosmos, providing a bridge between the known and the unknown.

Keywords: unified theory, quantum gravity, mass-radius relationship, dark matter, uniform gravity

Procedia PDF Downloads 53
320 Improved Technology Portfolio Management via Sustainability Analysis

Authors: Ali Al-Shehri, Abdulaziz Al-Qasim, Abdulkarim Sofi, Ali Yousef

Abstract:

The oil and gas industry has played a major role in improving the prosperity of mankind and driving the world economy. According to the International Energy Agency (IEA) and Integrated Environmental Assessment (EIA) estimates, the world will continue to rely heavily on hydrocarbons for decades to come. This growing energy demand mandates taking sustainability measures to prolong the availability of reliable and affordable energy sources, and ensure lowering its environmental impact. Unlike any other industry, the oil and gas upstream operations are energy-intensive and scattered over large zonal areas. These challenging conditions require unique sustainability solutions. In recent years there has been a concerted effort by the oil and gas industry to develop and deploy innovative technologies to: maximize efficiency, reduce carbon footprint, reduce CO2 emissions, and optimize resources and material consumption. In the past, the main driver for research and development (R&D) in the exploration and production sector was primarily driven by maximizing profit through higher hydrocarbon recovery and new discoveries. Environmental-friendly and sustainable technologies are increasingly being deployed to balance sustainability and profitability. Analyzing technology and its sustainability impact is increasingly being used in corporate decision-making for improved portfolio management and allocating valuable resources toward technology R&D.This paper articulates and discusses a novel workflow to identify strategic sustainable technologies for improved portfolio management by addressing existing and future upstream challenges. It uses a systematic approach that relies on sustainability key performance indicators (KPI’s) including energy efficiency quotient, carbon footprint, and CO2 emissions. The paper provides examples of various technologies including CCS, reducing water cuts, automation, using renewables, energy efficiency, etc. The use of 4IR technologies such as Artificial Intelligence, Machine Learning, and Data Analytics are also discussed. Overlapping technologies, areas of collaboration and synergistic relationships are identified. The unique sustainability analyses provide improved decision-making on technology portfolio management.

Keywords: sustainability, oil& gas, technology portfolio, key performance indicator

Procedia PDF Downloads 167
319 Customer Churn Prediction by Using Four Machine Learning Algorithms Integrating Features Selection and Normalization in the Telecom Sector

Authors: Alanoud Moraya Aldalan, Abdulaziz Almaleh

Abstract:

A crucial component of maintaining a customer-oriented business as in the telecom industry is understanding the reasons and factors that lead to customer churn. Competition between telecom companies has greatly increased in recent years. It has become more important to understand customers’ needs in this strong market of telecom industries, especially for those who are looking to turn over their service providers. So, predictive churn is now a mandatory requirement for retaining those customers. Machine learning can be utilized to accomplish this. Churn Prediction has become a very important topic in terms of machine learning classification in the telecommunications industry. Understanding the factors of customer churn and how they behave is very important to building an effective churn prediction model. This paper aims to predict churn and identify factors of customers’ churn based on their past service usage history. Aiming at this objective, the study makes use of feature selection, normalization, and feature engineering. Then, this study compared the performance of four different machine learning algorithms on the Orange dataset: Logistic Regression, Random Forest, Decision Tree, and Gradient Boosting. Evaluation of the performance was conducted by using the F1 score and ROC-AUC. Comparing the results of this study with existing models has proven to produce better results. The results showed the Gradients Boosting with feature selection technique outperformed in this study by achieving a 99% F1-score and 99% AUC, and all other experiments achieved good results as well.

Keywords: machine learning, gradient boosting, logistic regression, churn, random forest, decision tree, ROC, AUC, F1-score

Procedia PDF Downloads 121
318 Navigating Disruption: Key Principles and Innovations in Modern Management for Organizational Success

Authors: Ahmad Haidar

Abstract:

This research paper investigates the concept of modern management, concentrating on the development of managerial practices and the adoption of innovative strategies in response to the fast-changing business landscape caused by Artificial Intelligence (AI). The study begins by examining the historical context of management theories, tracing the progression from classical to contemporary models, and identifying key drivers of change. Through a comprehensive review of existing literature and case studies, this paper provides valuable insights into the principles and practices of modern management, offering a roadmap for organizations aiming to navigate the complexities of the contemporary business world. The paper examines the growing role of digital technology in modern management, focusing on incorporating AI, machine learning, and data analytics to streamline operations and facilitate informed decision-making. Moreover, the research highlights the emergence of new principles, such as adaptability, flexibility, public participation, trust, transparency, and digital mindset, as crucial components of modern management. Also, the role of business leaders is investigated by studying contemporary leadership styles, such as transformational, situational, and servant leadership, emphasizing the significance of emotional intelligence, empathy, and collaboration in fostering a healthy organizational culture. Furthermore, the research delves into the crucial role of environmental sustainability, corporate social responsibility (CSR), and corporate digital responsibility (CDR). Organizations strive to balance economic growth with ethical considerations and long-term viability. The primary research question for this study is: "What are the key principles, practices, and innovations that define modern management, and how can organizations effectively implement these strategies to thrive in the rapidly changing business landscape?." The research contributes to a comprehensive understanding of modern management by examining its historical context, the impact of digital technologies, the importance of contemporary leadership styles, and the role of CSR and CDR in today's business landscape.

Keywords: modern management, digital technology, leadership styles, adaptability, innovation, corporate social responsibility, organizational success, corporate digital responsibility

Procedia PDF Downloads 50
317 Spatial Analytics of Ramayan to Geolocate Lanka

Authors: Raj Mukta Sundaram

Abstract:

The location of Ayodhya is distinctly described along river Sarayu in the epic Ramayan. On the contrary, even elaborate descriptions of Lanka and its environs are still proving elusive to human ingenuity to find a direct correlation on the ground. His-torically, there were hardly any attempts to locate Lanka, but some speculations have been made very recently, of which Sri Lanka has gained widespread public ac-ceptance for obvious reasons, such as Sri and Lanka. This belief is almost secured by the impression of Ram Setu on the satellite images, which has led the government to initiate a scientific mission to determine its age. In fact, other viewpoints believe Lanka to be somewhere far-flung along the equator, and another has long proclaimed it to be in central regions of India, but both are diminished by contemporary belief. This study emanates from the fact that Sri Lanka has no correlation to epic, and more importantly, satellite images are deceptive. So the objectives are twofold - firstly, to interpret the text from a holistic approach by analyzing the ecosystem, settlements, geological as-pects, and most importantly, the timeline of key events. Secondly, it explains the pit-falls in the rationale behind contemporary belief. At the outset, it categorically rejects the notion of Ram Setu, which, in geological terms, is merely a part of the continental shelf developed millions of years ago. It also refutes the misconception created by the word “Sri,” which is, in fact, an official name adopted by the country in the seventies with no correlation whatsoever with the events of Ramayana. Likewise, the study ar-gues for the establishment of a prosperous kingdom on a remote island with adverse climatic conditions for any civilization at that time. Eventually, the study demonstrates that travel time for the distances covered by Lord Rama does not corroborate with the description in the epic. It all leads to one conclusion that Lanka cannot be in Sri Lanka. Rather, it needs to be somewhere in the central-eastern parts of India. That region jus-tifies the environs and timelines for the journeys undertaken by Lord Rama, besides the fact that the tribes of the region show strong allegiance to Ravana. The study strongly recommends looking into the central-east region of India for the golden abode of a demon king and rejuvenating tourism of a scenic and culturally rich region hitherto marred by disturbances.

Keywords: spatial analysis, Ramayan, heritage, tourism

Procedia PDF Downloads 39
316 Empirical Evidence to Beliefs and Perceptions About Mental Health Disorder and Substance Abuse: The Role of a Social Worker

Authors: Helena Baffoe

Abstract:

Context: In the United States, there have been significant advancements in programs aimed at improving the lives of individuals with mental health disorders and substance abuse problems. However, public attitudes and beliefs regarding these issues have not improved correspondingly. This study aims to explore the perceptions and beliefs surrounding mental health disorders and substance abuse in the context of data analytics in the field of social work. Research Aim: The aim of this research is to provide empirical evidence on the beliefs and perceptions regarding mental health disorders and substance abuse. Specifically, the study seeks to answer the question of whether being diagnosed with a mental disorder implies a diagnosis of substance abuse. Additionally, the research aims to analyze the specific roles that social workers can play in addressing individuals with mental disorders. Methodology: This research adopts a data-driven methodology, acquiring comprehensive data from the Substance Abuse and Mental Health Services Administration (SAMHSA). A noteworthy causal connection between mental disorders and substance abuse exists, a relationship that current literature tends to overlook critically. To address this gap, we applied logistic regression with an Instrumental Variable approach, effectively mitigating potential endogeneity issues in the analysis in order to ensure robust and unbiased results. This methodology allows for a rigorous examination of the relationship between mental disorders and substance abuse. Empirical Findings: The analysis of the data reveals that depressive, anxiety, and trauma/stressor mental disorders are the most common in the United States. However, the study does not find statistically significant evidence to support the notion that being diagnosed with these mental disorders necessarily implies a diagnosis of substance abuse. This suggests that there is a misconception among the public regarding the relationship between mental health disorders and substance abuse. Theoretical Importance: The research contributes to the existing body of literature by providing empirical evidence to challenge prevailing beliefs and perceptions regarding mental health disorders and substance abuse. By using a novel methodological approach and analyzing new US data, the study sheds light on the cultural and social factors that influence these attitudes.

Keywords: mental health disorder, substance abuse, empirical evidence, logistic regression with IV

Procedia PDF Downloads 45
315 Comparison of Different Machine Learning Algorithms for Solubility Prediction

Authors: Muhammet Baldan, Emel Timuçin

Abstract:

Molecular solubility prediction plays a crucial role in various fields, such as drug discovery, environmental science, and material science. In this study, we compare the performance of five machine learning algorithms—linear regression, support vector machines (SVM), random forests, gradient boosting machines (GBM), and neural networks—for predicting molecular solubility using the AqSolDB dataset. The dataset consists of 9981 data points with their corresponding solubility values. MACCS keys (166 bits), RDKit properties (20 properties), and structural properties(3) features are extracted for every smile representation in the dataset. A total of 189 features were used for training and testing for every molecule. Each algorithm is trained on a subset of the dataset and evaluated using metrics accuracy scores. Additionally, computational time for training and testing is recorded to assess the efficiency of each algorithm. Our results demonstrate that random forest model outperformed other algorithms in terms of predictive accuracy, achieving an 0.93 accuracy score. Gradient boosting machines and neural networks also exhibit strong performance, closely followed by support vector machines. Linear regression, while simpler in nature, demonstrates competitive performance but with slightly higher errors compared to ensemble methods. Overall, this study provides valuable insights into the performance of machine learning algorithms for molecular solubility prediction, highlighting the importance of algorithm selection in achieving accurate and efficient predictions in practical applications.

Keywords: random forest, machine learning, comparison, feature extraction

Procedia PDF Downloads 20