Search results for: machine capacity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6697

Search results for: machine capacity

5497 An Intelligent Baby Care System Based on IoT and Deep Learning Techniques

Authors: Chinlun Lai, Lunjyh Jiang

Abstract:

Due to the heavy burden and pressure of caring for infants, an integrated automatic baby watching system based on IoT smart sensing and deep learning machine vision techniques is proposed in this paper. By monitoring infant body conditions such as heartbeat, breathing, body temperature, sleeping posture, as well as the surrounding conditions such as dangerous/sharp objects, light, noise, humidity and temperature, the proposed system can analyze and predict the obvious/potential dangerous conditions according to observed data and then adopt suitable actions in real time to protect the infant from harm. Thus, reducing the burden of the caregiver and improving safety efficiency of the caring work. The experimental results show that the proposed system works successfully for the infant care work and thus can be implemented in various life fields practically.

Keywords: baby care system, Internet of Things, deep learning, machine vision

Procedia PDF Downloads 211
5496 Flexural Strengthening of Steel Beams Using Fiber Reinforced Polymers

Authors: Sally Hosny, Mona G. Ibrahim, N. K. Hassan

Abstract:

Fiber reinforced polymers (FRP) is one of the most environmentally method for strengthening and retrofitting steel structure buildings. The behaviour of flexural strengthened steel I-beams using FRP was investigated. The finite element (FE) models were developed using ANSYS® as verification cases to simulate the experimental behaviour of using FRP strips to flexure strengthen steel I-beam. Two experimental studies were selected for verification; first examined the effect of different thicknesses and modulus of elasticity while the second studied the effect of applying different carbon fiber reinforced polymers (CFRP) bond lengths. The proposed FE models were in good agreement with the experimental results in terms of failure modes, load bearing capacities and strain distribution on CFRP strips. The verified FE models can be utilized to conduct a parametric study where various widths (40, 50, 60, 70 and 80 mm), thickness (1.2, 2 and 4 mm) and lengths (1500, 1700 and 1800 mm) of CFRP were analyzed. The results presented clearly revealed that the load bearing capacity was significantly increased (+7%) when the width and thickness were increased. However, load bearing capacity was slightly affected using longer CFRP strips. Moreover, applying another glass fiber reinforced polymers (GFRP) of 1500 mm in length, 50 mm in width and thicknesses of 1.2, 2 and 4 mm were investigated. Load bearing capacity of strengthened I-beams using GFRP is less than CFRP by average 8%. Statistical analysis has been conducted using Minitab®.

Keywords: FRP, strengthened steel I-beams, flexural, FEM, ANSYS

Procedia PDF Downloads 259
5495 Classification of IoT Traffic Security Attacks Using Deep Learning

Authors: Anum Ali, Kashaf ad Dooja, Asif Saleem

Abstract:

The future smart cities trend will be towards Internet of Things (IoT); IoT creates dynamic connections in a ubiquitous manner. Smart cities offer ease and flexibility for daily life matters. By using small devices that are connected to cloud servers based on IoT, network traffic between these devices is growing exponentially, whose security is a concerned issue, since ratio of cyber attack may make the network traffic vulnerable. This paper discusses the latest machine learning approaches in related work further to tackle the increasing rate of cyber attacks, machine learning algorithm is applied to IoT-based network traffic data. The proposed algorithm train itself on data and identify different sections of devices interaction by using supervised learning which is considered as a classifier related to a specific IoT device class. The simulation results clearly identify the attacks and produce fewer false detections.

Keywords: IoT, traffic security, deep learning, classification

Procedia PDF Downloads 133
5494 Heart Ailment Prediction Using Machine Learning Methods

Authors: Abhigyan Hedau, Priya Shelke, Riddhi Mirajkar, Shreyash Chaple, Mrunali Gadekar, Himanshu Akula

Abstract:

The heart is the coordinating centre of the major endocrine glandular structure of the body, which produces hormones that profoundly affect the operations of the body, and diagnosing cardiovascular disease is a difficult but critical task. By extracting knowledge and information about the disease from patient data, data mining is a more practical technique to help doctors detect disorders. We use a variety of machine learning methods here, including logistic regression and support vector classifiers (SVC), K-nearest neighbours Classifiers (KNN), Decision Tree Classifiers, Random Forest classifiers and Gradient Boosting classifiers. These algorithms are applied to patient data containing 13 different factors to build a system that predicts heart disease in less time with more accuracy.

Keywords: logistic regression, support vector classifier, k-nearest neighbour, decision tree, random forest and gradient boosting

Procedia PDF Downloads 31
5493 Investigating the performance of machine learning models on PM2.5 forecasts: A case study in the city of Thessaloniki

Authors: Alexandros Pournaras, Anastasia Papadopoulou, Serafim Kontos, Anastasios Karakostas

Abstract:

The air quality of modern cities is an important concern, as poor air quality contributes to human health and environmental issues. Reliable air quality forecasting has, thus, gained scientific and governmental attention as an essential tool that enables authorities to take proactive measures for public safety. In this study, the potential of Machine Learning (ML) models to forecast PM2.5 at local scale is investigated in the city of Thessaloniki, the second largest city in Greece, which has been struggling with the persistent issue of air pollution. ML models, with proven ability to address timeseries forecasting, are employed to predict the PM2.5 concentrations and the respective Air Quality Index 5-days ahead by learning from daily historical air quality and meteorological data from 2014 to 2016 and gathered from two stations with different land use characteristics in the urban fabric of Thessaloniki. The performance of the ML models on PM2.5 concentrations is evaluated with common statistical methods, such as R squared (r²) and Root Mean Squared Error (RMSE), utilizing a portion of the stations’ measurements as test set. A multi-categorical evaluation is utilized for the assessment of their performance on respective AQIs. Several conclusions were made from the experiments conducted. Experimenting on MLs’ configuration revealed a moderate effect of various parameters and training schemas on the model’s predictions. Their performance of all these models were found to produce satisfactory results on PM2.5 concentrations. In addition, their application on untrained stations showed that these models can perform well, indicating a generalized behavior. Moreover, their performance on AQI was even better, showing that the MLs can be used as predictors for AQI, which is the direct information provided to the general public.

Keywords: Air Quality, AQ Forecasting, AQI, Machine Learning, PM2.5

Procedia PDF Downloads 55
5492 Towards Resource Sufficiency in Engineering Education in Sub-Saharan Africa

Authors: Iyabosola B. Oronti, Adeoluwawale A. Adewusi, Olubusola O. Nuga

Abstract:

Sub-Saharan Africa has long been known to be a region rife with poverty, inadequate health facilities, food shortages, high transport and communication costs and very low pace of infrastructural and technological development. These factors combined have led to decades of resource paucity in engineering education. Engineering is core to global development and building of capacity in engineering education with available resources in sub-Saharan Africa has become imperative. This paper identifies core political issues and policy shifts contributing adversely to this present state of affairs, and also explores the offshoots of the changing global political environment as it affects engineering education in the developing nations of sub-Saharan Africa. Opportunities for instituting resource sufficiency are examined and corrective measures that can be taken to resuscitate and stabilize the educational sector in the region are also suggested.

Keywords: capacity building, engineering education, resource sufficiency, sub-Saharan Africa

Procedia PDF Downloads 416
5491 StockTwits Sentiment Analysis on Stock Price Prediction

Authors: Min Chen, Rubi Gupta

Abstract:

Understanding and predicting stock market movements is a challenging problem. It is believed stock markets are partially driven by public sentiments, which leads to numerous research efforts to predict stock market trend using public sentiments expressed on social media such as Twitter but with limited success. Recently a microblogging website StockTwits is becoming increasingly popular for users to share their discussions and sentiments about stocks and financial market. In this project, we analyze the text content of StockTwits tweets and extract financial sentiment using text featurization and machine learning algorithms. StockTwits tweets are first pre-processed using techniques including stopword removal, special character removal, and case normalization to remove noise. Features are extracted from these preprocessed tweets through text featurization process using bags of words, N-gram models, TF-IDF (term frequency-inverse document frequency), and latent semantic analysis. Machine learning models are then trained to classify the tweets' sentiment as positive (bullish) or negative (bearish). The correlation between the aggregated daily sentiment and daily stock price movement is then investigated using Pearson’s correlation coefficient. Finally, the sentiment information is applied together with time series stock data to predict stock price movement. The experiments on five companies (Apple, Amazon, General Electric, Microsoft, and Target) in a duration of nine months demonstrate the effectiveness of our study in improving the prediction accuracy.

Keywords: machine learning, sentiment analysis, stock price prediction, tweet processing

Procedia PDF Downloads 134
5490 Static Response of Homogeneous Clay Stratum to Imposed Structural Loads

Authors: Aaron Aboshio

Abstract:

Numerical study of the static response of homogeneous clay stratum considering a wide range of cohesion and subject to foundation loads is presented. The linear elastic–perfectly plastic constitutive relation with the von Mises yield criterion were utilised to develop a numerically cost effective finite element model for the soil while imposing a rigid body constrain to the foundation footing. From the analyses carried out, estimate of the bearing capacity factor, Nc as well as the ultimate load-carrying capacities of these soils, effect of cohesion on foundation settlements, stress fields and failure propagation were obtained. These are consistent with other findings in the literature and hence can be a useful guide in design of safe foundations in clay soils for buildings and other structure.

Keywords: bearing capacity factors, finite element method, safe bearing pressure, structure-soil interaction

Procedia PDF Downloads 282
5489 A Comparative Time-Series Analysis and Deep Learning Projection of Innate Radon Gas Risk in Canadian and Swedish Residential Buildings

Authors: Selim M. Khan, Dustin D. Pearson, Tryggve Rönnqvist, Markus E. Nielsen, Joshua M. Taron, Aaron A. Goodarzi

Abstract:

Accumulation of radioactive radon gas in indoor air poses a serious risk to human health by increasing the lifetime risk of lung cancer and is classified by IARC as a category one carcinogen. Radon exposure risks are a function of geologic, geographic, design, and human behavioural variables and can change over time. Using time series and deep machine learning modelling, we analyzed long-term radon test outcomes as a function of building metrics from 25,489 Canadian and 38,596 Swedish residential properties constructed between 1945 to 2020. While Canadian and Swedish properties built between 1970 and 1980 are comparable (96–103 Bq/m³), innate radon risks subsequently diverge, rising in Canada and falling in Sweden such that 21st Century Canadian houses show 467% greater average radon (131 Bq/m³) relative to Swedish equivalents (28 Bq/m³). These trends are consistent across housing types and regions within each country. The introduction of energy efficiency measures within Canadian and Swedish building codes coincided with opposing radon level trajectories in each nation. Deep machine learning modelling predicts that, without intervention, average Canadian residential radon levels will increase to 176 Bq/m³ by 2050, emphasizing the importance and urgency of future building code intervention to achieve systemic radon reduction in Canada.

Keywords: radon health risk, time-series, deep machine learning, lung cancer, Canada, Sweden

Procedia PDF Downloads 70
5488 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe, V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: diabetic retinopathy, fundus images, STARE, Gabor filter, support vector machine

Procedia PDF Downloads 278
5487 L-Carnitine Supplementation and Exercise-Induced Muscle Damage

Authors: B. Nakhostin-Roohi, F. Khoshkhahesh, KH. Parandak, R. Ramazanzadeh

Abstract:

Introduction: The protective effect of antioxidants in diminishing the post-exercise rise of serum CK and LDH in individuals trained for competitive sports has come to light in recent years. This study was conducted to assess the effect of Two-week L-carnitine supplementation on exercise-induced muscle damage, as well as antioxidant capacity after a bout of strenuous exercise in active healthy young men. Methodology: Twenty active healthy men volunteered for this study. Participants were randomized in a double-blind placebo-controlled fashion into two groups: L-carnitine (C group; n = 10) and placebo group (P group; n = 10). The participants took supplementation (2000 mg L-carnitine) or placebo (2000 mg lactose) daily for 2weeks before the main trial. Then, participants ran 14 km. Blood samples were taken before supplementation, before exercise, immediately, 2h and 24h after exercise. Creatine kinase (CK), and lactate dehydrogenase (LDH), and total antioxidant capacity (TAC) were measured. Results: Serum CK and LDH significantly increased after exercise in both groups (p < 0.05). Serum LDH was significantly lower in C group than P group 2h and 24h after exercise (p < 0.05). Furthermore, CK was significantly lower in C group compared with P group just 24h after exercise (p < 0.05). Plasma TAC increased significantly 14 days after supplementation and 24h after exercise in C group compared with P group (p < 0.05). Discussion and conclusion: These results suggest two-week daily oral supplementation of L-carnitine has been able to promote antioxidant capacity before and after exercise and decrease muscle damage markers through possibly inhibition of exercise-induced oxidative stress.

Keywords: L-carnitine, muscle damage, creatine kinase, Lactate dehydrogenase

Procedia PDF Downloads 428
5486 Flexural Response of Sandwiches with Micro Lattice Cores Manufactured via Selective Laser Sintering

Authors: Emre Kara, Ali Kurşun, Halil Aykul

Abstract:

The lightweight sandwiches obtained with the use of various core materials such as foams, honeycomb, lattice structures etc., which have high energy absorbing capacity and high strength to weight ratio, are suitable for several applications in transport industry (automotive, aerospace, shipbuilding industry) where saving of fuel consumption, load carrying capacity increase, safety of vehicles and decrease of emission of harmful gases are very important aspects. While the sandwich structures with foams and honeycombs have been applied for many years, there is a growing interest on a new generation sandwiches with micro lattice cores. In order to produce these core structures, various production methods were created with the development of the technology. One of these production technologies is an additive manufacturing technique called selective laser sintering/melting (SLS/SLM) which is very popular nowadays because of saving of production time and achieving the production of complex topologies. The static bending and the dynamic low velocity impact tests of the sandwiches with carbon fiber/epoxy skins and the micro lattice cores produced via SLS/SLM were already reported in just a few studies. The goal of this investigation was the analysis of the flexural response of the sandwiches consisting of glass fiber reinforced plastic (GFRP) skins and the micro lattice cores manufactured via SLS under thermo-mechanical loads in order to compare the results in terms of peak load and absorbed energy values respect to the effect of core cell size, temperature and support span length. The micro lattice cores were manufactured using SLS technology that creates the product drawn by a 3D computer aided design (CAD) software. The lattice cores which were designed as body centered cubic (BCC) model having two different cell sizes (d= 2 and 2.5 mm) with the strut diameter of 0.3 mm were produced using titanium alloy (Ti6Al4V) powder. During the production of all the core materials, the same production parameters such as laser power, laser beam diameter, building direction etc. were kept constant. Vacuum Infusion (VI) method was used to produce skin materials, made of [0°/90°] woven S-Glass prepreg laminates. The combination of the core and skins were implemented under VI. Three point bending tests were carried out by a servo-hydraulic test machine with different values of support span distances (L = 30, 45, and 60 mm) under various temperature values (T = 23, 40 and 60 °C) in order to analyze the influences of support span and temperature values. The failure mode of the collapsed sandwiches has been investigated using 3D computed tomography (CT) that allows a three-dimensional reconstruction of the analyzed object. The main results of the bending tests are: load-deflection curves, peak force and absorbed energy values. The results were compared according to the effect of cell size, support span and temperature values. The obtained results have particular importance for applications that require lightweight structures with a high capacity of energy dissipation, such as the transport industry, where problems of collision and crash have increased in the last years.

Keywords: light-weight sandwich structures, micro lattice cores, selective laser sintering, transport application

Procedia PDF Downloads 325
5485 Forensic Analysis of Thumbnail Images in Windows 10

Authors: George Kurian, Hongmei Chi

Abstract:

Digital evidence plays a critical role in most legal investigations. In many cases, thumbnail databases show important information in that investigation. The probability of having digital evidence retrieved from a computer or smart device has increased, even though the previous user removed data and deleted apps on those devices. Due to the increase in digital forensics, the ability to store residual information from various thumbnail applications has improved. This paper will focus on investigating thumbnail information from Windows 10. Thumbnail images of interest in forensic investigations may be intact even when the original pictures have been deleted. It is our research goal to recover useful information from thumbnails. In this research project, we use various forensics tools to collect left thumbnail information from deleted videos or pictures. We examine and describe the various thumbnail sources in Windows and propose a methodology for thumbnail collection and analysis from laptops or desktops. A machine learning algorithm is adopted to help speed up content from thumbnail pictures.

Keywords: digital forensic, forensic tools, soundness, thumbnail, machine learning, OCR

Procedia PDF Downloads 112
5484 Design and Implementation of an AI-Enabled Task Assistance and Management System

Authors: Arun Prasad Jaganathan

Abstract:

In today's dynamic industrial world, traditional task allocation methods often fall short in adapting to evolving operational conditions. This paper introduces an AI-enabled task assistance and management system designed to overcome the limitations of conventional approaches. By using artificial intelligence (AI) and machine learning (ML), the system intelligently interprets user instructions, analyzes tasks, and allocates resources based on real-time data and environmental factors. Additionally, geolocation tracking enables proactive identification of potential delays, ensuring timely interventions. With its transparent reporting mechanisms, the system provides stakeholders with clear insights into task progress, fostering accountability and informed decision-making. The paper presents a comprehensive overview of the system architecture, algorithm, and implementation, highlighting its potential to revolutionize task management across diverse industries.

Keywords: artificial intelligence, machine learning, task allocation, operational efficiency, resource optimization

Procedia PDF Downloads 31
5483 Comparative Analysis of Traditional and Modern Roundabouts Using Sidra Intersection

Authors: Amir Mohammad Parvini, Amir Masoud Rahimi

Abstract:

Currently, most parts of the world have shifted from traditional roundabouts to modern roundabouts with respect to the role of roundabouts in reducing accidents, increasing safety, lowering the maintenance costs compared to traffic circles with their improper functional and safety experiences. In this study, field data collected from a current traditional roundabout was analyzed by the software AIMSUN and the obtained numbers were recorded. The modern roundabout was designed by changes in the traditional one, considering the geometric standards listed in regulations. Then, the modern roundabout was analyzed by applying a heterogeneous traffic by a micro-simulation software SIDRA (5.1). The function, capacity, and safety of the roundabout were analyzed assuming the superiority of modern roundabouts and acceptable LOS. The obtained results indicate that the function, capacity, and safety of modern roundabouts are better than traditional ones.

Keywords: traditional roundabout, traffic circles, modern roundabout, AIMSUN, SIDRA

Procedia PDF Downloads 375
5482 Visitor Management in the National Parks: Recreational Carrying Capacity Assessment of Çıralı Coast, Turkey

Authors: Tendü H. Göktuğ, Gönül T. İçemer, Bülent Deniz

Abstract:

National parks, which are rich in natural and cultural resources values are protected in the context of the idea to develop sustainability, are among the most important recreated areas demanding with each passing day. Increasing recreational use or unplanned use forms negatively affect the resource values and visitor satisfaction. The intent of national parks management is to protect the natural and cultural resource values and to provide the visitors with a quality of recreational experience, as well. In this context, the current studies to improve the appropriate tourism and recreation planning and visitor management, approach have focused on recreational carrying capacity analysis. The aim of this study is to analyze recreational carrying capacity of Çıralı Coast in the Bey Mountains Coastal National Park to compare the analyze results with the current usage format and to develop alternative management strategies. In the first phase of the study, the annual and daily visitations, geographic, bio-physical, and managerial characteristics of the park and the type of recreational usage and the recreational areas were analyzed. In addition to these, ecological observations were carried out in order to determine recreational-based pressures on the ecosystems. On-site questionnaires were administrated to a sample of 284 respondents in the August 2015 - 2016 to collect data concerning the demographics and visit characteristics. The second phase of the study, the coastal area separated into four different usage zones and the methodology proposed by Cifuentes (1992) was used for capacity analyses. This method supplies the calculation of physical, real and effective carrying capacities by using environmental, ecological, climatic and managerial parameters in a formulation. Expected numbers which estimated three levels of carrying capacities were compared to current numbers of national parks’ visitors. In the study, it was determined that the current recreational uses in the north of the beach were caused by ecological pressures, and the current numbers in the south of beach much more than estimated numbers of visitors. Based on these results management strategies were defined and the appropriate management tools were developed in accordance with these strategies. The authors are grateful for the financial support of this project by The Scientific and Technological Research Council of Turkey (No: 114O344)

Keywords: Çıralı Coast, national parks, recreational carrying capacity, visitor management

Procedia PDF Downloads 258
5481 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning

Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana

Abstract:

Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.

Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning

Procedia PDF Downloads 17
5480 Preliminary Phytopharmacological Evaluation of Methanol and Petroleum Ether Extracts of Selected Vegetables of Bangladesh

Authors: A. Mohammad Abdul Motalib Momin, B. Sheikh Mohammad Adil Uddin, C. Md Mamunur Rashid, D. Sheikh Arman Mahbub, E. Mohammad Sazzad Rahman, F. Abdullah Faruque

Abstract:

The present study was designed to investigate the antioxidant and cytotoxicity potential of methanol and pet ether extracts of the Lagenaria siceraria (LM, LP), Cucumis sativus (CSM, CSP), Cucurbita maxima (CMM, CMP) plants. For the phytochemical screening, crude extract was tested for the presence of different chemical groups. In Lagenaria siceraria the following groups were identified: alkaloids, steroids, glycosides and saponins for methanol extract and alkaloids, steroids, glycosides, tannins and saponins are for pet ether extract. Glycosides, steroids, alkaloids, saponins and tannins are present in the methanol extract of Cucumis sativus; the pet ether extract has the alkaloids, steroids and saponins. Glycosides, steroids, alkaloids, saponins and tannins are present in both the methanolic and pet ether extract of Cucurbita maxima. In vitro antioxidant activity of the extracts were performed using DPPH radical scavenging, nitric oxide (NO) scavenging, total antioxidant capacity, total phenol content, total flavonoid content, and Cupric Reducing Antioxidant Capacity assays. The most prominent antioxidant activity was observed with the CSM in the DPPH free radical scavenging test with an IC50 value of 1667.23±11.00271 μg/ml as opposed to that of standard ascorbic acid (IC50 value of 15.707± 1.181 μg/ml.) In total antioxidant capacity method, CMP showed the highest activity (427.81±11.4 mg ascorbic acid/g). The total phenolic and flavonoids content were determined by Folin-Ciocalteu Reagent and aluminium chloride colorimetric method, respectively. The highest total phenols and total flavonoids content were found in CMM and LP with the value of 79.06±16.06 mg gallic acid/g & 119.0±1.41 mg quercetin/g, respectively. In nitric oxide (NO) scavenging the most prominent antioxidant activity was observed in CMM with an IC50 value of 8.119± 0.0036 μg/ml. The Cupric reducing capacity of the extracts was strong and dose dependent manner and CSM showed lowest reducing capacity. The cytotoxicity was determined by Brine shrimp lethality test and among these extracts most potent cytotoxicity was shown by CMM with LC50 value 16.98 µg/ml. The obtained results indicate that the investigated plants could be potential sources of natural antioxidants and can be used for various types of diseases.

Keywords: antioxidant, cytotoxicity, methanol, petroleum ether

Procedia PDF Downloads 555
5479 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error

Procedia PDF Downloads 303
5478 Significance of High Specific Speed in Circulating Water Pump, Which Can Cause Cavitation, Noise and Vibration

Authors: Chandra Gupt Porwal

Abstract:

Excessive vibration means increased wear, increased repair efforts, bad product selection & quality and high energy consumption. This may be sometimes experienced by cavitation or suction/discharge re-circulation which could occur only when net positive suction head available NPSHA drops below the net positive suction head required NPSHR. Cavitation can cause axial surging if it is excessive, will damage mechanical seals, bearings, possibly other pump components frequently and shorten the life of the impeller. Efforts have been made to explain Suction Energy (SE), Specific Speed (Ns), Suction Specific Speed (Nss), NPSHA, NPSHR & their significance, possible reasons of cavitation /internal re-circulation, its diagnostics and remedial measures to arrest and prevent cavitation in this paper. A case study is presented by the author highlighting that the root cause of unwanted noise and vibration is due to cavitation, caused by high specific speeds or inadequate net- positive suction head available which results in damages to material surfaces of impeller & suction bells and degradation of machine performance, its capacity and efficiency too. The author strongly recommends revisiting the technical specifications of CW pumps to provide sufficient NPSH margin ratios > 1.5, for future projects and Nss be limited to 8500 -9000 for cavitation free operation.

Keywords: best efficiency point (BEP), net positive suction head NPSHA, NPSHR, specific speed NS, suction specific speed NSS

Procedia PDF Downloads 243
5477 Current Methods for Drug Property Prediction in the Real World

Authors: Jacob Green, Cecilia Cabrera, Maximilian Jakobs, Andrea Dimitracopoulos, Mark van der Wilk, Ryan Greenhalgh

Abstract:

Predicting drug properties is key in drug discovery to enable de-risking of assets before expensive clinical trials and to find highly active compounds faster. Interest from the machine learning community has led to the release of a variety of benchmark datasets and proposed methods. However, it remains unclear for practitioners which method or approach is most suitable, as different papers benchmark on different datasets and methods, leading to varying conclusions that are not easily compared. Our large-scale empirical study links together numerous earlier works on different datasets and methods, thus offering a comprehensive overview of the existing property classes, datasets, and their interactions with different methods. We emphasise the importance of uncertainty quantification and the time and, therefore, cost of applying these methods in the drug development decision-making cycle. To the best of the author's knowledge, it has been observed that the optimal approach varies depending on the dataset and that engineered features with classical machine learning methods often outperform deep learning. Specifically, QSAR datasets are typically best analysed with classical methods such as Gaussian Processes, while ADMET datasets are sometimes better described by Trees or deep learning methods such as Graph Neural Networks or language models. Our work highlights that practitioners do not yet have a straightforward, black-box procedure to rely on and sets a precedent for creating practitioner-relevant benchmarks. Deep learning approaches must be proven on these benchmarks to become the practical method of choice in drug property prediction.

Keywords: activity (QSAR), ADMET, classical methods, drug property prediction, empirical study, machine learning

Procedia PDF Downloads 56
5476 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies

Authors: Yuanjin Liu

Abstract:

Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.

Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model

Procedia PDF Downloads 59
5475 Effect of Cavities on the Behaviour of Strip Footing Subjected to Inclined Load

Authors: Ali A. Al-Jazaairry, Tahsin T. Sabbagh

Abstract:

One of the important concerns within the field of geotechnical engineering is the presence of cavities in soils. This present work is an attempt to understand the behaviour of strip footing subjected to inclined load and constructed on cavitied soil. The failure mechanism of strip footing located above such soils was studied analytically. The capability of analytical model to correctly expect the system behaviour is assessed by carrying out verification analysis on available studies. The study was prepared by finite element software (PLAXIS) in which an elastic-perfectly plastic soil model was used. It was indicated, from the results of the study, that the load carrying capacity of foundation constructed on cavity can be analysed well using such analysis. The research covered many foundation cases, and in each foundation case, there occurs a critical depth under which the presence of cavities has shown minimum impact on the foundation performance. When cavities are found above this critical depth, the load carrying capacity of the foundation differs with many influences, such as the location and size of the cavity and footing depth. Figures involving the load carrying capacity with the affecting factors studied are presented. These figures offer information beneficial for the design of strip footings rested on underground cavities. Moreover, the results might be used to design a shallow foundation constructed on cavitied soil, whereas the obtained failure mechanisms may be employed to improve numerical solutions for this kind of problems.

Keywords: axial load, cavity, inclined load, strip footing

Procedia PDF Downloads 238
5474 A Systematic Review Investigating the Use of EEG Measures in Neuromarketing

Authors: A. M. Byrne, E. Bonfiglio, C. Rigby, N. Edelstyn

Abstract:

Introduction: Neuromarketing employs numerous methodologies when investigating products and advertisement effectiveness. Electroencephalography (EEG), a non-invasive measure of electrical activity from the brain, is commonly used in neuromarketing. EEG data can be considered using time-frequency (TF) analysis, where changes in the frequency of brainwaves are calculated to infer participant’s mental states, or event-related potential (ERP) analysis, where changes in amplitude are observed in direct response to a stimulus. This presentation discusses the findings of a systematic review of EEG measures in neuromarketing. A systematic review summarises evidence on a research question, using explicit measures to identify, select, and critically appraise relevant research papers. Thissystematic review identifies which EEG measures are the most robust predictor of customer preference and purchase intention. Methods: Search terms identified174 papers that used EEG in combination with marketing-related stimuli. Publications were excluded if they were written in a language other than English or were not published as journal articles (e.g., book chapters). The review investigated which TF effect (e.g., theta-band power) and ERP component (e.g., N400) most consistently reflected preference and purchase intention. Machine-learning prediction was also investigated, along with the use of EEG combined with physiological measures such as eye-tracking. Results: Frontal alpha asymmetry was the most reliable TF signal, where an increase in activity over the left side of the frontal lobe indexed a positive response to marketing stimuli, while an increase in activity over the right side indexed a negative response. The late positive potential, a positive amplitude increase around 600 ms after stimulus presentation, was the most reliable ERP component, reflecting the conscious emotional evaluation of marketing stimuli. However, each measure showed mixed results when related to preference and purchase behaviour. Predictive accuracy was greatly improved through machine-learning algorithms such as deep neural networks, especially when combined with eye-tracking or facial expression analyses. Discussion: This systematic review provides a novel catalogue of the most effective use of each EEG measure commonly used in neuromarketing. Exciting findings to emerge are the identification of the frontal alpha asymmetry and late positive potential as markers of preferential responses to marketing stimuli. Predictive accuracy using machine-learning algorithms achieved predictive accuracies as high as 97%, and future research should therefore focus on machine-learning prediction when using EEG measures in neuromarketing.

Keywords: EEG, ERP, neuromarketing, machine-learning, systematic review, time-frequency

Procedia PDF Downloads 97
5473 Climate Changes in Albania and Their Effect on Cereal Yield

Authors: Lule Basha, Eralda Gjika

Abstract:

This study is focused on analyzing climate change in Albania and its potential effects on cereal yields. Initially, monthly temperature and rainfalls in Albania were studied for the period 1960-2021. Climacteric variables are important variables when trying to model cereal yield behavior, especially when significant changes in weather conditions are observed. For this purpose, in the second part of the study, linear and nonlinear models explaining cereal yield are constructed for the same period, 1960-2021. The multiple linear regression analysis and lasso regression method are applied to the data between cereal yield and each independent variable: average temperature, average rainfall, fertilizer consumption, arable land, land under cereal production, and nitrous oxide emissions. In our regression model, heteroscedasticity is not observed, data follow a normal distribution, and there is a low correlation between factors, so we do not have the problem of multicollinearity. Machine-learning methods, such as random forest, are used to predict cereal yield responses to climacteric and other variables. Random Forest showed high accuracy compared to the other statistical models in the prediction of cereal yield. We found that changes in average temperature negatively affect cereal yield. The coefficients of fertilizer consumption, arable land, and land under cereal production are positively affecting production. Our results show that the Random Forest method is an effective and versatile machine-learning method for cereal yield prediction compared to the other two methods.

Keywords: cereal yield, climate change, machine learning, multiple regression model, random forest

Procedia PDF Downloads 74
5472 Identifying the Phases of Indian Agriculture Towards Desertification: An Introspect of Karnataka State, India

Authors: Arun Das

Abstract:

Indian agriculture is acclaimed from the dates of Indus civilization (2500 BC). Since this time until the day, there were tremendous expansion in terms of space and technology has taken place. Abrupt growth in technology took place past one and half century. Consequent to this development, the land which was brought under agriculture in the initial stages of introducing agriculture for the first time, that land is not possessing the same physical condition. Either it has lost the productive capacity or modified into semi agriculture land. On the grounds of its capacity and interwoven characteristics seven phases of agriculture scenario has been identified. Most of the land is on the march of desertification. Identifying the stages and the phase of the agriculture scenario is most relevant from the point of view of food security at regional, national and at global level. Secondly decisive measure can put back the degenerating environmental condition into arrest. GIS and Remote sensing applications have been used to identify the phases of agriculture.

Keywords: agriculture phases, desertification, deforestation, foods security, transmigration

Procedia PDF Downloads 413
5471 Genetic Algorithms for Feature Generation in the Context of Audio Classification

Authors: José A. Menezes, Giordano Cabral, Bruno T. Gomes

Abstract:

Choosing good features is an essential part of machine learning. Recent techniques aim to automate this process. For instance, feature learning intends to learn the transformation of raw data into a useful representation to machine learning tasks. In automatic audio classification tasks, this is interesting since the audio, usually complex information, needs to be transformed into a computationally convenient input to process. Another technique tries to generate features by searching a feature space. Genetic algorithms, for instance, have being used to generate audio features by combining or modifying them. We find this approach particularly interesting and, despite the undeniable advances of feature learning approaches, we wanted to take a step forward in the use of genetic algorithms to find audio features, combining them with more conventional methods, like PCA, and inserting search control mechanisms, such as constraints over a confusion matrix. This work presents the results obtained on particular audio classification problems.

Keywords: feature generation, feature learning, genetic algorithm, music information retrieval

Procedia PDF Downloads 415
5470 Optimization and Operation of Charging and Discharging Stations for Hybrid Cars and their Effects on the Electricity Distribution Network

Authors: Ali Heydarimoghim

Abstract:

In this paper, the optimal placement of charging and discharging stations is done to determine the location and capacity of the stations, reducing the cost of electric vehicle owners' losses, reducing the cost of distribution system losses, and reducing the costs associated with the stations. Also, observing the permissible limits of the bus voltage and the capacity of the stations and their distance are considered as constraints of the problem. Given the traffic situation in different areas of a city, we estimate the amount of energy required to charge and the amount of energy provided to discharge electric vehicles in each area. We then introduce the electricity distribution system of the city in question. Following are the scenarios for introducing the problem and introducing the objective and constraint functions. Finally, the simulation results for different scenarios are compared.

Keywords: charging & discharging stations, hybrid vehicles, optimization, replacement

Procedia PDF Downloads 120
5469 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo

Authors: Margaret Boone Rappaport, Christopher J. Corbally

Abstract:

The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.

Keywords: genetic drift, genomics, parietal expansion, religious capacity

Procedia PDF Downloads 322
5468 Predictive Pathogen Biology: Genome-Based Prediction of Pathogenic Potential and Countermeasures Targets

Authors: Debjit Ray

Abstract:

Horizontal gene transfer (HGT) and recombination leads to the emergence of bacterial antibiotic resistance and pathogenic traits. HGT events can be identified by comparing a large number of fully sequenced genomes across a species or genus, define the phylogenetic range of HGT, and find potential sources of new resistance genes. In-depth comparative phylogenomics can also identify subtle genome or plasmid structural changes or mutations associated with phenotypic changes. Comparative phylogenomics requires that accurately sequenced, complete and properly annotated genomes of the organism. Assembling closed genomes requires additional mate-pair reads or “long read” sequencing data to accompany short-read paired-end data. To bring down the cost and time required of producing assembled genomes and annotating genome features that inform drug resistance and pathogenicity, we are analyzing the performance for genome assembly of data from the Illumina NextSeq, which has faster throughput than the Illumina HiSeq (~1-2 days versus ~1 week), and shorter reads (150bp paired-end versus 300bp paired end) but higher capacity (150-400M reads per run versus ~5-15M) compared to the Illumina MiSeq. Bioinformatics improvements are also needed to make rapid, routine production of complete genomes a reality. Modern assemblers such as SPAdes 3.6.0 running on a standard Linux blade are capable in a few hours of converting mixes of reads from different library preps into high-quality assemblies with only a few gaps. Remaining breaks in scaffolds are generally due to repeats (e.g., rRNA genes) are addressed by our software for gap closure techniques, that avoid custom PCR or targeted sequencing. Our goal is to improve the understanding of emergence of pathogenesis using sequencing, comparative genomics, and machine learning analysis of ~1000 pathogen genomes. Machine learning algorithms will be used to digest the diverse features (change in virulence genes, recombination, horizontal gene transfer, patient diagnostics). Temporal data and evolutionary models can thus determine whether the origin of a particular isolate is likely to have been from the environment (could it have evolved from previous isolates). It can be useful for comparing differences in virulence along or across the tree. More intriguing, it can test whether there is a direction to virulence strength. This would open new avenues in the prediction of uncharacterized clinical bugs and multidrug resistance evolution and pathogen emergence.

Keywords: genomics, pathogens, genome assembly, superbugs

Procedia PDF Downloads 183