Search results for: accurate forecast
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2799

Search results for: accurate forecast

1509 A Neural Network System for Predicting the Hardness of Titanium Aluminum Nitrite (TiAlN) Coatings

Authors: Omar M. Elmabrouk

Abstract:

The cutting tool, in the high-speed machining process, is consistently dealing with high localized stress at the tool tip, tip temperature exceeds 800°C and the chip slides along the rake face. These conditions are affecting the tool wear, the cutting tool performances, the quality of the produced parts and the tool life. Therefore, a thin film coating on the cutting tool should be considered to improve the tool surface properties while maintaining its bulks properties. One of the general coating processes in applying thin film for hard coating purpose is PVD magnetron sputtering. In this paper, the prediction of the effects of PVD magnetron sputtering coating process parameters, sputter power in the range of (4.81-7.19 kW), bias voltage in the range of (50.00-300.00 Volts) and substrate temperature in the range of (281.08-600.00 °C), were studied using artificial neural network (ANN). The results were compared with previously published results using RSM model. It was found that the ANN is more accurate in prediction of tool hardness, and hence, it will not only improve the tool life of the tool but also significantly enhances the efficiency of the machining processes.

Keywords: artificial neural network, hardness, prediction, titanium aluminium nitrate coating

Procedia PDF Downloads 550
1508 Hyperspectral Image Classification Using Tree Search Algorithm

Authors: Shreya Pare, Parvin Akhter

Abstract:

Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.

Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm

Procedia PDF Downloads 169
1507 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 54
1506 Estimation of Aquifer Properties Using Pumping Tests: Case Study of Pydibhimavaram Industrial Area, Srikakulam, India

Authors: G. Venkata Rao, P. Kalpana, R. Srinivasa Rao

Abstract:

Adequate and reliable estimates of aquifer parameters are of utmost importance for proper management of vital groundwater resources. At present scenario the ground water is polluted because of industrial waste disposed over the land and the contaminants are transported in the aquifer from one area to another area which is depending on the characteristics of the aquifer and contaminants. To know the contaminant transport, the accurate estimation of aquifer properties is highly needed. Conventionally, these properties are estimated through pumping tests carried out on water wells. The occurrence and movement of ground water in the aquifer are characteristically defined by the aquifer parameters. The pumping (aquifer) test is the standard technique for estimating various hydraulic properties of aquifer systems, viz, transmissivity (T), hydraulic conductivity (K), storage coefficient (S) etc., for which the graphical method is widely used. The study area for conducting pumping test is Pydibheemavaram Industrial area near the coastal belt of Srikulam, AP, India. The main objective of the present work is to estimate the aquifer properties for developing contaminant transport model for the study area.

Keywords: aquifer, contaminant transport, hydraulic conductivity, industrial waste, pumping test

Procedia PDF Downloads 441
1505 Targeting and Developing the Remaining Pay in an Ageing Field: The Ovhor Field Experience

Authors: Christian Ihwiwhu, Nnamdi Obioha, Udeme John, Edward Bobade, Oghenerunor Bekibele, Adedeji Awujoola, Ibi-Ada Itotoi

Abstract:

Understanding the complexity in the distribution of hydrocarbon in a simple structure with flow baffles and connectivity issues is critical in targeting and developing the remaining pay in a mature asset. Subtle facies changes (heterogeneity) can have a drastic impact on reservoir fluids movement, and this can be crucial to identifying sweet spots in mature fields. This study aims to evaluate selected reservoirs in Ovhor Field, Niger Delta, Nigeria, with the objective of optimising production from the field by targeting undeveloped oil reserves, bypassed pay, and gaining an improved understanding of the selected reservoirs to increase the company’s reservoir limits. The task at the Ovhor field is complicated by poor stratigraphic seismic resolution over the field. 3-D geological (sedimentology and stratigraphy) interpretation, use of results from quantitative interpretation, and proper understanding of production data have been used in recognizing flow baffles and undeveloped compartments in the field. The full field 3-D model has been constructed in such a way as to capture heterogeneities and the various compartments in the field to aid the proper simulation of fluid flow in the field for future production prediction, proper history matching and design of good trajectories to adequately target undeveloped oil in the field. Reservoir property models (porosity, permeability, and net-to-gross) have been constructed by biasing log interpreted properties to a defined environment of deposition model whose interpretation captures the heterogeneities expected in the studied reservoirs. At least, two scenarios have been modelled for most of the studied reservoirs to capture the range of uncertainties we are dealing with. The total original oil in-place volume for the four reservoirs studied is 157 MMstb. The cumulative oil and gas production from the selected reservoirs are 67.64 MMstb and 9.76 Bscf respectively, with current production rate of about 7035 bopd and 4.38 MMscf/d (as at 31/08/2019). Dynamic simulation and production forecast on the 4 reservoirs gave an undeveloped reserve of about 3.82 MMstb from two (2) identified oil restoration activities. These activities include side-tracking and re-perforation of existing wells. This integrated approach led to the identification of bypassed oil in some areas of the selected reservoirs and an improved understanding of the studied reservoirs. New wells have/are being drilled now to test the results of our studies, and the results are very confirmatory and satisfying.

Keywords: facies, flow baffle, bypassed pay, heterogeneities, history matching, reservoir limit

Procedia PDF Downloads 124
1504 Application of Liquid Chromatographic Method for the in vitro Determination of Gastric and Intestinal Stability of Pure Andrographolide in the Extract of Andrographis paniculata

Authors: Vijay R. Patil, Sathiyanarayanan Lohidasan, K. R. Mahadik

Abstract:

Gastrointestinal stability of andrographolide was evaluated in vitro in simulated gastric (SGF) and intestinal (SIF) fluids using a validated HPLC-PDA method. The method was validated using a 5μm ThermoHypersil GOLD C18column (250 mm × 4.0 mm) and mobile phase consisting of water: acetonitrile; 70: 30 (v/v) delivered isocratically at a flow rate of 1 mL/min with UV detection at 228 nm. Andrographolide in pure form and extract Andrographis paniculata was incubated at 37°C in an incubator shaker in USP simulated gastric and intestinal fluids with and without enzymes. Systematic protocol as per FDA Guidance System was followed for stability study and samples were assayed at 0, 15, 30 and 60 min intervals for gastric and at 0, 15, 30, 60 min, 1, 2 and 3 h for intestinal stability study. Also, the stability study was performed up to 24 h to see the degradation pattern in SGF and SIF (with enzyme and without enzyme). The developed method was found to be accurate, precise and robust. Andrographolide was found to be stable in SGF (pH ∼ 1.2) for 1h and SIF (pH 6.8) up to 3 h. The relative difference (RD) of amount of drug added and found at all time points was found to be < 3%. The present study suggests that drug loss in the gastrointestinal tract takes place may be by membrane permeation rather than a degradation process.

Keywords: andrographolide, Andrographis paniculata, in vitro, stability, gastric, Intestinal HPLC-PDA

Procedia PDF Downloads 242
1503 Calibration of Contact Model Parameters and Analysis of Microscopic Behaviors of Cuxhaven Sand Using The Discrete Element Method

Authors: Anjali Uday, Yuting Wang, Andres Alfonso Pena Olare

Abstract:

The Discrete Element Method is a promising approach to modeling microscopic behaviors of granular materials. The quality of the simulations however depends on the model parameters utilized. The present study focuses on calibration and validation of the discrete element parameters for Cuxhaven sand based on the experimental data from triaxial and oedometer tests. A sensitivity analysis was conducted during the sample preparation stage and the shear stage of the triaxial tests. The influence of parameters like rolling resistance, inter-particle friction coefficient, confining pressure and effective modulus were investigated on the void ratio of the sample generated. During the shear stage, the effect of parameters like inter-particle friction coefficient, effective modulus, rolling resistance friction coefficient and normal-to-shear stiffness ratio are examined. The calibration of the parameters is carried out such that the simulations reproduce the macro mechanical characteristics like dilation angle, peak stress, and stiffness. The above-mentioned calibrated parameters are then validated by simulating an oedometer test on the sand. The oedometer test results are in good agreement with experiments, which proves the suitability of the calibrated parameters. In the next step, the calibrated and validated model parameters are applied to forecast the micromechanical behavior including the evolution of contact force chains, buckling of columns of particles, observation of non-coaxiality, and sample inhomogeneity during a simple shear test. The evolution of contact force chains vividly shows the distribution, and alignment of strong contact forces. The changes in coordination number are in good agreement with the volumetric strain exhibited during the simple shear test. The vertical inhomogeneity of void ratios is documented throughout the shearing phase, which shows looser structures in the top and bottom layers. Buckling of columns is not observed due to the small rolling resistance coefficient adopted for simulations. The non-coaxiality of principal stress and strain rate is also well captured. Thus the micromechanical behaviors are well described using the calibrated and validated material parameters.

Keywords: discrete element model, parameter calibration, triaxial test, oedometer test, simple shear test

Procedia PDF Downloads 119
1502 Assessing Vertical Distribution of Soil Organic Carbon Stocks in Westleigh Soil under Shrub Encroached Rangeland, Limpopo Province, South Africa

Authors: Abel L. Masotla, Phesheya E. Dlamini, Vusumuzi E. Mbanjwa

Abstract:

Accurate quantification of the vertical distribution of soil organic carbon (SOC) in relation to land cover transformations, associated with shrub encroachment is crucial because deeper lying horizons have been shown to have greater capacity to sequester SOC. Despite this, in-depth soil carbon dynamics remain poorly understood, especially in arid and semi-arid rangelands. The objective of this study was to quantify and compare the vertical distribution of soil organic carbon stocks (SOCs) in shrub-encroached and open grassland sites. To achieve this, soil samples were collected vertically at 10 cm depth intervals under both sites. The results showed that SOC was on average 19% and 13% greater in the topsoil and subsoil respectively, under shrub-encroached grassland compared to open grassland. In both topsoil and subsoil, lower SOCs were found under shrub-encroached (4.53 kg m⁻² and 3.90 kgm⁻²) relative to open grassland (4.39 kgm⁻² and 3.67 kgm⁻²). These results demonstrate that deeper soil horizon play a critical role in the storage of SOC in savanna grassland.

Keywords: savanna grasslands, shrub-encroachment, soil organic carbon, vertical distribution

Procedia PDF Downloads 132
1501 Ethical Enhancement Strategies for Development of Mass Media Profession Conducted for the Ethical Promotion of Undergraduate Students in Communication Science

Authors: Supranee Wattanasin

Abstract:

This research study was a qualitative documentary research by using an in-depth interview with many experts in the field who has both knowledge and experience to provide information to create a strategic plan to enhance the students’ ethics. The findings revealed that there were five areas that require an attention. The five areas included honesty, accurate fact, human right, speed, and responsibility. The development of the strategic plan to enhance the ethics for students who major in communication arts can be concluded as follows. First, the government, private, and religion sectors need to come up together and set up the activities to promote the ethical standard in schools, universities, and organizations. Second, it is important to cultivate the knowledge that ethics is important of the professional jobs, especially in the mass communication and media. Third, the Philosophy of Sufficiency Economy should be brought to explain to students in order for them to have some immunity to the negative attitude such as drinking alcohol, gambling, cut classes, and cheating at exams. Fourth, experts in the field of ethics should be found to provide more knowledge to students and allow students to participate in activities that will increase their experience and knowledge of the real world problem.

Keywords: communication arts, ethics, mass communication, media, strategy

Procedia PDF Downloads 332
1500 The Use of SD Bioline TB AgMPT64® Detection Assay for Rapid Characterization of Mycobacteria in Nigeria

Authors: S. Ibrahim, U. B. Abubakar, S. Danbirni, A. Usman, F. M. Ballah, C. A. Kudi, L. Lawson, G. H. Abdulrazak, I. A. Abdulkadir

Abstract:

Performing culture and characterization of mycobacteria in low resource settings like Nigeria is a very difficult task to undertake because of the very few and limited laboratories carrying out such an experiment; this is a largely due to stringent and laborious nature of the tests. Hence, a rapid, simple and accurate test for characterization is needed. The “SD BIOLINE TB Ag MPT 64 Rapid ®” is a simple and rapid immunochromatographic test used in differentiating Mycobacteria into Mycobacterium tuberculosis (NTM). The 100 sputa were obtained from patients suspected to be infected with tuberculosis and presented themselves to hospitals for check-up and treatment were involved in the study. The samples were cultured in a class III Biosafety cabinet and level III biosafety practices were followed. Forty isolates were obtained from the cultured sputa, and there were identified as Acid-fast bacilli (AFB) using Zeihl-Neelsen acid-fast stain. All the isolates (AFB positive) were then subjected to the SD BIOLINE Analyses. A total of 31 (77.5%) were characterized as MTBC, while nine (22.5%) were NTM. The total turnaround time for the rapid assay was just 30 minutes as compared to a few days of phenotypic and genotypic method. It was simple, rapid and reliable test to differentiate MTBC from NTM.

Keywords: culture, mycobacteria, non tuberculous mycobacterium, SD Bioline

Procedia PDF Downloads 337
1499 Deep Learning and Accurate Performance Measure Processes for Cyber Attack Detection among Web Logs

Authors: Noureddine Mohtaram, Jeremy Patrix, Jerome Verny

Abstract:

As an enormous number of online services have been developed into web applications, security problems based on web applications are becoming more serious now. Most intrusion detection systems rely on each request to find the cyber-attack rather than on user behavior, and these systems can only protect web applications against known vulnerabilities rather than certain zero-day attacks. In order to detect new attacks, we analyze the HTTP protocols of web servers to divide them into two categories: normal attacks and malicious attacks. On the other hand, the quality of the results obtained by deep learning (DL) in various areas of big data has given an important motivation to apply it to cybersecurity. Deep learning for attack detection in cybersecurity has the potential to be a robust tool from small transformations to new attacks due to its capability to extract more high-level features. This research aims to take a new approach, deep learning to cybersecurity, to classify these two categories to eliminate attacks and protect web servers of the defense sector which encounters different web traffic compared to other sectors (such as e-commerce, web app, etc.). The result shows that by using a machine learning method, a higher accuracy rate, and a lower false alarm detection rate can be achieved.

Keywords: anomaly detection, HTTP protocol, logs, cyber attack, deep learning

Procedia PDF Downloads 205
1498 Experimental and Numerical Analysis of the Effects of Ball-End Milling Process upon Residual Stresses and Cutting Forces

Authors: Belkacem Chebil Sonia, Bensalem Wacef

Abstract:

The majority of ball end milling models includes only the influence of cutting parameters (cutting speed, feed rate, depth of cut). Furthermore, this influence is studied in most of works on cutting force. Therefore, this study proposes an accurate ball end milling process modeling which includes also the influence of tool workpiece inclination. In addition, a characterization of residual stresses resulting of thermo mechanical loading in the workpiece was also presented. Moreover, the study of the influence of tool workpiece inclination and cutting parameters was made on residual stresses distribution. In order to achieve the predetermination of cutting forces and residual stresses during a milling operation, a thermo mechanical three-dimensional numerical model of ball end milling was developed. Furthermore, an experimental companion of ball end milling tests was realized on a 5-axis machining center to determine the cutting forces and characterize the residual stresses. The simulation results are compared with the experiment to validate the Finite Element Model and subsequently identify the optimum inclination angle and cutting parameters.

Keywords: ball end milling, cutting forces, cutting parameters, residual stress, tool-workpiece inclination

Procedia PDF Downloads 304
1497 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 452
1496 Textile Based Physical Wearable Sensors for Healthcare Monitoring in Medical and Protective Garments

Authors: Sejuti Malakar

Abstract:

Textile sensors have gained a lot of interest in recent years as it is instrumental in monitoring physiological and environmental changes, for a better diagnosis that can be useful in various fields like medical textiles, sports textiles, protective textiles, agro textiles, and geo-textiles. Moreover, with the development of flexible textile-based wearable sensors, the functionality of smart clothing is augmented for a more improved user experience when it comes to technical textiles. In this context, conductive textiles using new composites and nanomaterials are being developed while considering its compatibility with the textile manufacturing processes. This review aims to provide a comprehensive and detailed overview of the contemporary advancements in textile-based wearable physical sensors, used in the field of medical, security, surveillance, and protection, from a global perspective. The methodology used is through analysing various examples of integration of wearable textile-based sensors with clothing for daily use, keeping in mind the technological advances in the same. By comparing various case studies, we come across various challenges textile sensors, in terms of stability, the comfort of movement, and reliable sensing components to enable accurate measurements, in spite of progress in the engineering of the wearable. Addressing such concerns is critical for the future success of wearable sensors.

Keywords: flexible textile-based wearable sensors, contemporary advancements, conductive textiles, body conformal design

Procedia PDF Downloads 175
1495 Unlocking Academic Success: A Comprehensive Exploration of Shaguf Bites’s Impact on Learning and Retention

Authors: Joud Zagzoog, Amira Aldabbagh, Radiyah Hamidaddin

Abstract:

This research aims to test out and observe whether artificial intelligence (AI) software and applications could actually be effective, useful, and time-saving for those who use them. Shaguf Bites, a web application that uses AI technology, claims to help students study and memorize information more effectively in less time. The website uses smart learning, or AI-powered bite-sized repetitive learning, by transforming documents or PDFs with the help of AI into summarized interactive smart flashcards (Bites, n.d.). To properly test out the websites’ effectiveness, both qualitative and quantitative methods were used in this research. An experiment was conducted on a number of students where they were first requested to use Shaguf Bites without any prior knowledge or explanation of how to use it. Second, they were asked for feedback through a survey on how their experience was after using it and whether it was helpful, efficient, time-saving, and easy to use for studying. After reviewing the collected data, we found out that the majority of students found the website to be straightforward and easy to use. 58% of the respondents agreed that the website accurately formulated the flashcard questions. And 53% of them reported that they are most likely to use the website again in the future as well as recommend it to others. Overall, from the given results, it is clear that Shaguf Bites have proved to be very beneficial, accurate, and time saving for the majority of the students.

Keywords: artificial intelligence (AI), education, memorization, spaced repetition, flashcards.

Procedia PDF Downloads 174
1494 Second Order Statistics of Dynamic Response of Structures Using Gamma Distributed Damping Parameters

Authors: Badreddine Chemali, Boualem Tiliouine

Abstract:

This article presents the main results of a numerical investigation on the uncertainty of dynamic response of structures with statistically correlated random damping Gamma distributed. A computational method based on a Linear Statistical Model (LSM) is implemented to predict second order statistics for the response of a typical industrial building structure. The significance of random damping with correlated parameters and its implications on the sensitivity of structural peak response in the neighborhood of a resonant frequency are discussed in light of considerable ranges of damping uncertainties and correlation coefficients. The results are compared to those generated using Monte Carlo simulation techniques. The numerical results obtained show the importance of damping uncertainty and statistical correlation of damping coefficients when obtaining accurate probabilistic estimates of dynamic response of structures. Furthermore, the effectiveness of the LSM model to efficiently predict uncertainty propagation for structural dynamic problems with correlated damping parameters is demonstrated.

Keywords: correlated random damping, linear statistical model, Monte Carlo simulation, uncertainty of dynamic response

Procedia PDF Downloads 276
1493 Use of Multistage Transition Regression Models for Credit Card Income Prediction

Authors: Denys Osipenko, Jonathan Crook

Abstract:

Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.

Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability

Procedia PDF Downloads 479
1492 Simulation of Technological, Energy and GHG Comparison between a Conventional Diesel Bus and E-bus: Feasibility to Promote E-bus Change in High Lands Cities

Authors: Riofrio Jonathan, Fernandez Guillermo

Abstract:

Renewable energy represented around 80% of the energy matrix for power generation in Ecuador during 2020, so the deployment of current public policies is focused on taking advantage of the high presence of renewable sources to carry out several electrification projects. These projects are part of the portfolio sent to the United Nations Framework on Climate Change (UNFCCC) as a commitment to reduce greenhouse gas emissions (GHG) in the established national determined contribution (NDC). In this sense, the Ecuadorian Organic Energy Efficiency Law (LOEE) published in 2019 promotes E-mobility as one of the main milestones. In fact, it states that the new vehicles for urban and interurban usage must be E-buses since 2025. As a result, and for a successful implementation of this technological change in a national context, it is important to deploy land surveys focused on technical and geographical areas to keep the quality of services in both the electricity and transport sectors. Therefore, this research presents a technological and energy comparison between a conventional diesel bus and its equivalent E-bus. Both vehicles fulfill all the technical requirements to ride in the study-case city, which is Ambato in the province of Tungurahua-Ecuador. In addition, the analysis includes the development of a model for the energy estimation of both technologies that are especially applied in a highland city such as Ambato. The altimetry of the most important bus routes in the city varies from 2557 to 3200 m.a.s.l., respectively, for the lowest and highest points. These operation conditions provide a grade of novelty to this paper. Complementary, the technical specifications of diesel buses are defined following the common features of buses registered in Ambato. On the other hand, the specifications for E-buses come from the most common units introduced in Latin America because there is not enough evidence in similar cities at the moment. The achieved results will be good input data for decision-makers since electric demand forecast, energy savings, costs, and greenhouse gases emissions are computed. Indeed, GHG is important because it allows reporting the transparency framework that it is part of the Paris Agreement. Finally, the presented results correspond to stage I of the called project “Analysis and Prospective of Electromobility in Ecuador and Energy Mix towards 2030” supported by Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ).

Keywords: high altitude cities, energy planning, NDC, e-buses, e-mobility

Procedia PDF Downloads 145
1491 A Theorem Related to Sample Moments and Two Types of Moment-Based Density Estimates

Authors: Serge B. Provost

Abstract:

Numerous statistical inference and modeling methodologies are based on sample moments rather than the actual observations. A result justifying the validity of this approach is introduced. More specifically, it will be established that given the first n moments of a sample of size n, one can recover the original n sample points. This implies that a sample of size n and its first associated n moments contain precisely the same amount of information. However, it is efficient to make use of a limited number of initial moments as most of the relevant distributional information is included in them. Two types of density estimation techniques that rely on such moments will be discussed. The first one expresses a density estimate as the product of a suitable base density and a polynomial adjustment whose coefficients are determined by equating the moments of the density estimate to the sample moments. The second one assumes that the derivative of the logarithm of a density function can be represented as a rational function. This gives rise to a system of linear equations involving sample moments, the density estimate is then obtained by solving a differential equation. Unlike kernel density estimation, these methodologies are ideally suited to model ‘big data’ as they only require a limited number of moments, irrespective of the sample size. What is more, they produce simple closed form expressions that are amenable to algebraic manipulations. They also turn out to be more accurate as will be shown in several illustrative examples.

Keywords: density estimation, log-density, polynomial adjustments, sample moments

Procedia PDF Downloads 161
1490 Using Bidirectional Encoder Representations from Transformers to Extract Topic-Independent Sentiment Features for Social Media Bot Detection

Authors: Maryam Heidari, James H. Jones Jr.

Abstract:

Millions of online posts about different topics and products are shared on popular social media platforms. One use of this content is to provide crowd-sourced information about a specific topic, event or product. However, this use raises an important question: what percentage of information available through these services is trustworthy? In particular, might some of this information be generated by a machine, i.e., a bot, instead of a human? Bots can be, and often are, purposely designed to generate enough volume to skew an apparent trend or position on a topic, yet the consumer of such content cannot easily distinguish a bot post from a human post. In this paper, we introduce a model for social media bot detection which uses Bidirectional Encoder Representations from Transformers (Google Bert) for sentiment classification of tweets to identify topic-independent features. Our use of a Natural Language Processing approach to derive topic-independent features for our new bot detection model distinguishes this work from previous bot detection models. We achieve 94\% accuracy classifying the contents of data as generated by a bot or a human, where the most accurate prior work achieved accuracy of 92\%.

Keywords: bot detection, natural language processing, neural network, social media

Procedia PDF Downloads 111
1489 Multiple Linear Regression for Rapid Estimation of Subsurface Resistivity from Apparent Resistivity Measurements

Authors: Sabiu Bala Muhammad, Rosli Saad

Abstract:

Multiple linear regression (MLR) models for fast estimation of true subsurface resistivity from apparent resistivity field measurements are developed and assessed in this study. The parameters investigated were apparent resistivity (ρₐ), horizontal location (X) and depth (Z) of measurement as the independent variables; and true resistivity (ρₜ) as the dependent variable. To achieve linearity in both resistivity variables, datasets were first transformed into logarithmic domain following diagnostic checks of normality of the dependent variable and heteroscedasticity to ensure accurate models. Four MLR models were developed based on hierarchical combination of the independent variables. The generated MLR coefficients were applied to another data set to estimate ρₜ values for validation. Contours of the estimated ρₜ values were plotted and compared to the observed data plots at the colour scale and blanking for visual assessment. The accuracy of the models was assessed using coefficient of determination (R²), standard error (SE) and weighted mean absolute percentage error (wMAPE). It is concluded that the MLR models can estimate ρₜ for with high level of accuracy.

Keywords: apparent resistivity, depth, horizontal location, multiple linear regression, true resistivity

Procedia PDF Downloads 270
1488 Evaluating Performance of Value at Risk Models for the MENA Islamic Stock Market Portfolios

Authors: Abderrazek Ben Maatoug, Ibrahim Fatnassi, Wassim Ben Ayed

Abstract:

In this paper we investigate the issue of market risk quantification for Middle East and North Africa (MENA) Islamic market equity. We use Value-at-Risk (VaR) as a measure of potential risk in Islamic stock market, for long and short position, based on Riskmetrics model and the conditional parametric ARCH class model volatility with normal, student and skewed student distribution. The sample consist of daily data for the 2006-2014 of 11 Islamic stock markets indices. We conduct Kupiec and Engle and Manganelli tests to evaluate the performance for each model. The main finding of our empirical results show that (i) the superior performance of VaR models based on the Student and skewed Student distribution, for the significance level of α=1% , for all Islamic stock market indices, and for both long and short trading positions (ii) Risk Metrics model, and VaR model based on conditional volatility with normal distribution provides the best accurate VaR estimations for both long and short trading positions for a significance level of α=5%.

Keywords: value-at-risk, risk management, islamic finance, GARCH models

Procedia PDF Downloads 589
1487 Moving Object Detection Using Histogram of Uniformly Oriented Gradient

Authors: Wei-Jong Yang, Yu-Siang Su, Pau-Choo Chung, Jar-Ferr Yang

Abstract:

Moving object detection (MOD) is an important issue in advanced driver assistance systems (ADAS). There are two important moving objects, pedestrians and scooters in ADAS. In real-world systems, there exist two important challenges for MOD, including the computational complexity and the detection accuracy. The histogram of oriented gradient (HOG) features can easily detect the edge of object without invariance to changes in illumination and shadowing. However, to reduce the execution time for real-time systems, the image size should be down sampled which would lead the outlier influence to increase. For this reason, we propose the histogram of uniformly-oriented gradient (HUG) features to get better accurate description of the contour of human body. In the testing phase, the support vector machine (SVM) with linear kernel function is involved. Experimental results show the correctness and effectiveness of the proposed method. With SVM classifiers, the real testing results show the proposed HUG features achieve better than classification performance than the HOG ones.

Keywords: moving object detection, histogram of oriented gradient, histogram of uniformly-oriented gradient, linear support vector machine

Procedia PDF Downloads 590
1486 Implementation of an Open Source ERP for SMEs in the Automotive Sector in Peru: A Case Study

Authors: Gerson E. Cornejo, Luis A. Gamarra, David S. Mauricio

Abstract:

The Enterprise Resource Planning Systems (ERP) allows the integration of all the business processes of the functional areas of the companies, in order to automate and standardize the processes, obtain accurate information and improve decision making in time real. In Peru, 79% of medium and small companies (SMEs) do not use any management software, this is because it is believed that ERPs are expensive, complex and difficult to implement. However, for more than 20 years there have been Open Source ERPs, which are more accessible and have the same benefit as proprietary ERPs, but there is little information on the implementation process. In this work is made a case of study, in order to show the implementation process of an Open Source ERP, Odoo, based on the ASAP methodology (Accelerated SAP) and applied to a company of corrective and preventive maintenance services of vehicles. The ERP allowed the SME to standardize its business processes, increase its productivity, reducing up to 40% certain processes. The study of this case shows that it is feasible and profitable to implement an Open Source ERP in SMEs in the Automotive Sector of Peru. In addition, it is shown that the ASAP methodology is adequate to carry out Open Source ERPs implementation projects.

Keywords: ASAP, automotive sector, ERP implementation, open source

Procedia PDF Downloads 330
1485 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies

Keywords: crop yield, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 403
1484 Pulmonary Disease Identification Using Machine Learning and Deep Learning Techniques

Authors: Chandu Rathnayake, Isuri Anuradha

Abstract:

Early detection and accurate diagnosis of lung diseases play a crucial role in improving patient prognosis. However, conventional diagnostic methods heavily rely on subjective symptom assessments and medical imaging, often causing delays in diagnosis and treatment. To overcome this challenge, we propose a novel lung disease prediction system that integrates patient symptoms and X-ray images to provide a comprehensive and reliable diagnosis.In this project, develop a mobile application specifically designed for detecting lung diseases. Our application leverages both patient symptoms and X-ray images to facilitate diagnosis. By combining these two sources of information, our application delivers a more accurate and comprehensive assessment of the patient's condition, minimizing the risk of misdiagnosis. Our primary aim is to create a user-friendly and accessible tool, particularly important given the current circumstances where many patients face limitations in visiting healthcare facilities. To achieve this, we employ several state-of-the-art algorithms. Firstly, the Decision Tree algorithm is utilized for efficient symptom-based classification. It analyzes patient symptoms and creates a tree-like model to predict the presence of specific lung diseases. Secondly, we employ the Random Forest algorithm, which enhances predictive power by aggregating multiple decision trees. This ensemble technique improves the accuracy and robustness of the diagnosis. Furthermore, we incorporate a deep learning model using Convolutional Neural Network (CNN) with the RestNet50 pre-trained model. CNNs are well-suited for image analysis and feature extraction. By training CNN on a large dataset of X-ray images, it learns to identify patterns and features indicative of lung diseases. The RestNet50 architecture, known for its excellent performance in image recognition tasks, enhances the efficiency and accuracy of our deep learning model. By combining the outputs of the decision tree-based algorithms and the deep learning model, our mobile application generates a comprehensive lung disease prediction. The application provides users with an intuitive interface to input their symptoms and upload X-ray images for analysis. The prediction generated by the system offers valuable insights into the likelihood of various lung diseases, enabling individuals to take appropriate actions and seek timely medical attention. Our proposed mobile application has significant potential to address the rising prevalence of lung diseases, particularly among young individuals with smoking addictions. By providing a quick and user-friendly approach to assessing lung health, our application empowers individuals to monitor their well-being conveniently. This solution also offers immense value in the context of limited access to healthcare facilities, enabling timely detection and intervention. In conclusion, our research presents a comprehensive lung disease prediction system that combines patient symptoms and X-ray images using advanced algorithms. By developing a mobile application, we provide an accessible tool for individuals to assess their lung health conveniently. This solution has the potential to make a significant impact on the early detection and management of lung diseases, benefiting both patients and healthcare providers.

Keywords: CNN, random forest, decision tree, machine learning, deep learning

Procedia PDF Downloads 71
1483 Spectral Analysis Approaches for Simultaneous Determination of Binary Mixtures with Overlapping Spectra: An Application on Pseudoephedrine Sulphate and Loratadine

Authors: Sara El-Hanboushy, Hayam Lotfy, Yasmin Fayez, Engy Shokry, Mohammed Abdelkawy

Abstract:

Simple, specific, accurate and precise spectrophotometric methods are developed and validated for simultaneous determination of pseudoephedrine sulphate (PSE) and loratadine (LOR) in combined dosage form based on spectral analysis technique. Pseudoephedrine (PSE) in binary mixture could be analyzed either by using its resolved zero order absorption spectrum at its λ max 256.8 nm after subtraction of LOR spectrum or in presence of LOR spectrum by absorption correction method at 256.8 nm, dual wavelength (DWL) method at 254nm and 273nm, induced dual wavelength (IDWL) method at 256nm and 272nm and ratio difference (RD) method at 256nm and 262 nm. Loratadine (LOR) in the mixture could be analyzed directly at 280nm without any interference of PSE spectrum or at 250 nm using its recovered zero order absorption spectrum using constant multiplication(CM).In addition, simultaneous determination for PSE and LOR in their mixture could be applied by induced amplitude modulation method (IAM) coupled with amplitude multiplication (PM).

Keywords: dual wavelength (DW), induced amplitude modulation method (IAM) coupled with amplitude multiplication (PM), loratadine, pseudoephedrine sulphate, ratio difference (RD)

Procedia PDF Downloads 317
1482 Multi-Spectral Deep Learning Models for Forest Fire Detection

Authors: Smitha Haridasan, Zelalem Demissie, Atri Dutta, Ajita Rattani

Abstract:

Aided by the wind, all it takes is one ember and a few minutes to create a wildfire. Wildfires are growing in frequency and size due to climate change. Wildfires and its consequences are one of the major environmental concerns. Every year, millions of hectares of forests are destroyed over the world, causing mass destruction and human casualties. Thus early detection of wildfire becomes a critical component to mitigate this threat. Many computer vision-based techniques have been proposed for the early detection of forest fire using video surveillance. Several computer vision-based methods have been proposed to predict and detect forest fires at various spectrums, namely, RGB, HSV, and YCbCr. The aim of this paper is to propose a multi-spectral deep learning model that combines information from different spectrums at intermediate layers for accurate fire detection. A heterogeneous dataset assembled from publicly available datasets is used for model training and evaluation in this study. The experimental results show that multi-spectral deep learning models could obtain an improvement of about 4.68 % over those based on a single spectrum for fire detection.

Keywords: deep learning, forest fire detection, multi-spectral learning, natural hazard detection

Procedia PDF Downloads 233
1481 Sentiment Analysis of Creative Tourism Experiences: The Case of Girona, Spain

Authors: Ariadna Gassiot, Raquel Camprubi, Lluis Coromina

Abstract:

Creative tourism involves the participation of tourists in the co-creation of their own experiences in a tourism destination. Consequently, creative tourists move from a passive behavior to an active behavior, and tourism destinations address this type of tourism by changing the scenario and making tourists learn and participate while they travel instead of merely offering tourism products and services to them. In creative tourism experiences, tourists are in close contact with locals and their culture. In destinations where culture (i.e. food, heritage, etc.) is the basis of their offer, such as Girona, Spain, tourism stakeholders must especially consider, analyze, and further foster the co-creation of authentic tourism experiences. They should focus on discovering more about these experiences, their main attributes, visitors’ opinions, etc. Creative tourists do not only participate while they travel around the world, but they also have and active post-travel behavior. They feel free to write about tourism experiences in different channels. User-generated content becomes crucial for any tourism destination when analyzing the market, making decisions, planning strategies, and when addressing issues, such as their reputation and performance. Sentiment analysis is a methodology used to automatically analyze semantic relationships and meanings in texts, so it is a way to extract tourists’ emotions and feelings. Tourists normally express their views and opinions regarding tourism products and services. They may express positive, neutral or negative feelings towards these products or services. For example, they may express anger, love, hate, sadness or joy towards tourism services and products. They may also express feelings through verbs, nouns, adverbs, adjectives, among others. Sentiment analysis may help tourism professionals in a range of areas, from marketing to customer service. For example, sentiment analysis allows tourism stakeholders to forecast tourism expenditure and tourist arrivals, or to analyze tourists’ profile. While there is an increasing presence of creativity in tourists’ experiences, there is also an increasing need to explore tourists’ expressions about these experiences. There is a need to know how they feel about participating in specific tourism activities. Thus, the main objective of this study is to analyze the meanings, emotions and feelings that tourists express about their creative experiences in Girona, Spain. To do so, sentiment analysis methodology is used. Results show the diversity of tourists who actively participate in tourism in Girona. Their opinions refer both to tangible aspects (e.g. food, museums, etc.) and to intangible aspects (e.g. friendliness, nightlife, etc.) of tourism experiences. Tourists express love, likeliness and other sentiments towards tourism products and services in Girona. This study can help tourism stakeholders in understanding tourists’ experiences and feelings. Consequently, they can offer more customized products and services and they can efficiently make them participate in the co-creation of their own tourism experiences.

Keywords: creative tourism, sentiment analysis, text mining, user-generated content

Procedia PDF Downloads 175
1480 Geometrically Non-Linear Axisymmetric Free Vibration Analysis of Functionally Graded Annular Plates

Authors: Boutahar Lhoucine, El Bikri Khalid, Benamar Rhali

Abstract:

In this paper, the non-linear free axisymmetric vibration of a thin annular plate made of functionally graded material (FGM) has been studied by using the energy method and a multimode approach. FGM properties vary continuously as well as non-homogeneity through the thickness direction of the plate. The theoretical model is based on the classical plate theory and the Von Kármán geometrical non-linearity assumptions. An approximation has been adopted in the present work consisting of neglecting the in-plane deformation in the formulation. Hamilton’s principle is used to derive the governing equation of motion. The problem is solved by a numerical iterative procedure in order to obtain more accurate results for vibration amplitudes up to 1.5 times the plate thickness. The numerical results are given for the first axisymmetric non-linear mode shape for a wide range of vibration amplitudes and they are presented either in tabular form or in graphical form to show the effect that the vibration amplitude and the variation in material properties have significant effects on the frequencies and the bending stresses in large amplitude vibration of the functionally graded annular plate.

Keywords: non-linear vibrations, annular plates, large amplitudes, functionally graded material

Procedia PDF Downloads 357