Search results for: time prediction algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20399

Search results for: time prediction algorithms

18419 Time-Frequency Feature Extraction Method Based on Micro-Doppler Signature of Ground Moving Targets

Authors: Ke Ren, Huiruo Shi, Linsen Li, Baoshuai Wang, Yu Zhou

Abstract:

Since some discriminative features are required for ground moving targets classification, we propose a new feature extraction method based on micro-Doppler signature. Firstly, the time-frequency analysis of measured data indicates that the time-frequency spectrograms of the three kinds of ground moving targets, i.e., single walking person, two people walking and a moving wheeled vehicle, are discriminative. Then, a three-dimensional time-frequency feature vector is extracted from the time-frequency spectrograms to depict these differences. At last, a Support Vector Machine (SVM) classifier is trained with the proposed three-dimensional feature vector. The classification accuracy to categorize ground moving targets into the three kinds of the measured data is found to be over 96%, which demonstrates the good discriminative ability of the proposed micro-Doppler feature.

Keywords: micro-doppler, time-frequency analysis, feature extraction, radar target classification

Procedia PDF Downloads 385
18418 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 135
18417 Blockchain-Resilient Framework for Cloud-Based Network Devices within the Architecture of Self-Driving Cars

Authors: Mirza Mujtaba Baig

Abstract:

Artificial Intelligence (AI) is evolving rapidly, and one of the areas in which this field has influenced is automation. The automobile, healthcare, education, and robotic industries deploy AI technologies constantly, and the automation of tasks is beneficial to allow time for knowledge-based tasks and also introduce convenience to everyday human endeavors. The paper reviews the challenges faced with the current implementations of autonomous self-driving cars by exploring the machine learning, robotics, and artificial intelligence techniques employed for the development of this innovation. The controversy surrounding the development and deployment of autonomous machines, e.g., vehicles, begs the need for the exploration of the configuration of the programming modules. This paper seeks to add to the body of knowledge of research assisting researchers in decreasing the inconsistencies in current programming modules. Blockchain is a technology of which applications are mostly found within the domains of financial, pharmaceutical, manufacturing, and artificial intelligence. The registering of events in a secured manner as well as applying external algorithms required for the data analytics are especially helpful for integrating, adapting, maintaining, and extending to new domains, especially predictive analytics applications.

Keywords: artificial intelligence, automation, big data, self-driving cars, machine learning, neural networking algorithm, blockchain, business intelligence

Procedia PDF Downloads 100
18416 Design and Implementation of a Counting and Differentiation System for Vehicles through Video Processing

Authors: Derlis Gregor, Kevin Cikel, Mario Arzamendia, Raúl Gregor

Abstract:

This paper presents a self-sustaining mobile system for counting and classification of vehicles through processing video. It proposes a counting and classification algorithm divided in four steps that can be executed multiple times in parallel in a SBC (Single Board Computer), like the Raspberry Pi 2, in such a way that it can be implemented in real time. The first step of the proposed algorithm limits the zone of the image that it will be processed. The second step performs the detection of the mobile objects using a BGS (Background Subtraction) algorithm based on the GMM (Gaussian Mixture Model), as well as a shadow removal algorithm using physical-based features, followed by morphological operations. In the first step the vehicle detection will be performed by using edge detection algorithms and the vehicle following through Kalman filters. The last step of the proposed algorithm registers the vehicle passing and performs their classification according to their areas. An auto-sustainable system is proposed, powered by batteries and photovoltaic solar panels, and the data transmission is done through GPRS (General Packet Radio Service)eliminating the need of using external cable, which will facilitate it deployment and translation to any location where it could operate. The self-sustaining trailer will allow the counting and classification of vehicles in specific zones with difficult access.

Keywords: intelligent transportation system, object detection, vehicle couting, vehicle classification, video processing

Procedia PDF Downloads 304
18415 Determination of Verapamil Hydrochloride in Tablets and Injection Solutions With the Verapamil-Selective Electrode and Possibilities of Application in Pharmaceutical Analysis

Authors: Faisal A. Salih

Abstract:

Verapamil hydrochloride (Ver) is a drug used in medicine for arrythmia, angina and hypertension as a calcium channel blocker. For the quantitative determination of Ver in dosage forms, the HPLC method is most often used. A convenient alternative to the chromatographic method is potentiometry using a Verselective electrode, which does not require expensive equipment, can be used without separation from the matrix components, which significantly reduces the analysis time, and does not use toxic organic solvents, being a "green", "environmentally friendly" technique. It has been established in this study that the rational choice of the membrane plasticizer and the preconditioning and measurement algorithms, which prevent nonexchangeable extraction of Ver into the membrane phase, makes it possible to achieve excellent analytical characteristics of Ver-selective electrodes based on commercially available components. In particular, an electrode with the following membrane composition: PVC (32.8 wt %), ortho-nitrophenyloctyl ether (66.6 wt %), and tetrakis-4-chlorophenylborate (0.6 wt % or 0.01 M) have the lower detection limit 4 × 10−8 M and potential reproducibility 0.15–0.22 mV. Both direct potentiometry (DP) and potentiometric titration (PT) methods can be used for the determination of Ver in tablets and injection solutions. Masses of Ver per average tablet weight determined by the methods of DP and PT for the same set of 10 tablets were (80.4±0.2 and80.7±0.2) mg, respectively. The masses of Ver in solutions for injection, determined by DP for two ampoules from one set, were (5.00±0.015 and 5.004±0.006) mg. In all cases, good reproducibility and excellent correspondence with the declared quantities were observed.

Keywords: verapamil, potentiometry, ion-selective electrode, pharmaceutical analysis

Procedia PDF Downloads 69
18414 Evotrader: Bitcoin Trading Using Evolutionary Algorithms on Technical Analysis and Social Sentiment Data

Authors: Martin Pellon Consunji

Abstract:

Due to the rise in popularity of Bitcoin and other crypto assets as a store of wealth and speculative investment, there is an ever-growing demand for automated trading tools, such as bots, in order to gain an advantage over the market. Traditionally, trading in the stock market was done by professionals with years of training who understood patterns and exploited market opportunities in order to gain a profit. However, nowadays a larger portion of market participants are at minimum aided by market-data processing bots, which can generally generate more stable signals than the average human trader. The rise in trading bot usage can be accredited to the inherent advantages that bots have over humans in terms of processing large amounts of data, lack of emotions of fear or greed, and predicting market prices using past data and artificial intelligence, hence a growing number of approaches have been brought forward to tackle this task. However, the general limitation of these approaches can still be broken down to the fact that limited historical data doesn’t always determine the future, and that a lot of market participants are still human emotion-driven traders. Moreover, developing markets such as those of the cryptocurrency space have even less historical data to interpret than most other well-established markets. Due to this, some human traders have gone back to the tried-and-tested traditional technical analysis tools for exploiting market patterns and simplifying the broader spectrum of data that is involved in making market predictions. This paper proposes a method which uses neuro evolution techniques on both sentimental data and, the more traditionally human-consumed, technical analysis data in order to gain a more accurate forecast of future market behavior and account for the way both automated bots and human traders affect the market prices of Bitcoin and other cryptocurrencies. This study’s approach uses evolutionary algorithms to automatically develop increasingly improved populations of bots which, by using the latest inflows of market analysis and sentimental data, evolve to efficiently predict future market price movements. The effectiveness of the approach is validated by testing the system in a simulated historical trading scenario, a real Bitcoin market live trading scenario, and testing its robustness in other cryptocurrency and stock market scenarios. Experimental results during a 30-day period show that this method outperformed the buy and hold strategy by over 260% in terms of net profits, even when taking into consideration standard trading fees.

Keywords: neuro-evolution, Bitcoin, trading bots, artificial neural networks, technical analysis, evolutionary algorithms

Procedia PDF Downloads 102
18413 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 70
18412 Thermo-Mechanical Analysis of Composite Structures Utilizing a Beam Finite Element Based on Global-Local Superposition

Authors: Andre S. de Lima, Alfredo R. de Faria, Jose J. R. Faria

Abstract:

Accurate prediction of thermal stresses is particularly important for laminated composite structures, as large temperature changes may occur during fabrication and field application. The normal transverse deformation plays an important role in the prediction of such stresses, especially for problems involving thick laminated plates subjected to uniform temperature loads. Bearing this in mind, the present study aims to investigate the thermo-mechanical behavior of laminated composite structures using a new beam element based on global-local superposition, accounting for through-the-thickness effects. The element formulation is based on a global-local superposition in the thickness direction, utilizing a cubic global displacement field in combination with a linear layerwise local displacement distribution, which assures zig-zag behavior of the stresses and displacements. By enforcing interlaminar stress (normal and shear) and displacement continuity, as well as free conditions at the upper and lower surfaces, the number of degrees of freedom in the model is maintained independently of the number of layers. Moreover, the proposed formulation allows for the determination of transverse shear and normal stresses directly from the constitutive equations, without the need of post-processing. Numerical results obtained with the beam element were compared to analytical solutions, as well as results obtained with commercial finite elements, rendering satisfactory results for a range of length-to-thickness ratios. The results confirm the need for an element with through-the-thickness capabilities and indicate that the present formulation is a promising alternative to such analysis.

Keywords: composite beam element, global-local superposition, laminated composite structures, thermal stresses

Procedia PDF Downloads 141
18411 Control Strategy for Two-Mode Hybrid Electric Vehicle by Using Fuzzy Controller

Authors: Jia-Shiun Chen, Hsiu-Ying Hwang

Abstract:

Hybrid electric vehicles can reduce pollution and improve fuel economy. Power-split hybrid electric vehicles (HEVs) provide two power paths between the internal combustion engine (ICE) and energy storage system (ESS) through the gears of an electrically variable transmission (EVT). EVT allows ICE to operate independently from vehicle speed all the time. Therefore, the ICE can operate in the efficient region of its characteristic brake specific fuel consumption (BSFC) map. The two-mode powertrain can operate in input-split or compound-split EVT modes and in four different fixed gear configurations. Power-split architecture is advantageous because it combines conventional series and parallel power paths. This research focuses on input-split and compound-split modes in the two-mode power-split powertrain. Fuzzy Logic Control (FLC) for an internal combustion engine (ICE) and PI control for electric machines (EMs) are derived for the urban driving cycle simulation. These control algorithms reduce vehicle fuel consumption and improve ICE efficiency while maintaining the state of charge (SOC) of the energy storage system in an efficient range.

Keywords: hybrid electric vehicle, fuel economy, two-mode hybrid, fuzzy control

Procedia PDF Downloads 365
18410 Time Variance and Spillover Effects between International Crude Oil Price and Ten Emerging Equity Markets

Authors: Murad A. Bein

Abstract:

This paper empirically examines the time-varying relationship and spillover effects between the international crude oil price and ten emerging equity markets, namely three oil-exporting countries (Brazil, Mexico, and Russia) and seven Central and Eastern European (CEE) countries (Bulgaria, Croatia, Czech Republic, Hungary, Poland, Romania, and Slovakia). The results revealed that there are spillover effects from oil markets into almost all emerging equity markets save Slovakia. Besides, the oil supply glut had a homogenous effect on the emerging markets, both net oil-exporting, and oil-importing countries (CEE). Further, the time variance drastically increased during financial turmoil. Indeed, the time variance remained high from 2009 to 2012 in response to aggregate demand shocks (global financial crisis and Eurozone debt crisis) and quantitative easing measures. Interestingly, the time variance was slightly higher for the oil-exporting countries than for some of the CEE countries. Decision-makers in emerging economies should therefore seek policy coordination when dealing with financial turmoil.

Keywords: crude oil, spillover effects, emerging equity, time-varying, aggregate demand shock

Procedia PDF Downloads 104
18409 A Probabilistic Study on Time to Cover Cracking Due to Corrosion

Authors: Chun-Qing Li, Hassan Baji, Wei Yang

Abstract:

Corrosion of steel in reinforced concrete structures is a major problem worldwide. The volume expansion of corrosion products causes concrete cover cracking, which could lead to delamination of concrete cover. The time to cover cracking plays a key role to the assessment of serviceability of reinforced concrete structures subjected to corrosion. Many analytical, numerical, and empirical models have been developed to predict the time to cracking initiation due to corrosion. In this study, a numerical model based on finite element modeling of corrosion-induced cracking process is used. In order to predict the service life based on time to cover initiation, the numerical approach is coupled with a probabilistic procedure. In this procedure, all the influential factors affecting time to cover cracking are modeled as random variables. The results show that the time to cover cracking is highly variables. It is also shown that rust product expansion ratio and the size of more porous concrete zone around the rebar are the most influential factors in predicting service life of corrosion-affected structures.

Keywords: corrosion, crack width, probabilistic, service life

Procedia PDF Downloads 193
18408 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 38
18407 Impact of Reclamation on the Water Exchange in Bohai Bay

Authors: Luyao Liu, Dekui Yuan, Xu Li

Abstract:

As one of the most important bays of China, the water exchange capacity of Bohai Bay can influence the economic development and urbanization of surrounding cities. However, the rapid reclamation has influenced the weak water exchange capacity of this semi-enclosed bay in recent years. This paper sets two hydrodynamic models of Bohai Bay with two shorelines before and after reclamation. The mean value and distribution of Turn-over Time, the distribution of residual current, and the feature of the tracer path are compared. After comparison, it is found that Bohai Bay keeps these characteristics; the spending time of water exchange in the northern is longer than southern, and inshore is longer than offshore. However, the mean water exchange time becomes longer after reclamation. In addition, the material spreading is blocked because of the inwardly extending shorelines, and the direction changed from along the shoreline to towards the center after reclamation.

Keywords: Bohai Bay, water exchange, reclamation, turn-over time

Procedia PDF Downloads 117
18406 Trauma Scores and Outcome Prediction After Chest Trauma

Authors: Mohamed Abo El Nasr, Mohamed Shoeib, Abdelhamid Abdelkhalik, Amro Serag

Abstract:

Background: Early assessment of severity of chest trauma, either blunt or penetrating is of critical importance in prediction of patient outcome. Different trauma scoring systems are widely available and are based on anatomical or physiological parameters to expect patient morbidity or mortality. Up till now, there is no ideal, universally accepted trauma score that could be applied in all trauma centers and is suitable for assessment of severity of chest trauma patients. Aim: Our aim was to compare various trauma scoring systems regarding their predictability of morbidity and mortality in chest trauma patients. Patients and Methods: This study was a prospective study including 400 patients with chest trauma who were managed at Tanta University Emergency Hospital, Egypt during a period of 2 years (March 2014 until March 2016). The patients were divided into 2 groups according to the mode of trauma: blunt or penetrating. The collected data included age, sex, hemodynamic status on admission, intrathoracic injuries, and associated extra-thoracic injuries. The patients outcome including mortality, need of thoracotomy, need for ICU admission, need for mechanical ventilation, length of hospital stay and the development of acute respiratory distress syndrome were also recorded. The relevant data were used to calculate the following trauma scores: 1. Anatomical scores including abbreviated injury scale (AIS), Injury severity score (ISS), New injury severity score (NISS) and Chest wall injury scale (CWIS). 2. Physiological scores including revised trauma score (RTS), Acute physiology and chronic health evaluation II (APACHE II) score. 3. Combined score including Trauma and injury severity score (TRISS ) and 4. Chest-Specific score Thoracic trauma severity score (TTSS). All these scores were analyzed statistically to detect their sensitivity, specificity and compared regarding their predictive power of mortality and morbidity in blunt and penetrating chest trauma patients. Results: The incidence of mortality was 3.75% (15/400). Eleven patients (11/230) died in blunt chest trauma group, while (4/170) patients died in penetrating trauma group. The mortality rate increased more than three folds to reach 13% (13/100) in patients with severe chest trauma (ISS of >16). The physiological scores APACHE II and RTS had the highest predictive value for mortality in both blunt and penetrating chest injuries. The physiological score APACHE II followed by the combined score TRISS were more predictive for intensive care admission in penetrating injuries while RTS was more predictive in blunt trauma. Also, RTS had a higher predictive value for expectation of need for mechanical ventilation followed by the combined score TRISS. APACHE II score was more predictive for the need of thoracotomy in penetrating injuries and the Chest-Specific score TTSS was higher in blunt injuries. The anatomical score ISS and TTSS score were more predictive for prolonged hospital stay in penetrating and blunt injuries respectively. Conclusion: Trauma scores including physiological parameters have a higher predictive power for mortality in both blunt and penetrating chest trauma. They are more suitable for assessment of injury severity and prediction of patients outcome.

Keywords: chest trauma, trauma scores, blunt injuries, penetrating injuries

Procedia PDF Downloads 402
18405 Design and Development of Sustained Release Floating Tablet of Stavudine

Authors: Surajj Sarode, G. Vidya Sagar, G. P. Vadnere

Abstract:

The purpose of the present study was to prolong the gastric residence time of Stavudine by developing gastric floating drug delivery system (GFDDS). Moreover, to study influence of different polymers on its release rate using gas-forming agents, like sodium bicarbonate, citric acid. Floating tablets were prepared by wet granulation method using PVP K-30 as a binder and the other polymers include Pullulan Gum, HPMC K100M, six different formulations with the varying concentrations of polymers were prepared and the tablets were evaluated in terms of their pre-compression parameters like bulk density, tapped density, Haunsner ratio, angle of repose, compressibility index, post compression physical characteristics, in vitro release, buoyancy, floating lag time (FLT), total floating time (TFT) and swelling index. All the formulations showed good floating lag time i.e. less than 3 mins. The batch containing combination of Pullulan Gum and HPMC 100M (i.e. F-6) showed total floating lag time more than 12 h., the highest swelling index among all the prepared batches. The drug release was found to follow zero order kinetics.

Keywords: Suavudine, floating, total floating time (TFT), gastric residence

Procedia PDF Downloads 376
18404 Customer Preference in the Textile Market: Fabric-Based Analysis

Authors: Francisca Margarita Ocran

Abstract:

Underwear, and more particularly bras and panties, are defined as intimate clothing. Strictly speaking, they enhance the place of women in the public or private satchel. Therefore, women's lingerie is a complex garment with a high involvement profile, motivating consumers to buy it not only by its functional utility but also by the multisensory experience it provides them. Customer behavior models are generally based on customer data mining, and each model is designed to answer questions at a specific time. Predicting the customer experience is uncertain and difficult. Thus, knowledge of consumers' tastes in lingerie deserves to be treated as an experiential product, where the dimensions of the experience motivating consumers to buy a lingerie product and to remain faithful to it must be analyzed in detail by the manufacturers and retailers to engage and retain consumers, which is why this research aims to identify the variables that push consumers to choose their lingerie product, based on an in-depth analysis of the types of fabrics used to make lingerie. The data used in this study comes from online purchases. Machine learning approach with the use of Python programming language and Pycaret gives us a precision of 86.34%, 85.98%, and 84.55% for the three algorithms to use concerning the preference of a buyer in front of a range of lingerie. Gradient Boosting, random forest, and K Neighbors were used in this study; they are very promising and rich in the classification of preference in the textile industry.

Keywords: consumer behavior, data mining, lingerie, machine learning, preference

Procedia PDF Downloads 67
18403 Modeling and Analysis of Laser Sintering Process Scanning Time for Optimal Planning and Control

Authors: Agarana Michael C., Akinlabi Esther T., Pule Kholopane

Abstract:

In order to sustain the advantages of an advanced manufacturing technique, such as laser sintering, minimization of total processing cost of the parts being produced is very important. An efficient time management would usually very important in optimal cost attainment which would ultimately result in an efficient advanced manufacturing process planning and control. During Laser Scanning Process Scanning (SLS) procedures it is possible to adjust various manufacturing parameters which are used to influence the improvement of various mechanical and other properties of the products. In this study, Modelling and mathematical analysis, including sensitivity analysis, of the laser sintering process time were carried out. The results of the analyses were represented with graphs, from where conclusions were drawn. It was specifically observed that achievement of optimal total scanning time is key for economic efficiency which is required for sustainability of the process.

Keywords: modeling and analysis, optimal planning and control, laser sintering process, scanning time

Procedia PDF Downloads 80
18402 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 77
18401 Nonstationarity Modeling of Economic and Financial Time Series

Authors: C. Slim

Abstract:

Traditional techniques for analyzing time series are based on the notion of stationarity of phenomena under study, but in reality most economic and financial series do not verify this hypothesis, which implies the implementation of specific tools for the detection of such behavior. In this paper, we study nonstationary non-seasonal time series tests in a non-exhaustive manner. We formalize the problem of nonstationary processes with numerical simulations and take stock of their statistical characteristics. The theoretical aspects of some of the most common unit root tests will be discussed. We detail the specification of the tests, showing the advantages and disadvantages of each. The empirical study focuses on the application of these tests to the exchange rate (USD/TND) and the Consumer Price Index (CPI) in Tunisia, in order to compare the Power of these tests with the characteristics of the series.

Keywords: stationarity, unit root tests, economic time series, ADF tests

Procedia PDF Downloads 405
18400 Forecast Financial Bubbles: Multidimensional Phenomenon

Authors: Zouari Ezzeddine, Ghraieb Ikram

Abstract:

From the results of the academic literature which evokes the limitations of previous studies, this article shows the reasons for multidimensionality Prediction of financial bubbles. A new framework for modeling study predicting financial bubbles by linking a set of variable presented on several dimensions dictating its multidimensional character. It takes into account the preferences of financial actors. A multicriteria anticipation of the appearance of bubbles in international financial markets helps to fight against a possible crisis.

Keywords: classical measures, predictions, financial bubbles, multidimensional, artificial neural networks

Procedia PDF Downloads 551
18399 Real-Time Implementation of Self-Tuning Fuzzy-PID Controller for First Order Plus Dead Time System Base on Microcontroller STM32

Authors: Maitree Thamma, Witchupong Wiboonjaroen, Thanat Suknuan, Karan Homchat

Abstract:

First order plus dead time (FOPDT) is a high dynamic system. Therefore, the controller must be intelligent. This paper presents the development and implementation of self-tuning Fuzzy-PID controller for controlling the FOPDT system. The water level process used represented FOPDT system and the mathematical model of the system was approximated by using System Identification toolbox in Matlab. The control programming and Fuzzy-PID algorithm used Matlab/Simulink and run on Microcontroller STM32.

Keywords: real-time control, self-tuning fuzzy-PID, FOPDT system, the water lever process

Procedia PDF Downloads 265
18398 Improving Part-Time Instructors’ Academic Outcomes with Gamification

Authors: Jared R. Chapman

Abstract:

This study introduces a type of motivational information system called an educational engagement information system (EEIS). An EEIS draws on principles of behavioral economics, motivation theory, and learning cognition theory to design information systems that help students want to improve their performance. This study compares academic outcomes for course sections taught by part- and full-time instructors both with and without an EEIS. Without an EEIS, students in the part-time instructor's course sections demonstrated significantly higher failure rates (a 143.8% increase) and dropout rates (a 110.4% increase) with significantly fewer students scoring a B- or higher (39.8% decrease) when compared to students in the course sections taught by a full-time instructor. It is concerning that students in the part-time instructor’s course without an EEIS had significantly lower academic outcomes, suggesting less understanding of the course content. This could impact retention and continuation in a major. With an EEIS, when comparing part- and full-time instructors, there was no significant difference in failure and dropout rates or in the number of students scoring a B- or higher in the course. In fact, with an EEIS, the failure and dropout rates were statistically identical for part- and full-time instructor courses. When using an EEIS (compared with not using an EEIS), the part-time instructor showed a 62.1% decrease in failures, a 61.4% decrease in dropouts, and a 41.7% increase in the number of students scoring a B- or higher in the course. We are unaware of other interventions that yield such large improvements in academic performance. This suggests that using an EEIS such as Delphinium may compensate for part-time instructors’ limitations of expertise, time, or rewards that can have a negative impact on students’ academic outcomes. The EEIS had only a minimal impact on failure rates (7.7% decrease) and dropout rates (18.8% decrease) for the full-time instructor. This suggests there is a ceiling effect for the improvements that an EEIS can make in student performance. This may be because experienced instructors are already doing the kinds of things that an EEIS does, such as motivating students, tracking grades, and providing feedback about progress. Additionally, full-time instructors have more time to dedicate to students outside of class than part-time instructors and more rewards for doing so. Using adjunct and other types of part-time instructors will likely remain a prevalent practice in higher education management courses. Given that using part-time instructors can have a negative impact on student graduation and persistence in a field of study, it is important to identify ways we can augment part-time instructors’ performance. We demonstrated that when part-time instructors use an EEIS, it can result in significantly lower students’ failure and dropout rates and an increase in the rate of students earning a B- or above; and bring their students’ performance to parity with the performance of students taught by a full-time instructor.

Keywords: gamification, engagement, motivation, academic outcomes

Procedia PDF Downloads 56
18397 A Detailed Experimental Study and Evaluation of Springback under Stretch Bending Process

Authors: A. Soualem

Abstract:

The design of multi stage deep drawing processes requires the evaluation of many process parameters such as the intermediate die geometry, the blank shape, the sheet thickness, the blank holder force, friction, lubrication etc..These process parameters have to be determined for the optimum forming conditions before the process design. In general sheet metal forming may involve stretching drawing or various combinations of these basic modes of deformation. It is important to determine the influence of the process variables in the design of sheet metal working process. Especially, the punch and die corner for deep drawing will affect the formability. At the same time the prediction of sheet metals springback after deep drawing is an important issue to solve for the control of manufacturing processes. Nowadays, the importance of this problem increases because of the use of steel sheeting with high stress and also aluminum alloys. The aim of this paper is to give a better understanding of the springback and its effect in various sheet metals forming process such as expansion and restraint deep drawing in the cup drawing process, by varying radius die, lubricant for two commercially available materials e.g. galvanized steel and Aluminum sheet. To achieve these goals experiments were carried out and compared with other results. The original of our purpose consist on tests which are ensured by adapting a U-type stretching-bending device on a tensile testing machine, where we studied and quantified the variation of the springback.

Keywords: springback, deep drawing, expansion, restricted deep drawing

Procedia PDF Downloads 437
18396 Performance Comparison of Space-Time Block and Trellis Codes under Rayleigh Channels

Authors: Jing Qingfeng, Wu Jiajia

Abstract:

Due to the crowded orbits and shortage of frequency resources, utilizing of MIMO technology to improve spectrum efficiency and increase the capacity has become a necessary trend of broadband satellite communication. We analyze the main influenced factors and compare the BER performance of space-time block code (STBC) scheme and space-time trellis code (STTC) scheme. This paper emphatically studies the bit error rate (BER) performance of STTC and STBC under Rayleigh channel. The main emphasis is placed on the effects of the factors, such as terminal environment and elevation angles, on the BER performance of STBC and STTC schemes. Simulation results indicate that performance of STTC under Rayleigh channel is obviously improved with the increasing of transmitting and receiving antennas numbers, but the encoder state has little impact on the performance. Under Rayleigh channel, performance of Alamouti code is better than that of STTC.

Keywords: MIMO, space time block code (STBC), space time trellis code (STTC), Rayleigh channel

Procedia PDF Downloads 329
18395 Application of the Global Optimization Techniques to the Optical Thin Film Design

Authors: D. Li

Abstract:

Optical thin films are used in a wide variety of optical components and there are many software tools programmed for advancing multilayer thin film design. The available software packages for designing the thin film structure may not provide optimum designs. Normally, almost all current software programs obtain their final designs either from optimizing a starting guess or by technique, which may or may not involve a pseudorandom process, that give different answers every time, depending upon the initial conditions. With the increasing power of personal computers, functional methods in optimization and synthesis of optical multilayer systems have been developed such as DGL Optimization, Simulated Annealing, Genetic Algorithms, Needle Optimization, Inductive Optimization and Flip-Flop Optimization. Among these, DGL Optimization has proved its efficiency in optical thin film designs. The application of the DGL optimization technique to the design of optical coating is presented. A DGL optimization technique is provided, and its main features are discussed. Guidelines on the application of the DGL optimization technique to various types of design problems are given. The innovative global optimization strategies used in a software tool, OnlyFilm, to optimize multilayer thin film designs through different filter designs are outlined. OnlyFilm is a powerful, versatile, and user-friendly thin film software on the market, which combines optimization and synthesis design capabilities with powerful analytical tools for optical thin film designers. It is also the only thin film design software that offers a true global optimization function.

Keywords: optical coatings, optimization, design software, thin film design

Procedia PDF Downloads 294
18394 Optimization of a Convolutional Neural Network for the Automated Diagnosis of Melanoma

Authors: Kemka C. Ihemelandu, Chukwuemeka U. Ihemelandu

Abstract:

The incidence of melanoma has been increasing rapidly over the past two decades, making melanoma a current public health crisis. Unfortunately, even as screening efforts continue to expand in an effort to ameliorate the death rate from melanoma, there is a need to improve diagnostic accuracy to decrease misdiagnosis. Artificial intelligence (AI) a new frontier in patient care has the ability to improve the accuracy of melanoma diagnosis. Convolutional neural network (CNN) a form of deep neural network, most commonly applied to analyze visual imagery, has been shown to outperform the human brain in pattern recognition. However, there are noted limitations with the accuracy of the CNN models. Our aim in this study was the optimization of convolutional neural network algorithms for the automated diagnosis of melanoma. We hypothesized that Optimal selection of the momentum and batch hyperparameter increases model accuracy. Our most successful model developed during this study, showed that optimal selection of momentum of 0.25, batch size of 2, led to a superior performance and a faster model training time, with an accuracy of ~ 83% after nine hours of training. We did notice a lack of diversity in the dataset used, with a noted class imbalance favoring lighter vs. darker skin tone. Training set image transformations did not result in a superior model performance in our study.

Keywords: melanoma, convolutional neural network, momentum, batch hyperparameter

Procedia PDF Downloads 88
18393 Biomechanical Prediction of Veins and Soft Tissues beneath Compression Stockings Using Fluid-Solid Interaction Model

Authors: Chongyang Ye, Rong Liu

Abstract:

Elastic compression stockings (ECSs) have been widely applied in prophylaxis and treatment of chronic venous insufficiency of lower extremities. The medical function of ECS is to improve venous return and increase muscular pumping action to facilitate blood circulation, which is largely determined by the complex interaction between the ECS and lower limb tissues. Understanding the mechanical transmission of ECS along the skin surface, deeper tissues, and vascular system is essential to assess the effectiveness of the ECSs. In this study, a three-dimensional (3D) finite element (FE) model of the leg-ECS system integrated with a 3D fluid-solid interaction (FSI) model of the leg-vein system was constructed to analyze the biomechanical properties of veins and soft tissues under different ECS compression. The Magnetic Resonance Imaging (MRI) of the human leg was divided into three regions, including soft tissues, bones (tibia and fibula) and veins (peroneal vein, great saphenous vein, and small saphenous vein). The ECSs with pressure ranges from 15 to 26 mmHg (Classes I and II) were adopted in the developed FE-FSI model. The soft tissue was assumed as a Neo-Hookean hyperelastic model with the fixed bones, and the ECSs were regarded as an orthotropic elastic shell. The interfacial pressure and stress transmission were simulated by the FE model, and venous hemodynamics properties were simulated by the FSI model. The experimental validation indicated that the simulated interfacial pressure distributions were in accordance with the pressure measurement results. The developed model can be used to predict interfacial pressure, stress transmission, and venous hemodynamics exerted by ECSs and optimize the structure and materials properties of ECSs design, thus improving the efficiency of compression therapy.

Keywords: elastic compression stockings, fluid-solid interaction, tissue and vein properties, prediction

Procedia PDF Downloads 95
18392 Applying a Noise Reduction Method to Reveal Chaos in the River Flow Time Series

Authors: Mohammad H. Fattahi

Abstract:

Chaotic analysis has been performed on the river flow time series before and after applying the wavelet based de-noising techniques in order to investigate the noise content effects on chaotic nature of flow series. In this study, 38 years of monthly runoff data of three gauging stations were used. Gauging stations were located in Ghar-e-Aghaj river basin, Fars province, Iran. The noise level of time series was estimated with the aid of Gaussian kernel algorithm. This step was found to be crucial in preventing removal of the vital data such as memory, correlation and trend from the time series in addition to the noise during de-noising process.

Keywords: chaotic behavior, wavelet, noise reduction, river flow

Procedia PDF Downloads 452
18391 A Theoretical Study of and Phase Change Material Layered Roofs under Specific Climatic Regions in Turkey and the United Kingdom

Authors: Tugba Gurler, Irfan Kurtbas

Abstract:

Roof influences considerably energy demand of buildings. In order to reduce this energy demand, various solutions have been proposed, such as roofs with variable thermal insulation, cool roofs, green roofs, heat exchangers and ventilated roofs, and phase change material (PCM) layered roofs. PCMs suffer from relatively low thermal conductivity despite of their promise of the energy-efficiency initiatives for thermal energy storage (TES). This study not only presents the thermal performance of the concrete roof with PCM layers but also evaluates the products with different design configurations and thicknesses under Central Anatolia Region, Turkey and Nottinghamshire, UK weather conditions. System design limitations and proposed prediction models are discussed in this study. A two-dimensional numerical model has been developed, and governing equations have been solved at each time step. Upper surfaces of the roofs have been modelled with heat flux conditions, while lower surfaces of the roofs with boundary conditions. In addition, suitable roofs have been modeled under symmetry boundary conditions. The results of the designed concrete roofs with PCM layers have been compared with common concrete roofs in Turkey. The UK and the numerical modeling results have been validated with the data given in the literature.

Keywords: phase change material, regional energy demand, roof layers, thermal energy storage

Procedia PDF Downloads 84
18390 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU

Authors: Ali Abdul Kadhim, Fue Lien

Abstract:

Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.

Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model

Procedia PDF Downloads 184