Search results for: random disturbance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2324

Search results for: random disturbance

1994 Internal Node Stabilization for Voltage Sense Amplifiers in Multi-Channel Systems

Authors: Sanghoon Park, Ki-Jin Kim, Kwang-Ho Ahn

Abstract:

This paper discusses the undesirable charge transfer by the parasitic capacitances of the input transistors in a voltage sense amplifier. Due to its intrinsic rail-to-rail voltage transition, the input sides are inevitably disturbed. It can possible disturb the stabilities of the reference voltage levels. Moreover, it becomes serious in multi-channel systems by altering them for other channels, and so degrades the linearity of the systems. In order to alleviate the internal node voltage transition, the internal node stabilization technique is proposed by utilizing an additional biasing circuit. It achieves 47% and 43% improvements for node stabilization and input referred disturbance, respectively.

Keywords: voltage sense amplifier, voltage transition, node stabilization, biasing circuits

Procedia PDF Downloads 461
1993 Acoustic Induced Vibration Response Analysis of Honeycomb Panel

Authors: Po-Yuan Tung, Jen-Chueh Kuo, Chia-Ray Chen, Chien-Hsing Li, Kuo-Liang Pan

Abstract:

The main-body structure of satellite is mainly constructed by lightweight material, it should be able to withstand certain vibration load during launches. Since various kinds of change possibility in the space, it is an extremely important work to study the random vibration response of satellite structure. This paper based on the reciprocity relationship between sound and structure response and it will try to evaluate the dynamic response of satellite main body under random acoustic load excitation. This paper will study the technical process and verify the feasibility of sonic-borne vibration analysis. One simple plate exposed to the uniform acoustic field is utilized to take some important parameters and to validate the acoustics field model of the reverberation chamber. Then import both structure and acoustic field chamber models into the vibro-acoustic coupling analysis software to predict the structure response. During the modeling process, experiment verification is performed to make sure the quality of numerical models. Finally, the surface vibration level can be calculated through the modal participation factor, and the analysis results are presented in PSD spectrum.

Keywords: vibration, acoustic, modal, honeycomb panel

Procedia PDF Downloads 545
1992 A Multigrid Approach for Three-Dimensional Inverse Heat Conduction Problems

Authors: Jianhua Zhou, Yuwen Zhang

Abstract:

A two-step multigrid approach is proposed to solve the inverse heat conduction problem in a 3-D object under laser irradiation. In the first step, the location of the laser center is estimated using a coarse and uniform grid system. In the second step, the front-surface temperature is recovered in good accuracy using a multiple grid system in which fine mesh is used at laser spot center to capture the drastic temperature rise in this region but coarse mesh is employed in the peripheral region to reduce the total number of sensors required. The effectiveness of the two-step approach and the multiple grid system are demonstrated by the illustrative inverse solutions. If the measurement data for the temperature and heat flux on the back surface do not contain random error, the proposed multigrid approach can yield more accurate inverse solutions. When the back-surface measurement data contain random noise, accurate inverse solutions cannot be obtained if both temperature and heat flux are measured on the back surface.

Keywords: conduction, inverse problems, conjugated gradient method, laser

Procedia PDF Downloads 351
1991 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 58
1990 A Machine Learning Approach for Intelligent Transportation System Management on Urban Roads

Authors: Ashish Dhamaniya, Vineet Jain, Rajesh Chouhan

Abstract:

Traffic management is one of the gigantic issue in most of the urban roads in al-most all metropolitan cities in India. Speed is one of the critical traffic parameters for effective Intelligent Transportation System (ITS) implementation as it decides the arrival rate of vehicles on an intersection which are majorly the point of con-gestions. The study aimed to leverage Machine Learning (ML) models to produce precise predictions of speed on urban roadway links. The research objective was to assess how categorized traffic volume and road width, serving as variables, in-fluence speed prediction. Four tree-based regression models namely: Decision Tree (DT), Random Forest (RF), Extra Tree (ET), and Extreme Gradient Boost (XGB)are employed for this purpose. The models' performances were validated using test data, and the results demonstrate that Random Forest surpasses other machine learning techniques and a conventional utility theory-based model in speed prediction. The study is useful for managing the urban roadway network performance under mixed traffic conditions and effective implementation of ITS.

Keywords: stream speed, urban roads, machine learning, traffic flow

Procedia PDF Downloads 43
1989 Application of GeoGebra into Teaching and Learning of Linear and Quadratic Equations amongst Senior Secondary School Students in Fagge Local Government Area of Kano State, Nigeria

Authors: Musa Auwal Mamman, S. G. Isa

Abstract:

This study was carried out in order to investigate the effectiveness of GeoGebra software in teaching and learning of linear and quadratic equations amongst senior secondary school students in Fagge Local Government Area, Kano State–Nigeria. Five research items were raised in objectives, research questions and hypotheses respectively. A random sampling method was used in selecting 398 students from a population of 2098 of SS2 students. The experimental group was taught using the GeoGebra software while the control group was taught using the conventional teaching method. The instrument used for the study was the mathematics performance test (MPT) which was administered at the beginning and at the end of the study. The results of the study revealed that students taught with GeoGebra software (experimental group) performed better than students taught with traditional teaching method. The t- test was used to analyze the data obtained from the study.

Keywords: GeoGebra Software, mathematics performance, random sampling, mathematics teaching

Procedia PDF Downloads 233
1988 Progressive Type-I Interval Censoring with Binomial Removal-Estimation and Its Properties

Authors: Sonal Budhiraja, Biswabrata Pradhan

Abstract:

This work considers statistical inference based on progressive Type-I interval censored data with random removal. The scheme of progressive Type-I interval censoring with random removal can be described as follows. Suppose n identical items are placed on a test at time T0 = 0 under k pre-fixed inspection times at pre-specified times T1 < T2 < . . . < Tk, where Tk is the scheduled termination time of the experiment. At inspection time Ti, Ri of the remaining surviving units Si, are randomly removed from the experiment. The removal follows a binomial distribution with parameters Si and pi for i = 1, . . . , k, with pk = 1. In this censoring scheme, the number of failures in different inspection intervals and the number of randomly removed items at pre-specified inspection times are observed. Asymptotic properties of the maximum likelihood estimators (MLEs) are established under some regularity conditions. A β-content γ-level tolerance interval (TI) is determined for two parameters Weibull lifetime model using the asymptotic properties of MLEs. The minimum sample size required to achieve the desired β-content γ-level TI is determined. The performance of the MLEs and TI is studied via simulation.

Keywords: asymptotic normality, consistency, regularity conditions, simulation study, tolerance interval

Procedia PDF Downloads 229
1987 Random Variation of Treated Volumes in Fractionated 2D Image Based HDR Brachytherapy for Cervical Cancer

Authors: R. Tudugala, B. M. A. I. Balasooriya, W. M. Ediri Arachchi, R. W. M. W. K. Rathnayake, T. D. Premaratna

Abstract:

Brachytherapy involves placing a source of radiation near the cancer site which gives promising prognosis for cervical cancer treatments. The purpose of this study was to evaluate the effect of random variation of treated volumes in between fractions in the 2D image based fractionated high dose rate brachytherapy for cervical cancer at National Cancer Institute Maharagama, Sri Lanka. Dose plans were analyzed for 150 cervical cancer patients with orthogonal radiographs (2D) based brachytherapy. ICRU treated volumes was modeled by translating the applicators with the help of “Multisource HDR plus software”. The difference of treated volumes with respect to the applicator geometry was analyzed by using SPSS 18 software; to derived patient population based estimates of delivered treated volumes relative to ideally treated volumes. Packing was evaluated according to bladder dose, rectum dose and geometry of the dose distribution by three consultant radiation oncologist. The difference of treated volumes depends on types of the applicators, which was used in fractionated brachytherapy. The means of the “Difference of Treated Volume” (DTV) for “Evenly activated tandem (ET)” length” group was ((X_1)) -0.48 cm3 and ((X_2)) 11.85 cm3 for “Unevenly activated tandem length (UET) group. The range of the DTV for ET group was 35.80 cm3 whereas UET group 104.80 cm3. One sample T test was performed to compare the DTV with “Ideal treatment volume difference (0.00cm3)”. It is evident that P value was 0.732 for ET group and for UET it was 0.00 moreover independent two sample T test was performed to compare ET and UET groups and calculated P value was 0.005. Packing was evaluated under three categories 59.38% used “Convenient Packing Technique”, 33.33% used “Fairly Packing Technique” and 7.29% used “Not Convenient Packing” in their fractionated brachytherapy treatments. Random variation of treated volume in ET group is much lower than UET group and there is a significant difference (p<0.05) in between ET and UET groups which affects the dose distribution of the treatment. Furthermore, it can be concluded nearly 92.71% patient’s packing were used acceptable packing technique at NCIM, Sri Lanka.

Keywords: brachytherapy, cervical cancer, high dose rate, tandem, treated volumes

Procedia PDF Downloads 184
1986 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 95
1985 Churn Prediction for Savings Bank Customers: A Machine Learning Approach

Authors: Prashant Verma

Abstract:

Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.

Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling

Procedia PDF Downloads 124
1984 Simulation of an Active Controlled Vibration Isolation System for Astronaut’s Exercise Platform

Authors: Shield B. Lin, Sameer Abdali

Abstract:

Computer simulations were performed using MATLAB/Simulink for a vibration isolation system for astronaut’s exercise platform. Simulation parameters initially were based on an on-going experiment in a laboratory at NASA Johnson Space Center. The authors expanded later simulations to include other parameters. A discrete proportional-integral-derivative controller with a low-pass filter commanding a linear actuator served as the active control unit to push and pull a counterweight in balancing the disturbance forces. A spring-damper device is used as an optional passive control unit. Simulation results indicated such design could achieve near complete vibration isolation with small displacements of the exercise platform.

Keywords: control, counterweight, isolation, vibration

Procedia PDF Downloads 134
1983 Design of Permanent Sensor Fault Tolerance Algorithms by Sliding Mode Observer for Smart Hybrid Powerpack

Authors: Sungsik Jo, Hyeonwoo Kim, Iksu Choi, Hunmo Kim

Abstract:

In the SHP, LVDT sensor is for detecting the length changes of the EHA output, and the thrust of the EHA is controlled by the pressure sensor. Sensor is possible to cause hardware fault by internal problem or external disturbance. The EHA of SHP is able to be uncontrollable due to control by feedback from uncertain information, on this paper; the sliding mode observer algorithm estimates the original sensor output information in permanent sensor fault. The proposed algorithm shows performance to recovery fault of disconnection and short circuit basically, also the algorithm detect various of sensor fault mode.

Keywords: smart hybrid powerpack (SHP), electro hydraulic actuator (EHA), permanent sensor fault tolerance, sliding mode observer (SMO), graphic user interface (GUI)

Procedia PDF Downloads 534
1982 Understanding the Thermal Transformation of Random Access Memory Cards: A Pathway to Their Efficient Recycling

Authors: Khushalini N. Ulman, Samane Maroufi, Veena H. Sahajwalla

Abstract:

Globally, electronic waste (e-waste) continues to grow at an alarming rate. Several technologies have been developed to recover valuable materials from e-waste, however, their efficiency can be increased with a better knowledge of the e-waste components. Random access memory cards (RAMs) are considered as high value scrap for the e-waste recyclers. Despite their high precious metal content, RAMs are still recycled in a conventional manner resulting in huge loss of resources. Our research work highlights the precious metal rich components of a RAM. Inductively coupled plasma (ICP) analysis of RAMs of six different generations have been carried out and the trends in their metal content have been investigated. Over the past decade, the copper content of RAMs has halved and their tin content has increased by 70 %. The stricter environmental laws have facilitated ~96 % drop in the lead content of RAMs. To comprehend the fundamentals of thermal transformation of RAMs, our research provides their detailed kinetic study. This can assist the e-waste recyclers in optimising their metal recovery processes. Thus, understanding the chemical and thermal behaviour of RAMs can open new avenues for efficient e-waste recycling.

Keywords: electronic waste, kinetic study, recycling, thermal transformation

Procedia PDF Downloads 128
1981 Designing Stochastic Non-Invasively Applied DC Pulses to Suppress Tremors in Multiple Sclerosis by Computational Modeling

Authors: Aamna Lawrence, Ashutosh Mishra

Abstract:

Tremors occur in 60% of the patients who have Multiple Sclerosis (MS), the most common demyelinating disease that affects the central and peripheral nervous system, and are the primary cause of disability in young adults. While pharmacological agents provide minimal benefits, surgical interventions like Deep Brain Stimulation and Thalamotomy are riddled with dangerous complications which make non-invasive electrical stimulation an appealing treatment of choice for dealing with tremors. Hence, we hypothesized that if the non-invasive electrical stimulation parameters (mainly frequency) can be computed by mathematically modeling the nerve fibre to take into consideration the minutest details of the axon morphologies, tremors due to demyelination can be optimally alleviated. In this computational study, we have modeled the random demyelination pattern in a nerve fibre that typically manifests in MS using the High-Density Hodgkin-Huxley model with suitable modifications to account for the myelin. The internode of the nerve fibre in our model could have up to ten demyelinated regions each having random length and myelin thickness. The arrival time of action potentials traveling the demyelinated and the normally myelinated nerve fibre between two fixed points in space was noted, and its relationship with the nerve fibre radius ranging from 5µm to 12µm was analyzed. It was interesting to note that there were no overlaps between the arrival time for action potentials traversing the demyelinated and normally myelinated nerve fibres even when a single internode of the nerve fibre was demyelinated. The study gave us an opportunity to design DC pulses whose frequency of application would be a function of the random demyelination pattern to block only the delayed tremor-causing action potentials. The DC pulses could be delivered to the peripheral nervous system non-invasively by an electrode bracelet that would suppress any shakiness beyond it thus paving the way for wearable neuro-rehabilitative technologies.

Keywords: demyelination, Hodgkin-Huxley model, non-invasive electrical stimulation, tremor

Procedia PDF Downloads 112
1980 Stabilization Technique for Multi-Inputs Voltage Sense Amplifiers in Node Sharing Converters

Authors: Sanghoon Park, Ki-Jin Kim, Kwang-Ho Ahn

Abstract:

This paper discusses the undesirable charge transfer through the parasitic capacitances of the input transistors in a multi-inputs voltage sense amplifier. Its intrinsic rail-to-rail voltage transitions at the output nodes inevitably disturb the input sides through the capacitive coupling between the outputs and inputs. Then, it can possible degrade the stabilities of the reference voltage levels. Moreover, it becomes more serious in multi-channel systems by altering them for other channels, and so degrades the linearity of the overall systems. In order to alleviate the internal node voltage transition, the internal node stabilization techniques are proposed. It achieves 45% and 40% improvements for node stabilization and input referred disturbance, respectively.

Keywords: voltage sense amplifier, multi-inputs, voltage transition, node stabilization, biasing circuits

Procedia PDF Downloads 547
1979 Erosion Modeling of Surface Water Systems for Long Term Simulations

Authors: Devika Nair, Sean Bellairs, Ken Evans

Abstract:

Flow and erosion modeling provides an avenue for simulating the fine suspended sediment in surface water systems like streams and creeks. Fine suspended sediment is highly mobile, and many contaminants that may have been released by any sort of catchment disturbance attach themselves to these sediments. Therefore, a knowledge of fine suspended sediment transport is important in assessing contaminant transport. The CAESAR-Lisflood Landform Evolution Model, which includes a hydrologic model (TOPMODEL) and a hydraulic model (Lisflood), is being used to assess the sediment movement in tropical streams on account of a disturbance in the catchment of the creek and to determine the dynamics of sediment quantity in the creek through the years by simulating the model for future years. The accuracy of future simulations depends on the calibration and validation of the model to the past and present events. Calibration and validation of the model involve finding a combination of parameters of the model, which, when applied and simulated, gives model outputs similar to those observed for the real site scenario for corresponding input data. Calibrating the sediment output of the CAESAR-Lisflood model at the catchment level and using it for studying the equilibrium conditions of the landform is an area yet to be explored. Therefore, the aim of the study was to calibrate the CAESAR-Lisflood model and then validate it so that it could be run for future simulations to study how the landform evolves over time. To achieve this, the model was run for a rainfall event with a set of parameters, plus discharge and sediment data for the input point of the catchment, to analyze how similar the model output would behave when compared with the discharge and sediment data for the output point of the catchment. The model parameters were then adjusted until the model closely approximated the real site values of the catchment. It was then validated by running the model for a different set of events and checking that the model gave similar results to the real site values. The outcomes demonstrated that while the model can be calibrated to a greater extent for hydrology (discharge output) throughout the year, the sediment output calibration may be slightly improved by having the ability to change parameters to take into account the seasonal vegetation growth during the start and end of the wet season. This study is important to assess hydrology and sediment movement in seasonal biomes. The understanding of sediment-associated metal dispersion processes in rivers can be used in a practical way to help river basin managers more effectively control and remediate catchments affected by present and historical metal mining.

Keywords: erosion modelling, fine suspended sediments, hydrology, surface water systems

Procedia PDF Downloads 67
1978 On the Design of Robust Governors of Steam Power Systems Using Polynomial and State-Space Based H∞ Techniques: A Comparative Study

Authors: Rami A. Maher, Ibraheem K. Ibraheem

Abstract:

This work presents a comparison study between the state-space and polynomial methods for the design of the robust governor for load frequency control of steam turbine power systems. The robust governor is synthesized using the two approaches and the comparison is extended to include time and frequency domains performance, controller order, and uncertainty representation, weighting filters, optimality and sub-optimality. The obtained results are represented through tables and curves with reasons of similarities and dissimilarities.

Keywords: robust control, load frequency control, steam turbine, H∞-norm, system uncertainty, load disturbance

Procedia PDF Downloads 394
1977 Evaluation of Spatial Correlation Length and Karhunen-Loeve Expansion Terms for Predicting Reliability Level of Long-Term Settlement in Soft Soils

Authors: Mehrnaz Alibeikloo, Hadi Khabbaz, Behzad Fatahi

Abstract:

The spectral random field method is one of the widely used methods to obtain more reliable and accurate results in geotechnical problems involving material variability. Karhunen-Loeve (K-L) expansion method was applied to perform random field discretization of cross-correlated creep parameters. Karhunen-Loeve expansion method is based on eigenfunctions and eigenvalues of covariance function adopting Kernel integral solution. In this paper, the accuracy of Karhunen-Loeve expansion was investigated to predict long-term settlement of soft soils adopting elastic visco-plastic creep model. For this purpose, a parametric study was carried to evaluate the effect of K-L expansion terms and spatial correlation length on the reliability of results. The results indicate that small values of spatial correlation length require more K-L expansion terms. Moreover, by increasing spatial correlation length, the coefficient of variation (COV) of creep settlement increases, confirming more conservative and safer prediction.

Keywords: Karhunen-Loeve expansion, long-term settlement, reliability analysis, spatial correlation length

Procedia PDF Downloads 140
1976 Random Subspace Neural Classifier for Meteor Recognition in the Night Sky

Authors: Carlos Vera, Tetyana Baydyk, Ernst Kussul, Graciela Velasco, Miguel Aparicio

Abstract:

This article describes the Random Subspace Neural Classifier (RSC) for the recognition of meteors in the night sky. We used images of meteors entering the atmosphere at night between 8:00 p.m.-5: 00 a.m. The objective of this project is to classify meteor and star images (with stars as the image background). The monitoring of the sky and the classification of meteors are made for future applications by scientists. The image database was collected from different websites. We worked with RGB-type images with dimensions of 220x220 pixels stored in the BitMap Protocol (BMP) format. Subsequent window scanning and processing were carried out for each image. The scan window where the characteristics were extracted had the size of 20x20 pixels with a scanning step size of 10 pixels. Brightness, contrast and contour orientation histograms were used as inputs for the RSC. The RSC worked with two classes and classified into: 1) with meteors and 2) without meteors. Different tests were carried out by varying the number of training cycles and the number of images for training and recognition. The percentage error for the neural classifier was calculated. The results show a good RSC classifier response with 89% correct recognition. The results of these experiments are presented and discussed.

Keywords: contour orientation histogram, meteors, night sky, RSC neural classifier, stars

Procedia PDF Downloads 126
1975 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing

Authors: Carolina Gouveia, José Vieira, Pedro Pinho

Abstract:

The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.

Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR

Procedia PDF Downloads 121
1974 Fuzzy Sliding Mode Control of a Flexible Structure for Vibration Suppression Using MFC Actuator

Authors: Jinsiang Shaw, Shih-Chieh Tseng

Abstract:

Active vibration control is good for low frequency excitation, with advantages of light weight and adaptability. This paper use a macro-fiber composite (MFC) actuator for vibration suppression in a cantilevered beam due to its higher output force to suppress the disturbance. A fuzzy sliding mode controller is developed and applied to this system. Experimental results illustrate that the controller and MFC actuator are very effective in attenuating the structural vibration near the first resonant freuqency. Furthermore, this controller is shown to outperform the traditional skyhook controller, with nearly 90% of the vibration suppressed at the first resonant frequency of the structure.

Keywords: Fuzzy sliding mode controller, macro-fiber-composite actuator, skyhook controller, vibration suppression

Procedia PDF Downloads 384
1973 Stock Prediction and Portfolio Optimization Thesis

Authors: Deniz Peksen

Abstract:

This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.

Keywords: stock prediction, portfolio optimization, data science, machine learning

Procedia PDF Downloads 62
1972 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 87
1971 Systems Lens: Towards Sustainable Management of Maintenance and Renewal of Wire-Based Infrastructure: The Case of Water Network in the City of Linköping, Sweden

Authors: E. Hegazy, S. Anderberg, J. Krook

Abstract:

The city's wire-based infrastructure systems (WBIS) are responsible for the delivery of electricity, telecommunications, sanitation, drainage, and district heating and are a necessity for sustainable modern urban life. Maintaining the functionality of these structures involves high costs and, brings disturbance to the local community and effects on the environment. One key reason for this is that the cables and pipes are placed under streets, making system parts easily worn and their service lifetime reduced, and all maintenance and renewal rely on recurrent needs for excavation. In Sweden, a significant part of wire-based infrastructure is already outdated and will need to be replaced in the coming decades. The replacement of these systems will entail massive costs as well as important traffic disruption and environmental disturbance. However, this challenge may also open a unique opportunity to introduce new, more sustainable technologies and management practices. The transformation of WBIS management for long-term sustainability and meeting maintenance and renewal needs does not have a comprehensive approach. However, a systemic approach may inform WBIS management. This approach considers both technical and non-technical aspects, as well as time-related factors. Nevertheless, there is limited systemic knowledge of how different factors influence current management practices. The aim of this study is to address this knowledge gap and contribute to the understanding of what factors influence the current practice of WBIS management. A case study approach is used to identify current management practices, the underlying factors that influence them, and their implications for sustainability outcomes. The case study is based on both quantitative data on the local system and data from interviews and workshops with local practitioners and other stakeholders. Linköping was selected as a case since it provided good accessibility to the water administration and relevant data for analyzing water infrastructure management strategies. It is a sufficiently important city in Sweden to be able to identify challenges, which, to some extent, are common to all Swedish cities. By uncovering current practices and what is influencing Linköping, knowledge gaps and uncertainties related to sustainability consequences were highlighted. The findings show that goals, priorities, and policies controlling management are short-termed, and decisions on maintenance and renewal are often restricted to finding solutions to the most urgent issues. Sustainability transformation in the infrastructure area will not be possible through individual efforts without coordinated technical, organizational, business, and regulatory changes.

Keywords: case study, infrastructure, management, practice, Sweden

Procedia PDF Downloads 67
1970 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 64
1969 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 109
1968 Vibration Control of a Flexible Structure Using MFC Actuator

Authors: Jinsiang Shaw, Jeng-Jie Huang

Abstract:

Active vibration control is good for low frequency excitation, with advantages of light weight and adaptability. This paper employs a macro-fiber composite (MFC) actuator for vibration suppression in a cantilevered beam due to its higher output force to reject the disturbance. A notch filter with an adaptive tuning algorithm, the leaky filtered-X least mean square algorithm (leaky FXLMS algorithm), is developed and applied to the system. Experimental results show that the controller and MFC actuator was very effective in attenuating the structural vibration. Furthermore, this notch filter controller was compared with the traditional skyhook controller. It was found that its performance was better, with over 88% vibration suppression near the first resonant frequency of the structure.

Keywords: macro-fiber composite, notch filter, skyhook controller, vibration suppression

Procedia PDF Downloads 442
1967 Modeling Biomass and Biodiversity across Environmental and Management Gradients in Temperate Grasslands with Deep Learning and Sentinel-1 and -2

Authors: Javier Muro, Anja Linstadter, Florian Manner, Lisa Schwarz, Stephan Wollauer, Paul Magdon, Gohar Ghazaryan, Olena Dubovyk

Abstract:

Monitoring the trade-off between biomass production and biodiversity in grasslands is critical to evaluate the effects of management practices across environmental gradients. New generations of remote sensing sensors and machine learning approaches can model grasslands’ characteristics with varying accuracies. However, studies often fail to cover a sufficiently broad range of environmental conditions, and evidence suggests that prediction models might be case specific. In this study, biomass production and biodiversity indices (species richness and Fishers’ α) are modeled in 150 grassland plots for three sites across Germany. These sites represent a North-South gradient and are characterized by distinct soil types, topographic properties, climatic conditions, and management intensities. Predictors used are derived from Sentinel-1 & 2 and a set of topoedaphic variables. The transferability of the models is tested by training and validating at different sites. The performance of feed-forward deep neural networks (DNN) is compared to a random forest algorithm. While biomass predictions across gradients and sites were acceptable (r2 0.5), predictions of biodiversity indices were poor (r2 0.14). DNN showed higher generalization capacity than random forest when predicting biomass across gradients and sites (relative root mean squared error of 0.5 for DNN vs. 0.85 for random forest). DNN also achieved high performance when using the Sentinel-2 surface reflectance data rather than different combinations of spectral indices, Sentinel-1 data, or topoedaphic variables, simplifying dimensionality. This study demonstrates the necessity of training biomass and biodiversity models using a broad range of environmental conditions and ensuring spatial independence to have realistic and transferable models where plot level information can be upscaled to landscape scale.

Keywords: ecosystem services, grassland management, machine learning, remote sensing

Procedia PDF Downloads 199
1966 Modeling and Control of a 4DoF Robotic Assistive Device for Hand Rehabilitation

Authors: Christopher Spiewak, M. R. Islam, Mohammad Arifur Rahaman, Mohammad H. Rahman, Roger Smith, Maarouf Saad

Abstract:

For those who have lost the ability to move their hand, going through repetitious motions with the assistance of a therapist is the main method of recovery. We have been developed a robotic assistive device to rehabilitate the hand motions in place of the traditional therapy. The developed assistive device (RAD-HR) is comprised of four degrees of freedom enabling basic movements, hand function, and assists in supporting the hand during rehabilitation. We used a nonlinear computed torque control technique to control the RAD-HR. The accuracy of the controller was evaluated in simulations (MATLAB/Simulink environment). To see the robustness of the controller external disturbance as modelling uncertainty (±10% of joint torques) were added in each joints.

Keywords: biorobotics, rehabilitation, robotic assistive device, exoskeleton, nonlinear control

Procedia PDF Downloads 452
1965 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models

Authors: Jay L. Fu

Abstract:

Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.

Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction

Procedia PDF Downloads 125