Search results for: time series models
23594 The Influence of Contact Models on Discrete Element Modeling of the Ballast Layer Subjected to Cyclic Loading
Authors: Peyman Aela, Lu Zong, Guoqing Jing
Abstract:
Recently, there has been growing interest in numerical modeling of ballast railway tracks. A commonly used mechanistic modeling approach for ballast is the discrete element method (DEM). Up to now, the effects of the contact model on ballast particle behavior have not been precisely examined. In this regard, selecting the appropriate contact model is mainly associated with the particle characteristics and the loading condition. Since ballast is cohesionless material, different contact models, including the linear spring, Hertz-Mindlin, and Hysteretic models, could be used to calculate particle-particle or wall-particle contact forces. Moreover, the simulation of a dynamic test is vital to investigate the effect of damping parameters on the ballast deformation. In this study, ballast box tests were simulated by DEM to examine the influence of different contact models on the mechanical behavior of the ballast layer under cyclic loading. This paper shows how the contact model can affect the deformation and damping of a ballast layer subjected to cyclic loading in a ballast box.Keywords: ballast, contact model, cyclic loading, DEM
Procedia PDF Downloads 19723593 Discrete-Event Modeling and Simulation Methodologies: Past, Present and Future
Authors: Gabriel Wainer
Abstract:
Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems, and it is now common to use simulation as a part of the scientific and technological discovery process. M&S advanced thanks to the improvements in computer technology, which, in many cases, resulted in the development of simulation software using ad-hoc techniques. Formal M&S appeared in order to try to improve the development task of very complex simulation systems. Some of these techniques proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing the application development tasks; reducing costs and favoring reuse. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. DEVS model development is based on a sound theoretical framework. The independence of M&S tasks made possible to run DEVS models on different environments (personal computers, parallel computers, real-time equipment, and distributed simulators) and middleware. We will present a historical perspective of discrete-event M&S methodologies, showing different modeling techniques. We will introduce DEVS origins and general ideas, and compare it with some of these techniques. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability, and model-centered development techniques). We will show some examples of the current use of DEVS, including applications in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need for real-time modeling techniques, and our view in these fields.Keywords: modeling and simulation, discrete-event simulation, hybrid systems modeling, parallel and distributed simulation
Procedia PDF Downloads 32323592 Influence of Atmospheric Pollutants on Child Respiratory Disease in Cartagena De Indias, Colombia
Authors: Jose A. Alvarez Aldegunde, Adrian Fernandez Sanchez, Matthew D. Menden, Bernardo Vila Rodriguez
Abstract:
Up to five statistical pre-processings have been carried out considering the pollutant records of the stations present in Cartagena de Indias, Colombia, also taking into account the childhood asthma incidence surveys conducted in hospitals in the city by the Health Ministry of Colombia for this study. These pre-processings have consisted of different techniques such as the determination of the quality of data collection, determination of the quality of the registration network, identification and debugging of errors in data collection, completion of missing data and purified data, as well as the improvement of the time scale of records. The characterization of the quality of the data has been conducted by means of density analysis of the pollutant registration stations using ArcGis Software and through mass balance techniques, making it possible to determine inconsistencies in the records relating the registration data between stations following the linear regression. The results obtained in this process have highlighted the positive quality in the pollutant registration process. Consequently, debugging of errors has allowed us to identify certain data as statistically non-significant in the incidence and series of contamination. This data, together with certain missing records in the series recorded by the measuring stations, have been completed by statistical imputation equations. Following the application of these prior processes, the basic series of incidence data for respiratory disease and pollutant records have allowed the characterization of the influence of pollutants on respiratory diseases such as, for example, childhood asthma. This characterization has been carried out using statistical correlation methods, including visual correlation, simple linear regression correlation and spectral analysis with PAST Software which identifies maximum periodicity cycles and minimums under the formula of the Lomb periodgram. In relation to part of the results obtained, up to eleven maximums and minimums considered contemporary between the incidence records and the particles have been identified taking into account the visual comparison. The spectral analyses that have been performed on the incidence and the PM2.5 have returned a series of similar maximum periods in both registers, which are at a maximum during a period of one year and another every 25 days (0.9 and 0.07 years). The bivariate analysis has managed to characterize the variable "Daily Vehicular Flow" in the ninth position of importance of a total of 55 variables. However, the statistical correlation has not obtained a favorable result, having obtained a low value of the R2 coefficient. The series of analyses conducted has demonstrated the importance of the influence of pollutants such as PM2.5 in the development of childhood asthma in Cartagena. The quantification of the influence of the variables has been able to determine that there is a 56% probability of dependence between PM2.5 and childhood respiratory asthma in Cartagena. Considering this justification, the study could be completed through the application of the BenMap Software, throwing a series of spatial results of interpolated values of the pollutant contamination records that exceeded the established legal limits (represented by homogeneous units up to the neighborhood level) and results of the impact on the exacerbation of pediatric asthma. As a final result, an economic estimate (in Colombian Pesos) of the monthly and individual savings derived from the percentage reduction of the influence of pollutants in relation to visits to the Hospital Emergency Room due to asthma exacerbation in pediatric patients has been granted.Keywords: Asthma Incidence, BenMap, PM2.5, Statistical Analysis
Procedia PDF Downloads 11623591 Real-time Rate and Rhythms Feedback Control System in Patients with Atrial Fibrillation
Authors: Mohammad A. Obeidat, Ayman M. Mansour
Abstract:
Capturing the dynamic behavior of the heart to improve control performance, enhance robustness, and support diagnosis is very important in establishing real time models for the heart. Control Techniques and strategies have been utilized to improve system costs, reliability, and estimation accuracy for different types of systems such as biomedical, industrial, and other systems that required tuning input/output relation and/or monitoring. Simulations are performed to illustrate potential applications of the technology. In this research, a new control technology scheme is used to enhance the performance of the Af system and meet the design specifications.Keywords: atrial fibrillation, dynamic behavior, closed loop, signal, filter
Procedia PDF Downloads 42023590 Aspects Concerning Flame Propagation of Various Fuels in Combustion Chamber of Four Valve Engines
Authors: Zoran Jovanovic, Zoran Masonicic, S. Dragutinovic, Z. Sakota
Abstract:
In this paper, results concerning flame propagation of various fuels in a particular combustion chamber with four tilted valves were elucidated. Flame propagation was represented by the evolution of spatial distribution of temperature in various cut-planes within combustion chamber while the flame front location was determined by dint of zones with maximum temperature gradient. The results presented are only a small part of broader on-going scrutinizing activity in the field of multidimensional modeling of reactive flows in combustion chambers with complicated geometries encompassing various models of turbulence, different fuels and combustion models. In the case of turbulence two different models were applied i.e. standard k-ε model of turbulence and k-ξ-f model of turbulence. In this paper flame propagation results were analyzed and presented for two different hydrocarbon fuels, such as CH4 and C8H18. In the case of combustion all differences ensuing from different turbulence models, obvious for non-reactive flows are annihilated entirely. Namely the interplay between fluid flow pattern and flame propagation is invariant as regards turbulence models and fuels applied. Namely the interplay between fluid flow pattern and flame propagation is entirely invariant as regards fuel variation indicating that the flame propagation through unburned mixture of CH4 and C8H18 fuels is not chemically controlled.Keywords: automotive flows, flame propagation, combustion modelling, CNG
Procedia PDF Downloads 29223589 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates
Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe
Abstract:
Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.Keywords: machine learning, MTB, WGS, drug resistant TB
Procedia PDF Downloads 5223588 An ALM Matrix Completion Algorithm for Recovering Weather Monitoring Data
Authors: Yuqing Chen, Ying Xu, Renfa Li
Abstract:
The development of matrix completion theory provides new approaches for data gathering in Wireless Sensor Networks (WSN). The existing matrix completion algorithms for WSN mainly consider how to reduce the sampling number without considering the real-time performance when recovering the data matrix. In order to guarantee the recovery accuracy and reduce the recovery time consumed simultaneously, we propose a new ALM algorithm to recover the weather monitoring data. A lot of experiments have been carried out to investigate the performance of the proposed ALM algorithm by using different parameter settings, different sampling rates and sampling models. In addition, we compare the proposed ALM algorithm with some existing algorithms in the literature. Experimental results show that the ALM algorithm can obtain better overall recovery accuracy with less computing time, which demonstrate that the ALM algorithm is an effective and efficient approach for recovering the real world weather monitoring data in WSN.Keywords: wireless sensor network, matrix completion, singular value thresholding, augmented Lagrange multiplier
Procedia PDF Downloads 38423587 Decision Support: How Explainable A.I. Can Improve Transparency and Trust with Human Users
Authors: Devon Brown, Liu Chunmei
Abstract:
This paper will present an analysis as part of the researchers dissertation topic focusing on the intersection of affective and analytical directed acyclic graphs (DAGs) in the context of Decision Support Systems (DSS). The researcher’s work involves analyzing decision theory models like Affective and Bayesian Decision theory models and how they could be implemented under an Affective Computing Framework using Information Fusion and Human-Centered Design. Additionally, the researcher is beginning research on an Affective-Analytic Decision Framework (AADF) model for their dissertation research and are looking to merge logic and analytic models with empathetic insights into affective DAGs. Data-collection efforts begin Fall 2024 and in preparation for the efforts this paper looks to analyze previous research in this area and introduce the AADF framework and propose conceptual models for consideration. For this paper, the research emphasis is placed on analyzing Bayesian networks and Markov models which offer probabilistic techniques during uncertainty in decision-making. Ideally, including affect into analytic models will ensure algorithms can increase user trust with algorithms by including emotional states and the user’s experience with the goal of developing emotionally intelligent A.I. systems that can start to navigate the complex fabric of human emotion during decision-making.Keywords: decision support systems, explainable AI, HCAI techniques, affective-analytical decision framework
Procedia PDF Downloads 2023586 Study of Climate Change Process on Hyrcanian Forests Using Dendroclimatology Indicators (Case Study of Guilan Province)
Authors: Farzad Shirzad, Bohlol Alijani, Mehry Akbary, Mohammad Saligheh
Abstract:
Climate change and global warming are very important issues today. The process of climate change, especially changes in temperature and precipitation, is the most important issue in the environmental sciences. Climate change means changing the averages in the long run. Iran is located in arid and semi-arid regions due to its proximity to the equator and its location in the subtropical high pressure zone. In this respect, the Hyrcanian forest is a green necklace between the Caspian Sea and the south of the Alborz mountain range. In the forty-third session of UNESCO, it was registered as the second natural heritage of Iran. Beech is one of the most important tree species and the most industrial species of Hyrcanian forests. In this research, using dendroclimatology, the width of the tree ring, and climatic data of temperature and precipitation from Shanderman meteorological station located in the study area, And non-parametric Mann-Kendall statistical method to investigate the trend of climate change over a time series of 202 years of growth ringsAnd Pearson statistical method was used to correlate the growth of "ring" growth rings of beech trees with climatic variables in the region. The results obtained from the time series of beech growth rings showed that the changes in beech growth rings had a downward and negative trend and were significant at the level of 5% and climate change occurred. The average minimum, medium, and maximum temperatures and evaporation in the growing season had an increasing trend, and the annual precipitation had a decreasing trend. Using Pearson method during fitting the correlation of diameter of growth rings with temperature, for the average in July, August, and September, the correlation is negative, and the average temperature in July, August, and September is negative, and for the average The average maximum temperature in February was correlation-positive and at the level of 95% was significant, and with precipitation, in June the correlation was at the level of 95% positive and significant.Keywords: climate change, dendroclimatology, hyrcanian forest, beech
Procedia PDF Downloads 10423585 Nuclear Fuel Safety Threshold Determined by Logistic Regression Plus Uncertainty
Authors: D. S. Gomes, A. T. Silva
Abstract:
Analysis of the uncertainty quantification related to nuclear safety margins applied to the nuclear reactor is an important concept to prevent future radioactive accidents. The nuclear fuel performance code may involve the tolerance level determined by traditional deterministic models producing acceptable results at burn cycles under 62 GWd/MTU. The behavior of nuclear fuel can simulate applying a series of material properties under irradiation and physics models to calculate the safety limits. In this study, theoretical predictions of nuclear fuel failure under transient conditions investigate extended radiation cycles at 75 GWd/MTU, considering the behavior of fuel rods in light-water reactors under reactivity accident conditions. The fuel pellet can melt due to the quick increase of reactivity during a transient. Large power excursions in the reactor are the subject of interest bringing to a treatment that is known as the Fuchs-Hansen model. The point kinetic neutron equations show similar characteristics of non-linear differential equations. In this investigation, the multivariate logistic regression is employed to a probabilistic forecast of fuel failure. A comparison of computational simulation and experimental results was acceptable. The experiments carried out use the pre-irradiated fuels rods subjected to a rapid energy pulse which exhibits the same behavior during a nuclear accident. The propagation of uncertainty utilizes the Wilk's formulation. The variables chosen as essential to failure prediction were the fuel burnup, the applied peak power, the pulse width, the oxidation layer thickness, and the cladding type.Keywords: logistic regression, reactivity-initiated accident, safety margins, uncertainty propagation
Procedia PDF Downloads 29223584 Sustainable Traditional Architecture and Urban Planning in Hot-Arid Climate of Iran
Authors: Farnaz Nazem
Abstract:
The aim of sustainable architecture is to design buildings with the least adverse effects on the environment and provide better conditions for people. What building forms make the best use of land? This question was addressed in the late 1960s at the center of Land Use and Built Form Studies in Cambridge. This led to a number of influential papers which had a great influence on the practice of urban design. This paper concentrates on the results of sustainability caused by climatic conditions in Iranian traditional architecture in hot-arid regions. As people spent a significant amount of their time in houses, it was very important to have such houses to fulfill their needs physically and spiritually as well as satisfying their cultural and religious aspects of their lifestyles. In a vast country such as Iran with different climatic zones, traditional builders have presented series of logical solutions for human comfort. These solutions have been able to response to the environmental problems for a long period of time. As a result, by considering the experience in traditional architecture of hot–arid climate in Iran, it is possible to attain sustainable architecture.Keywords: hot-arid climate, Iran, sustainable traditional architecture, urban planning
Procedia PDF Downloads 47223583 Effects of the Fractional Order on Nanoparticles in Blood Flow through the Stenosed Artery
Authors: Mohammed Abdulhameed, Sagir M. Abdullahi
Abstract:
In this paper, based on the applications of nanoparticle, the blood flow along with nanoparticles through stenosed artery is studied. The blood is acted by periodic body acceleration, an oscillating pressure gradient and an external magnetic field. The mathematical formulation is based on Caputo-Fabrizio fractional derivative without singular kernel. The model of ordinary blood, corresponding to time-derivatives of integer order, is obtained as a limiting case. Analytical solutions of the blood velocity and temperature distribution are obtained by means of the Hankel and Laplace transforms. Effects of the order of Caputo-Fabrizio time-fractional derivatives and three different nanoparticles i.e. Fe3O4, TiO4 and Cu are studied. The results highlights that, models with fractional derivatives bring significant differences compared to the ordinary model. It is observed that the addition of Fe3O4 nanoparticle reduced the resistance impedance of the blood flow and temperature distribution through bell shape stenosed arteries as compared to TiO4 and Cu nanoparticles. On entering in the stenosed area, blood temperature increases slightly, but, increases considerably and reaches its maximum value in the stenosis throat. The shears stress has variation from a constant in the area without stenosis and higher in the layers located far to the longitudinal axis of the artery. This fact can be an important for some clinical applications in therapeutic procedures.Keywords: nanoparticles, blood flow, stenosed artery, mathematical models
Procedia PDF Downloads 26723582 Improved Soil and Snow Treatment with the Rapid Update Cycle Land-Surface Model for Regional and Global Weather Predictions
Authors: Tatiana G. Smirnova, Stan G. Benjamin
Abstract:
Rapid Update Cycle (RUC) land surface model (LSM) was a land-surface component in several generations of operational weather prediction models at the National Center for Environment Prediction (NCEP) at the National Oceanic and Atmospheric Administration (NOAA). It was designed for short-range weather predictions with an emphasis on severe weather and originally was intentionally simple to avoid uncertainties from poorly known parameters. Nevertheless, the RUC LSM, when coupled with the hourly-assimilating atmospheric model, can produce a realistic evolution of time-varying soil moisture and temperature, as well as the evolution of snow cover on the ground surface. This result is possible only if the soil/vegetation/snow component of the coupled weather prediction model has sufficient skill to avoid long-term drift. RUC LSM was first implemented in the operational NCEP Rapid Update Cycle (RUC) weather model in 1998 and later in the Weather Research Forecasting Model (WRF)-based Rapid Refresh (RAP) and High-resolution Rapid Refresh (HRRR). Being available to the international WRF community, it was implemented in operational weather models in Austria, New Zealand, and Switzerland. Based on the feedback from the US weather service offices and the international WRF community and also based on our own validation, RUC LSM has matured over the years. Also, a sea-ice module was added to RUC LSM for surface predictions over the Arctic sea-ice. Other modifications include refinements to the snow model and a more accurate specification of albedo, roughness length, and other surface properties. At present, RUC LSM is being tested in the regional application of the Unified Forecast System (UFS). The next generation UFS-based regional Rapid Refresh FV3 Standalone (RRFS) model will replace operational RAP and HRRR at NCEP. Over time, RUC LSM participated in several international model intercomparison projects to verify its skill using observed atmospheric forcing. The ESM-SnowMIP was the last of these experiments focused on the verification of snow models for open and forested regions. The simulations were performed for ten sites located in different climatic zones of the world forced with observed atmospheric conditions. While most of the 26 participating models have more sophisticated snow parameterizations than in RUC, RUC LSM got a high ranking in simulations of both snow water equivalent and surface temperature. However, ESM-SnowMIP experiment also revealed some issues in the RUC snow model, which will be addressed in this paper. One of them is the treatment of grid cells partially covered with snow. RUC snow module computes energy and moisture budgets of snow-covered and snow-free areas separately by aggregating the solutions at the end of each time step. Such treatment elevates the importance of computing in the model snow cover fraction. Improvements to the original simplistic threshold-based approach have been implemented and tested both offline and in the coupled weather model. The detailed description of changes to the snow cover fraction and other modifications to RUC soil and snow parameterizations will be described in this paper.Keywords: land-surface models, weather prediction, hydrology, boundary-layer processes
Procedia PDF Downloads 8823581 Simulation of the Visco-Elasto-Plastic Deformation Behaviour of Short Glass Fibre Reinforced Polyphthalamides
Authors: V. Keim, J. Spachtholz, J. Hammer
Abstract:
The importance of fibre reinforced plastics continually increases due to the excellent mechanical properties, low material and manufacturing costs combined with significant weight reduction. Today, components are usually designed and calculated numerically by using finite element methods (FEM) to avoid expensive laboratory tests. These programs are based on material models including material specific deformation characteristics. In this research project, material models for short glass fibre reinforced plastics are presented to simulate the visco-elasto-plastic deformation behaviour. Prior to modelling specimens of the material EMS Grivory HTV-5H1, consisting of a Polyphthalamide matrix reinforced by 50wt.-% of short glass fibres, are characterized experimentally in terms of the highly time dependent deformation behaviour of the matrix material. To minimize the experimental effort, the cyclic deformation behaviour under tensile and compressive loading (R = −1) is characterized by isothermal complex low cycle fatigue (CLCF) tests. Combining cycles under two strain amplitudes and strain rates within three orders of magnitude and relaxation intervals into one experiment the visco-elastic deformation is characterized. To identify visco-plastic deformation monotonous tensile tests either displacement controlled or strain controlled (CERT) are compared. All relevant modelling parameters for this complex superposition of simultaneously varying mechanical loadings are quantified by these experiments. Subsequently, two different material models are compared with respect to their accuracy describing the visco-elasto-plastic deformation behaviour. First, based on Chaboche an extended 12 parameter model (EVP-KV2) is used to model cyclic visco-elasto-plasticity at two time scales. The parameters of the model including a total separation of elastic and plastic deformation are obtained by computational optimization using an evolutionary algorithm based on a fitness function called genetic algorithm. Second, the 12 parameter visco-elasto-plastic material model by Launay is used. In detail, the model contains a different type of a flow function based on the definition of the visco-plastic deformation as a part of the overall deformation. The accuracy of the models is verified by corresponding experimental LCF testing.Keywords: complex low cycle fatigue, material modelling, short glass fibre reinforced polyphthalamides, visco-elasto-plastic deformation
Procedia PDF Downloads 21523580 Experimental Parameters’ Effects on the Electrical Discharge Machining Performances (µEDM)
Authors: Asmae Tafraouti, Yasmina Layouni, Pascal Kleimann
Abstract:
The growing market for Microsystems (MST) and Micro-Electromechanical Systems (MEMS) is driving the research for alternative manufacturing techniques to microelectronics-based technologies, which are generally expensive and time-consuming. Hot-embossing and micro-injection modeling of thermoplastics appear to be industrially viable processes. However, both require the use of master models, usually made in hard materials such as steel. These master models cannot be fabricated using standard microelectronics processes. Thus, other micromachining processes are used, as laser machining or micro-electrical discharge machining (µEDM). In this work, µEDM has been used. The principle of µEDM is based on the use of a thin cylindrical micro-tool that erodes the workpiece surface. The two electrodes are immersed in a dielectric with a distance of a few micrometers (gap). When an electrical voltage is applied between the two electrodes, electrical discharges are generated, which cause material machining. In order to produce master models with high resolution and smooth surfaces, it is necessary to well control the discharge mechanism. However, several problems are encountered, such as a random electrical discharge process, the fluctuation of the discharge energy, the electrodes' polarity inversion, and the wear of the micro-tool. The effect of different parameters, such as the applied voltage, the working capacitor, the micro-tool diameter, the initial gap, has been studied. This analysis helps to improve the machining performances, such: the workpiece surface condition and the lateral crater's gap.Keywords: craters, electrical discharges, micro-electrical discharge machining (µEDM), microsystems
Procedia PDF Downloads 9623579 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution
Procedia PDF Downloads 37223578 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 13623577 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow
Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam
Abstract:
Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety
Procedia PDF Downloads 30223576 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting
Authors: Analise Borg, Paul Micallef
Abstract:
Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organize the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that non-parametric analysis offer potential results as the ones mentioned in the literature.Keywords: audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7
Procedia PDF Downloads 42123575 A False Introduction: Teaching in a Pandemic
Authors: Robert Michael, Kayla Tobin, William Foster, Rachel Fairchild
Abstract:
The COVID-19 pandemic has caused significant disruptions in education, particularly in the teaching of health and physical education (HPE). This study examined a cohort of teachers that experienced being a preservice and first-year teacher during various stages of the pandemic. Qualitative data collection was conducted by interviewing six teachers from different schools in the Eastern U.S. over a series of structured interviews. Thematic analysis was employed to analyze the data. The pandemic significantly impacted the way HPE was taught as schools shifted to virtual and hybrid models. Findings revealed five major themes: (a) You want me to teach HOW?, (b) PE without equipment and six feet apart, (c) Behind the Scenes, (d) They’re back…I became a behavior management guru, and (e) The Pandemic Crater. Overall, this study highlights the significant challenges faced by preservice and first-year teachers in teaching physical education during the pandemic and underscores the need for ongoing support and resources to help them adapt and succeed in these challenging circumstances.Keywords: teacher education, preservice teachers, first year teachers, health and physical education
Procedia PDF Downloads 18523574 Towards an Enhanced Compartmental Model for Profiling Malware Dynamics
Authors: Jessemyn Modiini, Timothy Lynar, Elena Sitnikova
Abstract:
We present a novel enhanced compartmental model for malware spread analysis in cyber security. This paper applies cyber security data features to epidemiological compartmental models to model the infectious potential of malware. Compartmental models are most efficient for calculating the infectious potential of a disease. In this paper, we discuss and profile epidemiologically relevant data features from a Domain Name System (DNS) dataset. We then apply these features to epidemiological compartmental models to network traffic features. This paper demonstrates how epidemiological principles can be applied to the novel analysis of key cybersecurity behaviours and trends and provides insight into threat modelling above that of kill-chain analysis. In applying deterministic compartmental models to a cyber security use case, the authors analyse the deficiencies and provide an enhanced stochastic model for cyber epidemiology. This enhanced compartmental model (SUEICRN model) is contrasted with the traditional SEIR model to demonstrate its efficacy.Keywords: cybersecurity, epidemiology, cyber epidemiology, malware
Procedia PDF Downloads 10723573 Application of Fourier Series Based Learning Control on Mechatronic Systems
Authors: Sandra Baßler, Peter Dünow, Mathias Marquardt
Abstract:
A Fourier series based learning control (FSBLC) algorithm for tracking trajectories of mechanical systems with unknown nonlinearities is presented. Two processes are introduced to which the FSBLC with PD controller is applied. One is a simplified service robot capable of climbing stairs due to special wheels and the other is a propeller driven pendulum with nearly the same requirements on control. Additionally to the investigation of learning the feed forward for the desired trajectories some considerations on the implementation of such an algorithm on low cost microcontroller hardware are made. Simulations of the service robot as well as practical experiments on the pendulum show the capability of the used FSBLC algorithm to perform the task of improving control behavior for repetitive task of such mechanical systems.Keywords: climbing stairs, FSBLC, ILC, service robot
Procedia PDF Downloads 31423572 A Hazard Rate Function for the Time of Ruin
Authors: Sule Sahin, Basak Bulut Karageyik
Abstract:
This paper introduces a hazard rate function for the time of ruin to calculate the conditional probability of ruin for very small intervals. We call this function the force of ruin (FoR). We obtain the expected time of ruin and conditional expected time of ruin from the exact finite time ruin probability with exponential claim amounts. Then we introduce the FoR which gives the conditional probability of ruin and the condition is that ruin has not occurred at time t. We analyse the behavior of the FoR function for different initial surpluses over a specific time interval. We also obtain FoR under the excess of loss reinsurance arrangement and examine the effect of reinsurance on the FoR.Keywords: conditional time of ruin, finite time ruin probability, force of ruin, reinsurance
Procedia PDF Downloads 40623571 Practical Modelling of RC Structural Walls under Monotonic and Cyclic Loading
Authors: Reza E. Sedgh, Rajesh P. Dhakal
Abstract:
Shear walls have been used extensively as the main lateral force resisting systems in multi-storey buildings. The recent development in performance based design urges practicing engineers to conduct nonlinear static or dynamic analysis to evaluate seismic performance of multi-storey shear wall buildings by employing distinct analytical models suggested in the literature. For practical purpose, application of macroscopic models to simulate the global and local nonlinear behavior of structural walls outweighs the microscopic models. The skill level, computational time and limited access to RC specialized finite element packages prevents the general application of this method in performance based design or assessment of multi-storey shear wall buildings in design offices. Hence, this paper organized to verify capability of nonlinear shell element in commercially available package (Sap2000) in simulating results of some specimens under monotonic and cyclic loads with very oversimplified available cyclic material laws in the analytical tool. The selection of constitutive models, the determination of related parameters of the constituent material and appropriate nonlinear shear model are presented in detail. Adoption of proposed simple model demonstrated that the predicted results follow the overall trend of experimental force-displacement curve. Although, prediction of ultimate strength and the overall shape of hysteresis model agreed to some extent with experiment, the ultimate displacement(significant strength degradation point) prediction remains challenging in some cases.Keywords: analytical model, nonlinear shell element, structural wall, shear behavior
Procedia PDF Downloads 40423570 Simplified Modeling of Post-Soil Interaction for Roadside Safety Barriers
Authors: Charly Julien Nyobe, Eric Jacquelin, Denis Brizard, Alexy Mercier
Abstract:
The performance of road side safety barriers depends largely on the dynamic interactions between post and soil. These interactions play a key role in the response of barriers to crash testing. In the literature, soil-post interaction is modeled in crash test simulations using three approaches. Many researchers have initially used the finite element approach, in which the post is embedded in a continuum soil modelled by solid finite elements. This method represents a more comprehensive and detailed approach, employing a mesh-based continuum to model the soil’s behavior and its interaction with the post. Although this method takes all soil properties into account, it is nevertheless very costly in terms of simulation time. In the second approach, all the points of the post located at a predefined depth are fixed. Although this approach reduces CPU computing time, it overestimates soil-post stiffness. The third approach involves modeling the post as a beam supported by a set of nonlinear springs in the horizontal directions. For support in the vertical direction, the posts were constrained at a node at ground level. This approach is less costly, but the literature does not provide a simple procedure to determine the constitutive law of the springs The aim of this study is to propose a simple and low-cost procedure to obtain the constitutive law of nonlinear springs that model the soil-post interaction. To achieve this objective, we will first present a procedure to obtain the constitutive law of nonlinear springs thanks to the simulation of a soil compression test. The test consists in compressing the soil contained in the tank by a rigid solid, up to a vertical displacement of 200 mm. The resultant force exerted by the ground on the rigid solid and its vertical displacement are extracted and, a force-displacement curve was determined. The proposed procedure for replacing the soil with springs must be tested against a reference model. The reference model consists of a wooden post embedded into the ground and impacted with an impactor. Two simplified models with springs are studied. In the first model, called Kh-Kv model, the springs are attached to the post in the horizontal and vertical directions. The second Kh model is the one described in the literature. The two simplified models are compared with the reference model according to several criteria: the displacement of a node located at the top of the post in vertical and horizontal directions; displacement of the post's center of rotation and impactor velocity. The results given by both simplified models are very close to the reference model results. It is noticeable that the Kh-Kv model is slightly better than the Kh model. Further, the former model is more interesting than the latter as it involves less arbitrary conditions. The simplified models also reduce the simulation time by a factor 4. The Kh-Kv model can therefore be used as a reliable tool to represent the soil-post interaction in a future research and development of road safety barriers.Keywords: crash tests, nonlinear springs, soil-post interaction modeling, constitutive law
Procedia PDF Downloads 3023569 Increases in Serum Erythropoietin Hormone in Recreational Breath-Hold Divers Following a Series of Repeated Apnoeas: Apnoea beyond Freediving
Authors: Antonis Elia, Theo Loizou, Gladys Onambele-Pearson, Matthew Barlow, Georgina Stebbings
Abstract:
Hypoxic conditions have been reported to enhance red blood cell production in both acclimatised low-landers and altitude adapted populations. This process is mediated by the erythropoietin hormone, which is released predominantly by the hypoxic kidney. A higher haemoglobin concentration was previously reported in elite breath-hold divers when compared to elite-skiers and untrained individuals. Therefore, the present study aimed to investigate whether apnoea induced hypoxia could induce a significant increase in serum erythropoietin concentration in recreational breath-hold divers which would provide an explanation to the higher haemoglobin levels observed in elite breath-hold divers. Identifying whether apnoea induced hypoxia induces a significant increase in serum erythropoietin might suggest that apnoea can be used as an alternative acclimatisation method to high altitude exposure. Seven healthy, recreational male breath-hold divers performed two sets of five 180 second breath-holds with a ten-minute supine rest between each set and a two-minute seated rest between each apnoea. During each breath-hold, participant’s heart rate and peripheral oxygen saturation levels were recorded every subsequent 10 seconds until the end of the 180 second breath-hold. After each 180 second breath-hold a capillary blood sample was collected from the finger to identify circulating haemoglobin levels. Following completion of the apnoeic protocol, three blood samples were collected at 30, 90 and 180 minutes to measure circulating erythropoietin levels. A significant interaction between erythropoietin and time was observed (F(3,18)= 4.72, p < 0.001), with significant increases in erythropoietin evident at 30 (t(6)= -5.035, p < 0.0590 (t(6)= -6.162, p < 0.05) and 180 (t(6)= - 7.232, p < 0.001) minutes post the last apnoea when compared to baseline. Corresponding average increases when compared to baseline were 16% at 30, 23% at 90 and 40% at 180 minutes post the last apnoea. A significant interaction between haemoglobin and time was observed (F(78,84)= 20.814, p < 0.001), with significant increases in haemoglobin evident at the fifth (t(29)= -1.124, p < 0.001), ninth (t(29)= -1.357, p < 0.001) and tenth (t(29)= -1.211, p < 0.05) apnoeas when compared to baseline. A significant interaction between peripheral oxygen saturation and time was observed (F(10,60)= 408.23, p < 0.001). The present study demonstrates that a series of ten 180 second breath-holds is sufficient to induce a significant increase in the circulating erythropoietin concentration of recreational breath hold divers. These observations may suggest that apnoea induced hypoxia may be used as an alternative acclimatisation method to high altitude exposure.Keywords: apnoea, breath-holding, diving reflex, erythropoietin, haemoglobin
Procedia PDF Downloads 18023568 An Inquiry of the Impact of Flood Risk on Housing Market with Enhanced Geographically Weighted Regression
Authors: Lin-Han Chiang Hsieh, Hsiao-Yi Lin
Abstract:
This study aims to determine the impact of the disclosure of flood potential map on housing prices. The disclosure is supposed to mitigate the market failure by reducing information asymmetry. On the other hand, opponents argue that the official disclosure of simulated results will only create unnecessary disturbances on the housing market. This study identifies the impact of the disclosure of the flood potential map by comparing the hedonic price of flood potential before and after the disclosure. The flood potential map used in this study is published by Taipei municipal government in 2015, which is a result of a comprehensive simulation based on geographical, hydrological, and meteorological factors. The residential property sales data of 2013 to 2016 is used in this study, which is collected from the actual sales price registration system by the Department of Land Administration (DLA). The result shows that the impact of flood potential on residential real estate market is statistically significant both before and after the disclosure. But the trend is clearer after the disclosure, suggesting that the disclosure does have an impact on the market. Also, the result shows that the impact of flood potential differs by the severity and frequency of precipitation. The negative impact for a relatively mild, high frequency flood potential is stronger than that for a heavy, low possibility flood potential. The result indicates that home buyers are of more concern to the frequency, than the intensity of flood. Another contribution of this study is in the methodological perspective. The classic hedonic price analysis with OLS regression suffers from two spatial problems: the endogeneity problem caused by omitted spatial-related variables, and the heterogeneity concern to the presumption that regression coefficients are spatially constant. These two problems are seldom considered in a single model. This study tries to deal with the endogeneity and heterogeneity problem together by combining the spatial fixed-effect model and geographically weighted regression (GWR). A series of literature indicates that the hedonic price of certain environmental assets varies spatially by applying GWR. Since the endogeneity problem is usually not considered in typical GWR models, it is arguable that the omitted spatial-related variables might bias the result of GWR models. By combing the spatial fixed-effect model and GWR, this study concludes that the effect of flood potential map is highly sensitive by location, even after controlling for the spatial autocorrelation at the same time. The main policy application of this result is that it is improper to determine the potential benefit of flood prevention policy by simply multiplying the hedonic price of flood risk by the number of houses. The effect of flood prevention might vary dramatically by location.Keywords: flood potential, hedonic price analysis, endogeneity, heterogeneity, geographically-weighted regression
Procedia PDF Downloads 29023567 On Periodic Integer-Valued Moving Average Models
Authors: Aries Nawel, Bentarzi Mohamed
Abstract:
This paper deals with the study of some probabilistic and statistical properties of a Periodic Integer-Valued Moving Average Model (PINMA_{S}(q)). The closed forms of the mean, the second moment and the periodic autocovariance function are obtained. Furthermore, the time reversibility of the model is discussed in details. Moreover, the estimation of the underlying parameters are obtained by the Yule-Walker method, the Conditional Least Square method (CLS) and the Weighted Conditional Least Square method (WCLS). A simulation study is carried out to evaluate the performance of the estimation method. Moreover, an application on real data set is provided.Keywords: periodic integer-valued moving average, periodically correlated process, time reversibility, count data
Procedia PDF Downloads 20223566 Sensitive Analysis of the ZF Model for ABC Multi Criteria Inventory Classification
Authors: Makram Ben Jeddou
Abstract:
The ABC classification is widely used by managers for inventory control. The classical ABC classification is based on the Pareto principle and according to the criterion of the annual use value only. Single criterion classification is often insufficient for a closely inventory control. Multi-criteria inventory classification models have been proposed by researchers in order to take into account other important criteria. From these models, we will consider the ZF model in order to make a sensitive analysis on the composite score calculated for each item. In fact, this score based on a normalized average between a good and a bad optimized index can affect the ABC items classification. We will then focus on the weights assigned to each index and propose a classification compromise.Keywords: ABC classification, multi criteria inventory classification models, ZF-model
Procedia PDF Downloads 50823565 Bayesian Meta-Analysis to Account for Heterogeneity in Studies Relating Life Events to Disease
Authors: Elizabeth Stojanovski
Abstract:
Associations between life events and various forms of cancers have been identified. The purpose of a recent random-effects meta-analysis was to identify studies that examined the association between adverse events associated with changes to financial status including decreased income and breast cancer risk. The same association was studied in four separate studies which displayed traits that were not consistent between studies such as the study design, location and time frame. It was of interest to pool information from various studies to help identify characteristics that differentiated study results. Two random-effects Bayesian meta-analysis models are proposed to combine the reported estimates of the described studies. The proposed models allow major sources of variation to be taken into account, including study level characteristics, between study variance, and within study variance and illustrate the ease with which uncertainty can be incorporated using a hierarchical Bayesian modelling approach.Keywords: random-effects, meta-analysis, Bayesian, variation
Procedia PDF Downloads 160