Search results for: packet loss probability estimation
6199 Voice over IP Quality of Service Evaluation for Mobile Ad Hoc Network in an Indoor Environment for Different Voice Codecs
Authors: Lina Abou Haibeh, Nadir Hakem, Ousama Abu Safia
Abstract:
In this paper, the performance and quality of Voice over IP (VoIP) calls carried over a Mobile Ad Hoc Network (MANET) which has a number of SIP nodes registered on a SIP Proxy are analyzed. The testing campaigns are carried out in an indoor corridor structure having a well-defined channel’s characteristics and model for the different voice codecs, G.711, G.727 and G.723.1. These voice codecs are commonly used in VoIP technology. The calls’ quality are evaluated using four Quality of Service (QoS) metrics, namely, mean opinion score (MOS), jitter, delay, and packet loss. The relationship between the wireless channel’s parameters and the optimum codec is well-established. According to the experimental results, the voice codec G.711 has the best performance for the proposed MANET topologyKeywords: wireless channel modelling, Voip, MANET, session initiation protocol (SIP), QoS
Procedia PDF Downloads 2266198 A Hyperexponential Approximation to Finite-Time and Infinite-Time Ruin Probabilities of Compound Poisson Processes
Authors: Amir T. Payandeh Najafabadi
Abstract:
This article considers the problem of evaluating infinite-time (or finite-time) ruin probability under a given compound Poisson surplus process by approximating the claim size distribution by a finite mixture exponential, say Hyperexponential, distribution. It restates the infinite-time (or finite-time) ruin probability as a solvable ordinary differential equation (or a partial differential equation). Application of our findings has been given through a simulation study.Keywords: ruin probability, compound poisson processes, mixture exponential (hyperexponential) distribution, heavy-tailed distributions
Procedia PDF Downloads 3416197 Reservoir Properties Effect on Estimating Initial Gas in Place Using Flowing Material Balance Method
Authors: Yousef S. Kh. S. Hashem
Abstract:
Accurate estimation of initial gas in place (IGIP) plays an important factor in the decision to develop a gas field. One of the methods that are available in the industry to estimate the IGIP is material balance. This method required that the well has to be shut-in while pressure is measured as it builds to average reservoir pressure. Since gas demand is high and shut-in well surveys are very expensive, flowing gas material balance (FGMB) is sometimes used instead of material balance. This work investigated the effect of reservoir properties (pressure, permeability, and reservoir size) on the estimation of IGIP when using FGMB. A gas reservoir simulator that accounts for friction loss, wellbore storage, and the non-Darcy effect was used to simulate 165 different possible causes (3 pressures, 5 reservoir sizes, and 11 permeabilities). Both tubing pressure and bottom-hole pressure were analyzed using FGMB. The results showed that the FGMB method is very sensitive for tied reservoirs (k < 10). Also, it showed which method is best to be used for different reservoir properties. This study can be used as a guideline for the application of the FGMB method.Keywords: flowing material balance, gas reservoir, reserves, gas simulator
Procedia PDF Downloads 1516196 Mobility Management via Software Defined Networks (SDN) in Vehicular Ad Hoc Networks (VANETs)
Authors: Bilal Haider, Farhan Aadil
Abstract:
A Vehicular Ad hoc Network (VANET) provides various services to end-users traveling on the road at high speeds. However, this high-speed mobility of mobile nodes can cause frequent service disruptions. Various mobility management protocols exist for managing node mobility, but due to their centralized nature, they tend to suffer in the VANET environment. In this research, we proposed a distributed mobility management protocol using software-defined networks (SDN) for VANETs. Instead of relying on a centralized mobility anchor, the mobility functionality is distributed at multiple infrastructural nodes. The protocol is based on the classical Proxy Mobile IP version 6 (PMIPv6). It is evident from simulation results that this work has improved the network performance with respect to nodes throughput, delay, and packet loss.Keywords: SDN, VANET, mobility management, optimization
Procedia PDF Downloads 1696195 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs
Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar
Abstract:
The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.Keywords: simulation, probability, confidence interval, sensitivity analysis
Procedia PDF Downloads 3816194 Using Indigenous Games to Demystify Probability Theorem in Ghanaian Classrooms: Mathematical Analysis of Ampe
Authors: Peter Akayuure, Michael Johnson Nabie
Abstract:
Similar to many colonized nations in the world, one indelible mark left by colonial masters after Ghana’s independence in 1957 has been the fact that many contexts used to teach statistics and probability concepts are often alien and do not resonate with the social domain of our indigenous Ghanaian child. This has seriously limited the understanding, discoveries, and applications of mathematics for national developments. With the recent curriculum demands of making the Ghanaian child mathematically literate, this qualitative study involved video recordings and mathematical analysis of play sessions of an indigenous girl game called Ampe with the aim to demystify the concepts in probability theorem, which is applied in mathematics related fields of study. The mathematical analysis shows that the game of Ampe, which is widely played by school girls in Ghana, is suitable for learning concepts of the probability theorems. It was also revealed that as a girl game, the use of Ampe provides good lessons to educators, textbook writers, and teachers to rethink about the selection of mathematics tasks and learning contexts that are sensitive to gender. As we undertake to transform teacher education and student learning, the use of indigenous games should be critically revisited.Keywords: Ampe, mathematical analysis, probability theorem, Ghanaian girl game
Procedia PDF Downloads 3696193 An Assessment of Vegetable Farmers’ Perceptions about Post-harvest Loss Sources in Ghana
Authors: Kofi Kyei, Kenchi Matsui
Abstract:
Loss of vegetable products has been a major constraint in the post-harvest chain. Sources of post-harvest loss in the vegetable industry start from the time of harvesting to its handling and at the various market centers. Identifying vegetable farmers’ perceptions about post-harvest loss sources is one way of addressing this issue. In this paper, we assessed farmers’ perceptions about sources of post-harvest losses in the Ashanti Region of Ghana. We also identified the factors that influence their perceptions. To clearly understand farmers’ perceptions, we selected Sekyere-Kumawu District in the Ashanti Region. Sekyere-Kumawu District is one of the major producers of vegetables in the Region. Based on a questionnaire survey, 100 vegetable farmers growing tomato, pepper, okra, cabbage, and garden egg were purposely selected from five communities in Sekyere-Kumawu District. For farmers’ perceptions, the five points Likert scale was employed. On a scale from 1 (no loss) to 5 (extremely high loss), we processed the scores for each vegetable harvest. To clarify factors influencing farmers’ perceptions, the Pearson Correlation analysis was used. Our findings revealed that farmers perceive post-harvest loss by pest infestation as the most extreme loss. However, vegetable farmers did not perceive loss during transportation as a serious source of post-harvest loss. The Pearson Correlation analysis results further revealed that farmers’ age, gender, level of education, and years of experience had an influence on their perceptions. This paper then discusses some recommendations to minimize the post-harvest loss in the region.Keywords: Ashanti Region, pest infestation, post-harvest loss, vegetable farmers
Procedia PDF Downloads 1816192 The Effects of Some Organic Amendments on Sediment Yield, Splash Loss, and Runoff of Soils of Selected Parent Materials in Southeastern Nigeria
Authors: Leonard Chimaobi Agim, Charles Arinzechukwu Igwe, Emmanuel Uzoma Onweremadu, Gabreil Osuji
Abstract:
Soil erosion has been linked to stream sedimentation, ecosystem degradation, and loss of soil nutrients. A study was conducted to evaluate the effect of some organic amendment on sediment yield, splash loss, and runoff of soils of selected parent materials in southeastern Nigeria. A total of 20 locations, five from each of four parent materials namely: Asu River Group (ARG), Bende Ameki Group (BAG), Coastal Plain Sand (CPS) and Falsebedded Sandstone (FBS) were used for the study. Collected soil samples were analyzed with standard methods for the initial soil properties. Rainfall simulation at an intensity of 190 mm hr-1was conducted for 30 minutes on the soil samples at both the initial stage and after amendment to obtain erosion parameters. The influence of parent material on sediment yield, splash loss and runoff based on rainfall simulation was tested for using one way analyses of variance, while the influence of organic material and their combinations were a factorially fitted in a randomized complete block design. The organic amendments include; goat dropping (GD), poultry dropping (PD), municipal solid waste (MSW) and their combinations (COA) applied at four rates of 0, 10, 20 and 30 t ha-1 respectively. Data were analyzed using analyses of variance suitable for a factorial experiment. Significant means were separated using LSD at 5 % probability levels. Result showed significant (p ≤ 0.05) lower values of sediment yield, splash loss and runoff following amendment. For instance, organic amendment reduced sediment yield under wet and dry runs by 12.91 % and 26.16% in Ishiagu, 40.76% and 45.67%, in Bende, 16.17% and 50% in Obinze and 22.80% and 42.35% in Umulolo respectively. Goat dropping and combination of amendment gave the best results in reducing sediment yield.Keywords: organic amendment, parent material, rainfall simulation, soil erosion
Procedia PDF Downloads 3426191 Channel Estimation Using Deep Learning for Reconfigurable Intelligent Surfaces-Assisted Millimeter Wave Systems
Authors: Ting Gao, Mingyue He
Abstract:
Reconfigurable intelligent surfaces (RISs) are expected to be an important part of next-generation wireless communication networks due to their potential to reduce the hardware cost and energy consumption of millimeter Wave (mmWave) massive multiple-input multiple-output (MIMO) technology. However, owing to the lack of signal processing abilities of the RIS, the perfect channel state information (CSI) in RIS-assisted communication systems is difficult to acquire. In this paper, the uplink channel estimation for mmWave systems with a hybrid active/passive RIS architecture is studied. Specifically, a deep learning-based estimation scheme is proposed to estimate the channel between the RIS and the user. In particular, the sparse structure of the mmWave channel is exploited to formulate the channel estimation as a sparse reconstruction problem. To this end, the proposed approach is derived to obtain the distribution of non-zero entries in a sparse channel. After that, the channel is reconstructed by utilizing the least-squares (LS) algorithm and compressed sensing (CS) theory. The simulation results demonstrate that the proposed channel estimation scheme is superior to existing solutions even in low signal-to-noise ratio (SNR) environments.Keywords: channel estimation, reconfigurable intelligent surface, wireless communication, deep learning
Procedia PDF Downloads 1496190 Modeling of Glycine Transporters in Mammalian Using the Probability Approach
Authors: K. S. Zaytsev, Y. R. Nartsissov
Abstract:
Glycine is one of the key inhibitory neurotransmitters in Central nervous system (CNS) meanwhile glycinergic transmission is highly dependable on its appropriate reuptake from synaptic cleft. Glycine transporters (GlyT) of types 1 and 2 are the enzymes providing glycine transport back to neuronal and glial cells along with Na⁺ and Cl⁻ co-transport. The distribution and stoichiometry of GlyT1 and GlyT2 differ in details, and GlyT2 is more interesting for the research as it reuptakes glycine to neuron cells, whereas GlyT1 is located in glial cells. In the process of GlyT2 activity, the translocation of the amino acid is accompanied with binding of both one chloride and three sodium ions consequently (two sodium ions for GlyT1). In the present study, we developed a computer simulator of GlyT2 and GlyT1 activity based on known experimental data for quantitative estimation of membrane glycine transport. The trait of a single protein functioning was described using the probability approach where each enzyme state was considered separately. Created scheme of transporter functioning realized as a consequence of elemental steps allowed to take into account each event of substrate association and dissociation. Computer experiments using up-to-date kinetic parameters allowed receiving the number of translocated glycine molecules, Na⁺ and Cl⁻ ions per time period. Flexibility of developed software makes it possible to evaluate glycine reuptake pattern in time under different internal characteristics of enzyme conformational transitions. We investigated the behavior of the system in a wide range of equilibrium constant (from 0.2 to 100), which is not determined experimentally. The significant influence of equilibrium constant in the range from 0.2 to 10 on the glycine transfer process is shown. The environmental conditions such as ion and glycine concentrations are decisive if the values of the constant are outside the specified range.Keywords: glycine, inhibitory neurotransmitters, probability approach, single protein functioning
Procedia PDF Downloads 1176189 Preliminary Results on a Maximum Mean Discrepancy Approach for Seizure Detection
Authors: Boumediene Hamzi, Turky N. AlOtaiby, Saleh AlShebeili, Arwa AlAnqary
Abstract:
We introduce a data-driven method for seizure detection drawing on recent progress in Machine Learning. The method is based on embedding probability measures in a high (or infinite) dimensional reproducing kernel Hilbert space (RKHS) where the Maximum Mean Discrepancy (MMD) is computed. The MMD is metric between probability measures that are computed as the difference between the means of probability measures after being embedded in an RKHS. Working in RKHS provides a convenient, general functional-analytical framework for theoretical understanding of data. We apply this approach to the problem of seizure detection.Keywords: kernel methods, maximum mean discrepancy, seizure detection, machine learning
Procedia PDF Downloads 2376188 A New Criterion Using Pose and Shape of Objects for Collision Risk Estimation
Authors: DoHyeung Kim, DaeHee Seo, ByungDoo Kim, ByungGil Lee
Abstract:
As many recent researches being implemented in aviation and maritime aspects, strong doubts have been raised concerning the reliability of the estimation of collision risk. It is shown that using position and velocity of objects can lead to imprecise results. In this paper, therefore, a new approach to the estimation of collision risks using pose and shape of objects is proposed. Simulation results are presented validating the accuracy of the new criterion to adapt to collision risk algorithm based on fuzzy logic.Keywords: collision risk, pose, shape, fuzzy logic
Procedia PDF Downloads 5286187 Classic Training of a Neural Observer for Estimation Purposes
Authors: R. Loukil, M. Chtourou, T. Damak
Abstract:
This paper investigates the training of multilayer neural network using the classic approach. Then, for estimation purposes, we suggest the use of a specific neural observer that we study its training algorithm which is the back-propagation one in the case of the disponibility of the state and in the case of an unmeasurable state. A MATLAB simulation example will be studied to highlight the usefulness of this kind of observer.Keywords: training, estimation purposes, neural observer, back-propagation, unmeasurable state
Procedia PDF Downloads 5736186 Off-Line Parameter Estimation for the Induction Motor Drive System
Authors: Han-Woong Ahn, In-Gun Kim, Hyun-Seok Hong, Dong-Woo Kang, Ju Lee
Abstract:
It is important to accurately identify machine parameters for direct vector control. To obtain the parameter values, traditional methods can be used such as no-load and rotor locked tests. However, there are many differences between values obtained from the traditional tests and actual values. In addition, there are drawbacks that additional equipment and cost are required for the experiment. Therefore, it is hard to temporary operation to estimate induction motor parameters. Therefore, this paper deals with the estimation algorithm of induction motor parameters without a motor operation and the measurement from additional equipment such as sensors and dynamometer. The validity and usefulness of the estimation algorithm considering inverter nonlinearity is verified by comparing the conventional method with the proposed method.Keywords: induction motor, parameter, off-line estimation, inverter nonlinearity
Procedia PDF Downloads 5276185 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation
Authors: Yoonsuh Jung, Steven N. MacEachern
Abstract:
Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.Keywords: cross-validation, model selection, quantile regression, tuning parameter selection
Procedia PDF Downloads 4376184 Deep Learning-Based Channel Estimation for RIS-Assisted Unmanned Aerial Vehicle-Enabled Wireless Communication System
Authors: Getaneh Berie Tarekegn
Abstract:
Wireless communication via unmanned aerial vehicles (UAVs) has drawn a great deal of attention due to its flexibility in establishing line-of-sight (LoS) communications. However, in complex urban and dynamic environments, the movement of UAVs can be blocked by trees and high-rise buildings that obstruct directional paths. With reconfigurable intelligent surfaces (RIS), this problem can be effectively addressed. To achieve this goal, accurate channel estimation in RIS-assisted UAV-enabled wireless communications is crucial. This paper proposes an accurate channel estimation model using long short-term memory (LSTM) for a multi-user RIS-assisted UAV-enabled wireless communication system. According to simulation results, LSTM can improve the channel estimation performance of RIS-assisted UAV-enabled wireless communication.Keywords: channel estimation, reconfigurable intelligent surfaces, long short-term memory, unmanned aerial vehicles
Procedia PDF Downloads 556183 Yield Loss Estimation Using Multiple Drought Severity Indices
Authors: Sara Tokhi Arab, Rozo Noguchi, Tofeal Ahamed
Abstract:
Drought is a natural disaster that occurs in a region due to a lack of precipitation and high temperatures over a continuous period or in a single season as a consequence of climate change. Precipitation deficits and prolonged high temperatures mostly affect the agricultural sector, water resources, socioeconomics, and the environment. Consequently, it causes agricultural product loss, food shortage, famines, migration, and natural resources degradation in a region. Agriculture is the first sector affected by drought. Therefore, it is important to develop an agricultural drought risk and loss assessment to mitigate the drought impact in the agriculture sector. In this context, the main purpose of this study was to assess yield loss using composite drought indices in the drought-affected vineyards. In this study, the CDI was developed for the years 2016 to 2020 by comprising five indices: the vegetation condition index (VCI), temperature condition index (TCI), deviation of NDVI from the long-term mean (NDVI DEV), normalized difference moisture index (NDMI) and precipitation condition index (PCI). Moreover, the quantitative principal component analysis (PCA) approach was used to assign a weight for each input parameter, and then the weights of all the indices were combined into one composite drought index. Finally, Bayesian regularized artificial neural networks (BRANNs) were used to evaluate the yield variation in each affected vineyard. The composite drought index result indicated the moderate to severe droughts were observed across the Kabul Province during 2016 and 2018. Moreover, the results showed that there was no vineyard in extreme drought conditions. Therefore, we only considered the severe and moderated condition. According to the BRANNs results R=0.87 and R=0.94 in severe drought conditions for the years of 2016 and 2018 and the R= 0.85 and R=0.91 in moderate drought conditions for the years of 2016 and 2018, respectively. In the Kabul Province within the two years drought periods, there was a significate deficit in the vineyards. According to the findings, 2018 had the highest rate of loss almost -7 ton/ha. However, in 2016 the loss rates were about – 1.2 ton/ha. This research will support stakeholders to identify drought affect vineyards and support farmers during severe drought.Keywords: grapes, composite drought index, yield loss, satellite remote sensing
Procedia PDF Downloads 1566182 Conflation Methodology Applied to Flood Recovery
Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong
Abstract:
Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.Keywords: community resilience, conflation, flood risk, nuisance flooding
Procedia PDF Downloads 1026181 Signal Estimation and Closed Loop System Performance in Atrial Fibrillation Monitoring with Communication Channels
Authors: Mohammad Obeidat, Ayman Mansour
Abstract:
In this paper a unique issue rising from feedback control of Atrial Fibrillation monitoring system with embedded communication channels has been investigated. One of the important factors to measure the performance of the feedback control closed loop system is disturbance and noise attenuation factor. It is important that the feedback system can attenuate such disturbances on the atrial fibrillation heart rate signals. Communication channels depend on network traffic conditions and deliver different throughput, implying that the sampling intervals may change. Since signal estimation is updated on the arrival of new data, its dynamics actually change with the sampling interval. Consequently, interaction among sampling, signal estimation, and the controller will introduce new issues in remotely controlled Atrial Fibrillation system. This paper treats a remotely controlled atrial fibrillation system with one communication channel which connects between the heart rate and rhythm measurements to the remote controller. Typical and optimal signal estimation schemes is represented by a signal averaging filter with its time constant derived from the step size of the signal estimation algorithm.Keywords: atrial fibrillation, communication channels, closed loop, estimation
Procedia PDF Downloads 3786180 Apricot Insurance Portfolio Risk
Authors: Kasirga Yildirak, Ismail Gur
Abstract:
We propose a model to measure hail risk of an Agricultural Insurance portfolio. Hail is one of the major catastrophic event that causes big amount of loss to an insurer. Moreover, it is very hard to predict due to its strange atmospheric characteristics. We make use of parcel based claims data on apricot damage collected by the Turkish Agricultural Insurance Pool (TARSIM). As our ultimate aim is to compute the loadings assigned to specific parcels, we build a portfolio risk model that makes use of PD and the severity of the exposures. PD is computed by Spherical-Linear and Circular –Linear regression models as the data carries coordinate information and seasonality. Severity is mapped into integer brackets so that Probability Generation Function could be employed. Individual regressions are run on each clusters estimated on different criteria. Loss distribution is constructed by Panjer Recursion technique. We also show that one risk-one crop model can easily be extended to the multi risk–multi crop model by assuming conditional independency.Keywords: hail insurance, spherical regression, circular regression, spherical clustering
Procedia PDF Downloads 2516179 Probability of Passing the Brake Test at Ministry of Transport Facilities of Each City at Alicante Region from Spain
Authors: Carolina Senabre Blanes, Sergio Valero Verdú, Emilio Velasco SáNchez
Abstract:
This research objective is to obtain a percentage of success for each Ministry of Transport (MOT) facilities of each city of the Alicante region from Comunidad Valenciana from Spain by comparing results obtained by using different brake testers. It has been studied which types of brake tester are being used at each city nowadays. Different types of brake testers are used at each city, and the mechanical engineering staffs from the Miguel Hernández University have studied differences between all of them, and have obtained measures from each type. A percentage of probability of success will be given to each MOT station when you try to pass the exam with the same car with same characteristics and the same wheels. In other words, parameters of the vehicle have been controlled to be the same at all tests; therefore, brake measurements variability will be due to the type of testers could be used at the MOT station. A percentage of probability to pass the brake exam at each city will be given by comparing results of tests.Keywords: brake tester, Mot station, probability to pass the exam, brake tester characteristics
Procedia PDF Downloads 2916178 Analysis of Cooperative Hybrid ARQ with Adaptive Modulation and Coding on a Correlated Fading Channel Environment
Authors: Ibrahim Ozkan
Abstract:
In this study, a cross-layer design which combines adaptive modulation and coding (AMC) and hybrid automatic repeat request (HARQ) techniques for a cooperative wireless network is investigated analytically. Previous analyses of such systems in the literature are confined to the case where the fading channel is independent at each retransmission, which can be unrealistic unless the channel is varying very fast. On the other hand, temporal channel correlation can have a significant impact on the performance of HARQ systems. In this study, utilizing a Markov channel model which accounts for the temporal correlation, the performance of non-cooperative and cooperative networks are investigated in terms of packet loss rate and throughput metrics for Chase combining HARQ strategy.Keywords: cooperative network, adaptive modulation and coding, hybrid ARQ, correlated fading
Procedia PDF Downloads 1436177 Mathematics Model Approaching: Parameter Estimation of Transmission Dynamics of HIV and AIDS in Indonesia
Authors: Endrik Mifta Shaiful, Firman Riyudha
Abstract:
Acquired Immunodeficiency Syndrome (AIDS) is one of the world's deadliest diseases caused by the Human Immunodeficiency Virus (HIV) that infects white blood cells and cause a decline in the immune system. AIDS quickly became a world epidemic disease that affects almost all countries. Therefore, mathematical modeling approach to the spread of HIV and AIDS is needed to anticipate the spread of HIV and AIDS which are widespread. The purpose of this study is to determine the parameter estimation on mathematical models of HIV transmission and AIDS using cumulative data of people with HIV and AIDS each year in Indonesia. In this model, there are parameters of r ∈ [0,1) which is the effectiveness of the treatment in patients with HIV. If the value of r is close to 1, the number of people with HIV and AIDS will decline toward zero. The estimation results indicate when the value of r is close to unity, there will be a significant decline in HIV patients, whereas in AIDS patients constantly decreases towards zero.Keywords: HIV, AIDS, parameter estimation, mathematical models
Procedia PDF Downloads 2486176 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes
Authors: J. J. Vargas, N. Prieto, L. A. Toro
Abstract:
Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method
Procedia PDF Downloads 3736175 Blind Super-Resolution Reconstruction Based on PSF Estimation
Authors: Osama A. Omer, Amal Hamed
Abstract:
Successful blind image Super-Resolution algorithms require the exact estimation of the Point Spread Function (PSF). In the absence of any prior information about the imagery system and the true image; this estimation is normally done by trial and error experimentation until an acceptable restored image quality is obtained. Multi-frame blind Super-Resolution algorithms often have disadvantages of slow convergence and sensitiveness to complex noises. This paper presents a Super-Resolution image reconstruction algorithm based on estimation of the PSF that yields the optimum restored image quality. The estimation of PSF is performed by the knife-edge method and it is implemented by measuring spreading of the edges in the reproduced HR image itself during the reconstruction process. The proposed image reconstruction approach is using L1 norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models. A series of experiment results show that the proposed method can outperform other previous work robustly and efficiently.Keywords: blind, PSF, super-resolution, knife-edge, blurring, bilateral, L1 norm
Procedia PDF Downloads 3646174 The Impact of Malicious Attacks on the Performance of Routing Protocols in Mobile Ad-Hoc Networks
Authors: Habib Gorine, Rabia Saleh
Abstract:
Mobile Ad-Hoc Networks are the special type of wireless networks which share common security requirements with other networks such as confidentiality, integrity, authentication, and availability, which need to be addressed in order to secure data transfer through the network. Their routing protocols are vulnerable to various malicious attacks which could have a devastating consequence on data security. In this paper, three types of attacks such as selfish, gray hole, and black hole attacks have been applied to the two most important routing protocols in MANET named dynamic source routing and ad-hoc on demand distance vector in order to analyse and compare the impact of these attacks on the Network performance in terms of throughput, average delay, packet loss, and consumption of energy using NS2 simulator.Keywords: MANET, wireless networks, routing protocols, malicious attacks, wireless networks simulation
Procedia PDF Downloads 3196173 Seasonal and Species Variations in Incidence of Foetal Loss at the Maiduguri Abattoir in Northern Nigeria
Authors: Abdulrazaq O. Raji, Abba Mohammed, Ibrahim D. Mohammed
Abstract:
This study was conducted to investigate foetal loss among slaughtered livestock species at the Maiduguri abattoir from 2009 to 2013. Record of animals slaughtered monthly and fetuses recovered were collected from the management of the Maiduguri abattoir. Data was subjected to Analysis of Variance using the General Linear Model of SPSS 13.0 with Season, Species and their interaction as fixed factors. Average yearly slaughter at the Maiduguri abattoir was 63,225 animals with cattle, camel, goat and sheep accounting for 19737, 7374, 19281 and 17540 of the total. The corresponding number of those pregnant were 3117, 839, 2281 and 2432 out of a total of 8522 animals. Thus, cattle, camel, goat and sheep accounted for 30.87, 11.53, 30.16 and 27.44%, respectively of the animals slaughtered at the Abattoir and 35.96, 9.68, 26.31 and 28.05% of the foetal loss. The effect of season and species on foetal loss was significant (P < 0.05). The number of pregnant animals slaughtered and foetal loss were higher during wet than dry season. Similarly, foetal loss at the abattoir was higher in the month of May in respect of camel, goat and sheep, and August for cattle. Camel was the least slaughtered animal and had the least number of pregnant females. Foetal loss (%) was higher (P < 0.05) for cattle compared to other species. The interaction showed that camel was the least slaughtered species in both seasons and cattle in the wet season had the highest foetal loss.Keywords: abattoir, foetal loss, season, species
Procedia PDF Downloads 5336172 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)
Procedia PDF Downloads 2396171 Parameter Estimation via Metamodeling
Authors: Sergio Haram Sarmiento, Arcady Ponosov
Abstract:
Based on appropriate multivariate statistical methodology, we suggest a generic framework for efficient parameter estimation for ordinary differential equations and the corresponding nonlinear models. In this framework classical linear regression strategies is refined into a nonlinear regression by a locally linear modelling technique (known as metamodelling). The approach identifies those latent variables of the given model that accumulate most information about it among all approximations of the same dimension. The method is applied to several benchmark problems, in particular, to the so-called ”power-law systems”, being non-linear differential equations typically used in Biochemical System Theory.Keywords: principal component analysis, generalized law of mass action, parameter estimation, metamodels
Procedia PDF Downloads 5166170 Density-based Denoising of Point Cloud
Authors: Faisal Zaman, Ya Ping Wong, Boon Yian Ng
Abstract:
Point cloud source data for surface reconstruction is usually contaminated with noise and outliers. To overcome this, we present a novel approach using modified kernel density estimation (KDE) technique with bilateral filtering to remove noisy points and outliers. First we present a method for estimating optimal bandwidth of multivariate KDE using particle swarm optimization technique which ensures the robust performance of density estimation. Then we use mean-shift algorithm to find the local maxima of the density estimation which gives the centroid of the clusters. Then we compute the distance of a certain point from the centroid. Points belong to outliers then removed by automatic thresholding scheme which yields an accurate and economical point surface. The experimental results show that our approach comparably robust and efficient.Keywords: point preprocessing, outlier removal, surface reconstruction, kernel density estimation
Procedia PDF Downloads 344