Search results for: forward collision probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2543

Search results for: forward collision probability

2423 Comparative Study of Soliton Collisions in Uniform and Nonuniform Magnetized Plasma

Authors: Renu Tomar, Hitendra K. Malik, Raj P. Dahiya

Abstract:

Similar to the sound waves in air, plasmas support the propagation of ion waves, which evolve into the solitary structures when the effect of non linearity and dispersion are balanced. The ion acoustic solitary waves have been investigated in details in homogeneous plasmas, inhomogeneous plasmas, and magnetized plasmas. The ion acoustic solitary waves are also found to reflect from a density gradient or boundary present in the plasma after propagating. Another interesting feature of the solitary waves is their collision. In the present work, we carry out analytical calculations for the head-on collision of solitary waves in a magnetized plasma which has dust grains in addition to the ions and electrons. For this, we employ Poincar´e-Lighthill-Kuo (PLK) method. To lowest nonlinear order, the problem of colliding solitary waves leads to KdV (modified KdV) equations and also yields the phase shifts that occur in the interaction. These calculations are accomplished for the uniform and nonuniform plasmas, and the results on the soliton properties are discussed in detail.

Keywords: inhomogeneous magnetized plasma, dust charging, soliton collisions, magnetized plasma

Procedia PDF Downloads 449
2422 Gaussian Probability Density for Forest Fire Detection Using Satellite Imagery

Authors: S. Benkraouda, Z. Djelloul-Khedda, B. Yagoubi

Abstract:

we present a method for early detection of forest fires from a thermal infrared satellite image, using the image matrix of the probability of belonging. The principle of the method is to compare a theoretical mathematical model to an experimental model. We considered that each line of the image matrix, as an embodiment of a non-stationary random process. Since the distribution of pixels in the satellite image is statistically dependent, we divided these lines into small stationary and ergodic intervals to characterize the image by an adequate mathematical model. A standard deviation was chosen to generate random variables, so each interval behaves naturally like white Gaussian noise. The latter has been selected as the mathematical model that represents a set of very majority pixels, which we can be considered as the image background. Before modeling the image, we made a few pretreatments, then the parameters of the theoretical Gaussian model were extracted from the modeled image, these settings will be used to calculate the probability of each interval of the modeled image to belong to the theoretical Gaussian model. The high intensities pixels are regarded as foreign elements to it, so they will have a low probability, and the pixels that belong to the background image will have a high probability. Finally, we did present the reverse of the matrix of probabilities of these intervals for a better fire detection.

Keywords: forest fire, forest fire detection, satellite image, normal distribution, theoretical gaussian model, thermal infrared matrix image

Procedia PDF Downloads 116
2421 Comparison of Two Online Intervention Protocols on Reducing Habitual Upper Body Postures: A Randomized Trial

Authors: Razieh Karimian, Kim Burton, Mohammad Mehdi Naghizadeh, Maryam Karimian

Abstract:

Introduction: Habitual upper body postures are associated with online learning during the COVID-19 pandemic. This study explored whether adding an exercise routine to an ergonomic advice intervention improves these postures. Methods: In this randomized trial, 42 male adolescent students with a forward head posture were randomly divided into two equal groups, one allocated to ergonomic advice alone and the other to ergonomic advice plus an exercise routine. The angles of forward head, shoulder, and back postures were measured with a photogrammetric profile technique before and after the 8-week intervention period. Findings: During home quarantine, 76% of the students used their mobile phones, while 35% used a table-chair-computer for online learning. While significant reductions of the forward, shoulder, and back angles were found in both groups (P < 0.001), the effect was significantly greater in the exercise group (P < 0.001: forward head, shoulder, and back angles reduced by some 9, 6, and 5 degrees respectively, compared with 4 degrees in the forward head, and 2 degrees in the shoulder and back angles for ergonomic advice alone. Conclusion: The exercise routine produced a greater improvement in habitual upper body postures than ergonomic advice alone, a finding that may extend beyond online learning at home.

Keywords: randomized trial, online learning, adolescent, posture, exercise, ergonomic advice

Procedia PDF Downloads 45
2420 'Call Drop': A Problem for Handover Minimizing the Call Drop Probability Using Analytical and Statistical Method

Authors: Anshul Gupta, T. Shankar

Abstract:

In this paper, we had analyzed the call drop to provide a good quality of service to user. By optimizing it we can increase the coverage area and also the reduction of interference and congestion created in a network. Basically handover is the transfer of call from one cell site to another site during a call. Here we have analyzed the whole network by two method-statistic model and analytic model. In statistic model we have collected all the data of a network during busy hour and normal 24 hours and in analytic model we have the equation through which we have to find the call drop probability. By avoiding unnecessary handovers we can increase the number of calls per hour. The most important parameter is co-efficient of variation on which the whole paper discussed.

Keywords: coefficient of variation, mean, standard deviation, call drop probability, handover

Procedia PDF Downloads 464
2419 Dynamical Characteristics of Interaction between Water Droplet and Aerosol Particle in Dedusting Technology

Authors: Ding Jue, Li Jiahua, Lei Zhidi, Weng Peifen, Li Xiaowei

Abstract:

With the rapid development of national modern industry, people begin to pay attention to environmental pollution and harm caused by industrial dust. Based on above, a numerical study on the dedusting technology of industrial environment was conducted. The dynamic models of multicomponent particles collision and coagulation, breakage and deposition are developed, and the interaction of water droplet and aerosol particle in 2-Dimension flow field was researched by Eulerian-Lagrangian method and Multi-Monte Carlo method. The effects of the droplet scale, movement speed of droplet and the flow field structure on scavenging efficiency were analyzed. The results show that under the certain condition, 30μm of droplet has the best scavenging efficiency. At the initial speed 1m/s of droplets, droplets and aerosol particles have more time to interact, so it has a better scavenging efficiency for the particle.

Keywords: water droplet, aerosol particle, collision and coagulation, multi-monte carlo method

Procedia PDF Downloads 282
2418 Tectonic Setting of Hinterland and Foreland Basins According to Tectonic Vergence in Eastern Iran

Authors: Shahriyar Keshtgar, Mahmoud Reza Heyhat, Sasan Bagheri, Ebrahim Gholami, Seyed Naser Raiisosadat

Abstract:

Various tectonic interpretations have been presented by different researchers to explain the geological evolution of eastern Iran, but there are still many ambiguities and many disagreements about the geodynamic nature of the Paleogene mountain range of eastern Iran. The purpose of this research is to clarify and discuss the tectonic position of the foreland and hinterland regions of eastern Iran from the tectonic perspective of sedimentary basins. In the tectonic model of oceanic subduction crust under the Afghan block, the hinterland is located to the east and on the Afghan block, and the foreland is located on the passive margin of the Sistan open ocean in the west. After the collision of the two microcontinents, the foreland basin must be located somewhere on the passive margin of the Lut block. This basin can deposit thick Paleocene to Oligocene sediments on the Cretaceous and older sediments. Thrust faults here will move towards the west. If we accept the subduction model of the Sistan Ocean under the Lut Block, the hinterland is located to the west towards the Lut Block, and the foreland basin is located towards the Sistan Ocean in the east. After the collision of the two microcontinents, the foreland basin with Paleogene sediments should expand on the Sefidaba basin. Thrust faults here will move towards the east. If we consider the two-sided subduction model of the ocean crust under both Lut and Afghan continental blocks, the tectonic position of the foreland and hinterland basins will not change and will be similar to the one-sided subduction models. After the collision of two microcontinents, the foreland basin should develop in the central part of the eastern Iranian orogen. In the oroclinic buckling model, the foreland basin will continue not only in the east and west but continuously in the north as well. In this model, since there is practically no collision, the foreland basin is not developed, and the remnants of the Sistan Ocean ophiolites and their deep turbidite sediments appear in the axial part of the mountain range, where the Neh and Khash complexes are located. The structural data from this research in the northern border of the Sistan belt and the Lut block indicate the convergence of the tectonic vergence directions towards the interior of the Sistan belt (in the Ahangaran area towards the southwest, in the north of Birjand towards the south-southeast, in the Sechengi area to the southeast). According to this research, not only the general movement of thrust sheets do not follow the linear orogeny models, but the expected active foreland basins have not been formed in the mentioned places in eastern Iran. Therefore, these results do not follow previous tectonic models for eastern Iran (i.e., rifting of eastern Iran continental crust and subsequent linear collision of the Lut and Afghan blocks), but it seems that was caused by buckling model in the Late Eocene-Oligocene.

Keywords: foreland, hinterland, tectonic vergence, orocline buckling, eastern Iran

Procedia PDF Downloads 34
2417 Probability Model Accidents of Motorcyclist Based on Driver's Personality

Authors: Margareth E. Bolla, Ludfi Djakfar, Achmad Wicaksono

Abstract:

The increase in the number of motorcycle users in Indonesia is in line with the increase in accidents involving motorcycles. Several previous studies have shown that humans are the biggest factor causing accidents, and the driver's personality factor will affect his behavior on the road. This study was conducted to see how a person's personality traits will affect the probability of having an accident while driving. The Big Five Inventory (BFI) questionnaire and the Honda Riding Trainer (HRT) simulator were used as measuring tools, while the analysis carried out was logistic regression analysis. The results of the descriptive analysis of the respondent's personality based on the BFI show that the majority of drivers have the dominant character of neuroticism (34%), while the smallest group is the driver with the dominant type of openness character (6%). The percentage of motorists who were not involved in an accident was 54%. The results of the logistic regression analysis form a mathematical model as follows Y = -3.852 - 0.288 X1 + 0.596 X2 + 0.429 X3 - 0.386 X4 - 0.094 X5 + 0.436 X6 + 0.162 X7, where the results of hypothesis testing indicate that the variables openness, conscientiousness, extraversion, agreeableness, neuroticism, history of traffic accidents and age at starting driving did not have a significant effect on the probability of a motorcyclist being involved in an accident.

Keywords: accidents, BFI, probability, simulator

Procedia PDF Downloads 122
2416 Application Reliability Method for the Analysis of the Stability Limit States of Large Concrete Dams

Authors: Mustapha Kamel Mihoubi, Essadik Kerkar, Abdelhamid Hebbouche

Abstract:

According to the randomness of most of the factors affecting the stability of a gravity dam, probability theory is generally used to TESTING the risk of failure and there is a confusing logical transition from the state of stability failed state, so the stability failure process is considered as a probable event. The control of risk of product failures is of capital importance for the control from a cross analysis of the gravity of the consequences and effects of the probability of occurrence of identified major accidents and can incur a significant risk to the concrete dam structures. Probabilistic risk analysis models are used to provide a better understanding the reliability and structural failure of the works, including when calculating stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of the reliability analysis methods including the methods used in engineering. It is in our case of the use of level II methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type FORM (First Order Reliability Method), SORM (Second Order Reliability Method). By way of comparison, a second level III method was used which generates a full analysis of the problem and involving an integration of the probability density function of, random variables are extended to the field of security by using of the method of Mont-Carlo simulations. Taking into account the change in stress following load combinations: normal, exceptional and extreme the acting on the dam, calculation results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities thus causing a significant decrease in strength, especially in the presence of combinations of unique and extreme loads. Shear forces then induce a shift threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case THE increase of uplift in a hypothetical default of the drainage system.

Keywords: dam, failure, limit state, monte-carlo, reliability, probability, sliding, Taylor

Procedia PDF Downloads 299
2415 Evaluation of DNA Paternity Testing Accuracy of Child Trafficking Cases

Authors: Wing Kam Fung, Kexin Yu

Abstract:

Child trafficking has been a serious problem in modern China. The Chinese government has established a national anti-trafficking DNA database to help reunite missing children with their families. The database collects DNA information from missing children's parents, trafficked and homeless children, then conducts paternity tests to find matched pairs. This paper considers the matching accuracy in such cases by looking into the exclusion probability in paternity testing. First, the situation of child trafficking in China is introduced. Next, derivations of the exclusion probability for both one-parent and two-parents cases are given, followed by extension to allow for 1 or 2 mutations. The accuracy of paternity testing of child trafficking cases is then assessed using the exclusion probabilities and available data. Finally, the number of loci that should be used to ensure a correct match is investigated.

Keywords: child trafficking, DNA database, exclusion probability, paternity testing

Procedia PDF Downloads 430
2414 Forward Stable Computation of Roots of Real Polynomials with Only Real Distinct Roots

Authors: Nevena Jakovčević Stor, Ivan Slapničar

Abstract:

Any polynomial can be expressed as a characteristic polynomial of a complex symmetric arrowhead matrix. This expression is not unique. If the polynomial is real with only real distinct roots, the matrix can be chosen as real. By using accurate forward stable algorithm for computing eigen values of real symmetric arrowhead matrices we derive a forward stable algorithm for computation of roots of such polynomials in O(n^2 ) operations. The algorithm computes each root to almost full accuracy. In some cases, the algorithm invokes extended precision routines, but only in the non-iterative part. Our examples include numerically difficult problems, like the well-known Wilkinson’s polynomials. Our algorithm compares favorably to other method for polynomial root-finding, like MPSolve or Newton’s method.

Keywords: roots of polynomials, eigenvalue decomposition, arrowhead matrix, high relative accuracy

Procedia PDF Downloads 390
2413 Integrated Design of Froth Flotation Process in Sludge Oil Recovery Using Cavitation Nanobubbles for Increase the Efficiency and High Viscose Compatibility

Authors: Yolla Miranda, Marini Altyra, Karina Kalmapuspita Imas

Abstract:

Oily sludge wastes always fill in upstream and downstream petroleum industry process. Sludge still contains oil that can use for energy storage. Recycling sludge is a method to handling it for reduce the toxicity and very probable to get the remaining oil around 20% from its volume. Froth flotation, a common method based on chemical unit for separate fine solid particles from an aqueous suspension. The basic composition of froth flotation is the capture of oil droplets or small solids by air bubbles in an aqueous slurry, followed by their levitation and collection in a froth layer. This method has been known as no intensive energy requirement and easy to apply. But the low efficiency and unable treat the high viscosity become the biggest problem in froth flotation unit. This study give the design to manage the high viscosity of sludge first and then entering the froth flotation including cavitation tube on it to change the bubbles into nano particles. The recovery in flotation starts with the collision and adhesion of hydrophobic particles to the air bubbles followed by transportation of the hydrophobic particle-bubble aggregate from the collection zone to the froth zone, drainage and enrichment of the froth, and finally by its overflow removal from the cell top. The effective particle separation by froth flotation relies on the efficient capture of hydrophobic particles by air bubbles in three steps. The important step is collision. Decreasing the bubble particles will increasing the collision effect. It cause the process more efficient. The pre-treatment, froth flotation, and cavitation tube integrated each other. The design shows the integrated unit and its process.

Keywords: sludge oil recovery, froth flotation, cavitation tube, nanobubbles, high viscosity

Procedia PDF Downloads 340
2412 Hybrid EMPCA-Scott Approach for Estimating Probability Distributions of Mutual Information

Authors: Thuvanan Borvornvitchotikarn, Werasak Kurutach

Abstract:

Mutual information (MI) is widely used in medical image registration. In the different medical images analysis, it is difficult to choose an optimal bins size number for calculating the probability distributions in MI. As the result, this paper presents a new adaptive bins number selection approach that named a hybrid EMPCA-Scott approach. This work combines an expectation maximization principal component analysis (EMPCA) and the modified Scott’s rule. The proposed approach solves the binning problem from the various intensity values in medical images. Experimental results of this work show the lower registration errors compared to other adaptive binning approaches.

Keywords: mutual information, EMPCA, Scott, probability distributions

Procedia PDF Downloads 227
2411 Performance of Nakagami Fading Channel over Energy Detection Based Spectrum Sensing

Authors: M. Ranjeeth, S. Anuradha

Abstract:

Spectrum sensing is the main feature of cognitive radio technology. Spectrum sensing gives an idea of detecting the presence of the primary users in a licensed spectrum. In this paper we compare the theoretical results of detection probability of different fading environments like Rayleigh, Rician, Nakagami-m fading channels with the simulation results using energy detection based spectrum sensing. The numerical results are plotted as P_f Vs P_d for different SNR values, fading parameters. It is observed that Nakagami fading channel performance is better than other fading channels by using energy detection in spectrum sensing. A MATLAB simulation test bench has been implemented to know the performance of energy detection in different fading channel environment.

Keywords: spectrum sensing, energy detection, fading channels, probability of detection, probability of false alarm

Procedia PDF Downloads 508
2410 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 651
2409 Curriculum-Based Multi-Agent Reinforcement Learning for Robotic Navigation

Authors: Hyeongbok Kim, Lingling Zhao, Xiaohong Su

Abstract:

Deep reinforcement learning has been applied to address various problems in robotics, such as autonomous driving and unmanned aerial vehicle. However, because of the sparse reward penalty for a collision with obstacles during the navigation mission, the agent fails to learn the optimal policy or requires a long time for convergence. Therefore, using obstacles and enemy agents, in this paper, we present a curriculum-based boost learning method to effectively train compound skills during multi-agent reinforcement learning. First, to enable the agents to solve challenging tasks, we gradually increased learning difficulties by adjusting reward shaping instead of constructing different learning environments. Then, in a benchmark environment with static obstacles and moving enemy agents, the experimental results showed that the proposed curriculum learning strategy enhanced cooperative navigation and compound collision avoidance skills in uncertain environments while improving learning efficiency.

Keywords: curriculum learning, hard exploration, multi-agent reinforcement learning, robotic navigation, sparse reward

Procedia PDF Downloads 67
2408 A New Dual Forward Affine Projection Adaptive Algorithm for Speech Enhancement in Airplane Cockpits

Authors: Djendi Mohmaed

Abstract:

In this paper, we propose a dual adaptive algorithm, which is based on the combination between the forward blind source separation (FBSS) structure and the affine projection algorithm (APA). This proposed algorithm combines the advantages of the source separation properties of the FBSS structure and the fast convergence characteristics of the APA algorithm. The proposed algorithm needs two noisy observations to provide an enhanced speech signal. This process is done in a blind manner without the need for ant priori information about the source signals. The proposed dual forward blind source separation affine projection algorithm is denoted (DFAPA) and used for the first time in an airplane cockpit context to enhance the communication from- and to- the airplane. Intensive experiments were carried out in this sense to evaluate the performance of the proposed DFAPA algorithm.

Keywords: adaptive algorithm, speech enhancement, system mismatch, SNR

Procedia PDF Downloads 114
2407 Young’s Modulus Variability: Influence on Masonry Vault Behavior

Authors: Abdelmounaim Zanaz, Sylvie Yotte, Fazia Fouchal, Alaa Chateauneuf

Abstract:

This paper presents a methodology for probabilistic assessment of bearing capacity and prediction of failure mechanism of masonry vaults at the ultimate state with consideration of the natural variability of Young’s modulus of stones. First, the computation model is explained. The failure mode is the most reported mode, i.e. the four-hinge mechanism. Based on this assumption, the study of a vault composed of 16 segments is presented. The Young’s modulus of the segments is considered as random variable defined by a mean value and a coefficient of variation CV. A relationship linking the vault bearing capacity to the modulus variation of voussoirs is proposed. The failure mechanisms, in addition to that observed in the deterministic case, are identified for each CV value as well as their probability of occurrence. The results show that the mechanism observed in the deterministic case has decreasing probability of occurrence in terms of CV, while the number of other mechanisms and their probability of occurrence increase with the coefficient of variation of Young’s modulus. This means that if a significant change in the Young modulus of the segments is proven, taken it into account in computations becomes mandatory, both for determining the vault bearing capacity and for predicting its failure mechanism.

Keywords: masonry, mechanism, probability, variability, vault

Procedia PDF Downloads 420
2406 A Strategy for the Application of Second-Order Monte Carlo Algorithms to Petroleum Exploration and Production Projects

Authors: Obioma Uche

Abstract:

Due to the recent volatility in oil & gas prices as well as increased development of non-conventional resources, it has become even more essential to critically evaluate the profitability of petroleum prospects prior to making any investment decisions. Traditionally, simple Monte Carlo (MC) algorithms have been used to randomly sample probability distributions of economic and geological factors (e.g. price, OPEX, CAPEX, reserves, productive life, etc.) in order to obtain probability distributions for profitability metrics such as Net Present Value (NPV). In recent years, second-order MC algorithms have been shown to offer an advantage over simple MC techniques due to the added consideration of uncertainties associated with the probability distributions of the relevant variables. Here, a strategy for the application of the second-order MC technique to a case study is demonstrated to analyze its effectiveness as a tool for portfolio management.

Keywords: Monte Carlo algorithms, portfolio management, profitability, risk analysis

Procedia PDF Downloads 307
2405 Transient Enhanced LDO Voltage Regulator with Improved Feed Forward Path Compensation

Authors: A. Suresh, Sreehari Rao Patri, K. S. R. Krishnaprasad

Abstract:

An ultra low power capacitor less low-dropout voltage regulator with improved transient response using gain enhanced feed forward path compensation is presented in this paper. It is based on a cascade of a voltage amplifier and a transconductor stage in the feed forward path with regular error amplifier to form a composite gain-enhanced feed forward stage. It broadens the gain bandwidth and thus improves the transient response without substantial increase in power consumption. The proposed LDO, designed for a maximum output current of 100 mA in UMC 180 nm, requires a quiescent current of 69 µA. An undershoot of 153.79mV for a load current changes from 0mA to 100mA and an overshoot of 196.24mV for current change of 100mA to 0mA. The settling time is approximately 1.1 µs for the output voltage undershoot case. The load regulation is of 2.77 µV/mA at load current of 100mA. Reference voltage is generated by using an accurate band gap reference circuit of 0.8V.The costly features of SOC such as total chip area and power consumption is drastically reduced by the use of only a total compensation capacitance of 6pF while consuming power consumption of 0.096 mW.

Keywords: capacitor-less LDO, frequency compensation, transient response, latch, self-biased differential amplifier

Procedia PDF Downloads 432
2404 Modeling Binomial Dependent Distribution of the Values: Synthesis Tables of Probabilities of Errors of the First and Second Kind of Biometrics-Neural Network Authentication System

Authors: B. S.Akhmetov, S. T. Akhmetova, D. N. Nadeyev, V. Yu. Yegorov, V. V. Smogoonov

Abstract:

Estimated probabilities of errors of the first and second kind for nonideal biometrics-neural transducers 256 outputs, the construction of nomograms based error probability of 'own' and 'alien' from the mathematical expectation and standard deviation of the normalized measures Hamming.

Keywords: modeling, errors, probability, biometrics, neural network, authentication

Procedia PDF Downloads 464
2403 Random Access in IoT Using Naïve Bayes Classification

Authors: Alhusein Almahjoub, Dongyu Qiu

Abstract:

This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.

Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation

Procedia PDF Downloads 125
2402 A New Resonance Solution to Suppress the Voltage Stresses in the Forward Topology Used in a Switch Mode Power Supply

Authors: Maamar Latroch, Mohamed Bourahla

Abstract:

Forward topology used in switch mode power supply (SMPS) is one of the most famous configuration feeding DC systems such as telecommunication systems and other specific applications where the galvanic isolation is required. This configuration benefits of the high frequency feature of the transformer to provide a small size and light weight of the over all system. However, the stresses existing on the power switch during an ON/OFF commutation limit the transmitted power to the DC load. This paper investigates the main causes of the stresses in voltage existing during a commutation cycle and suggest a low cost solution that eliminates the overvoltage. As a result, this configuration will yield the possibility of the use of this configuration in higher power applications. Simulation results will show the efficiency of the presented method.

Keywords: switch mode power supply, forward topology, resonance topology, high frequency commutation

Procedia PDF Downloads 415
2401 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks

Procedia PDF Downloads 192
2400 Parameter Interactions in the Cumulative Prospect Theory: Fitting the Binary Choice Experiment Data

Authors: Elzbieta Babula, Juhyun Park

Abstract:

Tversky and Kahneman’s cumulative prospect theory assumes symmetric probability cumulation with regard to the reference point within decision weights. Theoretically, this model should be invariant under the change of the direction of probability cumulation. In the present study, this phenomenon is being investigated by creating a reference model that allows verifying the parameter interactions in the cumulative prospect theory specifications. The simultaneous parametric fitting of utility and weighting functions is applied to binary choice data from the experiment. The results show that the flexibility of the probability weighting function is a crucial characteristic allowing to prevent parameter interactions while estimating cumulative prospect theory.

Keywords: binary choice experiment, cumulative prospect theory, decision weights, parameter interactions

Procedia PDF Downloads 189
2399 Optimal Mitigation of Slopes by Probabilistic Methods

Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez

Abstract:

A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.

Keywords: expected life-cycle cost, failure probability, slopes failure, storms

Procedia PDF Downloads 137
2398 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode

Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan

Abstract:

Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.

Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk

Procedia PDF Downloads 411
2397 Bandwidth Efficient Cluster Based Collision Avoidance Multicasting Protocol in VANETs

Authors: Navneet Kaur, Amarpreet Singh

Abstract:

In Vehicular Adhoc Networks, Data Dissemination is a challenging task. There are number of techniques, types and protocols available for disseminating the data but in order to preserve limited bandwidth and to disseminate maximum data over networks makes it more challenging. There are broadcasting, multicasting and geocasting based protocols. Multicasting based protocols are found to be best for conserving the bandwidth. One such protocol named BEAM exists that improves the performance of Vehicular Adhoc Networks by reducing the number of in-network message transactions and thereby efficiently utilizing the bandwidth during an emergency situation. But this protocol may result in multicar chain collision as there was no V2V communication. So, this paper proposes a new protocol named Enhanced Bandwidth Efficient Cluster Based Multicasting Protocol (EBECM) that will overcome the limitations of existing BEAM protocol. And Simulation results will show the improved performance of EBECM in terms of Routing overhead, throughput and PDR when compared with BEAM protocol.

Keywords: BEAM, data dissemination, emergency situation, vehicular adhoc network

Procedia PDF Downloads 324
2396 Basics of Gamma Ray Burst and Its Afterglow

Authors: Swapnil Kumar Singh

Abstract:

Gamma-ray bursts (GRB's), short and intense pulses of low-energy γ rays, have fascinated astronomers and astrophysicists since their unexpected discovery in the late sixties. GRB'sare accompanied by long-lasting afterglows, and they are associated with core-collapse supernovae. The detection of delayed emission in X-ray, optical, and radio wavelength, or "afterglow," following a γ-ray burst can be described as the emission of a relativistic shell decelerating upon collision with the interstellar medium. While it is fair to say that there is strong diversity amongst the afterglow population, probably reflecting diversity in the energy, luminosity, shock efficiency, baryon loading, progenitor properties, circumstellar medium, and more, the afterglows of GRBs do appear more similar than the bursts themselves, and it is possible to identify common features within afterglows that lead to some canonical expectations. After an initial flash of gamma rays, a longer-lived "afterglow" is usually emitted at longer wavelengths (X-ray, ultraviolet, optical, infrared, microwave, and radio). It is a slowly fading emission at longer wavelengths created by collisions between the burst ejecta and interstellar gas. In X-ray wavelengths, the GRB afterglow fades quickly at first, then transitions to a less-steep drop-off (it does other stuff after that, but we'll ignore that for now). During these early phases, the X-ray afterglow has a spectrum that looks like a power law: flux F∝ E^β, where E is energy and beta is some number called the spectral index. This kind of spectrum is characteristic of synchrotron emission, which is produced when charged particles spiral around magnetic field lines at close to the speed of light. In addition to the outgoing forward shock that ploughs into the interstellar medium, there is also a so-called reverse shock, which propagates backward through the ejecta. In many ways," reverse" shock can be misleading; this shock is still moving outward from the restframe of the star at relativistic velocity but is ploughing backward through the ejecta in their frame and is slowing the expansion. This reverse shock can be dynamically important, as it can carry comparable energy to the forward shock. The early phases of the GRB afterglow still provide a good description even if the GRB is highly collimated since the individual emitting regions of the outflow are not in causal contact at large angles and so behave as though they are expanding isotropically. The majority of afterglows, at times typically observed, fall in the slow cooling regime, and the cooling break lies between the optical and the X-ray. Numerous observations support this broad picture for afterglows in the spectral energy distribution of the afterglow of the very bright GRB. The bluer light (optical and X-ray) appears to follow a typical synchrotron forward shock expectation (note that the apparent features in the X-ray and optical spectrum are due to the presence of dust within the host galaxy). We need more research in GRB and Particle Physics in order to unfold the mysteries of afterglow.

Keywords: GRB, synchrotron, X-ray, isotropic energy

Procedia PDF Downloads 70
2395 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 248
2394 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 57