Search results for: Probability.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 553

Search results for: Probability.

433 Unsupervised Feature Selection Using Feature Density Functions

Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh

Abstract:

Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.

Keywords: Feature, Feature Selection, Filter, Probability Density Function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2077
432 Mathematics Anxiety among Male and Female Students

Authors: Wern Lin Yeo, Choo Kim Tan, Sook Ling Lew

Abstract:

The purpose of this study is to determine the relationship of anxiety level between male and female undergraduates at a private university in Malaysia. Convenient sampling method used in this study in which the students were selected based on the grouping assigned by the faculty. There were 214 undergraduates who registered the probability courses had participated in this study. Mathematics Anxiety Rating Scale (MARS) was the instrument used in study which used to determine students’ anxiety level towards probability. Reliability and validity of instrument was done before the major study was conducted. In the major study, students were given briefing about the study conducted. Participation of this study was voluntary. Students were given consent form to determine whether they agree to participate in the study. Duration of two weeks was given for students to complete the given online questionnaire. The data collected will be analyzed using Statistical Package for the Social Sciences (SPSS) to determine the level of anxiety. There were three anxiety level, i.e., low, average and high. Students’ anxiety level was determined based on their scores obtained compared with the mean and standard deviation. If the scores obtained were below mean and standard deviation, the anxiety level was low. If the scores were at below and above the mean and between one standard deviation, the anxiety level was average. If the scores were above the mean and greater than one standard deviation, the anxiety level was high. Results showed that both of genders were having average anxiety level. Among low, average and high anxiety level, frequency of males were found to be higher as compared to females. Hence, the mean values obtained for males (M = 3.62) was higher than females (M = 3.42). In order to be significant of anxiety level among the gender, the p-value should be less than .05. The p-value obtained in this study was .117. However, this value was greater than .05. Thus, there was no significant difference of anxiety level among the gender. In other words, there was no relationship of anxiety level with the gender.

Keywords: Anxiety level, gender, mathematics anxiety, probability and statistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4289
431 Centralized Cooperative Spectrum Sensing with MIMO in the Reporting Network over κ − μ Fading Channel

Authors: S Hariharan, K Chaitanya, P Muthuchidambaranathan

Abstract:

The IEEE 802.22 working group aims to drive the Digital Video Broadcasting-Terrestrial (DVB-T) bands for data communication to the rural area without interfering the TV broadcast. In this paper, we arrive at a closed-form expression for average detection probability of Fusion center (FC) with multiple antenna over the κ − μ fading channel model. We consider a centralized cooperative multiple antenna network for reporting. The DVB-T samples forwarded by the secondary user (SU) were combined using Maximum ratio combiner at FC, an energy detection is performed to make the decision. The fading effects of the channel degrades the detection probability of the FC, a generalized independent and identically distributed (IID) κ − μ and an additive white Gaussian noise (AWGN) channel is considered for reporting and sensing respectively. The proposed system performance is verified through simulation results.

Keywords: IEEE 802.22, Cooperative spectrum sensing, Multiple antenna, κ − μ .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5454
430 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: Cognitive radio, energy detector, periodogram, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1030
429 Noise-Improved Signal Detection in Nonlinear Threshold Systems

Authors: Youguo Wang, Lenan Wu

Abstract:

We discuss the signal detection through nonlinear threshold systems. The detection performance is assessed by the probability of error Per . We establish that: (1) when the signal is complete suprathreshold, noise always degrades the signal detection both in the single threshold system and in the parallel array of threshold devices. (2) When the signal is a little subthreshold, noise degrades signal detection in the single threshold system. But in the parallel array, noise can improve signal detection, i.e., stochastic resonance (SR) exists in the array. (3) When the signal is predominant subthreshold, noise always can improve signal detection and SR always exists not only in the single threshold system but also in the parallel array. (4) Array can improve signal detection by raising the number of threshold devices. These results extend further the applicability of SR in signal detection.

Keywords: Probability of error, signal detection, stochasticresonance, threshold system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1436
428 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
427 Modulation Identification Algorithm for Adaptive Demodulator in Software Defined Radios Using Wavelet Transform

Authors: P. Prakasam, M. Madheswaran

Abstract:

A generalized Digital Modulation Identification algorithm for adaptive demodulator has been developed and presented in this paper. The algorithm developed is verified using wavelet Transform and histogram computation to identify QPSK and QAM with GMSK and M–ary FSK modulations. It has been found that the histogram peaks simplifies the procedure for identification. The simulated results show that the correct modulation identification is possible to a lower bound of 5 dB and 12 dB for GMSK and QPSK respectively. When SNR is above 5 dB the throughput of the proposed algorithm is more than 97.8%. The receiver operating characteristics (ROC) has been computed to measure the performance of the proposed algorithm and the analysis shows that the probability of detection (Pd) drops rapidly when SNR is 5 dB and probability of false alarm (Pf) is smaller than 0.3. The performance of the proposed algorithm has been compared with existing methods and found it will identify all digital modulation schemes with low SNR.

Keywords: Bit Error rate, Receiver Operating Characteristics, Software Defined Radio, Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424
426 Reliability Assessment of Bangladesh Power System Using Recursive Algorithm

Authors: Nahid-Al-Masood, Jubaer Ahmed, Amina Hasan Abedin, S. R. Deeba, Faeza Hafiz, Mahmuda Begum

Abstract:

An electric utility-s main concern is to plan, design, operate and maintain its power supply to provide an acceptable level of reliability to its users. This clearly requires that standards of reliability be specified and used in all three sectors of the power system, i.e., generation, transmission and distribution. That is why reliability of a power system is always a major concern to power system planners. This paper presents the reliability analysis of Bangladesh Power System (BPS). Reliability index, loss of load probability (LOLP) of BPS is evaluated using recursive algorithm and considering no de-rated states of generators. BPS has sixty one generators and a total installed capacity of 5275 MW. The maximum demand of BPS is about 5000 MW. The relevant data of the generators and hourly load profiles are collected from the National Load Dispatch Center (NLDC) of Bangladesh and reliability index 'LOLP' is assessed for the period of last ten years.

Keywords: Recursive algorithm, LOLP, forced outage rate, cumulative probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2354
425 Real Time Speed Estimation of Vehicles

Authors: Azhar Hussain, Kashif Shahzad, Chunming Tang

Abstract:

this paper gives a novel approach towards real-time speed estimation of multiple traffic vehicles using fuzzy logic and image processing techniques with proper arrangement of camera parameters. The described algorithm consists of several important steps. First, the background is estimated by computing median over time window of specific frames. Second, the foreground is extracted using fuzzy similarity approach (FSA) between estimated background pixels and the current frame pixels containing foreground and background. Third, the traffic lanes are divided into two parts for both direction vehicles for parallel processing. Finally, the speeds of vehicles are estimated by Maximum a Posterior Probability (MAP) estimator. True ground speed is determined by utilizing infrared sensors for three different vehicles and the results are compared to the proposed algorithm with an accuracy of ± 0.74 kmph.

Keywords: Defuzzification, Fuzzy similarity approach, lane cropping, Maximum a Posterior Probability (MAP) estimator, Speed estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2806
424 Conflation Methodology Applied to Flood Recovery

Authors: E. L. Suarez, D. E. Meeroff, Y. Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: Community resilience, conflation, flood risk, nuisance flooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139
423 Continuous Wave Interference Effects on Global Position System Signal Quality

Authors: Fang Ye, Han Yu, Yibing Li

Abstract:

Radio interference is one of the major concerns in using the global positioning system (GPS) for civilian and military applications. Interference signals are produced not only through all electronic systems but also illegal jammers. Among different types of interferences, continuous wave (CW) interference has strong adverse impacts on the quality of the received signal. In this paper, we make more detailed analysis for CW interference effects on GPS signal quality. Based on the C/A code spectrum lines, the influence of CW interference on the acquisition performance of GPS receivers is further analysed. This influence is supported by simulation results using GPS software receiver. As the most important user parameter of GPS receivers, the mathematical expression of bit error probability is also derived in the presence of CW interference, and the expression is consistent with the Monte Carlo simulation results. The research on CW interference provides some theoretical gist and new thoughts on monitoring the radio noise environment and improving the anti-jamming ability of GPS receivers.

Keywords: GPS, CW interference, acquisition performance, bit error probability, Monte Carlo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
422 Performance Analysis of a Dynamic Channel Reservation-Like Technique for Low Earth Orbit Mobile Satellite Systems

Authors: W. Kiamouche, S. Lasmari, M. Benslama

Abstract:

In order to derive important parameters concerning mobile subscriber MS with ongoing calls in Low Earth Orbit Mobile Satellite Systems LEO MSSs, a positioning system had to be integrated into MSS in order to localize mobile subscribers MSs and track them during the connection. Such integration is regarded as a complex implementation. We propose in this paper a novel method based on advantages of mobility model of Low Earth Orbit Mobile Satellite System LEO MSS which allows the evaluation of instant of subsequent handover of a MS even if its location is unknown. This method is utilized to propose a Dynamic Channel Reservation DCRlike scheme based on the DCR scheme previously proposed in literature. Results presented show that DCR-like technique gives different QoS performance than DCR. Indeed, an improve in handover blocking probability and an increase in new call blocking probability are observed for the DCR-like technique.

Keywords: cellular layout, DCR, LEO mobile satellite system, mobility model, positioning system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
421 Nonconforming Control Charts for Zero-Inflated Poisson Distribution

Authors: N. Katemee, T. Mayureesawan

Abstract:

This paper developed the c-Chart based on a Zero- Inflated Poisson (ZIP) processes that approximated by a geometric distribution with parameter p. The p estimated that fit for ZIP distribution used in calculated the mean, median, and variance of geometric distribution for constructed the c-Chart by three difference methods. For cg-Chart, developed c-Chart by used the mean and variance of the geometric distribution constructed control limits. For cmg-Chart, the mean used for constructed the control limits. The cme- Chart, developed control limits of c-Chart from median and variance values of geometric distribution. The performance of charts considered from the Average Run Length and Average Coverage Probability. We found that for an in-control process, the cg-Chart is superior for low level of mean at all level of proportion zero. For an out-of-control process, the cmg-Chart and cme-Chart are the best for mean = 2, 3 and 4 at all level of parameter.

Keywords: average coverage probability, average run length, geometric distribution, zero-inflated poisson distribution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2412
420 Mobile Robot Path Planning Utilizing Probability Recursive Function

Authors: Ethar H. Khalil, Bahaa I. Kazem

Abstract:

In this work a software simulation model has been proposed for two driven wheels mobile robot path planning; that can navigate in dynamic environment with static distributed obstacles. The work involves utilizing Bezier curve method in a proposed N order matrix form; for engineering the mobile robot path. The Bezier curve drawbacks in this field have been diagnosed. Two directions: Up and Right function has been proposed; Probability Recursive Function (PRF) to overcome those drawbacks. PRF functionality has been developed through a proposed; obstacle detection function, optimization function which has the capability of prediction the optimum path without comparison between all feasible paths, and N order Bezier curve function that ensures the drawing of the obtained path. The simulation results that have been taken showed; the mobile robot travels successfully from starting point and reaching its goal point. All obstacles that are located in its way have been avoided. This navigation is being done successfully using the proposed PRF techniques.

Keywords: Mobile robot, path planning, Bezier curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1462
419 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage

Authors: Oh Hyeon Jeon, WooYoung Jung

Abstract:

In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.

Keywords: Weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo Simulation, permeability coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1161
418 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity

Authors: Mujtaba Roshan, John A. Schormans

Abstract:

Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.

Keywords: Quality of experience, quality of service, packet loss probability, network capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 941
417 Efficient Detection Using Sequential Probability Ratio Test in Mobile Cognitive Radio Systems

Authors: Yeon-Jea Cho, Sang-Uk Park, Won-Chul Choi, Dong-Jo Park

Abstract:

This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.

Keywords: Cognitive radio, fast fading, sequential detection, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743
416 Non-equilibrium Statistical Mechanics of a Driven Lattice Gas Model: Probability Function, FDT-violation, and Monte Carlo Simulations

Authors: K. Sudprasert, M. Precharattana, N. Nuttavut, D. Triampo, B. Pattanasiri, Y. Lenbury, W. Triampo

Abstract:

The study of non-equilibrium systems has attracted increasing interest in recent years, mainly due to the lack of theoretical frameworks, unlike their equilibrium counterparts. Studying the steady state and/or simple systems is thus one of the main interests. Hence in this work we have focused our attention on the driven lattice gas model (DLG model) consisting of interacting particles subject to an external field E. The dynamics of the system are given by hopping of particles to nearby empty sites with rates biased for jumps in the direction of E. Having used small two dimensional systems of DLG model, the stochastic properties at nonequilibrium steady state were analytically studied. To understand the non-equilibrium phenomena, we have applied the analytic approach via master equation to calculate probability function and analyze violation of detailed balance in term of the fluctuation-dissipation theorem. Monte Carlo simulations have been performed to validate the analytic results.

Keywords: Non-equilibrium, lattice gas, stochastic process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
415 A Degraded Practical MIMOME Channel: Issues Insecret Data Communications

Authors: Mohammad Rakibul Islam

Abstract:

In this paper, a Gaussian multiple input multiple output multiple eavesdropper (MIMOME) channel is considered where a transmitter communicates to a receiver in the presence of an eavesdropper. We present a technique for determining the secrecy capacity of the multiple input multiple output (MIMO) channel under Gaussian noise. We transform the degraded MIMOME channel into multiple single input multiple output (SIMO) Gaussian wire-tap channels and then use scalar approach to convert it into two equivalent multiple input single output (MISO) channels. The secrecy capacity model is then developed for the condition where the channel state information (CSI) for main channel only is known to the transmitter. The results show that the secret communication is possible when the eavesdropper channel noise is greater than a cutoff noise level. The outage probability is also analyzed of secrecy capacity is also analyzed. The effect of fading and outage probability is also analyzed.

Keywords: Secrecy capacity, MIMO, wiretap channel, covariance matrix, fading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
414 Carbon Disulfide Production via Hydrogen Sulfide Methane Reformation

Authors: H. Hosseini, M. Javadi, M. Moghiman, M. H. Ghodsi Rad

Abstract:

Carbon disulfide is widely used for the production of viscose rayon, rubber, and other organic materials and it is a feedstock for the synthesis of sulfuric acid. The objective of this paper is to analyze possibilities for efficient production of CS2 from sour natural gas reformation (H2SMR) (2H2S+CH4 =CS2 +4H2) . Also, the effect of H2S to CH4 feed ratio and reaction temperature on carbon disulfide production is investigated numerically in a reforming reactor. The chemical reaction model is based on an assumed Probability Density Function (PDF) parameterized by the mean and variance of mixture fraction and β-PDF shape. The results show that the major factors influencing CS2 production are reactor temperature. The yield of carbon disulfide increases with increasing H2S to CH4 feed gas ratio (H2S/CH4≤4). Also the yield of C(s) increases with increasing temperature until the temperature reaches to 1000°K, and then due to increase of CS2 production and consumption of C(s), yield of C(s) drops with further increase in the temperature. The predicted CH4 and H2S conversion and yield of carbon disulfide are in good agreement with result of Huang and TRaissi.

Keywords: Carbon disulfide, sour natural gas, H2SMR, probability density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5257
413 Fault Localization and Alarm Correlation in Optical WDM Networks

Authors: G. Ramesh, S. Sundara Vadivelu

Abstract:

For several high speed networks, providing resilience against failures is an essential requirement. The main feature for designing next generation optical networks is protecting and restoring high capacity WDM networks from the failures. Quick detection, identification and restoration make networks more strong and consistent even though the failures cannot be avoided. Hence, it is necessary to develop fast, efficient and dependable fault localization or detection mechanisms. In this paper we propose a new fault localization algorithm for WDM networks which can identify the location of a failure on a failed lightpath. Our algorithm detects the failed connection and then attempts to reroute data stream through an alternate path. In addition to this, we develop an algorithm to analyze the information of the alarms generated by the components of an optical network, in the presence of a fault. It uses the alarm correlation in order to reduce the list of suspected components shown to the network operators. By our simulation results, we show that our proposed algorithms achieve less blocking probability and delay while getting higher throughput.

Keywords: Alarm correlation, blocking probability, delay, fault localization, WDM networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2068
412 Improving Order Quantity Model with Emergency Safety Stock (ESS)

Authors: Yousef Abu Nahleh, Alhasan Hakami, Arun Kumar, Fugen Daver

Abstract:

This study considers the problem of calculating safety stocks in disaster situations inventory systems that face demand uncertainties. Safety stocks are essential to make the supply chain, which is controlled by forecasts of customer needs, in response to demand uncertainties and to reach predefined goal service levels. To solve the problem of uncertainties due to the disaster situations affecting the industry sector, the concept of Emergency Safety Stock (ESS) was proposed. While there exists a huge body of literature on determining safety stock levels, this literature does not address the problem arising due to the disaster and dealing with the situations. In this paper, the problem of improving the Order Quantity Model to deal with uncertainty of demand due to disasters is managed by incorporating a new idea called ESS which is based on the probability of disaster occurrence and uses probability matrix calculated from the historical data. 

Keywords: Emergency Safety Stocks, Safety stocks, Order Quantity Model, Supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2869
411 Ship Detection Requirements Analysis for Different Sea States: Validation on Real SAR Data

Authors: Jaime Martín-de-Nicolás, David Mata-Moya, Nerea del-Rey-Maestre, Pedro Gómez-del-Hoyo, María-Pilar Jarabo-Amores

Abstract:

Ship detection is nowadays quite an important issue in tasks related to sea traffic control, fishery management and ship search and rescue. Although it has traditionally been carried out by patrol ships or aircrafts, coverage and weather conditions and sea state can become a problem. Synthetic aperture radars can surpass these coverage limitations and work under any climatological condition. A fast CFAR ship detector based on a robust statistical modeling of sea clutter with respect to sea states in SAR images is used. In this paper, the minimum SNR required to obtain a given detection probability with a given false alarm rate for any sea state is determined. A Gaussian target model using real SAR data is considered. Results show that SNR does not depend heavily on the class considered. Provided there is some variation in the backscattering of targets in SAR imagery, the detection probability is limited and a post-processing stage based on morphology would be suitable.

Keywords: SAR, generalized gamma distribution, detection curves, radar detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1174
410 Landowers' Participation Behavior on the Payment for Environmental Service (PES): Evidences from Taiwan

Authors: Wan-Yu Liu

Abstract:

To respond to the Kyoto Protocol, the policy of Payment for Environmental Service (PES), which was entitled “Plain Landscape Afforestation Program (PLAP)", was certified by Executive Yuan in Taiwan on 31 August 2001 and has been implementing for six years since 1 January 2002. Although the PLAP has received a lot of positive comments, there are still many difficulties during the process of implementation, such as insufficient technology for afforestation, private landowners- low interests in participating in PLAP, insufficient subsidies, and so on, which are potential threats that hinder the PLAP from moving forward in future. In this paper, selecting Ping-Tung County in Taiwan as a sample region and targeting those private landowners with and without intention to participate in the PLAP, respectively, we conduct an empirical analysis based on the Logit model to investigate the factors that determine whether those private landowners join the PLAP, so as to realize the incentive effects of the PLAP upon the personal decision on afforestation. The possible factors that might determine private landowner-s participation in the PLAP include landowner-s characteristics, cropland characteristics, as well as policy factors. Among them, the policy factors include afforestation subsidy amount (+), duration of afforestation subsidy (+), the rules on adjoining and adjacent areas (+), and so on, which do not reach the remarkable level in statistics though, but the directions of variable signs are consistent with the intuition behind the policy. As for the landowners- characteristics, each of age (+), education level (–), and annual household income (+) variables reaches 10% of the remarkable level in statistics; as for the cropland characteristics, each of cropland area (+), cropland price (–), and the number of cropland parcels (–) reaches 1% of the remarkable level in statistics. In light of the above, the cropland characteristics are the dominate factor that determines the probability of landowner-s participation in the PLAP. In the Logit model established by this paper, the probability of correctly estimating nonparticipants is 98%, the probability of correctly estimating the participants is 71.8%, and the probability for the overall estimation is 95%. In addition, Hosmer-Lemeshow test and omnibus test also revealed that the Logit model in this paper may provide fine goodness of fit and good predictive power in forecasting private landowners- participation in this program. The empirical result of this paper expects to help the implementation of the afforestation programs in Taiwan.

Keywords: Forestry policy, logit, afforestation subsidy, afforestation policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1605
409 Application of Heuristic Integration Ant Colony Optimization in Path Planning

Authors: Zeyu Zhang, Guisheng Yin, Ziying Zhang, Liguo Zhang

Abstract:

This paper mainly studies the path planning method based on ant colony optimization (ACO), and proposes heuristic integration ant colony optimization (HIACO). This paper not only analyzes and optimizes the principle, but also simulates and analyzes the parameters related to the application of HIACO in path planning. Compared with the original algorithm, the improved algorithm optimizes probability formula, tabu table mechanism and updating mechanism, and introduces more reasonable heuristic factors. The optimized HIACO not only draws on the excellent ideas of the original algorithm, but also solves the problems of premature convergence, convergence to the sub optimal solution and improper exploration to some extent. HIACO can be used to achieve better simulation results and achieve the desired optimization. Combined with the probability formula and update formula, several parameters of HIACO are tested. This paper proves the principle of the HIACO and gives the best parameter range in the research of path planning.

Keywords: Ant colony optimization, heuristic integration, path planning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 680
408 Dynamic Reroute Modeling for Emergency Evacuation: Case Study of Brunswick City, Germany

Authors: Yun-Pang Flötteröd, Jakob Erdmann

Abstract:

The human behaviors during evacuations are quite complex. One of the critical behaviors which affect the efficiency of evacuation is route choice. Therefore, the respective simulation modeling work needs to function properly. In this paper, Simulation of Urban Mobility’s (SUMO) current dynamic route modeling during evacuation, i.e. the rerouting functions, is examined with a real case study. The result consistency of the simulation and the reality is checked as well. Four influence factors (1) time to get information, (2) probability to cancel a trip, (3) probability to use navigation equipment, and (4) rerouting and information updating period are considered to analyze possible traffic impacts during the evacuation and to examine the rerouting functions in SUMO. Furthermore, some behavioral characters of the case study are analyzed with use of the corresponding detector data and applied in the simulation. The experiment results show that the dynamic route modeling in SUMO can deal with the proposed scenarios properly. Some issues and function needs related to route choice are discussed and further improvements are suggested.

Keywords: Evacuation, microscopic traffic simulation, rerouting, SUMO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1181
407 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: Statistical slope stability analysis, Skew distributions, Probability of failure, Functions of random variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
406 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: Imperfect channel state information, outage probability, multiuser- multi input single output.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1115
405 Model Parameters Estimating on Lyman–Kutcher–Burman Normal Tissue Complication Probability for Xerostomia on Head and Neck Cancer

Authors: Tsair-Fwu Lee , Hui-Min Ting , Pei-Ju Chao, Jing-Chuan Jiang, Min-Yuan Chao, Wen-Cheng Chen, Long-Chang Chen, Jia-Ming Wu

Abstract:

The purpose of this study is to derive parameters estimating for the Lyman–Kutcher–Burman (LKB) normal tissue complication probability (NTCP) model using analysis of scintigraphy assessments and quality of life (QoL) measurement questionnaires for the parotid gland (xerostomia). In total, 31 patients with head-and-neck (HN) cancer were enrolled. Salivary excretion factor (SEF) and EORTC QLQ-H&N35 questionnaires datasets are used for the NTCP modeling to describe the incidence of grade 4 xerostomia. Assuming that n= 1, NTCP fitted parameters are given as TD50= 43.6 Gy, m= 0.18 in SEF analysis, and as TD50= 44.1 Gy, m= 0.11 in QoL measurements, respectively. SEF and QoL datasets can validate the Quantitative Analyses of Normal Tissue Effects in the Clinic (QUANTEC) guidelines well, resulting in NPV-s of 100% for the both datasets and suggests that the QUANTEC 25/20Gy gland-spared guidelines are suitable for clinical used for the HN cohort to effectively avoid xerostomia.

Keywords: HN, NTCP, SEF, QoL, QUANTEC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
404 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: Big data, cooperative jamming, energy balance, physical layer, two-hop transmission, wireless security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2180