Search results for: exceedance probability
468 Comparison of Wind Fragility for Window System in the Simplified 10 and 15-Story Building Considering Exposure Category
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Window system in high rise building is occasionally subjected to an excessive wind intensity, particularly during typhoon. The failure of window system did not affect overall safety of structural performance; however, it could endanger the safety of the residents. In this paper, comparison of fragility curves for window system of two residential buildings was studied. The probability of failure for individual window was determined with Monte Carlo Simulation method. Then, lognormal cumulative distribution function was used to represent the fragility. The results showed that windows located on the edge of leeward wall were more susceptible to wind load and the probability of failure for each window panel increased at higher floors.Keywords: Wind fragility, window system, high rise building.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1291467 A Formal Approach for Proof Constructions in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this article we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (σ-algebras, probability spaces and conditional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes- Formula. Besides, we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this article shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in cryptographic research, if the corresponding basic mathematical knowledge is available in a database.Keywords: prime numbers, primality tests, (conditional) probabilitydistributions, formal proof system, higher-order logic, formalverification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470466 Computer Verification in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Keywords: prime numbers, primality tests, (conditional) proba¬bility distributions, formal proof system, higher-order logic, formal verification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181465 Performance Evaluation of the OCDM/WDM Technique for Optical Packet Switches
Authors: V. Eramo, L. Piazzo, M. Listanti, A. Germoni, A Cianfrani
Abstract:
The performance of the Optical Code Division Multiplexing/ Wavelength Division Multiplexing (WDM/OCDM) technique for Optical Packet Switch is investigated. The impact on the performance of the impairment due to both Multiple Access Interference and Beat noise is studied. The Packet Loss Probability due to output packet contentions is evaluated as a function of the main switch and traffic parameters when Gold coherent optical codes are adopted. The Packet Loss Probability of the OCDM/WDM switch can reach 10-9 when M=16 wavelengths, Gold code of length L=511 and only 24 wavelength converters are used in the switch.
Keywords: Optical code division multiplexing, bufferless optical packet switch, performance evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444464 Evaluation of Seismic Damage for Gisha Bridge in Tehran by HAZUS Methodology
Authors: Langroudi B., Salehi E., Keshani S., Baghersad M.
Abstract:
Transportation is of great importance in the current life of human beings. The transportation system plays many roles, from economical development to after-catastrophe aids such as rescue operation in the first hours and days after an earthquake. In after earthquakes response phase, transportation system acts as a basis for ground operations including rescue and relief operation, food providing for victims and etc. It is obvious that partial or complete obstruction of this system results in the stop of these operations. Bridges are one of the most important elements of transportation network. Failure of a bridge, in the most optimistic case, cuts the relation between two regions and in more developed countries, cuts the relation of numerous regions. In this paper, to evaluate the vulnerability and estimate the damage level of Tehran bridges, HAZUS method, developed by Federal Emergency Management Agency (FEMA) with the aid of National Institute of Building Science (NIBS), is used for the first time in Iran. In this method, to evaluate the collapse probability, fragility curves are used. Iran is located on seismic belt and thus, it is vulnerable to earthquakes. Thus, the study of the probability of bridge collapses, as an important part of transportation system, during earthquakes is of great importance. The purpose of this study is to provide fragility curves for Gisha Bridge, one of the longest steel bridges in Tehran, as an important lifeline element. Besides, the damage probability for this bridge during a specific earthquake, introduced as scenario earthquakes, is calculated. The fragility curves show that for the considered scenario, the probability of occurrence of complete collapse for the bridge is 8.6%.Keywords: Bridge, Damage evaluation, Fragility curve, Lifelines, Seismic vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2142463 A New Algorithm for Enhanced Robustness of Copyright Mark
Authors: Harsh Vikram Singh, S. P. Singh, Anand Mohan
Abstract:
This paper discusses a new heavy tailed distribution based data hiding into discrete cosine transform (DCT) coefficients of image, which provides statistical security as well as robustness against steganalysis attacks. Unlike other data hiding algorithms, the proposed technique does not introduce much effect in the stegoimage-s DCT coefficient probability plots, thus making the presence of hidden data statistically undetectable. In addition the proposed method does not compromise on hiding capacity. When compared to the generic block DCT based data-hiding scheme, our method found more robust against a variety of image manipulating attacks such as filtering, blurring, JPEG compression etc.
Keywords: Information Security, Robust Steganography, Steganalysis, Pareto Probability Distribution function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797462 The Performance of Predictive Classification Using Empirical Bayes
Authors: N. Deetae, S. Sukparungsee, Y. Areepong, K. Jampachaisri
Abstract:
This research is aimed to compare the percentages of correct classification of Empirical Bayes method (EB) to Classical method when data are constructed as near normal, short-tailed and long-tailed symmetric, short-tailed and long-tailed asymmetric. The study is performed using conjugate prior, normal distribution with known mean and unknown variance. The estimated hyper-parameters obtained from EB method are replaced in the posterior predictive probability and used to predict new observations. Data are generated, consisting of training set and test set with the sample sizes 100, 200 and 500 for the binary classification. The results showed that EB method exhibited an improved performance over Classical method in all situations under study.
Keywords: Classification, Empirical Bayes, Posterior predictive probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598461 On Simple Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Authors: Suparat Niwitpong, Sa-aat Niwitpong
Abstract:
In this paper we proposed the new confidence interval for the normal population mean with known coefficient of variation. In practice, this situation occurs normally in environment and agriculture sciences where we know the standard deviation is proportional to the mean. As a result, the coefficient of variation of is known. We propose the new confidence interval based on the recent work of Khan [3] and this new confidence interval will compare with our previous work, see, e.g. Niwitpong [5]. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. A numerical method will be used to assess the performance of these intervals based on their expected lengths.
Keywords: confidence interval, coverage probability, expected length, known coefficient of variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761460 Steering Velocity Bounded Mobile Robots in Environments with Partially Known Obstacles
Authors: Reza Hossseynie, Amir Jafari
Abstract:
This paper presents a method for steering velocity bounded mobile robots in environments with partially known stationary obstacles. The exact location of obstacles is unknown and only a probability distribution associated with the location of the obstacles is known. Kinematic model of a 2-wheeled differential drive robot is used as the model of mobile robot. The presented control strategy uses the Artificial Potential Field (APF) method for devising a desired direction of movement for the robot at each instant of time while the Constrained Directions Control (CDC) uses the generated direction to produce the control signals required for steering the robot. The location of each obstacle is considered to be the mean value of the 2D probability distribution and similarly, the magnitude of the electric charge in the APF is set as the trace of covariance matrix of the location probability distribution. The method not only captures the challenges of planning the path (i.e. probabilistic nature of the location of unknown obstacles), but it also addresses the output saturation which is considered to be an important issue from the control perspective. Moreover, velocity of the robot can be controlled during the steering. For example, the velocity of robot can be reduced in close vicinity of obstacles and target to ensure safety. Finally, the control strategy is simulated for different scenarios to show how the method can be put into practice.Keywords: Steering, obstacle avoidance, mobile robots, constrained directions control, artificial potential field.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 907459 Towards Modeling for Crashes A Low-Cost Adaptive Methodology for Karachi
Authors: Mohammad Ahmed Rehmatullah
Abstract:
The aim of this paper is to discuss a low-cost methodology that can predict traffic flow conflicts and quantitatively rank crash expectancies (based on relative probability) for various traffic facilities. This paper focuses on the application of statistical distributions to model traffic flow and Monte Carlo techniques to simulate traffic and discusses how to create a tool in order to predict the possibility of a traffic crash. A low-cost data collection methodology has been discussed for the heterogeneous traffic flow that exists and a GIS platform has been proposed to thematically represent traffic flow from simulations and the probability of a crash. Furthermore, discussions have been made to reflect the dynamism of the model in reference to its adaptability, adequacy, economy, and efficiency to ensure adoption.
Keywords: Heterogeneous traffic data collection, Monte CarloSimulation, Traffic Flow Modeling, GIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1435458 Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area
Authors: S. Shirazian, M.R. Ghayamghamian, G.R. Nouri
Abstract:
Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.Keywords: Fragility curve, Seismic behavior, Time historyanalysis, Transportation Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2796457 Democratic Political Socialization of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok
Authors: Mathinee Khongsatid, Phusit Phukamchanoad, Sakapas Sangchai
Abstract:
This research aims to study the democratic political socialization of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok by using stratified sampling for probability sampling and using purposive sampling for non-probability sampling to collect data toward the distribution of questionnaires to 300 respondents. This covers all of the schools under the authority of Dusit District Office. The researcher analyzed the data by using descriptive statistics which include arithmetic mean and standard deviation. The result shows that 5th and 6th graders under the authority of Dusit District Office, Bangkok, have displayed some characteristics following democratic political socialization both inside and outside classroom as well as outside school. However, the democratic political socialization in classroom through grouping and class participation is much more emphasized.
Keywords: Democratic, Political Socialization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673456 Contingency Screening Using Risk Factor Considering Transmission Line Outage
Authors: M. Marsadek, A. Mohamed
Abstract:
Power system security analysis is the most time demanding process due to large number of possible contingencies that need to be analyzed. In a power system, any contingency resulting in security violation such as line overload or low voltage may occur for a number of reasons at any time. To efficiently rank a contingency, both probability and the extent of security violation must be considered so as not to underestimate the risk associated with the contingency. This paper proposed a contingency ranking method that take into account the probabilistic nature of power system and the severity of contingency by using a newly developed method based on risk factor. The proposed technique is implemented on IEEE 24-bus system.Keywords: Line overload, low voltage, probability, risk factor, severity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1227455 Remarks Regarding Queuing Model and Packet Loss Probability for the Traffic with Self-Similar Characteristics
Authors: Mihails Kulikovs, Ernests Petersons
Abstract:
Network management techniques have long been of interest to the networking research community. The queue size plays a critical role for the network performance. The adequate size of the queue maintains Quality of Service (QoS) requirements within limited network capacity for as many users as possible. The appropriate estimation of the queuing model parameters is crucial for both initial size estimation and during the process of resource allocation. The accurate resource allocation model for the management system increases the network utilization. The present paper demonstrates the results of empirical observation of memory allocation for packet-based services.Keywords: Queuing System, Packet Loss Probability, Measurement-Based Admission Control (MBAC), Performanceevaluation, Quality of Service (QoS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774454 Channels Splitting Strategy for Optical Local Area Networks of Passive Star Topology
Authors: Peristera Baziana
Abstract:
In this paper, we present a network configuration for a WDM LANs of passive star topology that assume that the set of data WDM channels is split into two separate sets of channels, with different access rights over them. Especially, a synchronous transmission WDMA access algorithm is adopted in order to increase the probability of successful transmission over the data channels and consequently to reduce the probability of data packets transmission cancellation in order to avoid the data channels collisions. Thus, a control pre-transmission access scheme is followed over a separate control channel. An analytical Markovian model is studied and the average throughput is mathematically derived. The performance is studied for several numbers of data channels and various values of control phase duration.Keywords: Access algorithm, channels division, collisions avoidance, wavelength division multiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014453 Discrete Time Optimal Solution for the Connection Admission Control Problem
Authors: C. Bruni, F. Delli Priscoli, G. Koch, I. Marchetti
Abstract:
The Connection Admission Control (CAC) problem is formulated in this paper as a discrete time optimal control problem. The control variables account for the acceptance/ rejection of new connections and forced dropping of in-progress connections. These variables are constrained to meet suitable conditions which account for the QoS requirements (Link Availability, Blocking Probability, Dropping Probability). The performance index evaluates the total throughput. At each discrete time, the problem is solved as an integer-valued linear programming one. The proposed procedure was successfully tested against suitably simulated data.
Keywords: Connection Admission Control, Optimal Control, Integer valued Linear Programming, Quality of Service Requirements, Robust Control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264452 Outage Capacity Analysis for Next Generation Wireless Communication Using Non-Orthogonal Multiple Access
Authors: Md. Sohidul Islam, Ahmad Fartheen Khan
Abstract:
In recent times, Non-Orthogonal Multiple Access (NOMA) has received significant attention as an upcoming candidate in the world of 5G systems. The main reason for getting NOMA in 5G is because of its capacity to provide services to many users who have the same time and frequency resources. It is best used as "multiple-input, multiple-output" (MIMO) technology. In this paper, we are going to investigate outage probability as a function of signal-to-noise ratio (SNR) and target rate user. These methods will be implemented using cooperative communication and fair power allocation, respectively.
Keywords: Non-orthogonal Multiple Access, Fair Power Allocation, Outage Probability, Target Rate User, Cooperative Communication, massive multiple input multiple output, MIMO, Successive Interference Cancellation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 354451 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.
Keywords: Wind fragility, glass window, high rise apartment, Monte Carlo Simulation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1222450 Performance of Dual MRC Receiver for M-ary Modulations over Correlated Nakagami-m Fading Channels with Non-identical and Arbitrary Fading Parameter
Authors: Rupaban Subadar
Abstract:
Performance of a dual maximal ratio combining receiver has been analyzed for M-ary coherent and non-coherent modulations over correlated Nakagami-m fading channels with nonidentical and arbitrary fading parameter. The classical probability density function (PDF) based approach is used for analysis. Expressions for outage probability and average symbol error performance for M-ary coherent and non-coherent modulations have been obtained. The obtained results are verified against the special case published results and found to be matching. The effect of the unequal fading parameters, branch correlation and unequal input average SNR on the receiver performance has been studied.Keywords: MRC, correlated Nakagami-m fading, non-identicalfading statistics, average symbol error rate
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449449 Performance Evaluation of Cooperative Diversity in Flat Fading Channel with Error Control Coding
Authors: Oluseye Adeniyi Adeleke, Mohd Fadzli Salleh
Abstract:
Cooperative communication provides transmit diversity, even when, due to size constraints, mobile units cannot accommodate multiple antennas. A versatile cooperation method called coded cooperation has been developed, in which cooperation is implemented through channel coding with a view to controlling the errors inherent in wireless communication. In this work we evaluate the performance of coded cooperation in flat Rayleigh fading environment using a concept known as the pair wise error probability (PEP). We derive the PEP for a flat fading scenario in coded cooperation and then compare with the signal-to-noise ratio of the users in the network. Results show that an increase in the SNR leads to a decrease in the PEP. We also carried out simulations to validate the result.
Keywords: Channel state information, coded cooperation, cooperative systems, pairwise-error-probability, Reed-Solomon codes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771448 Dynamic Economic Dispatch Constrained by Wind Power Weibull Distribution: A Here-and-Now Strategy
Authors: Mostafa A. Elshahed, Magdy M. Elmarsfawy, Hussain M. Zain Eldain
Abstract:
In this paper, a Dynamic Economic Dispatch (DED) model is developed for the system consisting of both thermal generators and wind turbines. The inclusion of a significant amount of wind energy into power systems has resulted in additional constraints on DED to accommodate the intermittent nature of the output. The probability of stochastic wind power based on the Weibull probability density function is included in the model as a constraint; A Here-and-Now Approach. The Environmental Protection Agency-s hourly emission target, which gives the maximum emission during the day, is used as a constraint to reduce the atmospheric pollution. A 69-bus test system with non-smooth cost function is used to illustrate the effectiveness of the proposed model compared with static economic dispatch model with including the wind power.
Keywords: Dynamic Economic Dispatch, StochasticOptimization, Weibull Distribution, Wind Power
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2963447 Optimization of Flexible Job Shop Scheduling Problem with Sequence Dependent Setup Times Using Genetic Algorithm Approach
Authors: Sanjay Kumar Parjapati, Ajai Jain
Abstract:
This paper presents optimization of makespan for ‘n’ jobs and ‘m’ machines flexible job shop scheduling problem with sequence dependent setup time using genetic algorithm (GA) approach. A restart scheme has also been applied to prevent the premature convergence. Two case studies are taken into consideration. Results are obtained by considering crossover probability (pc = 0.85) and mutation probability (pm = 0.15). Five simulation runs for each case study are taken and minimum value among them is taken as optimal makespan. Results indicate that optimal makespan can be achieved with more than one sequence of jobs in a production order.
Keywords: Flexible Job Shop, Genetic Algorithm, Makespan, Sequence Dependent Setup Times.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3294446 Entropic Measures of a Probability Sample Space and Exponential Type (α, β) Entropy
Authors: Rajkumar Verma, Bhu Dev Sharma
Abstract:
Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon’s entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon’s and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called ‘entropic measure of a sample space’, with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different ‘entropic measures’ depending on the distributions defined on it. It was noted that Campbell’s idea applied for R`enyi’s parametric entropy of a given order also. Knowing that parameters play a role in providing suitable choices and extended applications, paper studies parametric entropic measures of sample spaces also. Exponential entropies related to Shannon’s and those generalizations that have logarithmic functions, i.e. are additive have been studies for wider understanding and applications. We propose and study exponential entropies corresponding to non additive entropies of type (α, β), which include Havard and Charvˆat entropy as a special case.
Keywords: Sample space, Probability distributions, Shannon’s entropy, R`enyi’s entropy, Non-additive entropies .
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3392445 A Novel Estimation Method for Integer Frequency Offset in Wireless OFDM Systems
Authors: Taeung Yoon, Youngpo Lee, Chonghan Song, Na Young Ha, Seokho Yoon
Abstract:
Ren et al. presented an efficient carrier frequency offset (CFO) estimation method for orthogonal frequency division multiplexing (OFDM), which has an estimation range as large as the bandwidth of the OFDM signal and achieves high accuracy without any constraint on the structure of the training sequence. However, its detection probability of the integer frequency offset (IFO) rapidly varies according to the fractional frequency offset (FFO) change. In this paper, we first analyze the Ren-s method and define two criteria suitable for detection of IFO. Then, we propose a novel method for the IFO estimation based on the maximum-likelihood (ML) principle and the detection criteria defined in this paper. The simulation results demonstrate that the proposed method outperforms the Ren-s method in terms of the IFO detection probability irrespective of a value of the FFO.Keywords: Orthogonal frequency division multiplexing, integer frequency offset, estimation, training symbol
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2452444 Performance Analysis of Multiuser Diversity in Multiuser Two-Hop Decode-and-Forward Cooperative Multi-Relay Wireless Networks
Authors: Mamoun F. Al-Mistarihi, Rami Mohaisen
Abstract:
Cooperative diversity (CD) has been adopted in many communication systems because it helps in improving performance of the wireless communication systems with the help of the relays that emulate the multiple antenna terminals. This work aims to provide the derivation of the performance analysis expressions of the multiuser diversity (MUD) in the two-hop cooperative multi-relay wireless networks (TCMRNs). Considering the work analysis, we provide analytically the derivation of a closed form expression of the two most commonly used performance metrics namely, the outage probability and the symbol error probability (SEP) for the fixed decode-and-forward (FDF) protocol with MUD.
Keywords: Cooperative diversity (CD), fixed decode-andforward(FDF), multiuser diversity (MUD) , two - hop cooperative multi-relay wireless networks (TCMRN).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538443 A K-Means Based Clustering Approach for Finding Faulty Modules in Open Source Software Systems
Authors: Parvinder S. Sandhu, Jagdeep Singh, Vikas Gupta, Mandeep Kaur, Sonia Manhas, Ramandeep Sidhu
Abstract:
Prediction of fault-prone modules provides one way to support software quality engineering. Clustering is used to determine the intrinsic grouping in a set of unlabeled data. Among various clustering techniques available in literature K-Means clustering approach is most widely being used. This paper introduces K-Means based Clustering approach for software finding the fault proneness of the Object-Oriented systems. The contribution of this paper is that it has used Metric values of JEdit open source software for generation of the rules for the categorization of software modules in the categories of Faulty and non faulty modules and thereafter empirically validation is performed. The results are measured in terms of accuracy of prediction, probability of Detection and Probability of False Alarms.Keywords: K-Means, Software Fault, Classification, ObjectOriented Metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2305442 Modeling Peer-to-Peer Networks with Interest-Based Clusters
Authors: Bertalan Forstner, Dr. Hassan Charaf
Abstract:
In the world of Peer-to-Peer (P2P) networking different protocols have been developed to make the resource sharing or information retrieval more efficient. The SemPeer protocol is a new layer on Gnutella that transforms the connections of the nodes based on semantic information to make information retrieval more efficient. However, this transformation causes high clustering in the network that decreases the number of nodes reached, therefore the probability of finding a document is also decreased. In this paper we describe a mathematical model for the Gnutella and SemPeer protocols that captures clustering-related issues, followed by a proposition to modify the SemPeer protocol to achieve moderate clustering. This modification is a sort of link management for the individual nodes that allows the SemPeer protocol to be more efficient, because the probability of a successful query in the P2P network is reasonably increased. For the validation of the models, we evaluated a series of simulations that supported our results.Keywords: Peer-to-Peer, model, performance, networkmanagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1306441 Impact of Metallic Furniture on UWB Channel Statistical Characteristics by BER
Authors: Yu-Shuai Chen , Chien-Ching Chiu , Chung-Hsin Huang, Chien-Hung Chen
Abstract:
The bit error rate (BER) performance for ultra-wide band (UWB) indoor communication with impact of metallic furniture is investigated. The impulse responses of different indoor environments for any transmitter and receiver location are computed by shooting and bouncing ray/image and inverse Fourier transform techniques. By using the impulse responses of these multipath channels, the BER performance for binary pulse amplitude modulation (BPAM) impulse radio UWB communication system are calculated. Numerical results have shown that the multi-path effect by the metallic cabinets is an important factor for BER performance. Also the outage probability for the UWB multipath environment with metallic cabinets is more serious (about 18%) than with wooden cabinets. Finally, it is worth noting that in these cases the present work provides not only comparative information but also quantitative information on the performance reduction.Keywords: UWB, multipath, outage probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431440 Model-Based Software Regression Test Suite Reduction
Authors: Shiwei Deng, Yang Bao
Abstract:
In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.Keywords: Dependence analysis, EFSM model, greedy algorithm, regression test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921439 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: Enhanced ideal gas molecular movement, ideal gas molecular movement, model updating method, probability-based damage detection, uncertainty quantification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1077