Search results for: Probability of false alarm
591 Wind Fragility for Honeycomb Roof Cladding Panels Using Screw Pull-Out Capacity
Authors: Viriyavudh Sim, Woo Young Jung
Abstract:
The failure of roof cladding mostly occurs due to the failing of the connection between claddings and purlins, which is the pull-out of the screw connecting the two parts when the pull-out load, i.e. typhoon, is higher than the resistance of the connection screw. As typhoon disasters in Korea are constantly on the rise, probability risk assessment (PRA) has become a vital tool to evaluate the performance of civil structures. In this study, we attempted to determine the fragility of roof cladding with the screw connection. Experimental study was performed to evaluate the pull-out resistance of screw joints between honeycomb panels and back frames. Subsequently, by means of Monte Carlo Simulation method, probability of failure for these types of roof cladding was determined. The results that the failure of roof cladding was depends on their location on the roof, for example, the edge most panel has the highest probability of failure.Keywords: Monte Carlo Simulation, roof cladding, screw pull-out strength, wind fragility
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 956590 Signature Recognition and Verification using Hybrid Features and Clustered Artificial Neural Network(ANN)s
Authors: Manasjyoti Bhuyan, Kandarpa Kumar Sarma, Hirendra Das
Abstract:
Signature represents an individual characteristic of a person which can be used for his / her validation. For such application proper modeling is essential. Here we propose an offline signature recognition and verification scheme which is based on extraction of several features including one hybrid set from the input signature and compare them with the already trained forms. Feature points are classified using statistical parameters like mean and variance. The scanned signature is normalized in slant using a very simple algorithm with an intention to make the system robust which is found to be very helpful. The slant correction is further aided by the use of an Artificial Neural Network (ANN). The suggested scheme discriminates between originals and forged signatures from simple and random forgeries. The primary objective is to reduce the two crucial parameters-False Acceptance Rate (FAR) and False Rejection Rate (FRR) with lesser training time with an intension to make the system dynamic using a cluster of ANNs forming a multiple classifier system.Keywords: offline, algorithm, FAR, FRR, ANN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1781589 Influence of Maximum Fatigue Load on Probabilistic Aspect of Fatigue Crack Propagation Life at Specified Grown Crack in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.Keywords: Fatigue crack propagation life, magnesium alloys, maximum fatigue load, probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974588 Evolutionary Dynamics on Small-World Networks
Authors: Jan Rychtar, Brian Stadler
Abstract:
We study how the outcome of evolutionary dynamics on graphs depends on a randomness on the graph structure. We gradually change the underlying graph from completely regular (e.g. a square lattice) to completely random. We find that the fixation probability increases as the randomness increases; nevertheless, the increase is not significant and thus the fixation probability could be estimated by the known formulas for underlying regular graphs.Keywords: evolutionary dynamics, small-world networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1239587 A Unified Robust Algorithm for Detection of Human and Non-human Object in Intelligent Safety Application
Authors: M A Hannan, A. Hussain, S. A. Samad, K. A. Ishak, A. Mohamed
Abstract:
This paper presents a general trainable framework for fast and robust upright human face and non-human object detection and verification in static images. To enhance the performance of the detection process, the technique we develop is based on the combination of fast neural network (FNN) and classical neural network (CNN). In FNN, a useful correlation is exploited to sustain high level of detection accuracy between input image and the weight of the hidden neurons. This is to enable the use of Fourier transform that significantly speed up the time detection. The combination of CNN is responsible to verify the face region. A bootstrap algorithm is used to collect non human object, which adds the false detection to the training process of the human and non-human object. Experimental results on test images with both simple and complex background demonstrate that the proposed method has obtained high detection rate and low false positive rate in detecting both human face and non-human object.Keywords: Algorithm, detection of human and non-human object, FNN, CNN, Image training.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634586 A 3-Year Evaluation Study on Fine Needle Aspiration Cytology and Corresponding Histology
Authors: Amjad Al Shammari, Ashraf Ibrahim, Laila Seada
Abstract:
Background and Objectives: Incidence of thyroid carcinoma has been increasing world-wide. In the present study, we evaluated diagnostic accuracy of Fine needle aspiration (FNA) and its efficiency in early detecting neoplastic lesions of thyroid gland over a 3-year period. Methods: Data have been retrieved from pathology files in King Khalid Hospital. For each patient, age, gender, FNA, site & size of nodule and final histopathologic diagnosis were recorded. Results: Study included 490 cases where 419 of them were female and 71 male. Male to female ratio was 1:6. Mean age was 43 years for males and 38 for females. Cases with confirmed histopathology were 131. In 101/131 (77.1%), concordance was found between FNA and histology. In 30/131 (22.9%), there was discrepancy in diagnosis. Total malignant cases were 43, out of which 14 (32.5%) were true positive and 29 (67.44%) were false negative. No false positive cases could be found in our series. Conclusion: FNA could diagnose benign nodules in all cases, however, in malignant cases, ultrasound findings have to be taken into consideration to avoid missing of a microcarcinoma in the contralateral lobe.
Keywords: FNA, hail, histopathology, thyroid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1181585 SIMGraph: Simplifying Contig Graph to Improve de Novo Genome Assembly Using Next-generation Sequencing Data
Authors: Chien-Ju Li, Chun-Hui Yu, Chi-Chuan Hwang, Tsunglin Liu , Darby Tien-Hao Chang
Abstract:
De novo genome assembly is always fragmented. Assembly fragmentation is more serious using the popular next generation sequencing (NGS) data because NGS sequences are shorter than the traditional Sanger sequences. As the data throughput of NGS is high, the fragmentations in assemblies are usually not the result of missing data. On the contrary, the assembled sequences, called contigs, are often connected to more than one other contigs in a complicated manner, leading to the fragmentations. False connections in such complicated connections between contigs, named a contig graph, are inevitable because of repeats and sequencing/assembly errors. Simplifying a contig graph by removing false connections directly improves genome assembly. In this work, we have developed a tool, SIMGraph, to resolve ambiguous connections between contigs using NGS data. Applying SIMGraph to the assembly of a fungus and a fish genome, we resolved 27.6% and 60.3% ambiguous contig connections, respectively. These results can reduce the experimental efforts in resolving contig connections.
Keywords: Contig graph, NGS, de novo assembly, scaffold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737584 A Simplified Distribution for Nonlinear Seas
Authors: M. A. Tayfun, M. A. Alkhalidi
Abstract:
The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is reexamined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.
Keywords: Ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096583 Direct Sequence Spread Spectrum Technique with Residue Number System
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
In this paper, a residue number arithmetic is used in direct sequence spread spectrum system, this system is evaluated and the bit error probability of this system is compared to that of non residue number system. The effect of channel bandwidth, PN sequences, multipath effect and modulation scheme are studied. A Matlab program is developed to measure the signal-to-noise ratio (SNR), and the bit error probability for the various schemes.Keywords: Spread Spectrum, Direct sequence, Bit errorprobability and Residue number system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3653582 Feature Extraction of Dorsal Hand Vein Pattern Using a Fast Modified PCA Algorithm Based On Cholesky Decomposition and Lanczos Technique
Authors: Maleika Heenaye- Mamode Khan , Naushad Mamode Khan, Raja K.Subramanian
Abstract:
Dorsal hand vein pattern is an emerging biometric which is attracting the attention of researchers, of late. Research is being carried out on existing techniques in the hope of improving them or finding more efficient ones. In this work, Principle Component Analysis (PCA) , which is a successful method, originally applied on face biometric is being modified using Cholesky decomposition and Lanczos algorithm to extract the dorsal hand vein features. This modified technique decreases the number of computation and hence decreases the processing time. The eigenveins were successfully computed and projected onto the vein space. The system was tested on a database of 200 images and using a threshold value of 0.9 to obtain the False Acceptance Rate (FAR) and False Rejection Rate (FRR). This modified algorithm is desirable when developing biometric security system since it significantly decreases the matching time.
Keywords: Dorsal hand vein pattern, PCA, Cholesky decomposition, Lanczos algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838581 An Intelligent System for Phish Detection, using Dynamic Analysis and Template Matching
Authors: Chinmay Soman, Hrishikesh Pathak, Vishal Shah, Aniket Padhye, Amey Inamdar
Abstract:
Phishing, or stealing of sensitive information on the web, has dealt a major blow to Internet Security in recent times. Most of the existing anti-phishing solutions fail to handle the fuzziness involved in phish detection, thus leading to a large number of false positives. This fuzziness is attributed to the use of highly flexible and at the same time, highly ambiguous HTML language. We introduce a new perspective against phishing, that tries to systematically prove, whether a given page is phished or not, using the corresponding original page as the basis of the comparison. It analyzes the layout of the pages under consideration to determine the percentage distortion between them, indicative of any form of malicious alteration. The system design represents an intelligent system, employing dynamic assessment which accurately identifies brand new phishing attacks and will prove effective in reducing the number of false positives. This framework could potentially be used as a knowledge base, in educating the internet users against phishing.Keywords: World Wide Web, Phishing, Internet security, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833580 Data Oriented Modeling of Uniform Random Variable: Applied Approach
Authors: Ahmad Habibizad Navin, Mehdi Naghian Fesharaki, Mirkamal Mirnia, Mohamad Teshnelab, Ehsan Shahamatnia
Abstract:
In this paper we introduce new data oriented modeling of uniform random variable well-matched with computing systems. Due to this conformity with current computers structure, this modeling will be efficiently used in statistical inference.Keywords: Uniform random variable, Data oriented modeling, Statistical inference, Prodigraph, Statistically complete tree, Uniformdigital probability digraph, Uniform n-complete probability tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631579 Optimization for Reducing Handoff Latency and Utilization of Bandwidth in ATM Networks
Authors: Pooja, Megha Kulshrestha, V. K. Banga, Parvinder S. Sandhu
Abstract:
To support mobility in ATM networks, a number of technical challenges need to be resolved. The impact of handoff schemes in terms of service disruption, handoff latency, cost implications and excess resources required during handoffs needs to be addressed. In this paper, a one phase handoff and route optimization solution using reserved PVCs between adjacent ATM switches to reroute connections during inter-switch handoff is studied. In the second phase, a distributed optimization process is initiated to optimally reroute handoff connections. The main objective is to find the optimal operating point at which to perform optimization subject to cost constraint with the purpose of reducing blocking probability of inter-switch handoff calls for delay tolerant traffic. We examine the relation between the required bandwidth resources and optimization rate. Also we calculate and study the handoff blocking probability due to lack of bandwidth for resources reserved to facilitate the rapid rerouting.Keywords: Wireless ATM, Mobility, Latency, Optimization rateand Blocking Probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446578 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy
Authors: Zviad Ghadua, Biswa Bhattacharya
Abstract:
The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.
Keywords: Flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 751577 Stochastic Risk Analysis Framework for Building Construction Projects
Authors: Abdulkadir Abu Lawal
Abstract:
The study was carried out to establish the probability density function of some selected building construction projects of similar complexity delivered using Bill of Quantities (BQ) and Lump Sum (LS) forms of contract, and to draw a reliability scenario for each form of contract. 30 of such delivered projects are analyzed for each of the contract forms using Weibull Analysis, and their Weibull functions (α, and β) are determined based on their completion times. For the BQ form of contract delivered projects, α is calculated as 1.6737E20 and β as + 0.0115 and for the LS form, α is found to be 5.6556E03 and β is determined as + 0.4535. Using these values, respective probability density functions are calculated and plotted, as handy tool for risk analysis of future projects of similar characteristics. By input of variables from other projects, decision making processes can be made for a whole project or its components using EVM Analysis in project evaluation and review techniques. This framework, as a quantitative approach, depends on the assumption of normality in projects completion time, it can help greatly in determining the completion time probability for veritable projects using any of the contract forms under consideration. Projects aspects that are not amenable to measurement, on the other hand, can be analyzed using fuzzy sets and fuzzy logic. This scenario can be drawn for different types of building construction projects, and using different suitable forms of contract in projects delivery.
Keywords: Building construction, Projects, Forms of contract, Probability density function, Reliability scenario.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782576 Performance of Soft Handover Algorithm in Varied Propagation Environments
Authors: N. P. Singh, Brahmjit Singh
Abstract:
CDMA cellular networks support soft handover, which guarantees the continuity of wireless services and enhanced communication quality. Cellular networks support multimedia services under varied propagation environmental conditions. In this paper, we have shown the effect of characteristic parameters of the cellular environments on the soft handover performance. We consider path loss exponent, standard deviation of shadow fading and correlation coefficient of shadow fading as the characteristic parameters of the radio propagation environment. A very useful statistical measure for characterizing the performance of mobile radio system is the probability of outage. It is shown through numerical results that above parameters have decisive effect on the probability of outage and hence the overall performance of the soft handover algorithm.Keywords: CDMA, Correlation coefficient, Path loss exponent, Probability of outage, Soft handover.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724575 Parametric Modeling Approach for Call Holding Times for IP based Public Safety Networks via EM Algorithm
Authors: Badarch Tuyatsetseg
Abstract:
This paper presents parametric probability density models for call holding times (CHTs) into emergency call center based on the actual data collected for over a week in the public Emergency Information Network (EIN) in Mongolia. When the set of chosen candidates of Gamma distribution family is fitted to the call holding time data, it is observed that the whole area in the CHT empirical histogram is underestimated due to spikes of higher probability and long tails of lower probability in the histogram. Therefore, we provide the Gaussian parametric model of a mixture of lognormal distributions with explicit analytical expressions for the modeling of CHTs of PSNs. Finally, we show that the CHTs for PSNs are fitted reasonably by a mixture of lognormal distributions via the simulation of expectation maximization algorithm. This result is significant as it expresses a useful mathematical tool in an explicit manner of a mixture of lognormal distributions.Keywords: A mixture of lognormal distributions, modeling call holding times, public safety network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651574 Cognitive Relaying in Interference Limited Spectrum Sharing Environment: Outage Probability and Outage Capacity
Authors: Md Fazlul Kader, Soo Young Shin
Abstract:
In this paper, we consider a cognitive relay network (CRN) in which the primary receiver (PR) is protected by peak transmit power ¯PST and/or peak interference power Q constraints. In addition, the interference effect from the primary transmitter (PT) is considered to show its impact on the performance of the CRN. We investigate the outage probability (OP) and outage capacity (OC) of the CRN by deriving closed-form expressions over Rayleigh fading channel. Results show that both the OP and OC improve by increasing the cooperative relay nodes as well as when the PT is far away from the SR.Keywords: Cognitive relay, outage, interference limited, decode-and-forward (DF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1909573 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis
Authors: Petr Gurný
Abstract:
One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.
Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1919572 Fusion Classifier for Open-Set Face Recognition with Pose Variations
Authors: Gee-Sern Jison Hsu
Abstract:
A fusion classifier composed of two modules, one made by a hidden Markov model (HMM) and the other by a support vector machine (SVM), is proposed to recognize faces with pose variations in open-set recognition settings. The HMM module captures the evolution of facial features across a subject-s face using the subject-s facial images only, without referencing to the faces of others. Because of the captured evolutionary process of facial features, the HMM module retains certain robustness against pose variations, yielding low false rejection rates (FRR) for recognizing faces across poses. This is, however, on the price of poor false acceptance rates (FAR) when recognizing other faces because it is built upon withinclass samples only. The SVM module in the proposed model is developed following a special design able to substantially diminish the FAR and further lower down the FRR. The proposed fusion classifier has been evaluated in performance using the CMU PIE database, and proven effective for open-set face recognition with pose variations. Experiments have also shown that it outperforms the face classifier made by HMM or SVM alone.
Keywords: Face recognition, open-set identification, hidden Markov model, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694571 Application of Biometrics to Obtain High Entropy Cryptographic Keys
Authors: Sanjay Kanade, Danielle Camara, Dijana Petrovska-Delacretaz, Bernadette Dorizzi
Abstract:
In this paper, a two factor scheme is proposed to generate cryptographic keys directly from biometric data, which unlike passwords, are strongly bound to the user. Hash value of the reference iris code is used as a cryptographic key and its length depends only on the hash function, being independent of any other parameter. The entropy of such keys is 94 bits, which is much higher than any other comparable system. The most important and distinct feature of this scheme is that it regenerates the reference iris code by providing a genuine iris sample and the correct user password. Since iris codes obtained from two images of the same eye are not exactly the same, error correcting codes (Hadamard code and Reed-Solomon code) are used to deal with the variability. The scheme proposed here can be used to provide keys for a cryptographic system and/or for user authentication. The performance of this system is evaluated on two publicly available databases for iris biometrics namely CBS and ICE databases. The operating point of the system (values of False Acceptance Rate (FAR) and False Rejection Rate (FRR)) can be set by properly selecting the error correction capacity (ts) of the Reed- Solomon codes, e.g., on the ICE database, at ts = 15, FAR is 0.096% and FRR is 0.76%. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2091570 Probability Distribution of Rainfall Depth at Hourly Time-Scale
Authors: S. Dan'azumi, S. Shamsudin, A. A. Rahman
Abstract:
Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.Keywords: Goodness-of-fit test, Hourly rainfall, Malaysia, Probability distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2923569 Establishing a Probabilistic Model of Extrapolated Wind Speed Data for Wind Energy Prediction
Authors: Mussa I. Mgwatu, Reuben R. M. Kainkwa
Abstract:
Wind is among the potential energy resources which can be harnessed to generate wind energy for conversion into electrical power. Due to the variability of wind speed with time and height, it becomes difficult to predict the generated wind energy more optimally. In this paper, an attempt is made to establish a probabilistic model fitting the wind speed data recorded at Makambako site in Tanzania. Wind speeds and direction were respectively measured using anemometer (type AN1) and wind Vane (type WD1) both supplied by Delta-T-Devices at a measurement height of 2 m. Wind speeds were then extrapolated for the height of 10 m using power law equation with an exponent of 0.47. Data were analysed using MINITAB statistical software to show the variability of wind speeds with time and height, and to determine the underlying probability model of the extrapolated wind speed data. The results show that wind speeds at Makambako site vary cyclically over time; and they conform to the Weibull probability distribution. From these results, Weibull probability density function can be used to predict the wind energy.Keywords: Probabilistic models, wind speed, wind energy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2350568 Daemon- Based Distributed Deadlock Detection and Resolution
Authors: Z. RahimAlipour, A. T. Haghighat
Abstract:
detecting the deadlock is one of the important problems in distributed systems and different solutions have been proposed for it. Among the many deadlock detection algorithms, Edge-chasing has been the most widely used. In Edge-chasing algorithm, a special message called probe is made and sent along dependency edges. When the initiator of a probe receives the probe back the existence of a deadlock is revealed. But these algorithms are not problem-free. One of the problems associated with them is that they cannot detect some deadlocks and they even identify false deadlocks. A key point not mentioned in the literature is that when the process is waiting to obtain the required resources and its execution has been blocked, how it can actually respond to probe messages in the system. Also the question of 'which process should be victimized in order to achieve a better performance when multiple cycles exist within one single process in the system' has received little attention. In this paper, one of the basic concepts of the operating system - daemon - will be used to solve the problems mentioned. The proposed Algorithm becomes engaged in sending probe messages to the mandatory daemons and collects enough information to effectively identify and resolve multi-cycle deadlocks in distributed systems.Keywords: Distributed system, distributed deadlock detectionand resolution, daemon, false deadlock.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937567 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers
Authors: Quanru Pan
Abstract:
How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1963566 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.
Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3315565 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error
Authors: Oscar Javier Herrera, Manuel Ángel Camacho
Abstract:
This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors.’ The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.Keywords: Demand Forecasting, Empirical Distribution, Propagation of Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1845564 An Extension of the Kratzel Function and Associated Inverse Gaussian Probability Distribution Occurring in Reliability Theory
Authors: R. K. Saxena, Ravi Saxena
Abstract:
In view of their importance and usefulness in reliability theory and probability distributions, several generalizations of the inverse Gaussian distribution and the Krtzel function are investigated in recent years. This has motivated the authors to introduce and study a new generalization of the inverse Gaussian distribution and the Krtzel function associated with a product of a Bessel function of the third kind )(zKQ and a Z - Fox-Wright generalized hyper geometric function introduced in this paper. The introduced function turns out to be a unified gamma-type function. Its incomplete forms are also discussed. Several properties of this gamma-type function are obtained. By means of this generalized function, we introduce a generalization of inverse Gaussian distribution, which is useful in reliability analysis, diffusion processes, and radio techniques etc. The inverse Gaussian distribution thus introduced also provides a generalization of the Krtzel function. Some basic statistical functions associated with this probability density function, such as moments, the Mellin transform, the moment generating function, the hazard rate function, and the mean residue life function are also obtained.KeywordsFox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.
Keywords: Fox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724563 Investigation and Calculation of Seismic Reliability of Structures
Authors: Panam. Zarfam, Mohsen. Javan Pour
Abstract:
Recently, analysis and designing of the structures based on the Reliability theory have been the center of attention. Reason of this attention is the existence of the natural and random structural parameters such as the material specification, external loads, geometric dimensions etc. By means of the Reliability theory, uncertainties resulted from the statistical nature of the structural parameters can be changed into the mathematical equations and the safety and operational considerations can be considered in the designing process. According to this theory, it is possible to study the destruction probability of not only a specific element but also the entire system. Therefore, after being assured of safety of every element, their reciprocal effects on the safety of the entire system can be investigated.Keywords: Probability, Reliability, Statistics, Uncertainty
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1603562 Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Authors: Suparat Niwitpong
Abstract:
In this paper we proposed two new confidence intervals for the normal population mean with known coefficient of variation. This situation occurs normally in environment and agriculture experiments where the scientist knows the coefficient of variation of their experiments. We propose two new confidence intervals for this problem based on the recent work of Searls [5] and the new method proposed in this paper for the first time. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. Monte Carlo simulation will be used to assess the performance of these intervals based on their expected lengths.
Keywords: confidence interval, coverage probability, expected length, known coefficient of variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653