Search results for: exceedance probability
228 A New blaVIM Gene in a Pseudomonas putida Isolated from ENT Units in Sulaimani Hospitals
Authors: Dalanya Asaad Mohammed, Dara Abdul Razaq
Abstract:
A total of twenty tensile biopsies were collected from children undergoing tonsillectomy from teaching hospital ENT department and Kurdistan private hospital in sulaimani city. All biopsies were homogenized and cultured; the obtained bacterial isolates were purified and identified by biochemical tests and VITEK 2 compact system. Among the twenty studied samples, only one Pseudomonas putida with probability of 99% was isolated. Antimicrobial susceptibility was carried out by disk diffusion method, Pseudomonas putida showed resistance to all antibiotics used except vancomycin. The isolate further subjected to PCR and DNA sequence analysis of blaVIM gene using different set of primers for different regions of VIM gene. The results were found to be PCR positive for the blaVIM gene. To determine the sequence of blaVIM gene, DNA sequencing performed. Sequence alignment of blaVIM gene with previously recorded blaVIM gene in NCBI- database showed that P. putida isolate have different blaVIM gene.Keywords: Clinical isolates, Putida, Sulaimani, Vim gene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655227 A Comparative Study on Survival and Growth of Larvivorous Fish, Rasbora daniconius, Puntius ticto, and Puntius Conchonius
Authors: Lavkush Kumar Brahman, Ramesh Chandra
Abstract:
Experiments were carried out on the survival and growth of Rasbora daniconius, Puntius ticto and Puntius conchonius. The motivation of the study was to obtain information for growing the fish on a commercial scale for their use as biological control agents against mosquito larvae. The effects of temperature, total hardness, DO, pH and feed on the growth of fish were also investigated. Excessive value of total hardness was found because very rich calcium ion is present in Chitrakoot area. There was significant increases in growth rates of fish as temperature was increased from 280C to 300C. Further increases in temperature up to 320C, did not further affect growth. The positive and highly significant correlations 0.991488, 0.9581 and 0.9935 were found between length and weight of P. ticto, P. conchonius and R. daniconius respectively. The regression was significant at 5% level of probability.
Keywords: Indigenous fish, DO, larvae, mosquito, pH, Temperature, total hardness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1822226 Optimization of the Nutrient Supplients for Cellulase Production with the Basal Medium Palm Oil Mill Effluent
Authors: Rashid S S, Alam M Z, Karim M I A, Salleh, M H
Abstract:
A statistical optimization was studied to design a media composition to produce optimum cellulolytic enzyme where palm oil mill effluent (POME) as a basal medium and filamentous fungus, Trichoderma reesei RUT-C30 were used in the liquid state bioconversion(LSB). 2% (w/v) total suspended solid, TSS, of the POME supplemented with 1% (w/v) cellulose, 0.5%(w/v) peptone and 0.02% (v/v) Tween 80 was estimated to produce the optimum CMCase activity of 18.53 U/ml through the statistical analysis followed by the faced centered central composite design(FCCCD). The probability values of cellulose (<0.0011) and peptone (0.0021) indicated the significant effect on the production of cellulase with the determination coefficient (R2) of 0.995.
Keywords: Face centered central composite design (FCCCD), Liquid state bioconversion (LSB), Palm oil mill effluent, Trichoderma reesei RUT C-30.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145225 Assessment of Vulnerability Curves Using Vulnerability Index Method for Reinforced Concrete Structures
Authors: F. I. Belheouane, M. Bensaibi
Abstract:
The seismic feedback experiences in Algeria have shown higher percentage of damages for non-code conforming reinforced concrete (RC) buildings. Furthermore, the vulnerability of these buildings was further aggravated due to presence of many factors (e.g. weak the seismic capacity of these buildings, shorts columns, Pounding effect, etc.). Consequently Seismic risk assessments were carried out on populations of buildings to identify the buildings most likely to undergo losses during an earthquake. The results of such studies are important in the mitigation of losses under future seismic events as they allow strengthening intervention and disaster management plans to be drawn up. Within this paper, the state of the existing structures is assessed using "the vulnerability index" method. This method allows the classification of RC constructions taking into account both, structural and non structural parameters, considered to be ones of the main parameters governing the vulnerability of the structure. Based on seismic feedback from past earthquakes DPM (damage probability matrices) were developed too.Keywords: Seismic vulnerability, Reinforced concrete buildings, Earthquake, DPM, Algeria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2924224 Behavioral Signature Generation using Shadow Honeypot
Authors: Maros Barabas, Michal Drozd, Petr Hanacek
Abstract:
A novel behavioral detection framework is proposed to detect zero day buffer overflow vulnerabilities (based on network behavioral signatures) using zero-day exploits, instead of the signature-based or anomaly-based detection solutions currently available for IDPS techniques. At first we present the detection model that uses shadow honeypot. Our system is used for the online processing of network attacks and generating a behavior detection profile. The detection profile represents the dataset of 112 types of metrics describing the exact behavior of malware in the network. In this paper we present the examples of generating behavioral signatures for two attacks – a buffer overflow exploit on FTP server and well known Conficker worm. We demonstrated the visualization of important aspects by showing the differences between valid behavior and the attacks. Based on these metrics we can detect attacks with a very high probability of success, the process of detection is however very expensive.Keywords: behavioral signatures, metrics, network, security design
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2054223 A Thought on Exotic Statistical Distributions
Authors: R K Sinha
Abstract:
The statistical distributions are modeled in explaining nature of various types of data sets. Although these distributions are mostly uni-modal, it is quite common to see multiple modes in the observed distribution of the underlying variables, which make the precise modeling unrealistic. The observed data do not exhibit smoothness not necessarily due to randomness, but could also be due to non-randomness resulting in zigzag curves, oscillations, humps etc. The present paper argues that trigonometric functions, which have not been used in probability functions of distributions so far, have the potential to take care of this, if incorporated in the distribution appropriately. A simple distribution (named as, Sinoform Distribution), involving trigonometric functions, is illustrated in the paper with a data set. The importance of trigonometric functions is demonstrated in the paper, which have the characteristics to make statistical distributions exotic. It is possible to have multiple modes, oscillations and zigzag curves in the density, which could be suitable to explain the underlying nature of select data set.Keywords: Exotic Statistical Distributions, Kurtosis, Mixture Distributions, Multi-modal
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627222 A Probability based Pair Extension Method in Protein 2-DE Gel Image Analysis
Authors: Yanhua Jin, Won Suk Lee
Abstract:
The two-dimensional gel electrophoresis method (2-DE) is widely used in Proteomics to separate thousands of proteins in a sample. By comparing the protein expression levels of proteins in a normal sample with those in a diseased one, it is possible to identify a meaningful set of marker proteins for the targeted disease. The major shortcomings of this approach involve inherent noises and irregular geometric distortions of spots observed in 2-DE images. Various experimental conditions can be the major causes of these problems. In the protein analysis of samples, these problems eventually lead to incorrect conclusions. In order to minimize the influence of these problems, this paper proposes a partition based pair extension method that performs spot-matching on a set of gel images multiple times and segregates more reliable mapping results which can improve the accuracy of gel image analysis. The improved accuracy of the proposed method is analyzed through various experiments on real 2-DE images of human liver tissues.Keywords: Proteomics, spot-matching, two-dimensionalelectrophoresis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487221 An Improved Variable Tolerance RSM with a Proportion Threshold
Authors: Chen Wu, Youquan Xu, Dandan Li, Ronghua Yang, Lijuan Wang
Abstract:
In rough set models, tolerance relation, similarity relation and limited tolerance relation solve different situation problems for incomplete information systems in which there exists a phenomenon of missing value. If two objects have the same few known attributes and more unknown attributes, they cannot distinguish them well. In order to solve this problem, we presented two improved limited and variable precision rough set models. One is symmetric, the other one is non-symmetric. They all use more stringent condition to separate two small probability equivalent objects into different classes. The two models are needed to engage further study in detail. In the present paper, we newly form object classes with a different respect comparing to the first suggested model. We overcome disadvantages of non-symmetry regarding to the second suggested model. We discuss relationships between or among several models and also make rule generation. The obtained results by applying the second model are more accurate and reasonable.Keywords: Incomplete information system, rough set, symmetry, variable precision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 887220 An Improved Ant Colony Algorithm for Genome Rearrangements
Authors: Essam Al Daoud
Abstract:
Genome rearrangement is an important area in computational biology and bioinformatics. The basic problem in genome rearrangements is to compute the edit distance, i.e., the minimum number of operations needed to transform one genome into another. Unfortunately, unsigned genome rearrangement problem is NP-hard. In this study an improved ant colony optimization algorithm to approximate the edit distance is proposed. The main idea is to convert the unsigned permutation to signed permutation and evaluate the ants by using Kaplan algorithm. Two new operations are added to the standard ant colony algorithm: Replacing the worst ants by re-sampling the ants from a new probability distribution and applying the crossover operations on the best ants. The proposed algorithm is tested and compared with the improved breakpoint reversal sort algorithm by using three datasets. The results indicate that the proposed algorithm achieves better accuracy ratio than the previous methods.
Keywords: Ant colony algorithm, Edit distance, Genome breakpoint, Genome rearrangement, Reversal sort.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906219 Compton Scattering of Annihilation Photons as a Short Range Quantum Key Distribution Mechanism
Authors: Roman Novak, Matjaz Vencelj
Abstract:
The angular distribution of Compton scattering of two quanta originating in the annihilation of a positron with an electron is investigated as a quantum key distribution (QKD) mechanism in the gamma spectral range. The geometry of coincident Compton scattering is observed on the two sides as a way to obtain partially correlated readings on the quantum channel. We derive the noise probability density function of a conceptually equivalent prepare and measure quantum channel in order to evaluate the limits of the concept in terms of the device secrecy capacity and estimate it at roughly 1.9 bits per 1 000 annihilation events. The high error rate is well above the tolerable error rates of the common reconciliation protocols; therefore, the proposed key agreement protocol by public discussion requires key reconciliation using classical error-correcting codes. We constructed a prototype device based on the readily available monolithic detectors in the least complex setup.Keywords: Compton scattering, gamma-ray polarization, quantumcryptography, quantum key distribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2257218 Lane Detection Using Labeling Based RANSAC Algorithm
Authors: Yeongyu Choi, Ju H. Park, Ho-Youl Jung
Abstract:
In this paper, we propose labeling based RANSAC algorithm for lane detection. Advanced driver assistance systems (ADAS) have been widely researched to avoid unexpected accidents. Lane detection is a necessary system to assist keeping lane and lane departure prevention. The proposed vision based lane detection method applies Canny edge detection, inverse perspective mapping (IPM), K-means algorithm, mathematical morphology operations and 8 connected-component labeling. Next, random samples are selected from each labeling region for RANSAC. The sampling method selects the points of lane with a high probability. Finally, lane parameters of straight line or curve equations are estimated. Through the simulations tested on video recorded at daytime and nighttime, we show that the proposed method has better performance than the existing RANSAC algorithm in various environments.
Keywords: Canny edge detection, k-means algorithm, RANSAC, inverse perspective mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1203217 Performance Analysis of the First-Order Characteristics of Polling Systems Based on Parallel Limited (k = 1) Services Mode
Authors: Liu Yi, Bao Liyong
Abstract:
Aiming at the problem of low efficiency of pipelined scheduling in periodic query-qualified service, this paper proposes a system service resource scheduling strategy with parallel optimized qualified service polling control. The paper constructs the polling queuing system and its mathematical model; firstly, the first-order and second-order characteristic parameter equations are obtained by partial derivation of the probability mother function of the system state variables, and the complete mathematical, analytical expressions of each system parameter are deduced after the joint solution. The simulation experimental results are consistent with the theoretical calculated values. The system performance analysis shows that the average captain and average period of the system have been greatly improved, which can better adapt to the service demand of delay-sensitive data in the dense data environment.
Keywords: Polling, parallel scheduling, mean queue length, average cycle time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 63216 A Post Processing Method for Quantum Prime Factorization Algorithm based on Randomized Approach
Authors: Mir Shahriar Emami, Mohammad Reza Meybodi
Abstract:
Prime Factorization based on Quantum approach in two phases has been performed. The first phase has been achieved at Quantum computer and the second phase has been achieved at the classic computer (Post Processing). At the second phase the goal is to estimate the period r of equation xrN ≡ 1 and to find the prime factors of the composite integer N in classic computer. In this paper we present a method based on Randomized Approach for estimation the period r with a satisfactory probability and the composite integer N will be factorized therefore with the Randomized Approach even the gesture of the period is not exactly the real period at least we can find one of the prime factors of composite N. Finally we present some important points for designing an Emulator for Quantum Computer Simulation.Keywords: Quantum Prime Factorization, RandomizedAlgorithms, Quantum Computer Simulation, Quantum Computation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494215 Reliability-based Selection of Wind Turbines for Large-Scale Wind Farms
Authors: M. Fotuhi-Firuzabad, A. Salehi Dobakhshari
Abstract:
This paper presents a reliability-based approach to select appropriate wind turbine types for a wind farm considering site-specific wind speed patterns. An actual wind farm in the northern region of Iran with the wind speed registration of one year is studied in this paper. An analytic approach based on total probability theorem is utilized in this paper to model the probabilistic behavior of both turbines- availability and wind speed. Well-known probabilistic reliability indices such as loss of load expectation (LOLE), expected energy not supplied (EENS) and incremental peak load carrying capability (IPLCC) for wind power integration in the Roy Billinton Test System (RBTS) are examined. The most appropriate turbine type achieving the highest reliability level is chosen for the studied wind farm.
Keywords: Wind Turbine Generator, Wind Farm, Power System Reliability, Wind Turbine Type Selection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776214 Comparison of Neural Network and Logistic Regression Methods to Predict Xerostomia after Radiotherapy
Authors: Hui-Min Ting, Tsair-Fwu Lee, Ming-Yuan Cho, Pei-Ju Chao, Chun-Ming Chang, Long-Chang Chen, Fu-Min Fang
Abstract:
To evaluate the ability to predict xerostomia after radiotherapy, we constructed and compared neural network and logistic regression models. In this study, 61 patients who completed a questionnaire about their quality of life (QoL) before and after a full course of radiation therapy were included. Based on this questionnaire, some statistical data about the condition of the patients’ salivary glands were obtained, and these subjects were included as the inputs of the neural network and logistic regression models in order to predict the probability of xerostomia. Seven variables were then selected from the statistical data according to Cramer’s V and point-biserial correlation values and were trained by each model to obtain the respective outputs which were 0.88 and 0.89 for AUC, 9.20 and 7.65 for SSE, and 13.7% and 19.0% for MAPE, respectively. These parameters demonstrate that both neural network and logistic regression methods are effective for predicting conditions of parotid glands.
Keywords: NPC, ANN, logistic regression, xerostomia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637213 Tracking Objects in Color Image Sequences: Application to Football Images
Authors: Mourad Moussa, Ali Douik, Hassani Messaoud
Abstract:
In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.
Keywords: Image segmentation, objects tracking, Parzen window, singular value decomposition, target recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1985212 Calculation of the Ceramics Weibull Parameters
Abstract:
The paper deals with calculation of the parameters of ceramic material from a set of destruction tests of ceramic heads of total hip joint endoprosthesis. The standard way of calculation of the material parameters consists in carrying out a set of 3 or 4 point bending tests of specimens cut out from parts of the ceramic material to be analysed. In case of ceramic heads, it is not possible to cut out specimens of required dimensions because the heads are too small (if the cut out specimens were smaller than the normalised ones, the material parameters derived from them would exhibit higher strength values than those which the given ceramic material really has). On that score, a special testing jig was made, in which 40 heads were destructed. From the measured values of circumferential strains of the head-s external spherical surface under destruction, the state of stress in the head under destruction was established using the final elements method (FEM). From the values obtained, the sought for parameters of the ceramic material were calculated using Weibull-s weakest-link theory.Keywords: Hip joint endoprosthesis, ceramic head, FEM analysis, Weibull's weakest-link theory, failure probability, material parameters
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2638211 Decision Tree for Competing Risks Survival Probability in Breast Cancer Study
Authors: N. A. Ibrahim, A. Kudus, I. Daud, M. R. Abu Bakar
Abstract:
Competing risks survival data that comprises of more than one type of event has been used in many applications, and one of these is in clinical study (e.g. in breast cancer study). The decision tree method can be extended to competing risks survival data by modifying the split function so as to accommodate two or more risks which might be dependent on each other. Recently, researchers have constructed some decision trees for recurrent survival time data using frailty and marginal modelling. We further extended the method for the case of competing risks. In this paper, we developed the decision tree method for competing risks survival time data based on proportional hazards for subdistribution of competing risks. In particular, we grow a tree by using deviance statistic. The application of breast cancer data is presented. Finally, to investigate the performance of the proposed method, simulation studies on identification of true group of observations were executed.Keywords: Competing risks, Decision tree, Simulation, Subdistribution Proportional Hazard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2374210 An Algorithm for Determining the Arrival Behavior of a Secondary User to a Base Station in Cognitive Radio Networks
Authors: Danilo López, Edwin Rivas, Leyla López
Abstract:
This paper presents the development of an algorithm that predicts the arrival of a secondary user (SU) to a base station (BS) in a cognitive network based on infrastructure, requesting a Best Effort (BE) or Real Time (RT) type of service with a determined bandwidth (BW) implementing neural networks. The algorithm dynamically uses a neural network construction technique using the geometric pyramid topology and trains a Multilayer Perceptron Neural Networks (MLPNN) based on the historical arrival of an SU to estimate future applications. This will allow efficiently managing the information in the BS, since it precedes the arrival of the SUs in the stage of selection of the best channel in CRN. As a result, the software application determines the probability of arrival at a future time point and calculates the performance metrics to measure the effectiveness of the predictions made.
Keywords: Cognitive radio, MLPNN, base station, prediction, best effort, real time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1445209 Bin Bloom Filter Using Heuristic Optimization Techniques for Spam Detection
Authors: N. Arulanand, K. Premalatha
Abstract:
Bloom filter is a probabilistic and memory efficient data structure designed to answer rapidly whether an element is present in a set. It tells that the element is definitely not in the set but its presence is with certain probability. The trade-off to use Bloom filter is a certain configurable risk of false positives. The odds of a false positive can be made very low if the number of hash function is sufficiently large. For spam detection, weight is attached to each set of elements. The spam weight for a word is a measure used to rate the e-mail. Each word is assigned to a Bloom filter based on its weight. The proposed work introduces an enhanced concept in Bloom filter called Bin Bloom Filter (BBF). The performance of BBF over conventional Bloom filter is evaluated under various optimization techniques. Real time data set and synthetic data sets are used for experimental analysis and the results are demonstrated for bin sizes 4, 5, 6 and 7. Finally analyzing the results, it is found that the BBF which uses heuristic techniques performs better than the traditional Bloom filter in spam detection.
Keywords: Cuckoo search algorithm, levy’s flight, metaheuristic, optimal weight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262208 Networked Implementation of Milling Stability Optimization with Bayesian Learning
Authors: C. Ramsauer, J. Karandikar, D. Leitner, T. Schmitz, F. Bleicher
Abstract:
Machining instability, or chatter, can impose an important limitation to discrete part machining. In this work, a networked implementation of milling stability optimization with Bayesian learning is presented. The milling process was monitored with a wireless sensory tool holder instrumented with an accelerometer at the TU Wien, Vienna, Austria. The recorded data from a milling test cut were used to classify the cut as stable or unstable based on a frequency analysis. The test cut result was used in a Bayesian stability learning algorithm at the University of Tennessee, Knoxville, Tennessee, USA. The algorithm calculated the probability of stability as a function of axial depth of cut and spindle speed based on the test result and recommended parameters for the next test cut. The iterative process between two transatlantic locations was repeated until convergence to a stable optimal process parameter set was achieved.
Keywords: Bayesian learning, instrumented tool holder, machining stability, optimization strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539207 Real-Time Testing of Steel Strip Welds based on Bayesian Decision Theory
Authors: Julio Molleda, Daniel F. García, Juan C. Granda, Francisco J. Suárez
Abstract:
One of the main trouble in a steel strip manufacturing line is the breakage of whatever weld carried out between steel coils, that are used to produce the continuous strip to be processed. A weld breakage results in a several hours stop of the manufacturing line. In this process the damages caused by the breakage must be repaired. After the reparation and in order to go on with the production it will be necessary a restarting process of the line. For minimizing this problem, a human operator must inspect visually and manually each weld in order to avoid its breakage during the manufacturing process. The work presented in this paper is based on the Bayesian decision theory and it presents an approach to detect, on real-time, steel strip defective welds. This approach is based on quantifying the tradeoffs between various classification decisions using probability and the costs that accompany such decisions.Keywords: Classification, Pattern Recognition, ProbabilisticReasoning, Statistical Data Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411206 Application of Artificial Neural Network in Assessing Fill Slope Stability
Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung
Abstract:
This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.
Keywords: Landslide, limit analysis, ANN, soil properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1207205 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.
Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 836204 A Balanced Cost Cluster-Heads Selection Algorithm for Wireless Sensor Networks
Authors: Ouadoudi Zytoune, Youssef Fakhri, Driss Aboutajdine
Abstract:
This paper focuses on reducing the power consumption of wireless sensor networks. Therefore, a communication protocol named LEACH (Low-Energy Adaptive Clustering Hierarchy) is modified. We extend LEACHs stochastic cluster-head selection algorithm by a modifying the probability of each node to become cluster-head based on its required energy to transmit to the sink. We present an efficient energy aware routing algorithm for the wireless sensor networks. Our contribution consists in rotation selection of clusterheads considering the remoteness of the nodes to the sink, and then, the network nodes residual energy. This choice allows a best distribution of the transmission energy in the network. The cluster-heads selection algorithm is completely decentralized. Simulation results show that the energy is significantly reduced compared with the previous clustering based routing algorithm for the sensor networks.Keywords: Wireless Sensor Networks, Energy efficiency, WirelessCommunications, Clustering-based algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2646203 Seismic Fragility of Weir Structure Considering Aging Degradation of Concrete Material
Authors: HoYoung Son, DongHoon Shin, WooYoung Jung
Abstract:
This study presented the seismic fragility framework of concrete weir structure subjected to strong seismic ground motions and in particular, concrete aging condition of the weir structure was taken into account in this study. In order to understand the influence of concrete aging on the weir structure, by using probabilistic risk assessment, the analytical seismic fragility of the weir structure was derived for pre- and post-deterioration of concrete. The performance of concrete weir structure after five years was assumed for the concrete aging or deterioration, and according to after five years’ condition, the elastic modulus was simply reduced about one–tenth compared with initial condition of weir structures. A 2D nonlinear finite element analysis was performed considering the deterioration of concrete in weir structures using ABAQUS platform, a commercial structural analysis program. Simplified concrete degradation was resulted in the increase of almost 45% of the probability of failure at Limit State 3, in comparison to initial construction stage, by analyzing the seismic fragility.
Keywords: Weir, FEM, concrete, fragility, aging
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1154202 Factor Resistance Comparison of a Long Shaft in 955 and 1055 John Deere Grain Combine
Authors: M. Azadbakht, M. E. Shayan, H. Jafari, E. Ghajarjazi, A. Kiapei
Abstract:
Transmission shafts are affected by various forces, for example, during acceleration or sudden breaks, bending during transportation, vertical forces that lead to cuts. One of the main failures in combines is breaking shaft which repairmen refer it. Structural resistance of canal against torque is very important in the beginning of the movement. For analyzing stress, a typical sample from a type of combine was selected, called JD955 combine. Long shaft in this combine was analyzed with finite element method by Ansys13 generic package under static load. Conducted analysis showed that there is a maximum stress in contact surfaces of indentations and also in place of changing diameter. Safety factor value is low in parts of the shaft and this increases the probability of failure at these points. To improve the conditions with the least cost and an approach of product improvement, using alternative alloy is important.Keywords: John Deere, Ansys, Shaft, Stress, Grain Combine harvester, Finite element, Failure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637201 An Intelligent System Framework for Generating Activity List of a Project Using WBS Mind map and Semantic Network
Authors: H. Iranmanesh, M. Madadi
Abstract:
Work Breakdown Structure (WBS) is one of the most vital planning processes of the project management since it is considered to be the fundamental of other processes like scheduling, controlling, assigning responsibilities, etc. In fact WBS or activity list is the heart of a project and omission of a simple task can lead to an irrecoverable result. There are some tools in order to generate a project WBS. One of the most powerful tools is mind mapping which is the basis of this article. Mind map is a method for thinking together and helps a project manager to stimulate the mind of project team members to generate project WBS. Here we try to generate a WBS of a sample project involving with the building construction using the aid of mind map and the artificial intelligence (AI) programming language. Since mind map structure can not represent data in a computerized way, we convert it to a semantic network which can be used by the computer and then extract the final WBS from the semantic network by the prolog programming language. This method will result a comprehensive WBS and decrease the probability of omitting project tasks.Keywords: Expert System, Mind map, Semantic network, Work breakdown structure,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2609200 On Four Models of a Three Server Queue with Optional Server Vacations
Authors: Kailash C. Madan
Abstract:
We study four models of a three server queueing system with Bernoulli schedule optional server vacations. Customers arriving at the system one by one in a Poisson process are provided identical exponential service by three parallel servers according to a first-come, first served queue discipline. In model A, all three servers may be allowed a vacation at one time, in Model B at the most two of the three servers may be allowed a vacation at one time, in model C at the most one server is allowed a vacation, and in model D no server is allowed a vacation. We study steady the state behavior of the four models and obtain steady state probability generating functions for the queue size at a random point of time for all states of the system. In model D, a known result for a three server queueing system without server vacations is derived.Keywords: A three server queue, Bernoulli schedule server vacations, queue size distribution at a random epoch, steady state.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1386199 A Proposed Technique for Software Development Risks Identification by using FTA Model
Authors: Hatem A. Khater, A. Baith Mohamed, Sara M. Kamel
Abstract:
Software Development Risks Identification (SDRI), using Fault Tree Analysis (FTA), is a proposed technique to identify not only the risk factors but also the causes of the appearance of the risk factors in software development life cycle. The method is based on analyzing the probable causes of software development failures before they become problems and adversely affect a project. It uses Fault tree analysis (FTA) to determine the probability of a particular system level failures that are defined by A Taxonomy for Sources of Software Development Risk to deduce failure analysis in which an undesired state of a system by using Boolean logic to combine a series of lower-level events. The major purpose of this paper is to use the probabilistic calculations of Fault Tree Analysis approach to determine all possible causes that lead to software development risk occurrenceKeywords: Software Development Risks Identification (SDRI), Fault Tree Analysis (FTA), Taxonomy for Software Development Risks (TSDR), Probabilistic Risk Assessment (PRA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217