Search results for: probability distribution.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2308

Search results for: probability distribution.

1288 The Cardiac Diagnostic Prediction Applied to a Designed Holter

Authors: Leonardo Juan Ramírez López, Javier Oswaldo Rodriguez Velasquez

Abstract:

We have designed a Holter that measures the heart´s activity for over 24 hours, implemented a prediction methodology, and generate alarms as well as indicators to patients and treating physicians. Various diagnostic advances have been developed in clinical cardiology thanks to Holter implementation; however, their interpretation has largely been conditioned to clinical analysis and measurements adjusted to diverse population characteristics, thus turning it into a subjective examination. This, however, requires vast population studies to be validated that, in turn, have not achieved the ultimate goal: mortality prediction. Given this context, our Insight Research Group developed a mathematical methodology that assesses cardiac dynamics through entropy and probability, creating a numerical and geometrical attractor which allows quantifying the normalcy of chronic and acute disease as well as the evolution between such states, and our Tigum Research Group developed a holter device with 12 channels and advanced computer software. This has been shown in different contexts with 100% sensitivity and specificity results.

Keywords: Entropy, mathematical, prediction, cardiac, holter, attractor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 703
1287 Identification of Aircraft Gas Turbine Engines Temperature Condition

Authors: Pashayev A., Askerov D., C. Ardil, Sadiqov R., Abdullayev P.

Abstract:

Groundlessness of application probability-statistic methods are especially shown at an early stage of the aviation GTE technical condition diagnosing, when the volume of the information has property of the fuzzy, limitations, uncertainty and efficiency of application of new technology Soft computing at these diagnosing stages by using the fuzzy logic and neural networks methods. It is made training with high accuracy of multiple linear and nonlinear models (the regression equations) received on the statistical fuzzy data basis. At the information sufficiency it is offered to use recurrent algorithm of aviation GTE technical condition identification on measurements of input and output parameters of the multiple linear and nonlinear generalized models at presence of noise measured (the new recursive least squares method (LSM)). As application of the given technique the estimation of the new operating aviation engine D30KU-154 technical condition at height H=10600 m was made.

Keywords: Identification of a technical condition, aviation gasturbine engine, fuzzy logic and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
1286 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering

Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel

Abstract:

Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.

Keywords: Classification, data mining, spam filtering, naive Bayes, decision tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495
1285 Identification of Aircraft Gas Turbine Engine's Temperature Condition

Authors: Pashayev A., Askerov D., C. Ardil, Sadiqov R., Abdullayev P.

Abstract:

Groundlessness of application probability-statistic methods are especially shown at an early stage of the aviation GTE technical condition diagnosing, when the volume of the information has property of the fuzzy, limitations, uncertainty and efficiency of application of new technology Soft computing at these diagnosing stages by using the fuzzy logic and neural networks methods. It is made training with high accuracy of multiple linear and nonlinear models (the regression equations) received on the statistical fuzzy data basis. At the information sufficiency it is offered to use recurrent algorithm of aviation GTE technical condition identification on measurements of input and output parameters of the multiple linear and nonlinear generalized models at presence of noise measured (the new recursive least squares method (LSM)). As application of the given technique the estimation of the new operating aviation engine D30KU-154 technical condition at height H=10600 m was made.

Keywords: Identification of a technical condition, aviation gasturbine engine, fuzzy logic and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
1284 Video-Based Face Recognition Based On State-Space Model

Authors: Cheng-Chieh Chiang, Yi-Chia Chan, Greg C. Lee

Abstract:

This paper proposes a video-based framework for face recognition to identify which faces appear in a video sequence. Our basic idea is like a tracking task - to track a selection of person candidates over time according to the observing visual features of face images in video frames. Hence, we employ the state-space model to formulate video-based face recognition by dividing this problem into two parts: the likelihood and the transition measures. The likelihood measure is to recognize whose face is currently being observed in video frames, for which two-dimensional linear discriminant analysis is employed. The transition measure estimates the probability of changing from an incorrect recognition at the previous stage to the correct person at the current stage. Moreover, extra nodes associated with head nodes are incorporated into our proposed state-space model. The experimental results are also provided to demonstrate the robustness and efficiency of our proposed approach.

Keywords: 2DLDA, face recognition, state-space model, likelihood measure, transition measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
1283 Socio-Demographic Characteristics and Psychosocial Consequences of Sickle Cell Disease: The Case of Patients in a Public Hospital in Ghana

Authors: Vincent A. Adzika, Franklin N. Glozah, Collins S. K. Ahorlu

Abstract:

Background: Sickle Cell Disease (SCD) is of major public-health concern globally, with majority of patients living in Africa. Despite its relevance, there is a dearth of research to determine the socio-demographic distribution and psychosocial impact of SCD in Africa. The objective of this study therefore was to examine the socio-demographic distribution and psychosocial consequences of SCD among patients in Ghana and to assess their quality of life and coping mechanisms. Methods: A cross-sectional research design was used, involving the completion of questionnaires on socio-demographic characteristics, quality of life of individuals, anxiety and depression. Participants were 387 male and female patients attending a sickle cell clinic in a public hospital. Results: Results showed no gender and marital status differences in anxiety and depression. However, there were age and level of education variances in depression but not in anxiety. In terms of quality of life, patients were more satisfied by the presence of love, friends, relatives as well as home, community and neighbourhood environment. While pains of varied nature and severity were the major reasons for attending hospital in SCD condition, going to the hospital as well as having Faith in God was the frequently reported mechanisms for coping with an unbearable SCD attacks. Multiple regression analysis showed that some socio-demographic and quality of life indicators had strong associations with anxiety and/or depression. Conclusion: It is recommended that a multi-dimensional intervention strategy incorporating psychosocial dimensions should be considered in the treatment and management of SCD.

Keywords: Sickle cell disease, quality of life, anxiety, depression, socio-demographic characteristics, Ghana.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1819
1282 A Framework of Monte Carlo Simulation for Examining the Uncertainty-Investment Relationship

Authors: George Yungchih Wang

Abstract:

This paper argues that increased uncertainty, in certain situations, may actually encourage investment. Since earlier studies mostly base their arguments on the assumption of geometric Brownian motion, the study extends the assumption to alternative stochastic processes, such as mixed diffusion-jump, mean-reverting process, and jump amplitude process. A general approach of Monte Carlo simulation is developed to derive optimal investment trigger for the situation that the closed-form solution could not be readily obtained under the assumption of alternative process. The main finding is that the overall effect of uncertainty on investment is interpreted by the probability of investing, and the relationship appears to be an invested U-shaped curve between uncertainty and investment. The implication is that uncertainty does not always discourage investment even under several sources of uncertainty. Furthermore, high-risk projects are not always dominated by low-risk projects because the high-risk projects may have a positive realization effect on encouraging investment.

Keywords: real options, geometric Brownian motion, mixeddiffusion-jump process, mean- reverting process, jump amplitudeprocess

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
1281 Integrating Decision Tree and Spatial Cluster Analysis for Landslide Susceptibility Zonation

Authors: Chien-Min Chu, Bor-Wen Tsai, Kang-Tsung Chang

Abstract:

Landslide susceptibility map delineates the potential zones for landslide occurrence. Previous works have applied multivariate methods and neural networks for mapping landslide susceptibility. This study proposed a new approach to integrate decision tree model and spatial cluster statistic for assessing landslide susceptibility spatially. A total of 2057 landslide cells were digitized for developing the landslide decision tree model. The relationships of landslides and instability factors were explicitly represented by using tree graphs in the model. The local Getis-Ord statistics were used to cluster cells with high landslide probability. The analytic result from the local Getis-Ord statistics was classed to create a map of landslide susceptibility zones. The map was validated using new landslide data with 482 cells. Results of validation show an accuracy rate of 86.1% in predicting new landslide occurrence. This indicates that the proposed approach is useful for improving landslide susceptibility mapping.

Keywords: Landslide susceptibility Zonation, Decision treemodel, Spatial cluster, Local Getis-Ord statistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
1280 Estimating the Effect of Fluid in Pressing Process

Authors: A. Movaghar, R. A. Mahdavinejad

Abstract:

To analyze the effect of various parameters of fluid on the material properties such as surface and depth defects and/or cracks, it is possible to determine the affection of pressure field on these specifications. Stress tensor analysis is also able to determine the points in which the probability of defection creation is more. Besides, from pressure field, it is possible to analyze the affection of various fluid specifications such as viscosity and density on defect created in the material. In this research, the concerned boundary conditions are analyzed first. Then the solution network and stencil used are mentioned. With the determination of relevant equation on the fluid flow between notch and matrix and their discretion according to the governed boundary conditions, these equations can be solved. Finally, with the variation creations on fluid parameters such as density and viscosity, the affection of these variations can be determined on pressure field. In this direction, the flowchart and solution algorithm with their results as vortex and current function contours for two conditions with most applications in pressing process are introduced and discussed.

Keywords: Pressing, notch, matrix, flow function, vortex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 700
1279 Effective Relay Communication for Scalable Video Transmission

Authors: Jung Ah Park, Zhijie Zhao, Doug Young Suh, Joern Ostermann

Abstract:

In this paper, we propose an effective relay communication for layered video transmission as an alternative to make the most of limited resources in a wireless communication network where loss often occurs. Relaying brings stable multimedia services to end clients, compared to multiple description coding (MDC). Also, retransmission of only parity data about one or more video layer using channel coder to the end client of the relay device is paramount to the robustness of the loss situation. Using these methods in resource-constrained environments, such as real-time user created content (UCC) with layered video transmission, can provide high-quality services even in a poor communication environment. Minimal services are also possible. The mathematical analysis shows that the proposed method reduced the probability of GOP loss rate compared to MDC and raptor code without relay. The GOP loss rate is about zero, while MDC and raptor code without relay have a GOP loss rate of 36% and 70% in case of 10% frame loss rate.

Keywords: Relay communication, Multiple Description Coding, Scalable Video Coding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432
1278 A new Heuristic Algorithm for the Dynamic Facility Layout Problem with Budget Constraint

Authors: Parham Azimi, Hamid Reza Charmchi

Abstract:

In this research, we have developed a new efficient heuristic algorithm for the dynamic facility layout problem with budget constraint (DFLPB). This heuristic algorithm combines two mathematical programming methods such as discrete event simulation and linear integer programming (IP) to obtain a near optimum solution. In the proposed algorithm, the non-linear model of the DFLP has been changed to a pure integer programming (PIP) model. Then, the optimal solution of the PIP model has been used in a simulation model that has been designed in a similar manner as the DFLP for determining the probability of assigning a facility to a location. After a sufficient number of runs, the simulation model obtains near optimum solutions. Finally, to verify the performance of the algorithm, several test problems have been solved. The results show that the proposed algorithm is more efficient in terms of speed and accuracy than other heuristic algorithms presented in previous works found in the literature.

Keywords: Budget constraint, Dynamic facility layout problem, Integer programming, Simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962
1277 A Bayesian Network Reliability Modeling for FlexRay Systems

Authors: Kuen-Long Leu, Yung-Yuan Chen, Chin-Long Wey, Jwu-E Chen, Chung-Hsien Hsu

Abstract:

The increasing importance of FlexRay systems in automotive domain inspires unceasingly relative researches. One primary issue among researches is to verify the reliability of FlexRay systems either from protocol aspect or from system design aspect. However, research rarely discusses the effect of network topology on the system reliability. In this paper, we will illustrate how to model the reliability of FlexRay systems with various network topologies by a well-known probabilistic reasoning technology, Bayesian Network. In this illustration, we especially investigate the effectiveness of error containment built in star topology and fault-tolerant midpoint synchronization algorithm adopted in FlexRay communication protocol. Through a FlexRay steer-by-wire case study, the influence of different topologies on the failure probability of the FlexRay steerby- wire system is demonstrated. The notable value of this research is to show that the Bayesian Network inference is a powerful and feasible method for the reliability assessment of FlexRay systems.

Keywords: Bayesian Network, FlexRay, fault tolerance, network topology, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2024
1276 Improving the Exploitation of Fluid in Elastomeric Polymeric Isolator

Authors: Haithem Elderrat, Huw Davies, Emmanuel Brousseau

Abstract:

Elastomeric polymer foam has been used widely in the automotive industry, especially for isolating unwanted vibrations. Such material is able to absorb unwanted vibration due to its combination of elastic and viscous properties. However, the ‘creep effect’, poor stress distribution and susceptibility to high temperatures are the main disadvantages of such a system. In this study, improvements in the performance of elastomeric foam as a vibration isolator were investigated using the concept of Foam Filled Fluid (FFFluid). In FFFluid devices, the foam takes the form of capsule shapes, and is mixed with viscous fluid, while the mixture is contained in a closed vessel. When the FFFluid isolator is affected by vibrations, energy is absorbed, due to the elastic strain of the foam. As the foam is compressed, there is also movement of the fluid, which contributes to further energy absorption as the fluid shears. Also, and dependent on the design adopted, the packaging could also attenuate vibration through energy absorption via friction and/or elastic strain. The present study focuses on the advantages of the FFFluid concept over the dry polymeric foam in the role of vibration isolation. This comparative study between the performance of dry foam and the FFFluid was made according to experimental procedures. The paper concludes by evaluating the performance of the FFFluid isolator in the suspension system of a light vehicle. One outcome of this research is that the FFFluid may preferable over elastomer isolators in certain applications, as it enables a reduction in the effects of high temperatures and of ‘creep effects’, thereby increasing the reliability and load distribution. The stiffness coefficient of the system has increased about 60% by using an FFFluid sample. The technology represented by the FFFluid is therefore considered by this research suitable for application in the suspension system of a light vehicle.

Keywords: Anti-vibration devices, dry foam, FFFluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
1275 A CT-based Monte Carlo Dose Calculations for Proton Therapy Using a New Interface Program

Authors: A. Esmaili Torshabi, A. Terakawa, K. Ishii, H. Yamazaki, S. Matsuyama, Y. Kikuchi, M. Nakhostin, H. Sabet, A. Ishizaki, W. Yamashita, T. Togashi, J. Arikawa, H. Akiyama, K. Koyata

Abstract:

The purpose of this study is to introduce a new interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues in proton therapy. This interface program was developed under MATLAB software and includes a friendly graphical user interface with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton beam. The result of the mentioned technique is a number of nonoverlapped squares with different sizes in every image. By this way the resolution of image segmentation is high enough in and near heterogeneous areas to preserve the precision of dose calculations and is low enough in homogeneous areas to reduce the number of cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.

Keywords: Monte Carlo, CT images, Quadtree decomposition, Interface program, Proton beam

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857
1274 Internal Surface Measurement of Nanoparticle with Polarization-interferometric Nonlinear Confocal Microscope

Authors: Chikara Egami, Kazuhiro Kuwahara

Abstract:

Polarization-interferometric nonlinear confocal microscopy is proposed for measuring a nano-sized particle with optical anisotropy. The anisotropy in the particle was spectroscopically imaged through a three-dimensional distribution of third-order nonlinear dielectric polarization photoinduced.

Keywords: nanoparticle, optical storage, microscope

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
1273 Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment

Authors: Fares Innal, Yves Dutuit, Mourad Chebila

Abstract:

The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.

Keywords: Fuzzy sets, Monte Carlo simulation, Safety instrumented system, Safety integrity level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2775
1272 Performance Analysis of an Adaptive Threshold Hybrid Double-Dwell System with Antenna Diversity for Acquisition in DS-CDMA Systems

Authors: H. Krouma, M. Barkat, K. Kemih, M. Benslama, Y. Yacine

Abstract:

In this paper, we consider the analysis of the acquisition process for a hybrid double-dwell system with antenna diversity for DS-CDMA (direct sequence-code division multiple access) using an adaptive threshold. Acquisition systems with a fixed threshold value are unable to adapt to fast varying mobile communications environments and may result in a high false alarm rate, and/or low detection probability. Therefore, we propose an adaptively varying threshold scheme through the use of a cellaveraging constant false alarm rate (CA-CFAR) algorithm, which is well known in the field of radar detection. We derive exact expressions for the probabilities of detection and false alarm in Rayleigh fading channels. The mean acquisition time of the system under consideration is also derived. The performance of the system is analyzed and compared to that of a hybrid single dwell system.

Keywords: Adaptive threshold, hybrid double-dwell system, CA-CFAR algorithm, DS-CDMA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
1271 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: Reliability approach, storage tanks, Monte Carlo simulation, seismic acceleration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1483
1270 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race

Authors: Joonas Pääkkönen

Abstract:

In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.

Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 824
1269 Computer Proven Correctness of the Rabin Public-Key Scheme

Authors: Johannes Buchmann, Markus Kaiser

Abstract:

We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.

Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
1268 An Adaptive Opportunistic Transmission for Unlicensed Spectrum Sharing in Heterogeneous Networks

Authors: Daehyoung Kim, Pervez Khan, Hoon Kim

Abstract:

Efficient utilization of spectrum resources is a fundamental issue of wireless communications due to its scarcity. To improve the efficiency of spectrum utilization, the spectrum sharing for unlicensed bands is being regarded as one of key technologies in the next generation wireless networks. A number of schemes such as Listen-Before-Talk(LBT) and carrier sensor adaptive transmission (CSAT) have been suggested from this aspect, but more efficient sharing schemes are required for improving spectrum utilization efficiency. This work considers an opportunistic transmission approach and a dynamic Contention Window (CW) adjustment scheme for LTE-U users sharing the unlicensed spectrum with Wi-Fi, in order to enhance the overall system throughput. The decision criteria for the dynamic adjustment of CW are based on the collision evaluation, derived from the collision probability of the system. The overall performance can be improved due to the adaptive adjustment of the CW. Simulation results show that our proposed scheme outperforms the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 MAC.

Keywords: Spectrum sharing, adaptive opportunistic transmission, unlicensed bands, heterogeneous networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
1267 A New Hybrid RMN Image Segmentation Algorithm

Authors: Abdelouahab Moussaoui, Nabila Ferahta, Victor Chen

Abstract:

The development of aid's systems for the medical diagnosis is not easy thing because of presence of inhomogeneities in the MRI, the variability of the data from a sequence to the other as well as of other different source distortions that accentuate this difficulty. A new automatic, contextual, adaptive and robust segmentation procedure by MRI brain tissue classification is described in this article. A first phase consists in estimating the density of probability of the data by the Parzen-Rozenblatt method. The classification procedure is completely automatic and doesn't make any assumptions nor on the clusters number nor on the prototypes of these clusters since these last are detected in an automatic manner by an operator of mathematical morphology called skeleton by influence zones detection (SKIZ). The problem of initialization of the prototypes as well as their number is transformed in an optimization problem; in more the procedure is adaptive since it takes in consideration the contextual information presents in every voxel by an adaptive and robust non parametric model by the Markov fields (MF). The number of bad classifications is reduced by the use of the criteria of MPM minimization (Maximum Posterior Marginal).

Keywords: Clustering, Automatic Classification, SKIZ, MarkovFields, Image segmentation, Maximum Posterior Marginal (MPM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406
1266 An Improved Greedy Routing Algorithm for Grid using Pheromone-Based Landmarks

Authors: Lada-On Lertsuwanakul, Herwig Unger

Abstract:

This paper objects to extend Jon Kleinberg-s research. He introduced the structure of small-world in a grid and shows with a greedy algorithm using only local information able to find route between source and target in delivery time O(log2n). His fundamental model for distributed system uses a two-dimensional grid with longrange random links added between any two node u and v with a probability proportional to distance d(u,v)-2. We propose with an additional information of the long link nearby, we can find the shorter path. We apply the ant colony system as a messenger distributed their pheromone, the long-link details, in surrounding area. The subsequence forwarding decision has more option to move to, select among local neighbors or send to node has long link closer to its target. Our experiment results sustain our approach, the average routing time by Color Pheromone faster than greedy method.

Keywords: Routing algorithm, Small-World network, Ant Colony Optimization, and Peer-to-peer System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1855
1265 Analysis of MAC Protocols with Correlation Receiver for OCDMA Networks - Part II

Authors: Shivaleela E. S., Shrikant S. Tangade

Abstract:

In this paper optical code-division multiple-access (OCDMA) packet network is considered, which offers inherent security in the access networks. Two types of random access protocols are proposed for packet transmission. In protocol 1, all distinct codes and in protocol 2, distinct codes as well as shifted versions of all these codes are used. O-CDMA network performance using optical orthogonal codes (OOCs) 1-D and two-dimensional (2-D) wavelength/time single-pulse-per-row (W/T SPR) codes are analyzed. The main advantage of using 2-D codes instead of onedimensional (1-D) codes is to reduce the errors due to multiple access interference among different users. In this paper, correlation receiver is considered in the analysis. Using analytical model, we compute and compare packet-success probability for 1-D and 2-D codes in an O-CDMA network and the analysis shows improved performance with 2-D codes as compared to 1-D codes.

Keywords: Optical code-division multiple-access, optical CDMA correlation receiver, wavelength/time optical CDMA codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390
1264 Application of Single Tuned Passive Filters in Distribution Networks at the Point of Common Coupling

Authors: M. Almutairi, S. Hadjiloucas

Abstract:

The harmonic distortion of voltage is important in relation to power quality due to the interaction between the large diffusion of non-linear and time-varying single-phase and three-phase loads with power supply systems. However, harmonic distortion levels can be reduced by improving the design of polluting loads or by applying arrangements and adding filters. The application of passive filters is an effective solution that can be used to achieve harmonic mitigation mainly because filters offer high efficiency, simplicity, and are economical. Additionally, possible different frequency response characteristics can work to achieve certain required harmonic filtering targets. With these ideas in mind, the objective of this paper is to determine what size single tuned passive filters work in distribution networks best, in order to economically limit violations caused at a given point of common coupling (PCC). This article suggests that a single tuned passive filter could be employed in typical industrial power systems. Furthermore, constrained optimization can be used to find the optimal sizing of the passive filter in order to reduce both harmonic voltage and harmonic currents in the power system to an acceptable level, and, thus, improve the load power factor. The optimization technique works to minimize voltage total harmonic distortions (VTHD) and current total harmonic distortions (ITHD), where maintaining a given power factor at a specified range is desired. According to the IEEE Standard 519, both indices are viewed as constraints for the optimal passive filter design problem. The performance of this technique will be discussed using numerical examples taken from previous publications.

Keywords: Harmonics, passive filter, power factor, power quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
1263 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes

Authors: V. Churkin, M. Lopatin

Abstract:

The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second – 95,3%.

Keywords: Bass model, generalized Bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878
1262 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1369
1261 Probabilistic Electrical Power Generation Modeling Using Decimal to Binary Conversion

Authors: Ahmed S. Al-Abdulwahab

Abstract:

Generation system reliability assessment is an important task which can be performed using deterministic or probabilistic techniques. The probabilistic approaches have significant advantages over the deterministic methods. However, more complicated modeling is required by the probabilistic approaches. Power generation model is a basic requirement for this assessment. One form of the generation models is the well known capacity outage probability table (COPT). Different analytical techniques have been used to construct the COPT. These approaches require considerable mathematical modeling of the generating units. The unit-s models are combined to build the COPT which will add more burdens on the process of creating the COPT. Decimal to Binary Conversion (DBC) technique is widely and commonly applied in electronic systems and computing This paper proposes a novel utilization of the DBC to create the COPT without engaging in analytical modeling or time consuming simulations. The simple binary representation , “0 " and “1 " is used to model the states o f generating units. The proposed technique is proven to be an effective approach to build the generation model.

Keywords: Decimal to Binary, generation, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036
1260 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data

Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan

Abstract:

The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.

Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 153
1259 Data Rate Based Grouping Scheme for Cooperative Communications in Wireless LANs

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards provide multiple transmission rates, which can be changed dynamically according to the channel condition. Cooperative communications were introduced to improve the overall performance of wireless LANs with the help of relay nodes with higher transmission rates. The cooperative communications are based on the fact that the transmission is much faster when sending data packets to a destination node through a relay node with higher transmission rate, rather than sending data directly to the destination node at low transmission rate. To apply the cooperative communications in wireless LAN, several MAC protocols have been proposed. Some of them can result in collisions among relay nodes in a dense network. In order to solve this problem, we propose a new protocol. Relay nodes are grouped based on their transmission rates. And then, relay nodes only in the highest group try to get channel access. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and collision probability.

Keywords: Cooperative communications, MAC protocol, relay node, WLAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904