Search results for: Blocking probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 591

Search results for: Blocking probability

381 Biorecognizable Nanoparticles Based On Hyaluronic Acid/Poly(ε-Caprolactone) Block Copolymer

Authors: Jong Ho Hwang, Dae Hwan Kang, Young-IL Jeong

Abstract:

Since hyaluronic acid (HA) receptor such as CD44 is over-expressed at sites of cancer cells, HA can be used as a targeting vehicles for anti-cancer drugs. The aim of this study is to synthesize block copolymer composed of hyaluronic acid and poly(ε-caprolactone) (HAPCL) and to fabricate polymeric micelles for anticancer drug targeting against CD44 receptor of tumor cells. Chemical composition of HAPCL was confirmed using 1H NMR spectroscopy. Doxorubicin (DOX) was incorporated into polymeric micelles of HAPCL. The diameters of HAPHS polymeric micelles were changed around 80nm and have spherical shapes. Targeting potential was investigated using CD44-overexpressing. When DOX-incorporated polymeric micelles was added to KB cells, they revealed strong red fluorescence color while blocking of CD44 receptor by pretreatment of free HA resulted in reduced intensity, indicating that HAPCL polymeric micelles have targetability against CD44 receptor.

Keywords: Hyaluronic acid, CD44 receptor, biorecognizable nanoparticles, block copolymer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5955
380 Evaluating Probable Bending of Frames for Near-Field and Far-Field Records

Authors: Majid Saaly, Shahriar Tavousi Tafreshi, Mehdi Nazari Afshar

Abstract:

Most reinforced concrete structures are designed only under heavy loads have large transverse reinforcement spacing values, and therefore suffer severe failure after intense ground movements. The main goal of this paper is to compare the shear- and axial failure of concrete bending frames available in Tehran using Incremental Dynamic Analysis (IDA) under near- and far-field records. For this purpose, IDA of 5, 10, and 15-story concrete structures were done under seven far-fault records and five near-faults records. The results show that in two-dimensional models of short-rise, mid-rise and high-rise reinforced concrete frames located on Type-3 soil, increasing the distance of the transverse reinforcement can increase the maximum inter-story drift ratio values up to 37%. According to the existing results on 5, 10, and 15-story reinforced concrete models located on Type-3 soil, records with characteristics such as fling-step and directivity create maximum drift values between floors more than far-fault earthquakes. The results indicated that in the case of seismic excitation modes under earthquake encompassing directivity or fling-step, the probability values of failure and failure possibility increasing rate values are much smaller than the corresponding values of far-fault earthquakes. However, in near-fault frame records, the probability of exceedance occurs at lower seismic intensities compared to far-fault records.

Keywords: Directivity, fling-step, fragility curve, IDA, inter story drift ratio.v

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 297
379 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses

Authors: Neil Bar, Andrew Heweston

Abstract:

Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.

Keywords: Probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128
378 On Mobile Checkpointing using Index and Time Together

Authors: Awadhesh Kumar Singh

Abstract:

Checkpointing is one of the commonly used techniques to provide fault-tolerance in distributed systems so that the system can operate even if one or more components have failed. However, mobile computing systems are constrained by low bandwidth, mobility, lack of stable storage, frequent disconnections and limited battery life. Hence, checkpointing protocols having lesser number of synchronization messages and fewer checkpoints are preferred in mobile environment. There are two different approaches, although not orthogonal, to checkpoint mobile computing systems namely, time-based and index-based. Our protocol is a fusion of these two approaches, though not first of its kind. In the present exposition, an index-based checkpointing protocol has been developed, which uses time to indirectly coordinate the creation of consistent global checkpoints for mobile computing systems. The proposed algorithm is non-blocking, adaptive, and does not use any control message. Compared to other contemporary checkpointing algorithms, it is computationally more efficient because it takes lesser number of checkpoints and does not need to compute dependency relationships. A brief account of important and relevant works in both the fields, time-based and index-based, has also been included in the presentation.

Keywords: Checkpointing, forced checkpoint, mobile computing, recovery, time-coordinated.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444
377 The Effect of Ultrasound on Permeation Flux and Changes in Blocking Mechanisms during Dead-End Microfiltration of Carrot Juice

Authors: A. Hemmati, H. Mirsaeedghazi, M. Aboonajmi

Abstract:

Carrot juice is one of the most nutritious foods that are consumed around the world. Large particles in carrot juice causing turbid appearance make some problems in the concentration process such as off-flavor due to the large particles burnt on the walls of evaporators. Microfiltration (MF) is a pressure driven membrane separation method that can clarify fruit juices without enzymatic treatment. Fouling is the main problem in the membrane process causing reduction of permeate flux. Ultrasound as a cleaning technique was applied at 20 kHz to reduce fouling in membrane clarification of carrot juice using dead-end MF system with polyvinylidene fluoride (PVDF) membrane. Results showed that application of ultrasound waves reduce diphasic characteristic of carrot juice and permeate flux increased. Evaluation of different membrane fouling mechanisms showed that application of ultrasound waves changed creation time of each fouling mechanism. Also, its behavior was changed with varying transmembrane pressure.

Keywords: Carrot juice, dead end, microfiltration, ultrasound.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 877
376 A Novel Neighborhood Defined Feature Selection on Phase Congruency Images for Recognition of Faces with Extreme Variations

Authors: Satyanadh Gundimada, Vijayan K Asari

Abstract:

A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.

Keywords: Discriminant analysis, intra-class probability distribution, principal component analysis, phase congruency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1807
375 Signing the First Packet in Amortization Scheme for Multicast Stream Authentication

Authors: Mohammed Shatnawi, Qusai Abuein, Susumu Shibusawa

Abstract:

Signature amortization schemes have been introduced for authenticating multicast streams, in which, a single signature is amortized over several packets. The hash value of each packet is computed, some hash values are appended to other packets, forming what is known as hash chain. These schemes divide the stream into blocks, each block is a number of packets, the signature packet in these schemes is either the first or the last packet of the block. Amortization schemes are efficient solutions in terms of computation and communication overhead, specially in real-time environment. The main effictive factor of amortization schemes is it-s hash chain construction. Some studies show that signing the first packet of each block reduces the receiver-s delay and prevents DoS attacks, other studies show that signing the last packet reduces the sender-s delay. To our knowledge, there is no studies that show which is better, to sign the first or the last packet in terms of authentication probability and resistance to packet loss. In th is paper we will introduce another scheme for authenticating multicast streams that is robust against packet loss, reduces the overhead, and prevents the DoS attacks experienced by the receiver in the same time. Our scheme-The Multiple Connected Chain signing the First packet (MCF) is to append the hash values of specific packets to other packets,then append some hashes to the signature packet which is sent as the first packet in the block. This scheme is aspecially efficient in terms of receiver-s delay. We discuss and evaluate the performance of our proposed scheme against those that sign the last packet of the block.

Keywords: multicast stream authentication, hash chain construction, signature amortization, authentication probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
374 Determination of Some Physical and Mechanical Properties of Pofaki Variety of Pea

Authors: M. Azadbakht, E. Ghajarjazi, E. Amiri, F. Abdigaol

Abstract:

In this research the effect of moisture at three levels (47, 57, and 67 w.b.%) on the physical properties of the Pofaki pea variety including, dimensions, geometric mean diameter, volume, sphericity index and the surface area was determined. The influence of different moisture levels (47, 57 and 67 w.b.%), in two loading orientation (longitudinal and transverse) and three loading speed (4,6 and 8 mm min-1) on the mechanical properties of pea such as maximum deformation, rupture force, rupture energy, toughness and the power to break the pea was investigated. It was observed in the physical properties that moisture changes were affective at 1% on, dimensions, geometric mean diameter, volume, sphericity index and the surface area. It was observed in the mechanical properties that moisture changes were effective at 1% on, maximum deformation, rupture force, rupture energy, toughness and the power to break. Loading speed was effective on maximum deformation, rupture force, rupture energy at 1% and it was effective on toughness at 5%. Loading orientation was effective on maximum deformation, rupture force, rupture energy, toughness at 1% and it was effective on power at 5%. The mutual effect of speed and orientation were effective on rupture energy at 1% and were effective on toughness at 5% probability. The mutual effect of moisture and speed were effective on rupture force and rupture energy at 1% and were effective on toughness 5% probability. The mutual effect of orientation and moisture on rupture energy and toughness were effective at 1%.

Keywords: Mechanical properties, Pea, Physical properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2289
373 Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image

Authors: Yohei Saika, Yuji Haraguchi

Abstract:

We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.

Keywords: Noise reduction, JPEG-compressed image, Bayesian inference, the maximizer of the posterior marginal estimate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
372 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods

Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim

Abstract:

Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.

Keywords: Economical analysis, probability of failure, retaining walls, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974
371 Super Resolution Blind Reconstruction of Low Resolution Images using Wavelets based Fusion

Authors: Liyakathunisa, V. K. Ananthashayana

Abstract:

Crucial information barely visible to the human eye is often embedded in a series of low resolution images taken of the same scene. Super resolution reconstruction is the process of combining several low resolution images into a single higher resolution image. The ideal algorithm should be fast, and should add sharpness and details, both at edges and in regions without adding artifacts. In this paper we propose a super resolution blind reconstruction technique for linearly degraded images. In our proposed technique the algorithm is divided into three parts an image registration, wavelets based fusion and an image restoration. In this paper three low resolution images are considered which may sub pixels shifted, rotated, blurred or noisy, the sub pixel shifted images are registered using affine transformation model; A wavelet based fusion is performed and the noise is removed using soft thresolding. Our proposed technique reduces blocking artifacts and also smoothens the edges and it is also able to restore high frequency details in an image. Our technique is efficient and computationally fast having clear perspective of real time implementation.

Keywords: Affine Transforms, Denoiseing, DWT, Fusion, Image registration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2616
370 Asymptotic Analysis of Instant Messaging Service with Relay Nodes

Authors: Muhammad T. Alam, Zheng Da Wu

Abstract:

In this paper, we provide complete end-to-end delay analyses including the relay nodes for instant messages. Message Session Relay Protocol (MSRP) is used to provide congestion control for large messages in the Instant Messaging (IM) service. Large messages are broken into several chunks. These chunks may traverse through a maximum number of two relay nodes before reaching destination according to the IETF specification of the MSRP relay extensions. We discuss the current solutions of sending large instant messages and introduce a proposal to reduce message flows in the IM service. We consider virtual traffic parameter i.e., the relay nodes are stateless non-blocking for scalability purpose. This type of relay node is also assumed to have input rate at constant bit rate. We provide a new scheduling policy that schedules chunks according to their previous node?s delivery time stamp tags. Validation and analysis is shown for such scheduling policy. The performance analysis with the model introduced in this paper is simple and straight forward, which lead to reduced message flows in the IM service.

Keywords: Instant messaging, stateless, chunking, MSRP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
369 Speaker Identification by Joint Statistical Characterization in the Log Gabor Wavelet Domain

Authors: Suman Senapati, Goutam Saha

Abstract:

Real world Speaker Identification (SI) application differs from ideal or laboratory conditions causing perturbations that leads to a mismatch between the training and testing environment and degrade the performance drastically. Many strategies have been adopted to cope with acoustical degradation; wavelet based Bayesian marginal model is one of them. But Bayesian marginal models cannot model the inter-scale statistical dependencies of different wavelet scales. Simple nonlinear estimators for wavelet based denoising assume that the wavelet coefficients in different scales are independent in nature. However wavelet coefficients have significant inter-scale dependency. This paper enhances this inter-scale dependency property by a Circularly Symmetric Probability Density Function (CS-PDF) related to the family of Spherically Invariant Random Processes (SIRPs) in Log Gabor Wavelet (LGW) domain and corresponding joint shrinkage estimator is derived by Maximum a Posteriori (MAP) estimator. A framework is proposed based on these to denoise speech signal for automatic speaker identification problems. The robustness of the proposed framework is tested for Text Independent Speaker Identification application on 100 speakers of POLYCOST and 100 speakers of YOHO speech database in three different noise environments. Experimental results show that the proposed estimator yields a higher improvement in identification accuracy compared to other estimators on popular Gaussian Mixture Model (GMM) based speaker model and Mel-Frequency Cepstral Coefficient (MFCC) features.

Keywords: Speaker Identification, Log Gabor Wavelet, Bayesian Bivariate Estimator, Circularly Symmetric Probability Density Function, SIRP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
368 Comparative Studies on Interactions of Synthetic and Natural Compounds with Hen Egg-White Lysozyme

Authors: Seifollah Bahramikia

Abstract:

Amyloid aggregation of polypeptides is related to a growing number of pathologic states known as amyloid disorders. In recent years, blocking or reversing amyloid aggregation via the use of small compounds are considered as two useful approaches in hampering the development of these diseases. In this research, we have compared the ability of several manganese-salen derivatives, as synthetic compounds, and apigenin, as a natural flavonoid, to inhibit of hen egg-white lysozyme (HEWL) aggregation, as an in vitro model system. Different spectroscopic analyses such as Thioflavin T (ThT) and Anilinonaphthalene-8-sulfonic acid (ANS) fluorescence, Congo red (CR) absorbance along with transmission electron microscopy were used in this work to monitor the HEWL aggregation kinetic and inhibition. Our results demonstrated that both type of compounds were capable to prevent the formation of lysozyme amyloid aggregation in vitro. In addition, our data indicated that synthetic compounds had higher activity to inhibit of the β-sheet structures relative to natural compound. Regarding the higher antioxidant activities of the salen derivatives, it can be concluded that in addition to aromatic rings of each of the compounds, the potent antioxidant properties of salen derivatives contributes to lower lysozyme fibril accumulation.

Keywords: Aggregation, anti-amyloidogenic, apigenin, hen egg white lysozyme, salen derivatives.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
367 Association of the p53 Codon 72 Polymorphism with Colorectal Cancer in South West of Iran

Authors: A. Doosti, P. Ghasemi Dehkordi, M. Zamani, S. Taheri, M. Banitalebi, M. Mahmoudzadeh

Abstract:

The p53 tumor suppressor gene plays two important roles in genomic stability: blocking cell proliferation after DNA damage until it has been repaired, and starting apoptosis if the damage is too critical. Codon 72 exon4 polymorphism (Arg72Pro) of the P53 gene has been implicated in cancer risk. Various studies have been done to investigate the status of p53 at codon 72 for arginine (Arg) and proline (Pro) alleles in different populations and also the association of this codon 72 polymorphism with various tumors. Our objective was to investigate the possible association between P53 Arg72Pro polymorphism and susceptibility to colorectal cancer among Isfahan and Chaharmahal Va Bakhtiari (a part of south west of Iran) population. We investigated the status of p53 at codon 72 for Arg/Arg, Arg/Pro and Pro/Pro allele polymorphisms in blood samples from 145 colorectal cancer patients and 140 controls by Nested-PCR of p53 exon 4 and digestion with BstUI restriction enzyme and the DNA fragments were then resolved by electrophoresis in 2% agarose gel. The Pro allele was 279 bp, while the Arg allele was restricted into two fragments of 160 and 119 bp. Among the 145 colorectal cancer cases 49 cases (33.79%) were homozygous for the Arg72 allele (Arg/Arg), 18 cases (12.41%) were homozygous for the Pro72 allele (Pro/Pro) and 78 cases (53.8%) found in heterozygous (Arg/Pro). In conclusion, it can be said that p53Arg/Arg genotype may be correlated with possible increased risk of this kind of cancers in south west of Iran.

Keywords: TP53, Polymorphism, Colorectal Cancer, Iran

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
366 Preliminary Study on Analysis of Pinching Motion Actuated by Electro-Active Polymers

Authors: Doo W. Lee, Soo J. Lee, Bye R. Yoon, Jae Y. Jho, Kyehan Rhee

Abstract:

Hand exoskeletons have been developed in order to assist daily activities for disabled and elder people. A figure exoskeleton was developed using ionic polymer metal composite (IPMC) actuators, and the performance of it was evaluated in this study. In order to study dynamic performance of a finger dummy performing pinching motion, force generating characteristics of an IPMC actuator and pinching motion of a thumb and index finger dummy actuated by IMPC actuators were analyzed. The blocking force of 1.54 N was achieved under 4 V of DC. A thumb and index finger dummy, which has one degree of freedom at the proximal joint of each figure, was manufactured by a three dimensional rapid prototyping. Each figure was actuated by an IPMC actuator, and the maximum fingertip force was 1.18 N. Pinching motion of a dummy was analyzed by two video cameras in vertical top and horizontal left end view planes. A figure dummy powered by IPMC actuators could perform flexion and extension motion of an index figure and a thumb.

Keywords: Finger exoskeleton, ionic polymer metal composite, flexion and extension, motion analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
365 Robust ANOVA: An Illustrative Study in Horticultural Crop Research

Authors: Dinesh Inamadar, R. Venugopalan, K. Padmini

Abstract:

An attempt has been made in the present communication to elucidate the efficacy of robust ANOVA methods to analyse horticultural field experimental data in the presence of outliers. Results obtained fortify the use of robust ANOVA methods as there was substantiate reduction in error mean square, and hence the probability of committing Type I error, as compared to the regular approach.

Keywords: Outliers, robust ANOVA, horticulture, Cook distance, Type I error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2256
364 The Impact of Upgrades on ERP System Reliability

Authors: F. Urem, K. Fertalj, I. Livaja

Abstract:

Constant upgrading of Enterprise Resource Planning (ERP) systems is necessary, but can cause new defects. This paper attempts to model the likelihood of defects after completed upgrades with Weibull defect probability density function (PDF). A case study is presented analyzing data of recorded defects obtained for one ERP subsystem. The trends are observed for the value of the parameters relevant to the proposed statistical Weibull distribution for a given one year period. As a result, the ability to predict the appearance of defects after the next upgrade is described.

Keywords: ERP, upgrade, reliability, Weibull model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583
363 Solitons in Nonlinear Optical Lattices

Authors: Tapas Kumar Sinha, Joseph Mathew

Abstract:

Based on the Lagrangian for the Gross –Pitaevskii equation as derived by H. Sakaguchi and B.A Malomed [5] we have derived a double well model for the nonlinear optical lattice. This model explains the various features of nonlinear optical lattices. Further, from this model we obtain and simulate the probability for tunneling from one well to another which agrees with experimental results [4].

Keywords: Double well model, nonlinear optical lattice, Solitons, tunneling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
362 Characterization of the Microbial Induced Carbonate Precipitation Technique as a Biological Cementing Agent for Sand Deposits

Authors: Sameh Abu El-Soud, Zahra Zayed, Safwan Khedr, Adel M. Belal

Abstract:

The population increase in Egypt is urging for horizontal land development which became a demand to allow the benefit of different natural resources and expand from the narrow Nile valley. However, this development is facing challenges preventing land development and agriculture development. Desertification and moving sand dunes in the west sector of Egypt are considered the major obstacle that is blocking the ideal land use and development. In the proposed research, the sandy soil is treated biologically using Bacillus pasteurii bacteria as these bacteria have the ability to bond the sand partials to change its state of loose sand to cemented sand, which reduces the moving ability of the sand dunes. The procedure of implementing the Microbial Induced Carbonate Precipitation Technique (MICP) technique is examined, and the different factors affecting on this process such as the medium of bacteria sample preparation, the optical density (OD600), the reactant concentration, injection rates and intervals are highlighted. Based on the findings of the MICP treatment for sandy soil, conclusions and future recommendations are reached.

Keywords: Soil stabilization, biological treatment, MICP, sand cementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 980
361 Categorical Clustering By Converting Associated Information

Authors: Dongmin Cai, Stephen S-T Yau

Abstract:

Lacking an inherent “natural" dissimilarity measure between objects in categorical dataset presents special difficulties in clustering analysis. However, each categorical attributes from a given dataset provides natural probability and information in the sense of Shannon. In this paper, we proposed a novel method which heuristically converts categorical attributes to numerical values by exploiting such associated information. We conduct an experimental study with real-life categorical dataset. The experiment demonstrates the effectiveness of our approach.

Keywords: Categorical, Clustering, Converting, Information

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
360 Exponentially Weighted Simultaneous Estimation of Several Quantiles

Authors: Valeriy Naumov, Olli Martikainen

Abstract:

In this paper we propose new method for simultaneous generating multiple quantiles corresponding to given probability levels from data streams and massive data sets. This method provides a basis for development of single-pass low-storage quantile estimation algorithms, which differ in complexity, storage requirement and accuracy. We demonstrate that such algorithms may perform well even for heavy-tailed data.

Keywords: Quantile estimation, data stream, heavy-taileddistribution, tail index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482
359 Absolute Cross Sections of Multi-Photon Ionization of Xenon by the Comparison with Process of its Electron-Impact Ionization

Authors: A. A. Mityureva, A. A. Pastor, P. Yu. Serdobintsev, N. A. Timofeev

Abstract:

Comparison of electron- and photon-impact processes as a method for determination of photo-ionization cross sections is described, discussed and shown to have many attractive features.

Keywords: Transition probability, cross section, photo-ionization, electron-ionization, multi-photon process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
358 Optimal Image Compression Based on Sign and Magnitude Coding of Wavelet Coefficients

Authors: Mbainaibeye Jérôme, Noureddine Ellouze

Abstract:

Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.

Keywords: Image compression, wavelet transform, sign coding, magnitude coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
357 Scenario and Decision Analysis for Solar Energy in Egypt by 2035 Using Dynamic Bayesian Network

Authors: Rawaa H. El-Bidweihy, Hisham M. Abdelsalam, Ihab A. El-Khodary

Abstract:

Bayesian networks are now considered to be a promising tool in the field of energy with different applications. In this study, the aim was to indicate the states of a previous constructed Bayesian network related to the solar energy in Egypt and the factors affecting its market share, depending on the followed data distribution type for each factor, and using either the Z-distribution approach or the Chebyshev’s inequality theorem. Later on, the separate and the conditional probabilities of the states of each factor in the Bayesian network were derived, either from the collected and scrapped historical data or from estimations and past studies. Results showed that we could use the constructed model for scenario and decision analysis concerning forecasting the total percentage of the market share of the solar energy in Egypt by 2035 and using it as a stable renewable source for generating any type of energy needed. Also, it proved that whenever the use of the solar energy increases, the total costs decreases. Furthermore, we have identified different scenarios, such as the best, worst, 50/50, and most likely one, in terms of the expected changes in the percentage of the solar energy market share. The best scenario showed an 85% probability that the market share of the solar energy in Egypt will exceed 10% of the total energy market, while the worst scenario showed only a 24% probability that the market share of the solar energy in Egypt will exceed 10% of the total energy market. Furthermore, we applied policy analysis to check the effect of changing the controllable (decision) variable’s states acting as different scenarios, to show how it would affect the target nodes in the model. Additionally, the best environmental and economical scenarios were developed to show how other factors are expected to be, in order to affect the model positively. Additional evidence and derived probabilities were added for the weather dynamic nodes whose states depend on time, during the process of converting the Bayesian network into a dynamic Bayesian network.

Keywords: Bayesian network, Chebyshev, decision variable, dynamic Bayesian network, Z-distribution

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 436
356 Internal Migration and Poverty Dynamic Analysis Using a Bayesian Approach: The Tunisian Case

Authors: Amal Jmaii, Damien Rousseliere, Besma Belhadj

Abstract:

We explore the relationship between internal migration and poverty in Tunisia. We present a methodology combining potential outcomes approach with multiple imputation to highlight the effect of internal migration on poverty states. We find that probability of being poor decreases when leaving the poorest regions (the west areas) to the richer regions (greater Tunis and the east regions).

Keywords: Internal migration, Bayesian approach, poverty dynamics, Tunisia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880
355 ISTER (Immune System - Tumor Efficiency Rate): An Important Key for Planning in Radiotherapic Facilities

Authors: O. Sotolongo-Grau, D. Rodriguez-Perez, J. A. Santos-Miranda, M. M. Desco, O. Sotolongo-Costa, J. C. Antoranz

Abstract:

The use of the oncologic index ISTER allows for a more effective planning of the radiotherapic facilities in the hospitals. Any change in the radiotherapy treatment, due to unexpected stops, may be adapted by recalculating the doses to the new treatment duration while keeping the optimal prognosis. The results obtained in a simulation model on millions of patients allow the definition of optimal success probability algorithms.

Keywords: Mathematical model, radiation oncology, dynamical systems applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1474
354 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: Channel estimation, inter-cell interference, pilot contamination attacks, wireless communications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622
353 Treatment of Spin-1/2 Particle in Interaction with a Time-Dependent Magnetic Field by the Fermionic Coherent-State Path-Integral Formalism

Authors: Aouachria Mekki

Abstract:

We consider a spin-1/2 particle interacting with a time-dependent magnetic field using path integral formalism. The propagator is first of all written in the standard form replacing the spin by two fermionic oscillators via the Schwinger model. The propagator is then exactly determined, thanks to a simple transformation, and the transition probability is deduced.

Keywords: Path integral, formalism, Propagator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2419
352 A Comparison of Fuzzy Clustering Algorithms to Cluster Web Messages

Authors: Sara El Manar El Bouanani, Ismail Kassou

Abstract:

Our objective in this paper is to propose an approach capable of clustering web messages. The clustering is carried out by assigning, with a certain probability, texts written by the same web user to the same cluster based on Stylometric features and using fuzzy clustering algorithms. Focus in the present work is on comparing the most popular algorithms in fuzzy clustering theory namely, Fuzzy C-means, Possibilistic C-means and Fuzzy Possibilistic C-Means.

Keywords: Authorship detection, fuzzy clustering, profiling, stylometric features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2003