Search results for: T2)*-semi star generalized locally closed sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2082

Search results for: T2)*-semi star generalized locally closed sets

1722 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper utilized a Poisson modulated-weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multidiversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: Cellular communication, femto- and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1319
1721 Application of Pulse Doubling in Star-Connected Autotransformer Based 12-Pulse AC-DC Converter for Power Quality Improvement

Authors: Rohollah. Abdollahi, Alireza. Jalilian

Abstract:

This paper presents a pulse doubling technique in a 12-pulse ac-dc converter which supplies direct torque controlled motor drives (DTCIMD-s) in order to have better power quality conditions at the point of common coupling. The proposed technique increases the number of rectification pulses without significant changes in the installations and yields in harmonic reduction in both ac and dc sides. The 12-pulse rectified output voltage is accomplished via two paralleled six-pulse ac-dc converters each of them consisting of three-phase diode bridge rectifier. An autotransformer is designed to supply the rectifiers. The design procedure of magnetics is in a way such that makes it suitable for retrofit applications where a six-pulse diode bridge rectifier is being utilized. Independent operation of paralleled diode-bridge rectifiers, i.e. dc-ripple re-injection methodology, requires a Zero Sequence Blocking Transformer (ZSBT). Finally, a tapped interphase reactor is connected at the output of ZSBT to double the pulse numbers of output voltage up to 24 pulses. The aforementioned structure improves power quality criteria at ac mains and makes them consistent with the IEEE-519 standard requirements for varying loads. Furthermore, near unity power factor is obtained for a wide range of DTCIMD operation. A comparison is made between 6- pulse, 12-pulse, and proposed converters from view point of power quality indices. Results show that input current total harmonic distortion (THD) is less than 5% for the proposed topology at various loads.

Keywords: AC–DC converter, star-connected autotransformer, power quality, 24 pulse rectifier, Pulse Doubling, direct torquecontrolled induction motor drive (DTCIMD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2844
1720 Discovering Liouville-Type Problems for p-Energy Minimizing Maps in Closed Half-Ellipsoids by Calculus Variation Method

Authors: Lina Wu, Jia Liu, Ye Li

Abstract:

The goal of this project is to investigate constant properties (called the Liouville-type Problem) for a p-stable map as a local or global minimum of a p-energy functional where the domain is a Euclidean space and the target space is a closed half-ellipsoid. The First and Second Variation Formulas for a p-energy functional has been applied in the Calculus Variation Method as computation techniques. Stokes’ Theorem, Cauchy-Schwarz Inequality, Hardy-Sobolev type Inequalities, and the Bochner Formula as estimation techniques have been used to estimate the lower bound and the upper bound of the derived p-Harmonic Stability Inequality. One challenging point in this project is to construct a family of variation maps such that the images of variation maps must be guaranteed in a closed half-ellipsoid. The other challenging point is to find a contradiction between the lower bound and the upper bound in an analysis of p-Harmonic Stability Inequality when a p-energy minimizing map is not constant. Therefore, the possibility of a non-constant p-energy minimizing map has been ruled out and the constant property for a p-energy minimizing map has been obtained. Our research finding is to explore the constant property for a p-stable map from a Euclidean space into a closed half-ellipsoid in a certain range of p. The certain range of p is determined by the dimension values of a Euclidean space (the domain) and an ellipsoid (the target space). The certain range of p is also bounded by the curvature values on an ellipsoid (that is, the ratio of the longest axis to the shortest axis). Regarding Liouville-type results for a p-stable map, our research finding on an ellipsoid is a generalization of mathematicians’ results on a sphere. Our result is also an extension of mathematicians’ Liouville-type results from a special ellipsoid with only one parameter to any ellipsoid with (n+1) parameters in the general setting.

Keywords: Bochner Formula, Stokes’ Theorem, Cauchy-Schwarz Inequality, first and second variation formulas, Hardy-Sobolev type inequalities, Liouville-type problem, p-harmonic map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 883
1719 A Materialized Approach to the Integration of XML Documents: the OSIX System

Authors: H. Ahmad, S. Kermanshahani, A. Simonet, M. Simonet

Abstract:

The data exchanged on the Web are of different nature from those treated by the classical database management systems; these data are called semi-structured data since they do not have a regular and static structure like data found in a relational database; their schema is dynamic and may contain missing data or types. Therefore, the needs for developing further techniques and algorithms to exploit and integrate such data, and extract relevant information for the user have been raised. In this paper we present the system OSIX (Osiris based System for Integration of XML Sources). This system has a Data Warehouse model designed for the integration of semi-structured data and more precisely for the integration of XML documents. The architecture of OSIX relies on the Osiris system, a DL-based model designed for the representation and management of databases and knowledge bases. Osiris is a viewbased data model whose indexing system supports semantic query optimization. We show that the problem of query processing on a XML source is optimized by the indexing approach proposed by Osiris.

Keywords: Data integration, semi-structured data, views, XML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
1718 Properties and Approximation Distribution Reductions in Multigranulation Rough Set Model

Authors: Properties, Approximation Distribution Reductions in Multigranulation Rough Set Model

Abstract:

Some properties of approximation sets are studied in multi-granulation optimist model in rough set theory using maximal compatible classes. The relationships between or among lower and upper approximations in single and multiple granulation are compared and discussed. Through designing Boolean functions and discernibility matrices in incomplete information systems, the lower and upper approximation sets and reduction in multi-granulation environments can be found. By using examples, the correctness of computation approach is consolidated. The related conclusions obtained are suitable for further investigating in multiple granulation RSM.

Keywords: Incomplete information system, maximal compatible class, multi-granulation rough set model, reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
1717 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes

Authors: V. Churkin, M. Lopatin

Abstract:

The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second – 95,3%.

Keywords: Bass model, generalized Bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856
1716 Comparison of Imputation Techniques for Efficient Prediction of Software Fault Proneness in Classes

Authors: Geeta Sikka, Arvinder Kaur Takkar, Moin Uddin

Abstract:

Missing data is a persistent problem in almost all areas of empirical research. The missing data must be treated very carefully, as data plays a fundamental role in every analysis. Improper treatment can distort the analysis or generate biased results. In this paper, we compare and contrast various imputation techniques on missing data sets and make an empirical evaluation of these methods so as to construct quality software models. Our empirical study is based on NASA-s two public dataset. KC4 and KC1. The actual data sets of 125 cases and 2107 cases respectively, without any missing values were considered. The data set is used to create Missing at Random (MAR) data Listwise Deletion(LD), Mean Substitution(MS), Interpolation, Regression with an error term and Expectation-Maximization (EM) approaches were used to compare the effects of the various techniques.

Keywords: Missing data, Imputation, Missing Data Techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
1715 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity

Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle

Abstract:

The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.

Keywords: Complex-valued signal processing, synthetic aperture radar (SAR), 2-D radar imaging, compressive sensing, Sparse Bayesian learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461
1714 Comanche – A Compiler-Driven I/O Management System

Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye

Abstract:

Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.

Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
1713 A Sociocybernetics Data Analysis Using Causality in Tourism Networks

Authors: M. Lloret-Climent, J. Nescolarde-Selva

Abstract:

The aim of this paper is to propose a mathematical model to determine invariant sets, set covering, orbits and, in particular, attractors in the set of tourism variables. Analysis was carried out based on a pre-designed algorithm and applying our interpretation of chaos theory developed in the context of General Systems Theory. This article sets out the causal relationships associated with tourist flows in order to enable the formulation of appropriate strategies. Our results can be applied to numerous cases. For example, in the analysis of tourist flows, these findings can be used to determine whether the behaviour of certain groups affects that of other groups and to analyse tourist behaviour in terms of the most relevant variables. Unlike statistical analyses that merely provide information on current data, our method uses orbit analysis to forecast, if attractors are found, the behaviour of tourist variables in the immediate future.

Keywords: Attractor, invariant set, orbits, tourist variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
1712 Hybrid Prefix Adder Architecture for Minimizing the Power Delay Product

Authors: P.Ramanathan, P.T.Vanathi

Abstract:

Parallel Prefix addition is a technique for improving the speed of binary addition. Due to continuing integrating intensity and the growing needs of portable devices, low-power and highperformance designs are of prime importance. The classical parallel prefix adder structures presented in the literature over the years optimize for logic depth, area, fan-out and interconnect count of logic circuits. In this paper, a new architecture for performing 8-bit, 16-bit and 32-bit Parallel Prefix addition is proposed. The proposed prefix adder structures is compared with several classical adders of same bit width in terms of power, delay and number of computational nodes. The results reveal that the proposed structures have the least power delay product when compared with its peer existing Prefix adder structures. Tanner EDA tool was used for simulating the adder designs in the TSMC 180 nm and TSMC 130 nm technologies.

Keywords: Parallel Prefix Adder (PPA), Dot operator, Semi-Dotoperator, Complementary Metal Oxide Semiconductor (CMOS), Odd-dot operator, Even-dot operator, Odd-semi-dot operator andEven-semi-dot operator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
1711 A Hybrid Neural Network and Traditional Approach for Forecasting Lumpy Demand

Authors: A. Nasiri Pour, B. Rostami Tabar, A.Rahimzadeh

Abstract:

Accurate demand forecasting is one of the most key issues in inventory management of spare parts. The problem of modeling future consumption becomes especially difficult for lumpy patterns, which characterized by intervals in which there is no demand and, periods with actual demand occurrences with large variation in demand levels. However, many of the forecasting methods may perform poorly when demand for an item is lumpy. In this study based on the characteristic of lumpy demand patterns of spare parts a hybrid forecasting approach has been developed, which use a multi-layered perceptron neural network and a traditional recursive method for forecasting future demands. In the described approach the multi-layered perceptron are adapted to forecast occurrences of non-zero demands, and then a conventional recursive method is used to estimate the quantity of non-zero demands. In order to evaluate the performance of the proposed approach, their forecasts were compared to those obtained by using Syntetos & Boylan approximation, recently employed multi-layered perceptron neural network, generalized regression neural network and elman recurrent neural network in this area. The models were applied to forecast future demand of spare parts of Arak Petrochemical Company in Iran, using 30 types of real data sets. The results indicate that the forecasts obtained by using our proposed mode are superior to those obtained by using other methods.

Keywords: Lumpy Demand, Neural Network, Forecasting, Hybrid Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2640
1710 Learning Algorithms for Fuzzy Inference Systems Composed of Double- and Single-Input Rule Modules

Authors: Hirofumi Miyajima, Kazuya Kishida, Noritaka Shigei, Hiromi Miyajima

Abstract:

Most of self-tuning fuzzy systems, which are automatically constructed from learning data, are based on the steepest descent method (SDM). However, this approach often requires a large convergence time and gets stuck into a shallow local minimum. One of its solutions is to use fuzzy rule modules with a small number of inputs such as DIRMs (Double-Input Rule Modules) and SIRMs (Single-Input Rule Modules). In this paper, we consider a (generalized) DIRMs model composed of double and single-input rule modules. Further, in order to reduce the redundant modules for the (generalized) DIRMs model, pruning and generative learning algorithms for the model are suggested. In order to show the effectiveness of them, numerical simulations for function approximation, Box-Jenkins and obstacle avoidance problems are performed.

Keywords: Box-Jenkins’s problem, Double-input rule module, Fuzzy inference model, Obstacle avoidance, Single-input rule module.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
1709 Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC) within Operational Research (OR) with Sustainability and Phenomenology

Authors: Al-Salamin Hussain, Elias O. Tembe

Abstract:

Supply chain (SC) is an operational research (OR) approach and technique which acts as catalyst within central nervous system of business today. Without SC, any type of business is at doldrums, hence entropy. SC is the lifeblood of business today because it is the pivotal hub which provides imperative competitive advantage. The paper present a conceptual framework dubbed as Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC).The term Homomorphic is derived from abstract algebraic mathematical term homomorphism (same shape) which also embeds the following mathematical application sets: monomorphisms, isomorphism, automorphisms, and endomorphism. The HCFESC is intertwined and integrated with wide and broad sets of elements.

Keywords: Automorphisms, Homomorphism, Monomorphisms, Supply Chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
1708 A Neutral Set Approach for Applying TOPSIS in Maintenance Strategy Selection

Authors: C. Ardil

Abstract:

This paper introduces the concept of neutral sets (NSs) and explores various operations on NSs, along with their associated properties. The foundation of the Neutral Set framework lies in ontological neutrality and the principles of logic, including the Law of Non-Contradiction. By encompassing components for possibility, indeterminacy, and necessity, the NS framework provides a flexible representation of truth, uncertainty, and necessity, accommodating diverse ontological perspectives without presupposing specific existential commitments. The inclusion of Possibility acknowledges the spectrum of potential states or propositions, promoting neutrality by accommodating various viewpoints. Indeterminacy reflects the inherent uncertainty in understanding reality, refraining from making definitive ontological commitments in uncertain situations. Necessity captures propositions that must hold true under all circumstances, aligning with the principle of logical consistency and implicitly supporting the Law of Non-Contradiction. Subsequently, a neutral set-TOPSIS approach is applied in the maintenance strategy selection problem, demonstrating the practical applicability of the NS framework. The paper further explores uncertainty relations and presents the fundamental preliminaries of NS theory, emphasizing its role in fostering ontological neutrality and logical coherence in reasoning.

Keywords: Uncertainty sets, neutral sets, maintenance strategy selection multiple criteria decision-making analysis, MCDM, uncertainty decision analysis, distance function, multiple attribute, decision making, selection method, uncertainty, TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60
1707 Review of the Software Used for 3D Volumetric Reconstruction of the Liver

Authors: P. Strakos, M. Jaros, T. Karasek, T. Kozubek, P. Vavra, T. Jonszta

Abstract:

In medical imaging, segmentation of different areas of human body like bones, organs, tissues, etc. is an important issue. Image segmentation allows isolating the object of interest for further processing that can lead for example to 3D model reconstruction of whole organs. Difficulty of this procedure varies from trivial for bones to quite difficult for organs like liver. The liver is being considered as one of the most difficult human body organ to segment. It is mainly for its complexity, shape versatility and proximity of other organs and tissues. Due to this facts usually substantial user effort has to be applied to obtain satisfactory results of the image segmentation. Process of image segmentation then deteriorates from automatic or semi-automatic to fairly manual one. In this paper, overview of selected available software applications that can handle semi-automatic image segmentation with further 3D volume reconstruction of human liver is presented. The applications are being evaluated based on the segmentation results of several consecutive DICOM images covering the abdominal area of the human body.

Keywords: Image segmentation, semi-automatic, software, 3D volumetric reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4435
1706 Study of Natural Convection Heat Transfer of Plate-Fin Heat Sink in a Closed Enclosure

Authors: Han-Taw Chen, Tzu-Hsiang Lin, Chung-Hou Lai

Abstract:

The present study applies the inverse method and three-dimensional CFD commercial software in conjunction with the experimental temperature data to investigate the heat transfer and fluid flow characteristics of the plate-fin heat sink in a rectangular closed enclosure. The inverse method with the finite difference method and the experimental temperature data is applied to determine the approximate heat transfer coefficient. Later, based on the obtained results, the zero-equation turbulence model is used to obtain the heat transfer and fluid flow characteristics between two fins. T0 validate the accuracy of the results obtained, the comparison of the heat transfer coefficient is made. The obtained temperature at selected measurement locations of the fin is also compared with experimental data. The effect of the height of the rectangular enclosure on the obtained results is discussed.

Keywords: Inverse method, FLUENT, Plate-fin heat sink, Heat transfer characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2215
1705 Experimental Results about the Dynamics of the Generalized Belief Propagation Used on LDPC Codes

Authors: Jean-Christophe Sibel, Sylvain Reynal, David Declercq

Abstract:

In the context of channel coding, the Generalized Belief Propagation (GBP) is an iterative algorithm used to recover the transmission bits sent through a noisy channel. To ensure a reliable transmission, we apply a map on the bits, that is called a code. This code induces artificial correlations between the bits to send, and it can be modeled by a graph whose nodes are the bits and the edges are the correlations. This graph, called Tanner graph, is used for most of the decoding algorithms like Belief Propagation or Gallager-B. The GBP is based on a non unic transformation of the Tanner graph into a so called region-graph. A clear advantage of the GBP over the other algorithms is the freedom in the construction of this graph. In this article, we explain a particular construction for specific graph topologies that involves relevant performance of the GBP. Moreover, we investigate the behavior of the GBP considered as a dynamic system in order to understand the way it evolves in terms of the time and in terms of the noise power of the channel. To this end we make use of classical measures and we introduce a new measure called the hyperspheres method that enables to know the size of the attractors.

Keywords: iterative decoder, LDPC, region-graph, chaos.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
1704 Computational and Experimental Investigation of Supersonic Flow and their Controls

Authors: Vasana M. Don, Eldad J. Avital, Fariborz Motallebi

Abstract:

Supersonic open and closed cavity flows are investigated experimentally and computationally. Free stream Mach number of two is set. Schlieren imaging is used to visualise the flow behaviour showing stark differences between open and closed. Computational Fluid Dynamics (CFD) is used to simulate open cavity of flow with aspect ratio of 4. A rear wall treatment is implemented in order to pursue a simple passive control approach. Good qualitative agreement is achieved between the experimental flow visualisation and the CFD in terms of the expansion-shock waves system. The cavity oscillations are shown to be dominated by the first and third Rossister modes combining to high fluctuations of non-linear nature above the cavity rear edge. A simple rear wall treatment in terms of a hole shows mixed effect on the flow oscillations, RMS contours, and time history density fluctuations are given and analysed.

Keywords: Supersonic, Schlieren, open-cavity, flow simulation, passive control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2365
1703 Transient Population Dynamics of Phase Singularities in 2D Beeler-Reuter Model

Authors: Hidetoshi Konno, Akio Suzuki

Abstract:

The paper presented a transient population dynamics of phase singularities in 2D Beeler-Reuter model. Two stochastic modelings are examined: (i) the Master equation approach with the transition rate (i.e., λ(n, t) = λ(t)n and μ(n, t) = μ(t)n) and (ii) the nonlinear Langevin equation approach with a multiplicative noise. The exact general solution of the Master equation with arbitrary time-dependent transition rate is given. Then, the exact solution of the mean field equation for the nonlinear Langevin equation is also given. It is demonstrated that transient population dynamics is successfully identified by the generalized Logistic equation with fractional higher order nonlinear term. It is also demonstrated the necessity of introducing time-dependent transition rate in the master equation approach to incorporate the effect of nonlinearity.

Keywords: Transient population dynamics, Phase singularity, Birth-death process, Non-stationary Master equation, nonlinear Langevin equation, generalized Logistic equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561
1702 Effect of Domestic Treated Wastewater use on Three Varieties of Amaranth (Amaranthus spp.) under Semi Arid Conditions

Authors: El Youssfi L., Choukr-Allah R., Zaafrani M., Mediouni T., Sarr F, Hirich A.

Abstract:

An experiment was implemented in a filed in the south of Morocco to evaluate the effects of domestic treated wastewater use for irrigation of amaranth crop under semi-arid conditions. Three varieties (A0020, A0057 & A211) were tested and irrigated using domestic treated wastewater EC1 (0,92 dS/m) as control, EC3 (3dS/m) and EC6 (6dS/m) obtained by adding sea water. In term of growth, an increase of the EC level of applied irrigation water reduced significantly the plant-s height, leaf area, fresh and dry weight measured at vegetative, flowering and maturity stage for all varieties. Even with the application of the EC6, yields were relatively higher in comparison with the once obtained in normal cultivation conditions. A significant accumulation of nitrate, chloride and sodium in soil layers during the crop cycle was noted. The use of treated waste water for its irrigation is proved to be possible. The variety A211 had showed to be less sensitive to salinity stress and it could be more promising its introduction to study area.

Keywords: Amaranth, salinity, semi-arid, treated waste water.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
1701 On the Noise Distance in Robust Fuzzy C-Means

Authors: M. G. C. A. Cimino, G. Frosini, B. Lazzerini, F. Marcelloni

Abstract:

In the last decades, a number of robust fuzzy clustering algorithms have been proposed to partition data sets affected by noise and outliers. Robust fuzzy C-means (robust-FCM) is certainly one of the most known among these algorithms. In robust-FCM, noise is modeled as a separate cluster and is characterized by a prototype that has a constant distance δ from all data points. Distance δ determines the boundary of the noise cluster and therefore is a critical parameter of the algorithm. Though some approaches have been proposed to automatically determine the most suitable δ for the specific application, up to today an efficient and fully satisfactory solution does not exist. The aim of this paper is to propose a novel method to compute the optimal δ based on the analysis of the distribution of the percentage of objects assigned to the noise cluster in repeated executions of the robust-FCM with decreasing values of δ . The extremely encouraging results obtained on some data sets found in the literature are shown and discussed.

Keywords: noise prototype, robust fuzzy clustering, robustfuzzy C-means

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
1700 Optimal Data Compression and Filtering: The Case of Infinite Signal Sets

Authors: Anatoli Torokhti, Phil Howlett

Abstract:

We present a theory for optimal filtering of infinite sets of random signals. There are several new distinctive features of the proposed approach. First, we provide a single optimal filter for processing any signal from a given infinite signal set. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.

Keywords: stochastic signals, optimization problems in signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1257
1699 Generic Filtering of Infinite Sets of Stochastic Signals

Authors: Anatoli Torokhti, Phil Howlett

Abstract:

A theory for optimal filtering of infinite sets of random signals is presented. There are several new distinctive features of the proposed approach. First, a single optimal filter for processing any signal from a given infinite signal set is provided. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the scheme concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.

Keywords: Optimal filtering, data compression, stochastic signals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1296
1698 Semi-automatic Construction of Ontology-based CBR System for Knowledge Integration

Authors: Junjie Gao, Guishi Deng

Abstract:

In order to integrate knowledge in heterogeneous case-based reasoning (CBR) systems, ontology-based CBR system has become a hot topic. To solve the facing problems of ontology-based CBR system, for example, its architecture is nonstandard, reusing knowledge in legacy CBR is deficient, ontology construction is difficult, etc, we propose a novel approach for semi-automatically construct ontology-based CBR system whose architecture is based on two-layer ontology. Domain knowledge implied in legacy case bases can be mapped from relational database schema and knowledge items to relevant OWL local ontology automatically by a mapping algorithm with low time-complexity. By concept clustering based on formal concept analysis, computing concept equation measure and concept inclusion measure, some suggestions about enriching or amending concept hierarchy of OWL local ontologies are made automatically that can aid designers to achieve semi-automatic construction of OWL domain ontology. Validation of the approach is done by an application example.

Keywords: OWL ontology, Case-based Reasoning, FormalConcept Analysis, Knowledge Integration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
1697 Modeling and Investigation of Volume Strain at Large Deformation under Uniaxial Cyclic Loading in Semi Crystalline Polymer

Authors: Rida B. Arieby

Abstract:

This study deals with the experimental investigation and theoretical modeling of Semi crystalline polymeric materials with a rubbery amorphous phase (HDPE) subjected to a uniaxial cyclic tests with various maximum strain levels, even at large deformation. Each cycle is loaded in tension up to certain maximum strain and then unloaded down to zero stress with N number of cycles. This work is focuses on the measure of the volume strain due to the phenomena of damage during this kind of tests. On the basis of thermodynamics of relaxation processes, a constitutive model for large strain deformation has been developed, taking into account the damage effect, to predict the complex elasto-viscoelastic-viscoplastic behavior of material. A direct comparison between the model predictions and the experimental data show that the model accurately captures the material response. The model is also capable of predicting the influence damage causing volume variation.

Keywords: Cyclic test, large strain, polymers semi-crystalline, Volume strain, Thermodynamics of Irreversible Processes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281
1696 A Study of Two Disease Models: With and Without Incubation Period

Authors: H. C. Chinwenyi, H. D. Ibrahim, J. O. Adekunle

Abstract:

The incubation period is defined as the time from infection with a microorganism to development of symptoms. In this research, two disease models: one with incubation period and another without incubation period were studied. The study involves the use of a  mathematical model with a single incubation period. The test for the existence and stability of the disease free and the endemic equilibrium states for both models were carried out. The fourth order Runge-Kutta method was used to solve both models numerically. Finally, a computer program in MATLAB was developed to run the numerical experiments. From the results, we are able to show that the endemic equilibrium state of the model with incubation period is locally asymptotically stable whereas the endemic equilibrium state of the model without incubation period is unstable under certain conditions on the given model parameters. It was also established that the disease free equilibrium states of the model with and without incubation period are locally asymptotically stable. Furthermore, results from numerical experiments using empirical data obtained from Nigeria Centre for Disease Control (NCDC) showed that the overall population of the infected people for the model with incubation period is higher than that without incubation period. We also established from the results obtained that as the transmission rate from susceptible to infected population increases, the peak values of the infected population for the model with incubation period decrease and are always less than those for the model without incubation period.

Keywords: Asymptotic stability, incubation period, Routh-Hurwitz criterion, Runge Kutta method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 656
1695 A Fault Tolerant Token-based Algorithm for Group Mutual Exclusion in Distributed Systems

Authors: Abhishek Swaroop, Awadhesh Kumar Singh

Abstract:

The group mutual exclusion (GME) problem is a variant of the mutual exclusion problem. In the present paper a token-based group mutual exclusion algorithm, capable of handling transient faults, is proposed. The algorithm uses the concept of dynamic request sets. A time out mechanism is used to detect the token loss; also, a distributed scheme is used to regenerate the token. The worst case message complexity of the algorithm is n+1. The maximum concurrency and forum switch complexity of the algorithm are n and min (n, m) respectively, where n is the number of processes and m is the number of groups. The algorithm also satisfies another desirable property called smooth admission. The scheme can also be adapted to handle the extended group mutual exclusion problem.

Keywords: Dynamic request sets, Fault tolerance, Smoothadmission, Transient faults.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
1694 Aircraft Supplier Selection using Multiple Criteria Group Decision Making Process with Proximity Measure Method for Determinate Fuzzy Set Ranking Analysis

Authors: C. Ardil

Abstract:

Aircraft supplier selection process, which is considered as a fundamental supply chain problem, is a multi-criteria group decision problem that has a significant impact on the performance of the entire supply chain. In practical situations are frequently incomplete and uncertain information, making it difficult for decision-makers to communicate their opinions on candidates with precise and definite values. To solve the aircraft supplier selection problem in an environment of incomplete and uncertain information, proximity measure method is proposed. It uses determinate fuzzy numbers. The weights of each decision maker are equally predetermined and the entropic criteria weights are calculated using each decision maker's decision matrix. Additionally, determinate fuzzy numbers, it is proposed to use the weighted normalized Minkowski distance function and Hausdorff distance function to determine the ranking order patterns of alternatives. A numerical example for aircraft supplier selection is provided to further demonstrate the applicability, effectiveness, validity and rationality of the proposed method.

Keywords: Aircraft supplier selection, multiple criteria decision making, fuzzy sets, determinate fuzzy sets, intuitionistic fuzzy sets, proximity measure method, Minkowski distance function, Hausdorff distance function, PMM, MCDM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 328
1693 Semi-Automatic Approach for Semantic Annotation

Authors: Mohammad Yasrebi, Mehran Mohsenzadeh

Abstract:

The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.

Keywords: Semantic Annotation, Metadata, Information Extraction, Semantic Web, knowledge base.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1837