Search results for: Laguerre polynomial.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 241

Search results for: Laguerre polynomial.

151 Automatic Removal of Ocular Artifacts using JADE Algorithm and Neural Network

Authors: V Krishnaveni, S Jayaraman, A Gunasekaran, K Ramadoss

Abstract:

The ElectroEncephaloGram (EEG) is useful for clinical diagnosis and biomedical research. EEG signals often contain strong ElectroOculoGram (EOG) artifacts produced by eye movements and eye blinks especially in EEG recorded from frontal channels. These artifacts obscure the underlying brain activity, making its visual or automated inspection difficult. The goal of ocular artifact removal is to remove ocular artifacts from the recorded EEG, leaving the underlying background signals due to brain activity. In recent times, Independent Component Analysis (ICA) algorithms have demonstrated superior potential in obtaining the least dependent source components. In this paper, the independent components are obtained by using the JADE algorithm (best separating algorithm) and are classified into either artifact component or neural component. Neural Network is used for the classification of the obtained independent components. Neural Network requires input features that exactly represent the true character of the input signals so that the neural network could classify the signals based on those key characters that differentiate between various signals. In this work, Auto Regressive (AR) coefficients are used as the input features for classification. Two neural network approaches are used to learn classification rules from EEG data. First, a Polynomial Neural Network (PNN) trained by GMDH (Group Method of Data Handling) algorithm is used and secondly, feed-forward neural network classifier trained by a standard back-propagation algorithm is used for classification and the results show that JADE-FNN performs better than JADEPNN.

Keywords: Auto Regressive (AR) Coefficients, Feed Forward Neural Network (FNN), Joint Approximation Diagonalisation of Eigen matrices (JADE) Algorithm, Polynomial Neural Network (PNN).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834
150 A Class of Formal Operators for Combinatorial Identities and its Application

Authors: Ruigang Zhang, Wuyungaowa, Xingchen Ma

Abstract:

In this paper, we present some formulas of symbolic operator summation, which involving Generalization well-know number sequences or polynomial sequences, and mean while we obtain some identities about the sequences by employing M-R‘s substitution rule.

Keywords: Generating functions, operators sequence group, Riordan arrays, R. G operator group, combinatorial identities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767
149 Accurate And Efficient Global Approximation using Adaptive Polynomial RSM for Complex Mechanical and Vehicular Performance Models

Authors: Y. Z. Wu, Z. Dong, S. K. You

Abstract:

Global approximation using metamodel for complex mathematical function or computer model over a large variable domain is often needed in sensibility analysis, computer simulation, optimal control, and global design optimization of complex, multiphysics systems. To overcome the limitations of the existing response surface (RS), surrogate or metamodel modeling methods for complex models over large variable domain, a new adaptive and regressive RS modeling method using quadratic functions and local area model improvement schemes is introduced. The method applies an iterative and Latin hypercube sampling based RS update process, divides the entire domain of design variables into multiple cells, identifies rougher cells with large modeling error, and further divides these cells along the roughest dimension direction. A small number of additional sampling points from the original, expensive model are added over the small and isolated rough cells to improve the RS model locally until the model accuracy criteria are satisfied. The method then combines local RS cells to regenerate the global RS model with satisfactory accuracy. An effective RS cells sorting algorithm is also introduced to improve the efficiency of model evaluation. Benchmark tests are presented and use of the new metamodeling method to replace complex hybrid electrical vehicle powertrain performance model in vehicle design optimization and optimal control are discussed.

Keywords: Global approximation, polynomial response surface, domain decomposition, domain combination, multiphysics modeling, hybrid powertrain optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
148 Determination of Cd, Zn, K, pH, TNV, Organic Material and Electrical Conductivity (EC) Distribution in Agricultural Soils using Geostatistics and GIS (Case Study: South- Western of Natanz- Iran)

Authors: Abbas Hani, Seyed Ali Hoseini Abari

Abstract:

Soil chemical and physical properties have important roles in compartment of the environment and agricultural sustainability and human health. The objectives of this research is determination of spatial distribution patterns of Cd, Zn, K, pH, TNV, organic material and electrical conductivity (EC) in agricultural soils of Natanz region in Esfehan province. In this study geostatistic and non-geostatistic methods were used for prediction of spatial distribution of these parameters. 64 composite soils samples were taken at 0-20 cm depth. The study area is located in south of NATANZ agricultural lands with area of 21660 hectares. Spatial distribution of Cd, Zn, K, pH, TNV, organic material and electrical conductivity (EC) was determined using geostatistic and geographic information system. Results showed that Cd, pH, TNV and K data has normal distribution and Zn, OC and EC data had not normal distribution. Kriging, Inverse Distance Weighting (IDW), Local Polynomial Interpolation (LPI) and Redial Basis functions (RBF) methods were used to interpolation. Trend analysis showed that organic carbon in north-south and east to west did not have trend while K and TNV had second degree trend. We used some error measurements include, mean absolute error(MAE), mean squared error (MSE) and mean biased error(MBE). Ordinary kriging(exponential model), LPI(Local polynomial interpolation), RBF(radial basis functions) and IDW methods have been chosen as the best methods to interpolating of the soil parameters. Prediction maps by disjunctive kriging was shown that in whole study area was intensive shortage of organic matter and more than 63.4 percent of study area had shortage of K amount.

Keywords: Electrical conductivity, Geostatistics, Geographical Information System, TNV

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2639
147 Certain Estimates of Oscillatory Integrals and Extrapolation

Authors: Hussain Al-Qassem

Abstract:

In this paper we study the boundedness properties of certain oscillatory integrals with polynomial phase. We obtain sharp estimates for these oscillatory integrals. By the virtue of these estimates and extrapolation we obtain Lp boundedness for these oscillatory integrals under rather weak size conditions on the kernel function.

Keywords: Fourier transform, oscillatory integrals, Orlicz spaces, Block spaces, Extrapolation, Lp boundedness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1265
146 Validation on 3D Surface Roughness Algorithm for Measuring Roughness of Psoriasis Lesion

Authors: M.H. Ahmad Fadzil, Esa Prakasa, Hurriyatul Fitriyah, Hermawan Nugroho, Azura Mohd Affandi, S.H. Hussein

Abstract:

Psoriasis is a widespread skin disease affecting up to 2% population with plaque psoriasis accounting to about 80%. It can be identified as a red lesion and for the higher severity the lesion is usually covered with rough scale. Psoriasis Area Severity Index (PASI) scoring is the gold standard method for measuring psoriasis severity. Scaliness is one of PASI parameter that needs to be quantified in PASI scoring. Surface roughness of lesion can be used as a scaliness feature, since existing scale on lesion surface makes the lesion rougher. The dermatologist usually assesses the severity through their tactile sense, therefore direct contact between doctor and patient is required. The problem is the doctor may not assess the lesion objectively. In this paper, a digital image analysis technique is developed to objectively determine the scaliness of the psoriasis lesion and provide the PASI scaliness score. Psoriasis lesion is modelled by a rough surface. The rough surface is created by superimposing a smooth average (curve) surface with a triangular waveform. For roughness determination, a polynomial surface fitting is used to estimate average surface followed by a subtraction between rough and average surface to give elevation surface (surface deviations). Roughness index is calculated by using average roughness equation to the height map matrix. The roughness algorithm has been tested to 444 lesion models. From roughness validation result, only 6 models can not be accepted (percentage error is greater than 10%). These errors occur due the scanned image quality. Roughness algorithm is validated for roughness measurement on abrasive papers at flat surface. The Pearson-s correlation coefficient of grade value (G) of abrasive paper and Ra is -0.9488, its shows there is a strong relation between G and Ra. The algorithm needs to be improved by surface filtering, especially to overcome a problem with noisy data.

Keywords: psoriasis, roughness algorithm, polynomial surfacefitting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
145 Interpolation of Geofield Parameters

Authors: A. Pashayev, C. Ardil, R. Sadiqov

Abstract:

Various methods of geofield parameters restoration (by algebraic polynoms; filters; rational fractions; interpolation splines; geostatistical methods – kriging; search methods of nearest points – inverse distance, minimum curvature, local – polynomial interpolation; neural networks) have been analyzed and some possible mistakes arising during geofield surface modeling have been presented.

Keywords: interpolation methods, geofield parameters, neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
144 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper utilized a Poisson modulated-weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multidiversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: Cellular communication, femto- and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1310
143 Approximation of Sturm-Liouville Problems by Exponentially Weighted Legendre-Gauss Tau Method

Authors: Mohamed K. El Daou

Abstract:

We construct an exponentially weighted Legendre- Gauss Tau method for solving differential equations with oscillatory solutions. The proposed method is applied to Sturm-Liouville problems. Numerical examples illustrating the efficiency and the high accuracy of our results are presented.

Keywords: Oscillatory functions, Sturm-Liouville problems, legendre polynomial, gauss points.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1357
142 A Logic Approach to Database Dynamic Updating

Authors: Daniel Stamate

Abstract:

We introduce a logic-based framework for database updating under constraints. In our framework, the constraints are represented as an instantiated extended logic program. When performing an update, database consistency may be violated. We provide an approach of maintaining database consistency, and study the conditions under which the maintenance process is deterministic. We show that the complexity of the computations and decision problems presented in our framework is in each case polynomial time.

Keywords: Databases, knowledge bases, constraints, updates, minimal change, consistency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1324
141 Advanced Stochastic Models for Partially Developed Speckle

Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije

Abstract:

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1703
140 Order Reduction using Modified Pole Clustering and Pade Approximations

Authors: C.B. Vishwakarma

Abstract:

The authors present a mixed method for reducing the order of the large-scale dynamic systems. In this method, the denominator polynomial of the reduced order model is obtained by using the modified pole clustering technique while the coefficients of the numerator are obtained by Pade approximations. This method is conceptually simple and always generates stable reduced models if the original high-order system is stable. The proposed method is illustrated with the help of the numerical examples taken from the literature.

Keywords: Modified pole clustering, order reduction, padeapproximation, stability, transfer function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2923
139 On the Efficiency of Five Step Approximation Method for the Solution of General Third Order Ordinary Differential Equations

Authors: N. M. Kamoh, M. C. Soomiyol

Abstract:

In this work, a five step continuous method for the solution of third order ordinary differential equations was developed in block form using collocation and interpolation techniques of the shifted Legendre polynomial basis function. The method was found to be zero-stable, consistent and convergent. The application of the method in solving third order initial value problem of ordinary differential equations revealed that the method compared favorably with existing methods.

Keywords: Shifted Legendre polynomials, third order block method, discrete method, convergent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 602
138 Evolutionary Algorithms for the Multiobjective Shortest Path Problem

Authors: José Maria A. Pangilinan, Gerrit K. Janssens

Abstract:

This paper presents an overview of the multiobjective shortest path problem (MSPP) and a review of essential and recent issues regarding the methods to its solution. The paper further explores a multiobjective evolutionary algorithm as applied to the MSPP and describes its behavior in terms of diversity of solutions, computational complexity, and optimality of solutions. Results show that the evolutionary algorithm can find diverse solutions to the MSPP in polynomial time (based on several network instances) and can be an alternative when other methods are trapped by the tractability problem.

Keywords: Multiobjective evolutionary optimization, geneticalgorithms, shortest paths.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2688
137 Extended Deductive Databases with Uncertain Information

Authors: Daniel Stamate

Abstract:

The paper presents an approach for handling uncertain information in deductive databases using multivalued logics. Uncertainty means that database facts may be assigned logical values other than the conventional ones - true and false. The logical values represent various degrees of truth, which may be combined and propagated by applying the database rules. A corresponding multivalued database semantics is defined. We show that it extends successful conventional semantics as the well-founded semantics, and has a polynomial time data complexity.

Keywords: Reasoning under uncertainty, multivalued logics, deductive databases, logic programs, multivalued semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305
136 Identification of Printed Punjabi Words and English Numerals Using Gabor Features

Authors: Rajneesh Rani, Renu Dhir, G. S. Lehal

Abstract:

Script identification is one of the challenging steps in the development of optical character recognition system for bilingual or multilingual documents. In this paper an attempt is made for identification of English numerals at word level from Punjabi documents by using Gabor features. The support vector machine (SVM) classifier with five fold cross validation is used to classify the word images. The results obtained are quite encouraging. Average accuracy with RBF kernel, Polynomial and Linear Kernel functions comes out to be greater than 99%.

Keywords: Script identification, gabor features, support vector machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2091
135 Analysis of Self Excited Induction Generator using Particle Swarm Optimization

Authors: Hassan E. A. Ibrahim, Mohamed F. Serag

Abstract:

In this paper, Novel method, Particle Swarm Optimization (PSO) algorithm, based technique is proposed to estimate and analyze the steady state performance of self-excited induction generator (SEIG). In this novel method the tedious job of deriving the complex coefficients of a polynomial equation and solving it, as in previous methods, is not required. By comparing the simulation results obtained by the proposed method with those obtained by the well known mathematical methods, a good agreement between these results is obtained. The comparison validates the effectiveness of the proposed technique.

Keywords: Evolution theory, MATLAB, optimization, PSO, SEIG.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2412
134 On Reversal and Transposition Medians

Authors: Martin Bader

Abstract:

During the last years, the genomes of more and more species have been sequenced, providing data for phylogenetic recon- struction based on genome rearrangement measures. A main task in all phylogenetic reconstruction algorithms is to solve the median of three problem. Although this problem is NP-hard even for the sim- plest distance measures, there are exact algorithms for the breakpoint median and the reversal median that are fast enough for practical use. In this paper, this approach is extended to the transposition median as well as to the weighted reversal and transposition median. Although there is no exact polynomial algorithm known even for the pairwise distances, we will show that it is in most cases possible to solve these problems exactly within reasonable time by using a branch and bound algorithm.

Keywords: Comparative genomics, genome rearrangements, me-dian, reversals, transpositions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
133 Edge Detection in Low Contrast Images

Authors: Koushlendra Kumar Singh, Manish Kumar Bajpai, Rajesh K. Pandey

Abstract:

The edges of low contrast images are not clearly distinguishable to human eye. It is difficult to find the edges and boundaries in it. The present work encompasses a new approach for low contrast images. The Chebyshev polynomial based fractional order filter has been used for filtering operation on an image. The preprocessing has been performed by this filter on the input image. Laplacian of Gaussian method has been applied on preprocessed image for edge detection. The algorithm has been tested on two test images.

Keywords: Chebyshev polynomials, Fractional order differentiator, Laplacian of Gaussian (LoG) method, Low contrast image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3206
132 Non-Rigid Registration of Medical Images Using an Automated Method

Authors: Panos Kotsas

Abstract:

This paper presents the application of a signal intensity independent registration criterion for non-rigid body registration of medical images. The criterion is defined as the weighted ratio image of two images. The ratio is computed on a voxel per voxel basis and weighting is performed by setting the ratios between signal and background voxels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the signal areas of the two images and it is minimized using the Chebyshev polynomial approximation. The geometric transformation model adopted is a local cubic B-splines based model.

Keywords: Medical image, non-rigid, registration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
131 A 7DOF Manipulator Control in an Unknown Environment based on an Exact Algorithm

Authors: Pavel K. Lopatin, Artyom S. Yegorov

Abstract:

An exact algorithm for a n-link manipulator movement amidst arbitrary unknown static obstacles is presented. The algorithm guarantees the reaching of a target configuration of the manipulator in a finite number of steps. The algorithm is reduced to a finite number of calls of a subroutine for planning a trajectory in the presence of known forbidden states. The polynomial approximation algorithm which is used as the subroutine is presented. The results of the exact algorithm implementation for the control of a seven link (7 degrees of freedom, 7DOF) manipulator are given.

Keywords: Manipulator, trajectory planning, unknown obstacles

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1246
130 Rigid and Non-rigid Registration of Binary Objects using the Weighted Ratio Image

Authors: Panos Kotsas, Tony Dodd

Abstract:

This paper presents the application of a signal intensity independent similarity criterion for rigid and non-rigid body registration of binary objects. The criterion is defined as the weighted ratio image of two images. The ratio is computed on a voxel per voxel basis and weighting is performed by setting the raios between signal and background voxels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the signal areas of the two images and it is minimized using the Chebyshev polynomial approximation.

Keywords: rigid and non-rigid body registration, binary objects

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1289
129 Evaluating Sinusoidal Functions by a Low Complexity Cubic Spline Interpolator with Error Optimization

Authors: Abhijit Mitra, Harpreet Singh Dhillon

Abstract:

We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.

Keywords: Arithmetic, spline interpolator, hardware design, erroranalysis, optimization methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
128 Approximations to the Distribution of the Sample Correlation Coefficient

Authors: John N. Haddad, Serge B. Provost

Abstract:

Given a bivariate normal sample of correlated variables, (Xi, Yi), i = 1, . . . , n, an alternative estimator of Pearson’s correlation coefficient is obtained in terms of the ranges, |Xi − Yi|. An approximate confidence interval for ρX,Y is then derived, and a simulation study reveals that the resulting coverage probabilities are in close agreement with the set confidence levels. As well, a new approximant is provided for the density function of R, the sample correlation coefficient. A mixture involving the proposed approximate density of R, denoted by hR(r), and a density function determined from a known approximation due to R. A. Fisher is shown to accurately approximate the distribution of R. Finally, nearly exact density approximants are obtained on adjusting hR(r) by a 7th degree polynomial.

Keywords: Sample correlation coefficient, density approximation, confidence intervals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
127 Graphs with Metric Dimension Two-A Characterization

Authors: Sudhakara G, Hemanth Kumar A.R

Abstract:

In this paper, we define distance partition of vertex set of a graph G with reference to a vertex in it and with the help of the same, a graph with metric dimension two (i.e. β (G) = 2 ) is characterized. In the process, we develop a polynomial time algorithm that verifies if the metric dimension of a given graph G is two. The same algorithm explores all metric bases of graph G whenever β (G) = 2 . We also find a bound for cardinality of any distance partite set with reference to a given vertex, when ever β (G) = 2 . Also, in a graph G with β (G) = 2 , a bound for cardinality of any distance partite set as well as a bound for number of vertices in any sub graph H of G is obtained in terms of diam H .

Keywords: Metric basis, Distance partition, Metric dimension.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
126 A Feasible Path Selection QoS Routing Algorithm with two Constraints in Packet Switched Networks

Authors: P.S.Prakash, S.Selvan

Abstract:

Over the past several years, there has been a considerable amount of research within the field of Quality of Service (QoS) support for distributed multimedia systems. One of the key issues in providing end-to-end QoS guarantees in packet networks is determining a feasible path that satisfies a number of QoS constraints. The problem of finding a feasible path is NPComplete if number of constraints is more than two and cannot be exactly solved in polynomial time. We proposed Feasible Path Selection Algorithm (FPSA) that addresses issues with pertain to finding a feasible path subject to delay and cost constraints and it offers higher success rate in finding feasible paths.

Keywords: feasible path, multiple constraints, path selection, QoS routing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1702
125 The Multi-scenario Knapsack Problem: An Adaptive Search Algorithm

Authors: Mhand Hifi, Hedi Mhalla, Mustapha Michaphy

Abstract:

In this paper, we study the multi-scenario knapsack problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of an adaptive algorithm for solving heuristically the problem. The used method combines two complementary phases: a size reduction phase and a dynamic 2- opt procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for reducing the size problem. Second, the adaptive search procedure is applied in order to attain a feasible solution Finally, the performances of two versions of the proposed algorithm are evaluated on a set of randomly generated instances.

Keywords: combinatorial optimization, max-min optimization, knapsack, heuristics, problem reduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
124 Exploring Counting Methods for the Vertices of Certain Polyhedra with Uncertainties

Authors: Sammani Danwawu Abdullahi

Abstract:

Vertex Enumeration Algorithms explore the methods and procedures of generating the vertices of general polyhedra formed by system of equations or inequalities. These problems of enumerating the extreme points (vertices) of general polyhedra are shown to be NP-Hard. This lead to exploring how to count the vertices of general polyhedra without listing them. This is also shown to be #P-Complete. Some fully polynomial randomized approximation schemes (fpras) of counting the vertices of some special classes of polyhedra associated with Down-Sets, Independent Sets, 2-Knapsack problems and 2 x n transportation problems are presented together with some discovered open problems.

Keywords: Approximation, counting with uncertainties, mathematical programming, optimization, vertex enumeration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1315
123 Zero-knowledge-like Proof of Cryptanalysis of Bluetooth Encryption

Authors: Eric Filiol

Abstract:

This paper presents a protocol aiming at proving that an encryption system contains structural weaknesses without disclosing any information on those weaknesses. A verifier can check in a polynomial time that a given property of the cipher system output has been effectively realized. This property has been chosen by the prover in such a way that it cannot been achieved by known attacks or exhaustive search but only if the prover indeed knows some undisclosed weaknesses that may effectively endanger the cryptosystem security. This protocol has been denoted zero-knowledge-like proof of cryptanalysis. In this paper, we apply this protocol to the Bluetooth core encryption algorithm E0, used in many mobile environments and thus we suggest that its security can seriously be put into question.

Keywords: Bluetooth encryption, Bluetooth security, Bluetoothprotocol, Stream cipher, Zero-knowledge, Cryptanalysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
122 Analytical Study and Modeling of Free Vibrations of Functionally Graded Plates Using a Higher Shear Deformation Theory

Authors: A. Meftah, D. Zarga, M. Yahiaoui

Abstract:

In this paper, we have used an analytical method to analyze the vibratory behavior of plates in materials with gradient of properties, simply supported, proposing a refined non polynomial theory. The number of unknown functions involved in this theory is only four, as compared to five in the case of other higher shear deformation theories. The transverse shearing effects are studied according to the thickness of the plate. The motion equations for the FGM plates are obtained by the Hamilton principle application, the solutions are obtained using the Navier method, and then the fundamental frequencies are found, solving an eigenvalue equation system, the results of this analysis are presented and compared to those available in the literature.

Keywords: FGM plates, Navier method, vibratory behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 607