Search results for: Legendre polynomials.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 78

Search results for: Legendre polynomials.

18 An Approach to Control Design for Nonlinear Systems via Two-stage Formal Linearization and Two-type LQ Controls

Authors: Kazuo Komatsu, Hitoshi Takata

Abstract:

In this paper we consider a nonlinear control design for nonlinear systems by using two-stage formal linearization and twotype LQ controls. The ordinary LQ control is designed on almost linear region around the steady state point. On the other region, another control is derived as follows. This derivation is based on coordinate transformation twice with respect to linearization functions which are defined by polynomials. The linearized systems can be made up by using Taylor expansion considered up to the higher order. To the resulting formal linear system, the LQ control theory is applied to obtain another LQ control. Finally these two-type LQ controls are smoothly united to form a single nonlinear control. Numerical experiments indicate that this control show remarkable performances for a nonlinear system.

Keywords: Formal Linearization, LQ Control, Nonlinear Control, Taylor Expansion, Zero Function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
17 Arabic Character Recognition Using Regression Curves with the Expectation Maximization Algorithm

Authors: Abdullah A. AlShaher

Abstract:

In this paper, we demonstrate how regression curves can be used to recognize 2D non-rigid handwritten shapes. Each shape is represented by a set of non-overlapping uniformly distributed landmarks. The underlying models utilize 2nd order of polynomials to model shapes within a training set. To estimate the regression models, we need to extract the required coefficients which describe the variations for a set of shape class. Hence, a least square method is used to estimate such modes. We then proceed by training these coefficients using the apparatus Expectation Maximization algorithm. Recognition is carried out by finding the least error landmarks displacement with respect to the model curves. Handwritten isolated Arabic characters are used to evaluate our approach.

Keywords: Shape recognition, Arabic handwritten characters, regression curves, expectation maximization algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 662
16 Rigid Registration of Reduced Dimension Images using 1D Binary Projections

Authors: Panos D. Kotsas, Tony Dodd

Abstract:

The purpose of this work is to present a method for rigid registration of medical images using 1D binary projections when a part of one of the two images is missing. We use 1D binary projections and we adjust the projection limits according to the reduced image in order to perform accurate registration. We use the variance of the weighted ratio as a registration function which we have shown is able to register 2D and 3D images more accurately and robustly than mutual information methods. The function is computed explicitly for n=5 Chebyshev points in a [-9,+9] interval and it is approximated using Chebyshev polynomials for all other points. The images used are MR scans of the head. We find that the method is able to register the two images with average accuracy 0.3degrees for rotations and 0.2 pixels for translations for a y dimension of 156 with initial dimension 256. For y dimension 128/256 the accuracy decreases to 0.7 degrees for rotations and 0.6 pixels for translations.

Keywords: binary projections, image registration, reduceddimension images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
15 3D Object Model Reconstruction Based on Polywogs Wavelet Network Parametrization

Authors: Mohamed Othmani, Yassine Khlifi

Abstract:

This paper presents a technique for compact three dimensional (3D) object model reconstruction using wavelet networks. It consists to transform an input surface vertices into signals,and uses wavelet network parameters for signal approximations. To prove this, we use a wavelet network architecture founded on several mother wavelet families. POLYnomials WindOwed with Gaussians (POLYWOG) wavelet families are used to maximize the probability to select the best wavelets which ensure the good generalization of the network. To achieve a better reconstruction, the network is trained several iterations to optimize the wavelet network parameters until the error criterion is small enough. Experimental results will shown that our proposed technique can effectively reconstruct an irregular 3D object models when using the optimized wavelet network parameters. We will prove that an accurateness reconstruction depends on the best choice of the mother wavelets.

Keywords: 3D object, optimization, parametrization, Polywog wavelets, reconstruction, wavelet networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1445
14 A Technique for Improving the Performance of Median Smoothers at the Corners Characterized by Low Order Polynomials

Authors: E. Srinivasan, D. Ebenezer

Abstract:

Median filters with larger windows offer greater smoothing and are more robust than the median filters of smaller windows. However, the larger median smoothers (the median filters with the larger windows) fail to track low order polynomial trends in the signals. Due to this, constant regions are produced at the signal corners, leading to the loss of fine details. In this paper, an algorithm, which combines the ability of the 3-point median smoother in preserving the low order polynomial trends and the superior noise filtering characteristics of the larger median smoother, is introduced. The proposed algorithm (called the combiner algorithm in this paper) is evaluated for its performance on a test image corrupted with different types of noise and the results obtained are included.

Keywords: Image filtering, detail preservation, median filters, nonlinear filters, order statistics filtering, Rank order filtering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
13 A Contribution to the Polynomial Eigen Problem

Authors: Malika Yaici, Kamel Hariche, Tim Clarke

Abstract:

The relationship between eigenstructure (eigenvalues and eigenvectors) and latent structure (latent roots and latent vectors) is established. In control theory eigenstructure is associated with the state space description of a dynamic multi-variable system and a latent structure is associated with its matrix fraction description. Beginning with block controller and block observer state space forms and moving on to any general state space form, we develop the identities that relate eigenvectors and latent vectors in either direction. Numerical examples illustrate this result. A brief discussion of the potential of these identities in linear control system design follows. Additionally, we present a consequent result: a quick and easy method to solve the polynomial eigenvalue problem for regular matrix polynomials.

Keywords: Eigenvalues/Eigenvectors, Latent values/vectors, Matrix fraction description, State space description.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
12 Identification of LTI Autonomous All Pole System Using Eigenvector Algorithm

Authors: Sudipta Majumdar

Abstract:

This paper presents a method for identification of a linear time invariant (LTI) autonomous all pole system using singular value decomposition. The novelty of this paper is two fold: First, MUSIC algorithm for estimating complex frequencies from real measurements is proposed. Secondly, using the proposed algorithm, we can identify the coefficients of differential equation that determines the LTI system by switching off our input signal. For this purpose, we need only to switch off the input, apply our complex MUSIC algorithm and determine the coefficients as symmetric polynomials in the complex frequencies. This method can be applied to unstable system and has higher resolution as compared to time series solution when, noisy data are used. The classical performance bound, Cramer Rao bound (CRB), has been used as a basis for performance comparison of the proposed method for multiple poles estimation in noisy exponential signal.

Keywords: MUSIC algorithm, Cramer Rao bound, frequency estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 883
11 Secure Data Aggregation Using Clusters in Sensor Networks

Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik

Abstract:

Wireless sensor network can be applied to both abominable and military environments. A primary goal in the design of wireless sensor networks is lifetime maximization, constrained by the energy capacity of batteries. One well-known method to reduce energy consumption in such networks is data aggregation. Providing efcient data aggregation while preserving data privacy is a challenging problem in wireless sensor networks research. In this paper, we present privacy-preserving data aggregation scheme for additive aggregation functions. The Cluster-based Private Data Aggregation (CPDA)leverages clustering protocol and algebraic properties of polynomials. It has the advantage of incurring less communication overhead. The goal of our work is to bridge the gap between collaborative data collection by wireless sensor networks and data privacy. We present simulation results of our schemes and compare their performance to a typical data aggregation scheme TAG, where no data privacy protection is provided. Results show the efficacy and efficiency of our schemes.

Keywords: Aggregation, Clustering, Query Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
10 Generalized π-Armendariz Authentication Cryptosystem

Authors: Areej M. Abduldaim, Nadia M. G. Al-Saidi

Abstract:

Algebra is one of the important fields of mathematics. It concerns with the study and manipulation of mathematical symbols. It also concerns with the study of abstractions such as groups, rings, and fields. Due to the development of these abstractions, it is extended to consider other structures, such as vectors, matrices, and polynomials, which are non-numerical objects. Computer algebra is the implementation of algebraic methods as algorithms and computer programs. Recently, many algebraic cryptosystem protocols are based on non-commutative algebraic structures, such as authentication, key exchange, and encryption-decryption processes are adopted. Cryptography is the science that aimed at sending the information through public channels in such a way that only an authorized recipient can read it. Ring theory is the most attractive category of algebra in the area of cryptography. In this paper, we employ the algebraic structure called skew -Armendariz rings to design a neoteric algorithm for zero knowledge proof. The proposed protocol is established and illustrated through numerical example, and its soundness and completeness are proved.

Keywords: Cryptosystem, identification, skew π-Armendariz rings, skew polynomial rings, zero knowledge protocol.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 743
9 Method of Moments for Analysis of Multiple Crack Interaction in an Isotropic Elastic Solid

Authors: Weifeng Wang, Xianwei Zeng, Jianping Ding

Abstract:

The problem of N cracks interaction in an isotropic elastic solid is decomposed into a subproblem of a homogeneous solid without crack and N subproblems with each having a single crack subjected to unknown tractions on the two crack faces. The unknown tractions, namely pseudo tractions on each crack are expanded into polynomials with unknown coefficients, which have to be determined by the consistency condition, i.e. by the equivalence of the original multiple cracks interaction problem and the superposition of the N+1 subproblems. In this paper, Kachanov-s approach of average tractions is extended into the method of moments to approximately impose the consistence condition. Hence Kachanov-s method can be viewed as the zero-order method of moments. Numerical results of the stress intensity factors are presented for interactions of two collinear cracks, three collinear cracks, two parallel cracks, and three parallel cracks. As the order of moment increases, the accuracy of the method of moments improves.

Keywords: Crack interaction, stress intensity factor, multiplecracks, method of moments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
8 Flow Modeling and Runner Design Optimization in Turgo Water Turbines

Authors: John S. Anagnostopoulos, Dimitrios E. Papantonis

Abstract:

The incorporation of computational fluid dynamics in the design of modern hydraulic turbines appears to be necessary in order to improve their efficiency and cost-effectiveness beyond the traditional design practices. A numerical optimization methodology is developed and applied in the present work to a Turgo water turbine. The fluid is simulated by a Lagrangian mesh-free approach that can provide detailed information on the energy transfer and enhance the understanding of the complex, unsteady flow field, at very small computing cost. The runner blades are initially shaped according to hydrodynamics theory, and parameterized using Bezier polynomials and interpolation techniques. The use of a limited number of free design variables allows for various modifications of the standard blade shape, while stochastic optimization using evolutionary algorithms is implemented to find the best blade that maximizes the attainable hydraulic efficiency of the runner. The obtained optimal runner design achieves considerably higher efficiency than the standard one, and its numerically predicted performance is comparable to a real Turgo turbine, verifying the reliability and the prospects of the new methodology.

Keywords: Turgo turbine, Lagrangian flow modeling, Surface parameterization, Design optimization, Evolutionary algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4000
7 An Approach to Polynomial Curve Comparison in Geometric Object Database

Authors: Chanon Aphirukmatakun, Natasha Dejdumrong

Abstract:

In image processing and visualization, comparing two bitmapped images needs to be compared from their pixels by matching pixel-by-pixel. Consequently, it takes a lot of computational time while the comparison of two vector-based images is significantly faster. Sometimes these raster graphics images can be approximately converted into the vector-based images by various techniques. After conversion, the problem of comparing two raster graphics images can be reduced to the problem of comparing vector graphics images. Hence, the problem of comparing pixel-by-pixel can be reduced to the problem of polynomial comparisons. In computer aided geometric design (CAGD), the vector graphics images are the composition of curves and surfaces. Curves are defined by a sequence of control points and their polynomials. In this paper, the control points will be considerably used to compare curves. The same curves after relocated or rotated are treated to be equivalent while two curves after different scaled are considered to be similar curves. This paper proposed an algorithm for comparing the polynomial curves by using the control points for equivalence and similarity. In addition, the geometric object-oriented database used to keep the curve information has also been defined in XML format for further used in curve comparisons.

Keywords: Bezier curve, Said-Ball curve, Wang-Ball curve, DP curve, CAGD, comparison, geometric object database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2165
6 A Combined Conventional and Differential Evolution Method for Model Order Reduction

Authors: J. S. Yadav, N. P. Patidar, J. Singhai, S. Panda, C. Ardil

Abstract:

In this paper a mixed method by combining an evolutionary and a conventional technique is proposed for reduction of Single Input Single Output (SISO) continuous systems into Reduced Order Model (ROM). In the conventional technique, the mixed advantages of Mihailov stability criterion and continued Fraction Expansions (CFE) technique is employed where the reduced denominator polynomial is derived using Mihailov stability criterion and the numerator is obtained by matching the quotients of the Cauer second form of Continued fraction expansions. Then, retaining the numerator polynomial, the denominator polynomial is recalculated by an evolutionary technique. In the evolutionary method, the recently proposed Differential Evolution (DE) optimization technique is employed. DE method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. The proposed method is illustrated through a numerical example and compared with ROM where both numerator and denominator polynomials are obtained by conventional method to show its superiority.

Keywords: Reduced Order Modeling, Stability, Mihailov Stability Criterion, Continued Fraction Expansions, Differential Evolution, Integral Squared Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
5 Non-Local Behavior of a Mixed-Mode Crack in a Functionally Graded Piezoelectric Medium

Authors: Nidhal Jamia, Sami El-Borgi

Abstract:

In this paper, the problem of a mixed-Mode crack embedded in an infinite medium made of a functionally graded piezoelectric material (FGPM) with crack surfaces subjected to electro-mechanical loadings is investigated. Eringen’s non-local theory of elasticity is adopted to formulate the governing electro-elastic equations. The properties of the piezoelectric material are assumed to vary exponentially along a perpendicular plane to the crack. Using Fourier transform, three integral equations are obtained in which the unknown variables are the jumps of mechanical displacements and electric potentials across the crack surfaces. To solve the integral equations, the unknowns are directly expanded as a series of Jacobi polynomials, and the resulting equations solved using the Schmidt method. In contrast to the classical solutions based on the local theory, it is found that no mechanical stress and electric displacement singularities are present at the crack tips when nonlocal theory is employed to investigate the problem. A direct benefit is the ability to use the calculated maximum stress as a fracture criterion. The primary objective of this study is to investigate the effects of crack length, material gradient parameter describing FGPMs, and lattice parameter on the mechanical stress and electric displacement field near crack tips.

Keywords: Functionally graded piezoelectric material, mixed-mode crack, non-local theory, Schmidt method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 951
4 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper utilized a Poisson modulated-weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multidiversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: Cellular communication, femto- and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307
3 Conventional and PSO Based Approaches for Model Reduction of SISO Discrete Systems

Authors: S. K. Tomar, R. Prasad, S. Panda, C. Ardil

Abstract:

Reduction of Single Input Single Output (SISO) discrete systems into lower order model, using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Modified Cauer Form (MCF) and differentiation are used. In this method the original discrete system is, first, converted into equivalent continuous system by applying bilinear transformation. The denominator of the equivalent continuous system and its reciprocal are differentiated successively, the reduced denominator of the desired order is obtained by combining the differentiated polynomials. The numerator is obtained by matching the quotients of MCF. The reduced continuous system is converted back into discrete system using inverse bilinear transformation. In the evolutionary technique method, Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.

Keywords: Discrete System, Single Input Single Output (SISO), Bilinear Transformation, Reduced Order Model, Modified CauerForm, Polynomial Differentiation, Particle Swarm Optimization, Integral Squared Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1896
2 Application of Rapidly Exploring Random Tree Star-Smart and G2 Quintic Pythagorean Hodograph Curves to the UAV Path Planning Problem

Authors: Luiz G. Véras, Felipe L. Medeiros, Lamartine F. Guimarães

Abstract:

This work approaches the automatic planning of paths for Unmanned Aerial Vehicles (UAVs) through the application of the Rapidly Exploring Random Tree Star-Smart (RRT*-Smart) algorithm. RRT*-Smart is a sampling process of positions of a navigation environment through a tree-type graph. The algorithm consists of randomly expanding a tree from an initial position (root node) until one of its branches reaches the final position of the path to be planned. The algorithm ensures the planning of the shortest path, considering the number of iterations tending to infinity. When a new node is inserted into the tree, each neighbor node of the new node is connected to it, if and only if the extension of the path between the root node and that neighbor node, with this new connection, is less than the current extension of the path between those two nodes. RRT*-smart uses an intelligent sampling strategy to plan less extensive routes by spending a smaller number of iterations. This strategy is based on the creation of samples/nodes near to the convex vertices of the navigation environment obstacles. The planned paths are smoothed through the application of the method called quintic pythagorean hodograph curves. The smoothing process converts a route into a dynamically-viable one based on the kinematic constraints of the vehicle. This smoothing method models the hodograph components of a curve with polynomials that obey the Pythagorean Theorem. Its advantage is that the obtained structure allows computation of the curve length in an exact way, without the need for quadratural techniques for the resolution of integrals.

Keywords: Path planning, path smoothing, Pythagorean hodograph curve, RRT*-Smart.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 838
1 GridNtru: High Performance PKCS

Authors: Narasimham Challa, Jayaram Pradhan

Abstract:

Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.

Keywords: Alchemi, GridNtru, Ntru, PKCS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640