Search results for: Original KNN
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 646

Search results for: Original KNN

616 An Estimation of Variance Components in Linear Mixed Model

Authors: Shuimiao Wan, Chao Yuan, Baoguang Tian

Abstract:

In this paper, a linear mixed model which has two random effects is broken up into two models. This thesis gets the parameter estimation of the original model and an estimation’s statistical qualities based on these two models. Then many important properties are given by comparing this estimation with other general estimations. At the same time, this paper proves the analysis of variance estimate (ANOVAE) about σ2 of the original model is equal to the least-squares estimation (LSE) about σ2 of these two models. Finally, it also proves that this estimation is better than ANOVAE under Stein function and special condition in some degree.

Keywords: Linear mixed model, Random effects, Parameter estimation, Stein function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774
615 Controller Design of Discrete Systems by Order Reduction Technique Employing Differential Evolution Optimization Algorithm

Authors: J. S. Yadav, N. P. Patidar, J. Singhai

Abstract:

One of the main objectives of order reduction is to design a controller of lower order which can effectively control the original high order system so that the overall system is of lower order and easy to understand. In this paper, a simple method is presented for controller design of a higher order discrete system. First the original higher order discrete system in reduced to a lower order model. Then a Proportional Integral Derivative (PID) controller is designed for lower order model. An error minimization technique is employed for both order reduction and controller design. For the error minimization purpose, Differential Evolution (DE) optimization algorithm has been employed. DE method is based on the minimization of the Integral Squared Error (ISE) between the desired response and actual response pertaining to a unit step input. Finally the designed PID controller is connected to the original higher order discrete system to get the desired specification. The validity of the proposed method is illustrated through a numerical example.

Keywords: Discrete System, Model Order Reduction, PIDController, Integral Squared Error, Differential Evolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1853
614 System Reduction by Eigen Permutation Algorithm and Improved Pade Approximations

Authors: Jay Singh, Kalyan Chatterjee, C. B. Vishwakarma

Abstract:

A mixed method by combining a Eigen algorithm and improved pade approximations is proposed for reducing the order of the large-scale dynamic systems. The most dominant Eigen value of both original and reduced order systems remain same in this method. The proposed method guarantees stability of the reduced model if the original high-order system is stable and is comparable in quality with the other well known existing order reduction methods. The superiority of the proposed method is shown through examples taken from the literature.

Keywords: Eigen algorithm, Order reduction, improved pade approximations, Stability, Transfer function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2038
613 Role of Director's Philosophical Approach in Cinematographic Expression

Authors: Sedat Cereci

Abstract:

The original idea for a feature film may come from a writer, director or a producer. Director is the person responsible for the creative aspects, both interpretive and technical, of a motion picture production in a film. Director may be shot discussing his project with his or her cowriters, members of production staff, and producer, and director may be shown selecting locales or constructing sets. All these activities provide, of course, ways of externalizing director-s ideas about the film. A director sometimes pushes both the film image and techniques of narration to new artistic limits, but main responsibility of director is take the spectator to an original opinion in his philosophical approach. Director tries to find an artistic angle in every scene and change screenplay into an effective story and sets his film on a spiritual and philosophical base.

Keywords: Director, role, film, approach, opinion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
612 Delay Preserving Substructures in Wireless Networks Using Edge Difference between a Graph and its Square Graph

Authors: T. N. Janakiraman, J. Janet Lourds Rani

Abstract:

In practice, wireless networks has the property that the signal strength attenuates with respect to the distance from the base station, it could be better if the nodes at two hop away are considered for better quality of service. In this paper, we propose a procedure to identify delay preserving substructures for a given wireless ad-hoc network using a new graph operation G 2 – E (G) = G* (Edge difference of square graph of a given graph and the original graph). This operation helps to analyze some induced substructures, which preserve delay in communication among them. This operation G* on a given graph will induce a graph, in which 1- hop neighbors of any node are at 2-hop distance in the original network. In this paper, we also identify some delay preserving substructures in G*, which are (i) set of all nodes, which are mutually at 2-hop distance in G that will form a clique in G*, (ii) set of nodes which forms an odd cycle C2k+1 in G, will form an odd cycle in G* and the set of nodes which form a even cycle C2k in G that will form two disjoint companion cycles ( of same parity odd/even) of length k in G*, (iii) every path of length 2k+1 or 2k in G will induce two disjoint paths of length k in G*, and (iv) set of nodes in G*, which induces a maximal connected sub graph with radius 1 (which identifies a substructure with radius equal 2 and diameter at most 4 in G). The above delay preserving sub structures will behave as good clusters in the original network.

Keywords: Clique, cycles, delay preserving substructures, maximal connected sub graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1218
611 Encryption Image via Mutual Singular Value Decomposition

Authors: Adil Al-Rammahi

Abstract:

Image or document encryption is needed through egovernment data base. Really in this paper we introduce two matrices images, one is the public, and the second is the secret (original). The analyses of each matrix is achieved using the transformation of singular values decomposition. So each matrix is transformed or analyzed to three matrices say row orthogonal basis, column orthogonal basis, and spectral diagonal basis. Product of the two row basis is calculated. Similarly the product of the two column basis is achieved. Finally we transform or save the files of public, row product and column product. In decryption stage, the original image is deduced by mutual method of the three public files.

Keywords: Image cryptography, Singular values decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2040
610 Watermarking Scheme for Color Images using Wavelet Transform based Texture Properties and Secret Sharing

Authors: Nagaraj V. Dharwadkar, B.B.Amberker

Abstract:

In this paper, a new secure watermarking scheme for color image is proposed. It splits the watermark into two shares using (2, 2)- threshold Visual Cryptography Scheme (V CS) with Adaptive Order Dithering technique and embeds one share into high textured subband of Luminance channel of the color image. The other share is used as the key and is available only with the super-user or the author of the image. In this scheme only the super-user can reveal the original watermark. The proposed scheme is dynamic in the sense that to maintain the perceptual similarity between the original and the watermarked image the selected subband coefficients are modified by varying the watermark scaling factor. The experimental results demonstrate the effectiveness of the proposed scheme. Further, the proposed scheme is able to resist all common attacks even with strong amplitude.

Keywords: VCS, Dithering, HVS, DWT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1999
609 Optimal One Bit Time Reversal For UWB Impulse Radio In Multi-User Wireless Communications

Authors: Hung Tuan Nguyen

Abstract:

In this paper, with the purpose of further reducing the complexity of the system, while keeping its temporal and spatial focusing performance, we investigate the possibility of using optimal one bit time reversal (TR) system for impulse radio ultra wideband multi-user wireless communications. The results show that, by optimally selecting the number of used taps in the pre-filter the optimal one bit TR system can outperform the full one bit TR system. In some cases, the temporal and spatial focusing performance of the optimal one bit TR system appears to be compatible with that of the original TR system. This is a significant result as the overhead cost is much lower than it is required in the original TR system.

Keywords: Time reversal, optimal one bit, UWB, multi-user interference, inter symbol interference

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
608 An Efficient Proxy Signature Scheme Over a Secure Communications Network

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Proxy signature scheme permits an original signer to delegate his/her signing capability to a proxy signer, and then the proxy signer generates a signing message on behalf of the original signer. The two parties must be able to authenticate one another and agree on a secret encryption key, in order to communicate securely over an unreliable public network. Authenticated key agreement protocols have an important role in building secure communications network between the two parties. In this paper, we present a secure proxy signature scheme over an efficient and secure authenticated key agreement protocol based on the discrete logarithm problem.

Keywords: Proxy signature, warrant partial delegation, key agreement, discrete logarithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
607 A Visual Cryptography and Statistics Based Method for Ownership Identification of Digital Images

Authors: Ching-Sheng Hsu, Young-Chang Hou

Abstract:

In this paper, a novel copyright protection scheme for digital images based on Visual Cryptography and Statistics is proposed. In our scheme, the theories and properties of sampling distribution of means and visual cryptography are employed to achieve the requirements of robustness and security. Our method does not need to alter the original image and can identify the ownership without resorting to the original image. Besides, our method allows multiple watermarks to be registered for a single host image without causing any damage to other hidden watermarks. Moreover, it is also possible for our scheme to cast a larger watermark into a smaller host image. Finally, experimental results will show the robustness of our scheme against several common attacks.

Keywords: Copyright protection, digital watermarking, samplingdistribution, visual cryptography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1841
606 Principal Component Analysis for the Characterization in the Application of Some Soil Properties

Authors: Kamolchanok Panishkan, Kanokporn Swangjang, Natdhera Sanmanee, Daoroong Sungthong

Abstract:

The objective of this research is to study principal component analysis for classification of 67 soil samples collected from different agricultural areas in the western part of Thailand. Six soil properties were measured on the soil samples and are used as original variables. Principal component analysis is applied to reduce the number of original variables. A model based on the first two principal components accounts for 72.24% of total variance. Score plots of first two principal components were used to map with agricultural areas divided into horticulture, field crops and wetland. The results showed some relationships between soil properties and agricultural areas. PCA was shown to be a useful tool for agricultural areas classification based on soil properties.

Keywords: soil organic matter, soil properties, classification, principal components

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4055
605 Principal Component Analysis-Ranking as a Variable Selection Method for the Simultaneous Spectrophotometric Determination of Phenol, Resorcinol and Catechol in Real Samples

Authors: Nahid Ghasemi, Mohammad Goodarzi, Morteza Khosravi

Abstract:

Simultaneous determination of multicomponents of phenol, resorcinol and catechol with a chemometric technique a PCranking artificial neural network (PCranking-ANN) algorithm is reported in this study. Based on the data correlation coefficient method, 3 representative PCs are selected from the scores of original UV spectral data (35 PCs) as the original input patterns for ANN to build a neural network model. The results obtained by iterating 8000 .The RMSEP for phenol, resorcinol and catechol with PCranking- ANN were 0.6680, 0.0766 and 0.1033, respectively. Calibration matrices were 0.50-21.0, 0.50-15.1 and 0.50-20.0 μg ml-1 for phenol, resorcinol and catechol, respectively. The proposed method was successfully applied for the determination of phenol, resorcinol and catechol in synthetic and water samples.

Keywords: Phenol, Resorcinol, Catechol, Principal componentrankingArtificial Neural Network, Chemometrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1375
604 Image Restoration in Non-Linear Filtering Domain using MDB approach

Authors: S. K. Satpathy, S. Panda, K. K. Nagwanshi, C. Ardil

Abstract:

This paper proposes a new technique based on nonlinear Minmax Detector Based (MDB) filter for image restoration. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Image degradation can be due to the addition of different types of noise in the original image. Image noise can be modeled of many types and impulse noise is one of them. Impulse noise generates pixels with gray value not consistent with their local neighborhood. It appears as a sprinkle of both light and dark or only light spots in the image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly efficient but for large window and in case of high noise it gives rise to more blurring to image. The Centre Weighted Mean (CWM) filter has got a better average performance over the median filter. However the original pixel corrupted and noise reduction is substantial under high noise condition. Hence this technique has also blurring affect on the image. To illustrate the superiority of the proposed approach, the proposed new scheme has been simulated along with the standard ones and various restored performance measures have been compared.

Keywords: Filtering, Minmax Detector Based (MDB), noise, centre weighted mean filter, PSNR, restoration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2697
603 Removal of Copper and Zinc Ions onto Biomodified Palm Shell Activated Carbon

Authors: Gulnaziya Issabayeva, Mohamed Kheireddine Aroua

Abstract:

commercially produced in Malaysia granular palm shell activated carbon (PSAC) was biomodified with bacterial biomass (Bacillus subtilis) to produce a hybrid biosorbent of higher efficiency. The obtained biosorbent was evaluated in terms of adsorption capacity to remove copper and zinc metal ions from aqueous solutions. The adsorption capacity was evaluated in batch adsorption experiments where concentrations of metal ions varied from 20 to 350 mg/L. A range of pH from 3 to 6 of aqueous solutions containing metal ions was tested. Langmuir adsorption model was used to interpret the experimental data. Comparison of the adsorption data of the biomodified and original palm shell activated carbon showed higher uptake of metal ions by the hybrid biosorbent. A trend in metal ions uptake increase with the increase in the solution-s pH was observed. The surface characterization data indicated a decrease in the total surface area for the hybrid biosorbent; however the uptake of copper and zinc by it was at least equal to the original PSAC at pH 4 and 5. The highest capacity of the hybrid biosorbent was observed at pH 5 and comprised 22 mg/g and 19 mg/g for copper and zinc, respectively. The adsorption capacity at the lowest pH of 3 was significantly low. The experimental results facilitated identification of potential factors influencing the adsorption of copper and zinc onto biomodified and original palm shell activated carbon.

Keywords: Adsorption, biomodification, copper, zinc, palm shell carbon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
602 Measurement Scheme Improving for State Estimation Using Stochastic Tabu Search

Authors: T. Kerdchuen

Abstract:

This paper proposes the stochastic tabu search (STS) for improving the measurement scheme for power system state estimation. If the original measured scheme is not observable, the additional measurements with minimum number of measurements are added into the system by STS so that there is no critical measurement pair. The random bit flipping and bit exchanging perturbations are used for generating the neighborhood solutions in STS. The Pδ observable concept is used to determine the network observability. Test results of 10 bus, IEEE 14 and 30 bus systems are shown that STS can improve the original measured scheme to be observable without critical measurement pair. Moreover, the results of STS are superior to deterministic tabu search (DTS) in terms of the best solution hit.

Keywords: Measurement Scheme, Power System StateEstimation, Network Observability, Stochastic Tabu Search (STS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1228
601 Factoring a Polynomial with Multiple-Roots

Authors: Feng Cheng Chang

Abstract:

A given polynomial, possibly with multiple roots, is factored into several lower-degree distinct-root polynomials with natural-order-integer powers. All the roots, including multiplicities, of the original polynomial may be obtained by solving these lowerdegree distinct-root polynomials, instead of the original high-degree multiple-root polynomial directly. The approach requires polynomial Greatest Common Divisor (GCD) computation. The very simple and effective process, “Monic polynomial subtractions" converted trickily from “Longhand polynomial divisions" of Euclidean algorithm is employed. It requires only simple elementary arithmetic operations without any advanced mathematics. Amazingly, the derived routine gives the expected results for the test polynomials of very high degree, such as p( x) =(x+1)1000.

Keywords: Polynomial roots, greatest common divisor, Longhand polynomial division, Euclidean GCD Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1525
600 Efficient CT Image Volume Rendering for Diagnosis

Authors: HaeNa Lee, Sun K. Yoo

Abstract:

Volume rendering is widely used in medical CT image visualization. Applying 3D image visualization to diagnosis application can require accurate volume rendering with high resolution. Interpolation is important in medical image processing applications such as image compression or volume resampling. However, it can distort the original image data because of edge blurring or blocking effects when image enhancement procedures were applied. In this paper, we proposed adaptive tension control method exploiting gradient information to achieve high resolution medical image enhancement in volume visualization, where restored images are similar to original images as much as possible. The experimental results show that the proposed method can improve image quality associated with the adaptive tension control efficacy.

Keywords: Tension control, Interpolation, Ray-casting, Medical imaging analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2318
599 A Modified Spiral Search Algorithm and Its Embedded System Architecture Design

Authors: Nikolaos Kroupis, Minas Dasygenis, Dimitrios Soudris, Antonios Thanailakis

Abstract:

One of the most growing areas in the embedded community is multimedia devices. Multimedia devices incorporate a number of complicated functions for their operation, like motion estimation. A multitude of different implementations have been proposed to reduce motion estimation complexity, such as spiral search. We have studied the implementations of spiral search and identified areas of improvement. We propose a modified spiral search algorithm, with lower computational complexity compared to the original spiral search. We have implemented our algorithm on an embedded ARM based architecture, with custom memory hierarchy. The resulting system yields energy consumption reduction up to 64% and performance increase up to 77%, with a small penalty of 2.3 dB, in average, of video quality compared with the original spiral search algorithm.

Keywords: Spiral Search, Motion Estimation, Embedded Systems, Low Power

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
598 Robust Features for Impulsive Noisy Speech Recognition Using Relative Spectral Analysis

Authors: Hajer Rahali, Zied Hajaiej, Noureddine Ellouze

Abstract:

The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.

Keywords: Auditory filter, impulsive noise, MFCC, prosodic features, RASTA filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2284
597 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

Authors: Karima Siham Aoubid, Mohamed Boulemden

Abstract:

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1286
596 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.

Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3265
595 Distortion Estimation in Digital Image Watermarking using Genetic Programming

Authors: Labiba Gilani, Asifullah Khan, Anwar M. Mirza

Abstract:

This paper introduces a technique of distortion estimation in image watermarking using Genetic Programming (GP). The distortion is estimated by considering the problem of obtaining a distorted watermarked signal from the original watermarked signal as a function regression problem. This function regression problem is solved using GP, where the original watermarked signal is considered as an independent variable. GP-based distortion estimation scheme is checked for Gaussian attack and Jpeg compression attack. We have used Gaussian attacks of different strengths by changing the standard deviation. JPEG compression attack is also varied by adding various distortions. Experimental results demonstrate that the proposed technique is able to detect the watermark even in the case of strong distortions and is more robust against attacks.

Keywords: Blind Watermarking, Genetic Programming (GP), Fitness Function, Discrete Cosine Transform (DCT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
594 Acausal and Causal Model Construction with FEM Approach Using Modelica

Authors: Oke Oktavianty, Tadayuki Kyoutani, Shigeyuki Haruyama, Junji Kaneko, Ken Kaminishi

Abstract:

Modelica has many advantages and it is very useful in modeling and simulation especially for the multi-domain with a complex technical system. However, the big obstacle for a beginner is to understand the basic concept and to build a new system model for a real system. In order to understand how to solve the simple circuit model by hand translation and to get a better understanding of how modelica works, we provide a detailed explanation about solver ordering system in horizontal and vertical sorting and make some proposals for improvement. In this study, some difficulties in using modelica software with the original concept and the comparison with Finite Element Method (FEM) approach is discussed. We also present our textual modeling approach using FEM concept for acausal and causal model construction. Furthermore, simulation results are provided that demonstrate the comparison between using textual modeling with original coding in modelica and FEM concept.

Keywords: FEM, acausal model, modelica, horizontal and vertical sorting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1295
593 Investigation of New Gait Representations for Improving Gait Recognition

Authors: Chirawat Wattanapanich, Hong Wei

Abstract:

This study presents new gait representations for improving gait recognition accuracy on cross gait appearances, such as normal walking, wearing a coat and carrying a bag. Based on the Gait Energy Image (GEI), two ideas are implemented to generate new gait representations. One is to append lower knee regions to the original GEI, and the other is to apply convolutional operations to the GEI and its variants. A set of new gait representations are created and used for training multi-class Support Vector Machines (SVMs). Tests are conducted on the CASIA dataset B. Various combinations of the gait representations with different convolutional kernel size and different numbers of kernels used in the convolutional processes are examined. Both the entire images as features and reduced dimensional features by Principal Component Analysis (PCA) are tested in gait recognition. Interestingly, both new techniques, appending the lower knee regions to the original GEI and convolutional GEI, can significantly contribute to the performance improvement in the gait recognition. The experimental results have shown that the average recognition rate can be improved from 75.65% to 87.50%.

Keywords: Convolutional image, lower knee, gait.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1027
592 Unsupervised Feature Selection Using Feature Density Functions

Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh

Abstract:

Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.

Keywords: Feature, Feature Selection, Filter, Probability Density Function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
591 The Effect of Granule Size on the Digestibility of Wheat Starch Using an in vitro Model

Authors: Mee-Lin Lim Chai Teo, Darryl M. Small

Abstract:

Wheat has a bimodal starch granule population and the dependency of the rate of enzymatic hydrolysis on particle size has been investigated. Ungelatinised wheaten starch granules were separated into two populations by sedimentation and decantation. Particle size was analysed by laser diffraction and morphological characteristics were viewed using SEM. The sedimentation technique though lengthy, gave satisfactory separation of the granules. Samples (<10μm, >10μm and original) were digested with a-amylase using a dialysis model. Granules of <10μm showed significantly higher rate of reducing sugar release than those >10μm (p<0.05). In contrast, the rate was not significantly different between the original sample and granules >10μm. Moreover, the digestion rate was dependent on particle size whereby smaller granules produced higher rate of release. The methodology and results reported here can be used as a basis for further evaluations designed to delay the release of glucose during the digestion of native starches.

Keywords: in vitro Digestion, a-amylase, wheat starch, granule size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2805
590 Damage Assessment and Repair for Older Brick Buildings

Authors: Tim D. Sass

Abstract:

The experience of engineers and architects practicing today is typically limited to current building code requirements and modern construction methods and materials. However, many cities have a mix of new and old buildings with many buildings constructed over one hundred years ago when building codes and construction methods were much different. When a brick building sustains damage, a structural engineer is often hired to determine the cause of damage as well as determine the necessary repairs. Forensic studies of dozens of brick buildings shows an appreciation of historical building methods and materials is needed to correctly identify the cause of damage and design an appropriate repair. Damage on an older, brick building can be mistakenly attributed to storms or seismic events when the real source of the damage is deficient original construction. Assessing and remediating damaged brickwork on older brick buildings requires an understanding of the original construction, an understanding of older repair methods, and, an understanding of current building code requirements.

Keywords: Brick, damage, deterioration, facade.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1162
589 Mathematical Programming on Multivariate Calibration Estimation in Stratified Sampling

Authors: Dinesh Rao, M.G.M. Khan, Sabiha Khan

Abstract:

Calibration estimation is a method of adjusting the original design weights to improve the survey estimates by using auxiliary information such as the known population total (or mean) of the auxiliary variables. A calibration estimator uses calibrated weights that are determined to minimize a given distance measure to the original design weights while satisfying a set of constraints related to the auxiliary information. In this paper, we propose a new multivariate calibration estimator for the population mean in the stratified sampling design, which incorporates information available for more than one auxiliary variable. The problem of determining the optimum calibrated weights is formulated as a Mathematical Programming Problem (MPP) that is solved using the Lagrange multiplier technique.

Keywords: Calibration estimation, Stratified sampling, Multivariate auxiliary information, Mathematical programming problem, Lagrange multiplier technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
588 A Novel Reversible Watermarking Method based on Adaptive Thresholding and Companding Technique

Authors: Nisar Ahmed Memon

Abstract:

Embedding and extraction of a secret information as well as the restoration of the original un-watermarked image is highly desirable in sensitive applications like military, medical, and law enforcement imaging. This paper presents a novel reversible data-hiding method for digital images using integer to integer wavelet transform and companding technique which can embed and recover the secret information as well as can restore the image to its pristine state. The novel method takes advantage of block based watermarking and iterative optimization of threshold for companding which avoids histogram pre and post-processing. Consequently, it reduces the associated overhead usually required in most of the reversible watermarking techniques. As a result, it keeps the distortion small between the marked and the original images. Experimental results show that the proposed method outperforms the existing reversible data hiding schemes reported in the literature.

Keywords: Adaptive Thresholding, Companding Technique, Integer Wavelet Transform, Reversible Watermarking

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
587 Automatic Rearrangement of Localized Graphical User Interface

Authors: Ágoston Winkler, Sándor Juhász

Abstract:

The localization of software products is essential for reaching the users of the international market. An important task for this is the translation of the user interface into local national languages. As graphical interfaces are usually optimized for the size of the texts in the original language, after the translation certain user controls (e.g. text labels and buttons in dialogs) may grow in such a manner that they slip above each other. This not only causes an unpleasant appearance but also makes the use of the program more difficult (or even impossible) which implies that the arrangement of the controls must be corrected subsequently. The correction should preserve the original structure of the interface (e.g. the relation of logically coherent controls), furthermore, it is important to keep the nicely proportioned design: the formation of large empty areas should be avoided. This paper describes an algorithm that automatically rearranges the controls of a graphical user interface based on the principles above. The algorithm has been implemented and integrated into a translation support system and reached results pleasant for the human eye in most test cases.

Keywords: Graphical user interface, GUI, natural languages, software localization, translation support systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638