Search results for: Voice activation detection algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4836

Search results for: Voice activation detection algorithm

546 Modified Fuzzy ARTMAP and Supervised Fuzzy ART: Comparative Study with Multispectral Classification

Authors: F.Alilat, S.Loumi, H.Merrad, B.Sansal

Abstract:

In this article a modification of the algorithm of the fuzzy ART network, aiming at returning it supervised is carried out. It consists of the search for the comparison, training and vigilance parameters giving the minimum quadratic distances between the output of the training base and those obtained by the network. The same process is applied for the determination of the parameters of the fuzzy ARTMAP giving the most powerful network. The modification consist in making learn the fuzzy ARTMAP a base of examples not only once as it is of use, but as many time as its architecture is in evolution or than the objective error is not reached . In this way, we don-t worry about the values to impose on the eight (08) parameters of the network. To evaluate each one of these three networks modified, a comparison of their performances is carried out. As application we carried out a classification of the image of Algiers-s bay taken by SPOT XS. We use as criterion of evaluation the training duration, the mean square error (MSE) in step control and the rate of good classification per class. The results of this study presented as curves, tables and images show that modified fuzzy ARTMAP presents the best compromise quality/computing time.

Keywords: Neural Networks, fuzzy ART, fuzzy ARTMAP, Remote sensing, multispectral Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1364
545 Adaptive Kernel Principal Analysis for Online Feature Extraction

Authors: Mingtao Ding, Zheng Tian, Haixia Xu

Abstract:

The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.

Keywords: adaptive method, kernel principal component analysis, online extraction, recursive algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
544 ANN Based Currency Recognition System using Compressed Gray Scale and Application for Sri Lankan Currency Notes - SLCRec

Authors: D. A. K. S. Gunaratna, N. D. Kodikara, H. L. Premaratne

Abstract:

Automatic currency note recognition invariably depends on the currency note characteristics of a particular country and the extraction of features directly affects the recognition ability. Sri Lanka has not been involved in any kind of research or implementation of this kind. The proposed system “SLCRec" comes up with a solution focusing on minimizing false rejection of notes. Sri Lankan currency notes undergo severe changes in image quality in usage. Hence a special linear transformation function is adapted to wipe out noise patterns from backgrounds without affecting the notes- characteristic images and re-appear images of interest. The transformation maps the original gray scale range into a smaller range of 0 to 125. Applying Edge detection after the transformation provided better robustness for noise and fair representation of edges for new and old damaged notes. A three layer back propagation neural network is presented with the number of edges detected in row order of the notes and classification is accepted in four classes of interest which are 100, 500, 1000 and 2000 rupee notes. The experiments showed good classification results and proved that the proposed methodology has the capability of separating classes properly in varying image conditions.

Keywords: Artificial intelligence, linear transformation and pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2833
543 Oral Examination: An Important Adjunct to the Diagnosis of Dermatological Disorders

Authors: Sanjay Saraf

Abstract:

The oral cavity can be the site for early manifestations of mucocutaneous disorders (MD) or the only site for occurrence of these disorders. It can also exhibit oral lesions with simultaneous associated skin lesions. The MD involving the oral mucosa commonly presents with signs such as ulcers, vesicles and bullae. The unique environment of the oral cavity may modify these signs of the disease, thereby making the clinical diagnosis an arduous task. In addition to the unique environment of oral cavity, the overlapping of the signs of various mucocutaneous disorders, also makes the clinical diagnosis more intricate. The aim of this review is to present the oral signs of dermatological disorders having common oral involvement and emphasize their   importance in   early detection of the systemic disorders. The aim is also to highlight the necessity of oral examination by a dermatologist while examining the skin lesions. Prior to the oral examination, it must be imperative for the dermatologists and the dental clinicians to have the knowledge of oral anatomy. It is also important to know the impact of various diseases on oral mucosa, and the characteristic features of various oral mucocutaneous lesions. An initial clinical oral examination is may help in the early diagnosis of the MD. Failure to identify the oral manifestations may reduce the likelihood of early treatment and lead to more serious problems. This paper reviews the oral manifestations of immune mediated dermatological disorders with common oral manifestations.

Keywords: Vesiculobullous lesions, Desquamative gingivitis, Nikolsky’s sign, Erythema.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
542 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: Integral differential equations, American options, jump–diffusion model, rational approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 561
541 Application of Exact String Matching Algorithms towards SMILES Representation of Chemical Structure

Authors: Ahmad Fadel Klaib, Zurinahni Zainol, Nurul Hashimah Ahamed, Rosma Ahmad, Wahidah Hussin

Abstract:

Bioinformatics and Cheminformatics use computer as disciplines providing tools for acquisition, storage, processing, analysis, integrate data and for the development of potential applications of biological and chemical data. A chemical database is one of the databases that exclusively designed to store chemical information. NMRShiftDB is one of the main databases that used to represent the chemical structures in 2D or 3D structures. SMILES format is one of many ways to write a chemical structure in a linear format. In this study we extracted Antimicrobial Structures in SMILES format from NMRShiftDB and stored it in our Local Data Warehouse with its corresponding information. Additionally, we developed a searching tool that would response to user-s query using the JME Editor tool that allows user to draw or edit molecules and converts the drawn structure into SMILES format. We applied Quick Search algorithm to search for Antimicrobial Structures in our Local Data Ware House.

Keywords: Exact String-matching Algorithms, NMRShiftDB, SMILES Format, Antimicrobial Structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2223
540 An Inter-banking Auditing Security Solution for Detecting Unauthorised Financial Transactions entered by Authorised Insiders

Authors: C. A. Corzo, N. Zhang, F. Corzo

Abstract:

Insider abuse has recently been reported as one of the more frequently occurring security incidents, suggesting that more security is required for detecting and preventing unauthorised financial transactions entered by authorised users. To address the problem, and based on the observation that all authorised interbanking financial transactions trigger or are triggered by other transactions in a workflow, we have developed a security solution based on a redefined understanding of an audit workflow. One audit workflow where there is a log file containing the complete workflow activity of financial transactions directly related to one financial transaction (an electronic deal recorded at an e-trading system). The new security solution contemplates any two parties interacting on the basis of financial transactions recorded by their users in related but distinct automated financial systems. In the new definition interorganizational and intra-organization interactions can be described in one unique audit trail. This concept expands the current ideas of audit trails by adapting them to actual e-trading workflow activity, i.e. intra-organizational and inter-organizational activity. With the above, a security auditing service is designed to detect integrity drifts with and between organizations in order to detect unauthorised financial transactions entered by authorised users.

Keywords: Intrusion Detection and Prevention, Authentica-transtionand Identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
539 Designing a Football Team of Robots from Beginning to End

Authors: Maziar A. Sharbafi, Caro Lucas, Aida Mohammadinejad, Mostafa Yaghobi

Abstract:

The Combination of path planning and path following is the main purpose of this paper. This paper describes the developed practical approach to motion control of the MRL small size robots. An intelligent controller is applied to control omni-directional robots motion in simulation and real environment respectively. The Brain Emotional Learning Based Intelligent Controller (BELBIC), based on LQR control is adopted for the omni-directional robots. The contribution of BELBIC in improving the control system performance is shown as application of the emotional learning in a real world problem. Optimizing of the control effort can be achieved in this method too. Next the implicit communication method is used to determine the high level strategies and coordination of the robots. Some simple rules besides using the environment as a memory to improve the coordination between agents make the robots' decision making system. With this simple algorithm our team manifests a desirable cooperation.

Keywords: multi-agent systems (MAS), Emotional learning, MIMO system, BELBIC, LQR, Communication via environment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1853
538 Automated Thickness Measurement of Retinal Blood Vessels for Implementation of Clinical Decision Support Systems in Diagnostic Diabetic Retinopathy

Authors: S.Jerald Jeba Kumar, M.Madheswaran

Abstract:

The structure of retinal vessels is a prominent feature, that reveals information on the state of disease that are reflected in the form of measurable abnormalities in thickness and colour. Vascular structures of retina, for implementation of clinical diabetic retinopathy decision making system is presented in this paper. Retinal Vascular structure is with thin blood vessel, whose accuracy is highly dependent upon the vessel segmentation. In this paper the blood vessel thickness is automatically detected using preprocessing techniques and vessel segmentation algorithm. First the capture image is binarized to get the blood vessel structure clearly, then it is skeletonised to get the overall structure of all the terminal and branching nodes of the blood vessels. By identifying the terminal node and the branching points automatically, the main and branching blood vessel thickness is estimated. Results are presented and compared with those provided by clinical classification on 50 vessels collected from Bejan Singh Eye hospital..

Keywords: Diabetic retinopathy, Binarization, SegmentationClinical Decision Support Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043
537 Spectral Entropy Employment in Speech Enhancement based on Wavelet Packet

Authors: Talbi Mourad, Salhi Lotfi, Chérif Adnen

Abstract:

In this work, we are interested in developing a speech denoising tool by using a discrete wavelet packet transform (DWPT). This speech denoising tool will be employed for applications of recognition, coding and synthesis. For noise reduction, instead of applying the classical thresholding technique, some wavelet packet nodes are set to zero and the others are thresholded. To estimate the non stationary noise level, we employ the spectral entropy. A comparison of our proposed technique to classical denoising methods based on thresholding and spectral subtraction is made in order to evaluate our approach. The experimental implementation uses speech signals corrupted by two sorts of noise, white and Volvo noises. The obtained results from listening tests show that our proposed technique is better than spectral subtraction. The obtained results from SNR computation show the superiority of our technique when compared to the classical thresholding method using the modified hard thresholding function based on u-law algorithm.

Keywords: Enhancement, spectral subtraction, SNR, discrete wavelet packet transform, spectral entropy Histogram

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992
536 Optimization of Shear Frame Structures Applying Various Forms of Wavelet Transforms

Authors: Seyed Sadegh Naseralavi, Sohrab Nemati, Ehsan Khojastehfar, Sadegh Balaghi

Abstract:

In the present research, various formulations of wavelet transform are applied on acceleration time history of earthquake. The mentioned transforms decompose the strong ground motion into low and high frequency parts. Since the high frequency portion of strong ground motion has a minor effect on dynamic response of structures, the structure is excited by low frequency part. Consequently, the seismic response of structure is predicted consuming one half of computational time, comparing with conventional time history analysis. Towards reducing the computational effort needed in seismic optimization of structure, seismic optimization of a shear frame structure is conducted by applying various forms of mentioned transformation through genetic algorithm.

Keywords: Time history analysis, wavelet transform, optimization, earthquake.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797
535 Predicting Bankruptcy using Tabu Search in the Mauritian Context

Authors: J. Cheeneebash, K. B. Lallmamode, A. Gopaul

Abstract:

Throughout this paper, a relatively new technique, the Tabu search variable selection model, is elaborated showing how it can be efficiently applied within the financial world whenever researchers come across the selection of a subset of variables from a whole set of descriptive variables under analysis. In the field of financial prediction, researchers often have to select a subset of variables from a larger set to solve different type of problems such as corporate bankruptcy prediction, personal bankruptcy prediction, mortgage, credit scoring and the Arbitrage Pricing Model (APM). Consequently, to demonstrate how the method operates and to illustrate its usefulness as well as its superiority compared to other commonly used methods, the Tabu search algorithm for variable selection is compared to two main alternative search procedures namely, the stepwise regression and the maximum R 2 improvement method. The Tabu search is then implemented in finance; where it attempts to predict corporate bankruptcy by selecting the most appropriate financial ratios and thus creating its own prediction score equation. In comparison to other methods, mostly the Altman Z-Score model, the Tabu search model produces a higher success rate in predicting correctly the failure of firms or the continuous running of existing entities.

Keywords: Predicting Bankruptcy, Tabu Search

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
534 Enhancement of Environmental Security by the Application of Wireless Sensor Network in Nigeria

Authors: Ahmadu Girgiri, Lawan Gana Ali, Mamman M. Baba

Abstract:

Environmental security clearly articulates the perfections and developments of various communities around the world irrespective of the region, culture, religion or social inclination. Although, the present state of insecurity has become serious issue devastating the peace, unity, stability and progress of man and his physical environment particularly in developing countries. Recently, measure of security and it management in Nigeria has been a bottle-neck to the effectiveness and advancement of various sectors that include; business, education, social relations, politics and above all an economy. Several measures have been considered on mitigating environment insecurity such as surveillance, demarcation, security personnel empowerment and the likes, but still the issue remains disturbing. In this paper, we present the application of new technology that contributes to the improvement of security surveillance known as “Wireless Sensor Network (WSN)”. The system is new, smart and emerging technology that provides monitoring, detection and aggregation of information using sensor nodes and wireless network. WSN detects, monitors and stores information or activities in the deployed area such as schools, environment, business centers, public squares, industries, and outskirts and transmit to end users. This will reduce the cost of security funding and eases security surveillance depending on the nature and the requirement of the deployment.

Keywords: Wireless sensor network, node, application, monitoring, insecurity, environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
533 Key Frame Based Video Summarization via Dependency Optimization

Authors: Janya Sainui

Abstract:

As a rapid growth of digital videos and data communications, video summarization that provides a shorter version of the video for fast video browsing and retrieval is necessary. Key frame extraction is one of the mechanisms to generate video summary. In general, the extracted key frames should both represent the entire video content and contain minimum redundancy. However, most of the existing approaches heuristically select key frames; hence, the selected key frames may not be the most different frames and/or not cover the entire content of a video. In this paper, we propose a method of video summarization which provides the reasonable objective functions for selecting key frames. In particular, we apply a statistical dependency measure called quadratic mutual informaion as our objective functions for maximizing the coverage of the entire video content as well as minimizing the redundancy among selected key frames. The proposed key frame extraction algorithm finds key frames as an optimization problem. Through experiments, we demonstrate the success of the proposed video summarization approach that produces video summary with better coverage of the entire video content while less redundancy among key frames comparing to the state-of-the-art approaches.

Keywords: Video summarization, key frame extraction, dependency measure, quadratic mutual information, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 963
532 Gas Detection via Machine Learning

Authors: Walaa Khalaf, Calogero Pace, Manlio Gaudioso

Abstract:

We present an Electronic Nose (ENose), which is aimed at identifying the presence of one out of two gases, possibly detecting the presence of a mixture of the two. Estimation of the concentrations of the components is also performed for a volatile organic compound (VOC) constituted by methanol and acetone, for the ranges 40-400 and 22-220 ppm (parts-per-million), respectively. Our system contains 8 sensors, 5 of them being gas sensors (of the class TGS from FIGARO USA, INC., whose sensing element is a tin dioxide (SnO2) semiconductor), the remaining being a temperature sensor (LM35 from National Semiconductor Corporation), a humidity sensor (HIH–3610 from Honeywell), and a pressure sensor (XFAM from Fujikura Ltd.). Our integrated hardware–software system uses some machine learning principles and least square regression principle to identify at first a new gas sample, or a mixture, and then to estimate the concentrations. In particular we adopt a training model using the Support Vector Machine (SVM) approach with linear kernel to teach the system how discriminate among different gases. Then we apply another training model using the least square regression, to predict the concentrations. The experimental results demonstrate that the proposed multiclassification and regression scheme is effective in the identification of the tested VOCs of methanol and acetone with 96.61% correctness. The concentration prediction is obtained with 0.979 and 0.964 correlation coefficient for the predicted versus real concentrations of methanol and acetone, respectively.

Keywords: Electronic nose, Least square regression, Mixture ofgases, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2539
531 Effects of Data Correlation in a Sparse-View Compressive Sensing Based Image Reconstruction

Authors: Sajid Abbas, Joon Pyo Hong, Jung-Ryun Lee, Seungryong Cho

Abstract:

Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.

Keywords: Computed tomography, Computed laminography, Compressive sending, Low-dose.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
530 On the Algorithmic Iterative Solutions of Conjugate Gradient, Gauss-Seidel and Jacobi Methods for Solving Systems of Linear Equations

Authors: H. D. Ibrahim, H. C. Chinwenyi, H. N. Ude

Abstract:

In this paper, efforts were made to examine and compare the algorithmic iterative solutions of conjugate gradient method as against other methods such as Gauss-Seidel and Jacobi approaches for solving systems of linear equations of the form Ax = b, where A is a real n x n symmetric and positive definite matrix. We performed algorithmic iterative steps and obtained analytical solutions of a typical 3 x 3 symmetric and positive definite matrix using the three methods described in this paper (Gauss-Seidel, Jacobi and Conjugate Gradient methods) respectively. From the results obtained, we discovered that the Conjugate Gradient method converges faster to exact solutions in fewer iterative steps than the two other methods which took much iteration, much time and kept tending to the exact solutions.

Keywords: conjugate gradient, linear equations, symmetric and positive definite matrix, Gauss-Seidel, Jacobi, algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 473
529 Temporal Signal Processing by Inference Bayesian Approach for Detection of Abrupt Variation of Statistical Characteristics of Noisy Signals

Authors: Farhad Asadi, Hossein Sadati

Abstract:

In fields such as neuroscience and especially in cognition modeling of mental processes, uncertainty processing in temporal zone of signal is vital. In this paper, Bayesian online inferences in estimation of change-points location in signal are constructed. This method separated the observed signal into independent series and studies the change and variation of the regime of data locally with related statistical characteristics. We give conditions on simulations of the method when the data characteristics of signals vary, and provide empirical evidence to show the performance of method. It is verified that correlation between series around the change point location and its characteristics such as Signal to Noise Ratios and mean value of signal has important factor on fluctuating in finding proper location of change point. And one of the main contributions of this study is related to representing of these influences of signal statistical characteristics for finding abrupt variation in signal. There are two different structures for simulations which in first case one abrupt change in temporal section of signal is considered with variable position and secondly multiple variations are considered. Finally, influence of statistical characteristic for changing the location of change point is explained in details in simulation results with different artificial signals.

Keywords: Time series, fluctuation in statistical characteristics, optimal learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 564
528 Syntactic Recognition of Distorted Patterns

Authors: Marek Skomorowski

Abstract:

In syntactic pattern recognition a pattern can be represented by a graph. Given an unknown pattern represented by a graph g, the problem of recognition is to determine if the graph g belongs to a language L(G) generated by a graph grammar G. The so-called IE graphs have been defined in [1] for a description of patterns. The IE graphs are generated by so-called ETPL(k) graph grammars defined in [1]. An efficient, parsing algorithm for ETPL(k) graph grammars for syntactic recognition of patterns represented by IE graphs has been presented in [1]. In practice, structural descriptions may contain pattern distortions, so that the assignment of a graph g, representing an unknown pattern, to a graph language L(G) generated by an ETPL(k) graph grammar G is rejected by the ETPL(k) type parsing. Therefore, there is a need for constructing effective parsing algorithms for recognition of distorted patterns. The purpose of this paper is to present a new approach to syntactic recognition of distorted patterns. To take into account all variations of a distorted pattern under study, a probabilistic description of the pattern is needed. A random IE graph approach is proposed here for such a description ([2]).

Keywords: Syntactic pattern recognition, Distorted patterns, Random graphs, Graph grammars.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
527 Periodic Control of a Wastewater Treatment Process to Improve Productivity

Authors: Muhammad Rizwan Azhar, Emadadeen Ali

Abstract:

In this paper, periodic force operation of a wastewater treatment process has been studied for the improved process performance. A previously developed dynamic model for the process is used to conduct the performance analysis. The static version of the model was utilized first to determine the optimal productivity conditions for the process. Then, feed flow rate in terms of dilution rate i.e. (D) is transformed into sinusoidal function. Nonlinear model predictive control algorithm is utilized to regulate the amplitude and period of the sinusoidal function. The parameters of the feed cyclic functions are determined which resulted in improved productivity than the optimal productivity under steady state conditions. The improvement in productivity is found to be marginal and is satisfactory in substrate conversion compared to that of the optimal condition and to the steady state condition, which corresponds to the average value of the periodic function. Successful results were also obtained in the presence of modeling errors and external disturbances.

Keywords: Dilution rate, nonlinear model predictive control, sinusoidal function, wastewater treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209
526 Implementation of an Improved Secure System Detection for E-passport by using EPC RFID Tags

Authors: A. Baith Mohamed, Ayman Abdel-Hamid, Kareem Youssri Mohamed

Abstract:

Current proposals for E-passport or ID-Card is similar to a regular passport with the addition of tiny contactless integrated circuit (computer chip) inserted in the back cover, which will act as a secure storage device of the same data visually displayed on the photo page of the passport. In addition, it will include a digital photograph that will enable biometric comparison, through the use of facial recognition technology at international borders. Moreover, the e-passport will have a new interface, incorporating additional antifraud and security features. However, its problems are reliability, security and privacy. Privacy is a serious issue since there is no encryption between the readers and the E-passport. However, security issues such as authentication, data protection and control techniques cannot be embedded in one process. In this paper, design and prototype implementation of an improved E-passport reader is presented. The passport holder is authenticated online by using GSM network. The GSM network is the main interface between identification center and the e-passport reader. The communication data is protected between server and e-passport reader by using AES to encrypt data for protection will transferring through GSM network. Performance measurements indicate a 19% improvement in encryption cycles versus previously reported results.

Keywords: RFID "Radio Frequency Identification", EPC"Electronic Product Code", ICAO "International Civil Aviation Organization", IFF "Identify Friend or Foe"

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2601
525 Virtual Routing Function Allocation Method for Minimizing Total Network Power Consumption

Authors: Kenichiro Hida, Shin-Ichi Kuribayashi

Abstract:

In a conventional network, most network devices, such as routers, are dedicated devices that do not have much variation in capacity. In recent years, a new concept of network functions virtualisation (NFV) has come into use. The intention is to implement a variety of network functions with software on general-purpose servers and this allows the network operator to select their capacities and locations without any constraints. This paper focuses on the allocation of NFV-based routing functions which are one of critical network functions, and presents the virtual routing function allocation algorithm that minimizes the total power consumption. In addition, this study presents the useful allocation policy of virtual routing functions, based on an evaluation with a ladder-shaped network model. This policy takes the ratio of the power consumption of a routing function to that of a circuit and traffic distribution between areas into consideration. Furthermore, the present paper shows that there are cases where the use of NFV-based routing functions makes it possible to reduce the total power consumption dramatically, in comparison to a conventional network, in which it is not economically viable to distribute small-capacity routing functions.

Keywords: Virtual routing function, NFV, resource allocation, minimum power consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305
524 Maximizer of the Posterior Marginal Estimate for Noise Reduction of JPEG-compressed Image

Authors: Yohei Saika, Yuji Haraguchi

Abstract:

We constructed a method of noise reduction for JPEG-compressed image based on Bayesian inference using the maximizer of the posterior marginal (MPM) estimate. In this method, we tried the MPM estimate using two kinds of likelihood, both of which enhance grayscale images converted into the JPEG-compressed image through the lossy JPEG image compression. One is the deterministic model of the likelihood and the other is the probabilistic one expressed by the Gaussian distribution. Then, using the Monte Carlo simulation for grayscale images, such as the 256-grayscale standard image “Lena" with 256 × 256 pixels, we examined the performance of the MPM estimate based on the performance measure using the mean square error. We clarified that the MPM estimate via the Gaussian probabilistic model of the likelihood is effective for reducing noises, such as the blocking artifacts and the mosquito noise, if we set parameters appropriately. On the other hand, we found that the MPM estimate via the deterministic model of the likelihood is not effective for noise reduction due to the low acceptance ratio of the Metropolis algorithm.

Keywords: Noise reduction, JPEG-compressed image, Bayesian inference, the maximizer of the posterior marginal estimate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988
523 Ontology-based Concept Weighting for Text Documents

Authors: Hmway Hmway Tar, Thi Thi Soe Nyaunt

Abstract:

Documents clustering become an essential technology with the popularity of the Internet. That also means that fast and high-quality document clustering technique play core topics. Text clustering or shortly clustering is about discovering semantically related groups in an unstructured collection of documents. Clustering has been very popular for a long time because it provides unique ways of digesting and generalizing large amounts of information. One of the issues of clustering is to extract proper feature (concept) of a problem domain. The existing clustering technology mainly focuses on term weight calculation. To achieve more accurate document clustering, more informative features including concept weight are important. Feature Selection is important for clustering process because some of the irrelevant or redundant feature may misguide the clustering results. To counteract this issue, the proposed system presents the concept weight for text clustering system developed based on a k-means algorithm in accordance with the principles of ontology so that the important of words of a cluster can be identified by the weight values. To a certain extent, it has resolved the semantic problem in specific areas.

Keywords: Clustering, Concept Weight, Document clustering, Feature Selection, Ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2405
522 State Estimation Solution with Optimal Allocation of Phasor Measurement Units Considering Zero Injection Bus Modeling

Authors: M. Ravindra, R. Srinivasa Rao, V. Shanmukha Naga Raju

Abstract:

This paper presents state estimation with Phasor Measurement Unit (PMU) allocation to obtain complete observability of network. A matrix is designed with modeling of zero injection constraints to minimize PMU allocations. State estimation algorithm is developed with optimal allocation of PMUs to find accurate states of network. The incorporation of PMU into traditional state estimation process improves accuracy and computational performance for large power systems. The nonlinearity integrated with zero injection (ZI) constraints is remodeled to linear frame to optimize number of PMUs. The problem of optimal PMU allocation is regarded with modeling of ZI constraints, PMU loss or line outage, cost factor and redundant measurements. The proposed state estimation with optimal PMU allocation has been compared with traditional state estimation process to show its importance. MATLAB programming on IEEE 14, 30, 57, and 118 bus networks is implemented out by Binary Integer Programming (BIP) method and compared with other methods to show its effectiveness.

Keywords: Observability, phasor measurement units, synchrophasors, SCADA measurements, zero injection bus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 807
521 A Sparse Representation Speech Denoising Method Based on Adapted Stopping Residue Error

Authors: Qianhua He, Weili Zhou, Aiwu Chen

Abstract:

A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.

Keywords: Speech denoising, sparse representation, K-singular value decomposition, orthogonal matching pursuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
520 Content and Resources based Mobile and Wireless Video Transcoding

Authors: Ashraf M. A. Ahmad

Abstract:

Delivering streaming video over wireless is an important component of many interactive multimedia applications running on personal wireless handset devices. Such personal devices have to be inexpensive, compact, and lightweight. But wireless channels have a high channel bit error rate and limited bandwidth. Delay variation of packets due to network congestion and the high bit error rate greatly degrades the quality of video at the handheld device. Therefore, mobile access to multimedia contents requires video transcoding functionality at the edge of the mobile network for interworking with heterogeneous networks and services. Therefore, to guarantee quality of service (QoS) delivered to the mobile user, a robust and efficient transcoding scheme should be deployed in mobile multimedia transporting network. Hence, this paper examines the challenges and limitations that the video transcoding schemes in mobile multimedia transporting network face. Then handheld resources, network conditions and content based mobile and wireless video transcoding is proposed to provide high QoS applications. Exceptional performance is demonstrated in the experiment results. These experiments were designed to verify and prove the robustness of the proposed approach. Extensive experiments have been conducted, and the results of various video clips with different bit rate and frame rate have been provided.

Keywords: Content, Object detection, Transcoding, Texture, Temporal, Video.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347
519 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines

Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé

Abstract:

The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).

Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
518 Optimal Path Planning under Priori Information in Stochastic, Time-varying Networks

Authors: Siliang Wang, Minghui Wang, Jun Hu

Abstract:

A novel path planning approach is presented to solve optimal path in stochastic, time-varying networks under priori traffic information. Most existing studies make use of dynamic programming to find optimal path. However, those methods are proved to be unable to obtain global optimal value, moreover, how to design efficient algorithms is also another challenge. This paper employs a decision theoretic framework for defining optimal path: for a given source S and destination D in urban transit network, we seek an S - D path of lowest expected travel time where its link travel times are discrete random variables. To solve deficiency caused by the methods of dynamic programming, such as curse of dimensionality and violation of optimal principle, an integer programming model is built to realize assignment of discrete travel time variables to arcs. Simultaneously, pruning techniques are also applied to reduce computation complexity in the algorithm. The final experiments show the feasibility of the novel approach.

Keywords: pruning method, stochastic, time-varying networks, optimal path planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
517 Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments

Authors: A. Nuchitprasittichai, N. Lerdritsirikoon, T. Khamsing

Abstract:

Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.

Keywords: Central composite design, CO2 liquefaction, Latin Hypercube Sampling, simulation – based optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741