Search results for: image search.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2220

Search results for: image search.

360 Evolutionary Eigenspace Learning using CCIPCA and IPCA for Face Recognition

Authors: Ghazy M.R. Assassa, Mona F. M. Mursi, Hatim A. Aboalsamh

Abstract:

Traditional principal components analysis (PCA) techniques for face recognition are based on batch-mode training using a pre-available image set. Real world applications require that the training set be dynamic of evolving nature where within the framework of continuous learning, new training images are continuously added to the original set; this would trigger a costly continuous re-computation of the eigen space representation via repeating an entire batch-based training that includes the old and new images. Incremental PCA methods allow adding new images and updating the PCA representation. In this paper, two incremental PCA approaches, CCIPCA and IPCA, are examined and compared. Besides, different learning and testing strategies are proposed and applied to the two algorithms. The results suggest that batch PCA is inferior to both incremental approaches, and that all CCIPCAs are practically equivalent.

Keywords: Candid covariance-free incremental principal components analysis (CCIPCA), face recognition, incremental principal components analysis (IPCA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817
359 Image Analysis of Fine Structures of Supercavitation in the Symmetric Wake of a Cylinder

Authors: Y. Obikane , M.Kaneko, K.Kakioka, K.Ogura

Abstract:

The fine structure of supercavitation in the wake of a symmetrical cylinder is studied with high-speed video cameras. The flow is observed in a cavitation tunnel at the speed of 8m/sec when the sidewall and the wake are partially filled with the massive cavitation bubbles. The present experiment observed that a two-dimensional ripple wave with a wave length of 0.3mm is propagated in a downstream direction, and then abruptly increases to a thicker three-dimensional layer. IR-photography recorded that the wakes originated from the horseshoe vortexes alongside the cylinder. The wake was developed to inside the dead water zone, which absorbed the bubbly wake propelled from the separated vortices at the center of the cylinder. A remote sensing classification technique (maximum most likelihood) determined that the surface porosity was 0.2, and the mean speed in the mixed wake was 7m/sec. To confirm the existence of two-dimensional wave motions in the interface, the experiments were conducted at a very low frequency, and showed similar gravity waves in both the upper and lower interfaces.

Keywords: Supercavitation, density gradient correlation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515
358 Feature Based Dense Stereo Matching using Dynamic Programming and Color

Authors: Hajar Sadeghi, Payman Moallem, S. Amirhassn Monadjemi

Abstract:

This paper presents a new feature based dense stereo matching algorithm to obtain the dense disparity map via dynamic programming. After extraction of some proper features, we use some matching constraints such as epipolar line, disparity limit, ordering and limit of directional derivative of disparity as well. Also, a coarseto- fine multiresolution strategy is used to decrease the search space and therefore increase the accuracy and processing speed. The proposed method links the detected feature points into the chains and compares some of the feature points from different chains, to increase the matching speed. We also employ color stereo matching to increase the accuracy of the algorithm. Then after feature matching, we use the dynamic programming to obtain the dense disparity map. It differs from the classical DP methods in the stereo vision, since it employs sparse disparity map obtained from the feature based matching stage. The DP is also performed further on a scan line, between any matched two feature points on that scan line. Thus our algorithm is truly an optimization method. Our algorithm offers a good trade off in terms of accuracy and computational efficiency. Regarding the results of our experiments, the proposed algorithm increases the accuracy from 20 to 70%, and reduces the running time of the algorithm almost 70%.

Keywords: Chain Correspondence, Color Stereo Matching, Dynamic Programming, Epipolar Line, Stereo Vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2337
357 Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries

Authors: Elham Alaee, Mousa Shamsi, Hossein Ahmadi, Soroosh Nazem, Mohammadhossein Sedaaghi

Abstract:

Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy CMeans (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic CMeans (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm.

Keywords: Facial image, segmentation, PCM, FCM, skin error, facial surgery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
356 Evolutionary Techniques for Model Order Reduction of Large Scale Linear Systems

Authors: S. Panda, J. S. Yadav, N. P. Patidar, C. Ardil

Abstract:

Recently, genetic algorithms (GA) and particle swarm optimization (PSO) technique have attracted considerable attention among various modern heuristic optimization techniques. The GA has been popular in academia and the industry mainly because of its intuitiveness, ease of implementation, and the ability to effectively solve highly non-linear, mixed integer optimization problems that are typical of complex engineering systems. PSO technique is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. In this paper both PSO and GA optimization are employed for finding stable reduced order models of single-input- single-output large-scale linear systems. Both the techniques guarantee stability of reduced order model if the original high order model is stable. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example from literature and the results are compared with recently published conventional model reduction technique.

Keywords: Genetic Algorithm, Particle Swarm Optimization, Order Reduction, Stability, Transfer Function, Integral Squared Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2714
355 Comparison of Particle Swarm Optimization and Genetic Algorithm for TCSC-based Controller Design

Authors: Sidhartha Panda, N. P. Padhy

Abstract:

Recently, genetic algorithms (GA) and particle swarm optimization (PSO) technique have attracted considerable attention among various modern heuristic optimization techniques. Since the two approaches are supposed to find a solution to a given objective function but employ different strategies and computational effort, it is appropriate to compare their performance. This paper presents the application and performance comparison of PSO and GA optimization techniques, for Thyristor Controlled Series Compensator (TCSC)-based controller design. The design objective is to enhance the power system stability. The design problem of the FACTS-based controller is formulated as an optimization problem and both the PSO and GA optimization techniques are employed to search for optimal controller parameters. The performance of both optimization techniques in terms of computational time and convergence rate is compared. Further, the optimized controllers are tested on a weakly connected power system subjected to different disturbances, and their performance is compared with the conventional power system stabilizer (CPSS). The eigenvalue analysis and non-linear simulation results are presented and compared to show the effectiveness of both the techniques in designing a TCSC-based controller, to enhance power system stability.

Keywords: Thyristor Controlled Series Compensator, geneticalgorithm; particle swarm optimization; Phillips-Heffron model;power system stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3147
354 Validation of Automotive Centrals Using Hardware in the Loop-Body Control Unit and Lights

Authors: Marley Rosa Luciano, Rodney Rezende Saldanha

Abstract:

The race for electrification and the need for innovation to attract customers has led the automotive industry to do something different with vehicles. New emissions control challenges and efficient technological availability are the pillars of creation. The growing demand to upgrade industrial manufacturing systems creates actions that directly impact vehicle production. With this comes the search for new prototyping methods and virtual tools for component testing and validation, and vehicle systems have established themselves. The demand for Electronic Control Units (ECU) is increasing due to the availability of intelligence and safety in today's vehicles, directly affecting their development, performance, and functional testing. In order to keep up with global changes, the automotive industry uses different virtual environments to produce, verify and validate their vehicles and test prototypes used during development. Therefore, in this paper, integration and validation were performed using the Hardware in the Loop (HIL) test platform, focusing on the ECU Body Control Module (BCM). Then, a brief commentary reviews other test medium platforms, such as the Plywood Buck (PWB), and examines the reliability, flexibility, installation time, and cost of the three test platforms, software in the loop (SIL), Model in the loop (MIL), and HIL, to review their benefits, challenges, and issues in use and information to optimize the use of each platform and test medium.

Keywords: Automotive, Electronic Central Unit, xIL, Hardware in the loop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 297
353 A Weighted Approach to Unconstrained Iris Recognition

Authors: Yao-Hong Tsai

Abstract:

This paper presents a weighted approach to unconstrained iris recognition. In nowadays, commercial systems are usually characterized by strong acquisition constraints based on the subject’s cooperation. However, it is not always achievable for real scenarios in our daily life. Researchers have been focused on reducing these constraints and maintaining the performance of the system by new techniques at the same time. With large variation in the environment, there are two main improvements to develop the proposed iris recognition system. For solving extremely uneven lighting condition, statistic based illumination normalization is first used on eye region to increase the accuracy of iris feature. The detection of the iris image is based on Adaboost algorithm. Secondly, the weighted approach is designed by Gaussian functions according to the distance to the center of the iris. Furthermore, local binary pattern (LBP) histogram is then applied to texture classification with the weight. Experiment showed that the proposed system provided users a more flexible and feasible way to interact with the verification system through iris recognition.

Keywords: Authentication, iris recognition, Adaboost, local binary pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
352 Analysis of Feature Space for a 2d/3d Vision based Emotion Recognition Method

Authors: Robert Niese, Ayoub Al-Hamadi, Bernd Michaelis

Abstract:

In modern human computer interaction systems (HCI), emotion recognition is becoming an imperative characteristic. The quest for effective and reliable emotion recognition in HCI has resulted in a need for better face detection, feature extraction and classification. In this paper we present results of feature space analysis after briefly explaining our fully automatic vision based emotion recognition method. We demonstrate the compactness of the feature space and show how the 2d/3d based method achieves superior features for the purpose of emotion classification. Also it is exposed that through feature normalization a widely person independent feature space is created. As a consequence, the classifier architecture has only a minor influence on the classification result. This is particularly elucidated with the help of confusion matrices. For this purpose advanced classification algorithms, such as Support Vector Machines and Artificial Neural Networks are employed, as well as the simple k- Nearest Neighbor classifier.

Keywords: Facial expression analysis, Feature extraction, Image processing, Pattern Recognition, Application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
351 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography

Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo

Abstract:

Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.

Keywords: Arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s Encoding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1077
350 Video Shot Detection and Key Frame Extraction Using Faber Shauder DWT and SVD

Authors: Assma Azeroual, Karim Afdel, Mohamed El Hajji, Hassan Douzi

Abstract:

Key frame extraction methods select the most representative frames of a video, which can be used in different areas of video processing such as video retrieval, video summary, and video indexing. In this paper we present a novel approach for extracting key frames from video sequences. The frame is characterized uniquely by his contours which are represented by the dominant blocks. These dominant blocks are located on the contours and its near textures. When the video frames have a noticeable changement, its dominant blocks changed, then we can extracte a key frame. The dominant blocks of every frame is computed, and then feature vectors are extracted from the dominant blocks image of each frame and arranged in a feature matrix. Singular Value Decomposition is used to calculate sliding windows ranks of those matrices. Finally the computed ranks are traced and then we are able to extract key frames of a video. Experimental results show that the proposed approach is robust against a large range of digital effects used during shot transition.

Keywords: Key Frame Extraction, Shot detection, FSDWT, Singular Value Decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2515
349 Frame Texture Classification Method (FTCM) Applied on Mammograms for Detection of Abnormalities

Authors: Kjersti Engan, Karl Skretting, Jostein Herredsvela, Thor Ole Gulsrud

Abstract:

Texture classification is an important image processing task with a broad application range. Many different techniques for texture classification have been explored. Using sparse approximation as a feature extraction method for texture classification is a relatively new approach, and Skretting et al. recently presented the Frame Texture Classification Method (FTCM), showing very good results on classical texture images. As an extension of that work the FTCM is here tested on a real world application as detection of abnormalities in mammograms. Some extensions to the original FTCM that are useful in some applications are implemented; two different smoothing techniques and a vector augmentation technique. Both detection of microcalcifications (as a primary detection technique and as a last stage of a detection scheme), and soft tissue lesions in mammograms are explored. All the results are interesting, and especially the results using FTCM on regions of interest as the last stage in a detection scheme for microcalcifications are promising.

Keywords: detection, mammogram, texture classification, dictionary learning, FTCM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1383
348 Real Time Speed Estimation of Vehicles

Authors: Azhar Hussain, Kashif Shahzad, Chunming Tang

Abstract:

this paper gives a novel approach towards real-time speed estimation of multiple traffic vehicles using fuzzy logic and image processing techniques with proper arrangement of camera parameters. The described algorithm consists of several important steps. First, the background is estimated by computing median over time window of specific frames. Second, the foreground is extracted using fuzzy similarity approach (FSA) between estimated background pixels and the current frame pixels containing foreground and background. Third, the traffic lanes are divided into two parts for both direction vehicles for parallel processing. Finally, the speeds of vehicles are estimated by Maximum a Posterior Probability (MAP) estimator. True ground speed is determined by utilizing infrared sensors for three different vehicles and the results are compared to the proposed algorithm with an accuracy of ± 0.74 kmph.

Keywords: Defuzzification, Fuzzy similarity approach, lane cropping, Maximum a Posterior Probability (MAP) estimator, Speed estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2796
347 Split-Pipe Design of Water Distribution Network Using Simulated Annealing

Authors: J. Tospornsampan, I. Kita, M. Ishii, Y. Kitamura

Abstract:

In this paper a procedure for the split-pipe design of looped water distribution network based on the use of simulated annealing is proposed. Simulated annealing is a heuristic-based search algorithm, motivated by an analogy of physical annealing in solids. It is capable for solving the combinatorial optimization problem. In contrast to the split-pipe design that is derived from a continuous diameter design that has been implemented in conventional optimization techniques, the split-pipe design proposed in this paper is derived from a discrete diameter design where a set of pipe diameters is chosen directly from a specified set of commercial pipes. The optimality and feasibility of the solutions are found to be guaranteed by using the proposed method. The performance of the proposed procedure is demonstrated through solving the three well-known problems of water distribution network taken from the literature. Simulated annealing provides very promising solutions and the lowest-cost solutions are found for all of these test problems. The results obtained from these applications show that simulated annealing is able to handle a combinatorial optimization problem of the least cost design of water distribution network. The technique can be considered as an alternative tool for similar areas of research. Further applications and improvements of the technique are expected as well.

Keywords: Combinatorial problem, Heuristics, Least-cost design, Looped network, Pipe network, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2669
346 The Temperature Effects on the Microstructure and Profile in Laser Cladding

Authors: P. C. Chiu, Jehnming Lin

Abstract:

In this study, a 50-W CO2 laser was used for the clad of 304L powders on the stainless steel substrate with a temperature sensor and image monitoring system. The laser power and cladding speed and focal position were modified to achieve the requirement of the workpiece flatness and mechanical properties. The numerical calculation is based on ANSYS to analyze the temperature change of the moving heat source at different surface positions when coating the workpiece, and the effect of the process parameters on the bath size was discussed. The temperature of stainless steel powder in the nozzle outlet reacting with the laser was simulated as a process parameter. In the experiment, the difference of the thermal conductivity in three-dimensional space is compared with single-layer cladding and multi-layer cladding. The heat dissipation pattern of the single-layer cladding is the steel plate and the multi-layer coating is the workpiece itself. The relationship between the multi-clad temperature and the profile was analyzed by the temperature signal from an IR pyrometer.

Keywords: Laser cladding, temperature, profile, microstructure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1060
345 In Search of a Suitable Neural Network Capable of Fast Monitoring of Congestion Level in Electric Power Systems

Authors: Pradyumna Kumar Sahoo, Prasanta Kumar Satpathy

Abstract:

This paper aims at finding a suitable neural network for monitoring congestion level in electrical power systems. In this paper, the input data has been framed properly to meet the target objective through supervised learning mechanism by defining normal and abnormal operating conditions for the system under study. The congestion level, expressed as line congestion index (LCI), is evaluated for each operating condition and is presented to the NN along with the bus voltages to represent the input and target data. Once, the training goes successful, the NN learns how to deal with a set of newly presented data through validation and testing mechanism. The crux of the results presented in this paper rests on performance comparison of a multi-layered feed forward neural network with eleven types of back propagation techniques so as to evolve the best training criteria. The proposed methodology has been tested on the standard IEEE-14 bus test system with the support of MATLAB based NN toolbox. The results presented in this paper signify that the Levenberg-Marquardt backpropagation algorithm gives best training performance of all the eleven cases considered in this paper, thus validating the proposed methodology.

Keywords: Line congestion index, critical bus, contingency, neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1781
344 A Cheating Model for Cellular Automata-Based Secret Sharing Schemes

Authors: Borna Jafarpour, Azadeh Nematzadeh, Vahid Kazempour, Babak Sadeghian

Abstract:

Cellular automata have been used for design of cryptosystems. Recently some secret sharing schemes based on linear memory cellular automata have been introduced which are used for both text and image. In this paper, we illustrate that these secret sharing schemes are vulnerable to dishonest participants- collusion. We propose a cheating model for the secret sharing schemes based on linear memory cellular automata. For this purpose we present a novel uniform model for representation of all secret sharing schemes based on cellular automata. Participants can cheat by means of sending bogus shares or bogus transition rules. Cheaters can cooperate to corrupt a shared secret and compute a cheating value added to it. Honest participants are not aware of cheating and suppose the incorrect secret as the valid one. We prove that cheaters can recover valid secret by removing the cheating value form the corrupted secret. We provide methods of calculating the cheating value.

Keywords: Cellular automata, cheating model, secret sharing, threshold scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
343 Neural Network based Texture Analysis of Liver Tumor from Computed Tomography Images

Authors: K.Mala, V.Sadasivam, S.Alagappan

Abstract:

Advances in clinical medical imaging have brought about the routine production of vast numbers of medical images that need to be analyzed. As a result an enormous amount of computer vision research effort has been targeted at achieving automated medical image analysis. Computed Tomography (CT) is highly accurate for diagnosing liver tumors. This study aimed to evaluate the potential role of the wavelet and the neural network in the differential diagnosis of liver tumors in CT images. The tumors considered in this study are hepatocellular carcinoma, cholangio carcinoma, hemangeoma and hepatoadenoma. Each suspicious tumor region was automatically extracted from the CT abdominal images and the textural information obtained was used to train the Probabilistic Neural Network (PNN) to classify the tumors. Results obtained were evaluated with the help of radiologists. The system differentiates the tumor with relatively high accuracy and is therefore clinically useful.

Keywords: Fuzzy c means clustering, texture analysis, probabilistic neural network, LVQ neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2978
342 Aliveness Detection of Fingerprints using Multiple Static Features

Authors: Heeseung Choi, Raechoong Kang, Kyungtaek Choi, Jaihie Kim

Abstract:

Fake finger submission attack is a major problem in fingerprint recognition systems. In this paper, we introduce an aliveness detection method based on multiple static features, which derived from a single fingerprint image. The static features are comprised of individual pore spacing, residual noise and several first order statistics. Specifically, correlation filter is adopted to address individual pore spacing. The multiple static features are useful to reflect the physiological and statistical characteristics of live and fake fingerprint. The classification can be made by calculating the liveness scores from each feature and fusing the scores through a classifier. In our dataset, we compare nine classifiers and the best classification rate at 85% is attained by using a Reduced Multivariate Polynomial classifier. Our approach is faster and more convenient for aliveness check for field applications.

Keywords: Aliveness detection, Fingerprint recognition, individual pore spacing, multiple static features, residual noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1918
341 Scenarios for a Sustainable Energy Supply Results of a Case Study for Austria

Authors: Petra Wächter

Abstract:

A comprehensive discussion of feasible strategies for sustainable energy supply is urgently needed to achieve a turnaround of the current energy situation. The necessary fundamentals required for the development of a long term energy vision are lacking to a great extent due to the absence of reasonable long term scenarios that fulfill the requirements of climate protection and sustainable energy use. The contribution of the study is based on a search for sustainable energy paths in the long run for Austria. The analysis makes use of secondary data predominantly. The measures developed to avoid CO2 emissions and other ecological risk factors vary to a great extent among all economic sectors. This is shown by the calculation of CO2 cost of abatement curves. In this study it is demonstrated that the most effective technical measures with the lowest CO2 abatement costs yield solutions to the current energy problems. Various scenarios are presented concerning the question how the technological and environmental options for a sustainable energy system for Austria could look like in the long run. It is shown how sustainable energy can be supplied even with today-s technological knowledge and options available. The scenarios developed include an evaluation of the economic costs and ecological impacts. The results are not only applicable to Austria but demonstrate feasible and cost efficient ways towards a sustainable future.

Keywords: Cost of CO2 Abatement, Energy Economics, Energy Efficiency, Renewable Energy Technologies, Sustainable Energy and Development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
340 Face Recognition Based On Vector Quantization Using Fuzzy Neuro Clustering

Authors: Elizabeth B. Varghese, M. Wilscy

Abstract:

A face recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame. A lot of algorithms have been proposed for face recognition. Vector Quantization (VQ) based face recognition is a novel approach for face recognition. Here a new codebook generation for VQ based face recognition using Integrated Adaptive Fuzzy Clustering (IAFC) is proposed. IAFC is a fuzzy neural network which incorporates a fuzzy learning rule into a competitive neural network. The performance of proposed algorithm is demonstrated by using publicly available AT&T database, Yale database, Indian Face database and a small face database, DCSKU database created in our lab. In all the databases the proposed approach got a higher recognition rate than most of the existing methods. In terms of Equal Error Rate (ERR) also the proposed codebook is better than the existing methods.

Keywords: Face Recognition, Vector Quantization, Integrated Adaptive Fuzzy Clustering, Self Organization Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2234
339 Automated Detection of Alzheimer Disease Using Region Growing technique and Artificial Neural Network

Authors: B. Al-Naami, N. Gharaibeh, A. AlRazzaq Kheshman

Abstract:

Alzheimer is known as the loss of mental functions such as thinking, memory, and reasoning that is severe enough to interfere with a person's daily functioning. The appearance of Alzheimer Disease symptoms (AD) are resulted based on which part of the brain has a variety of infection or damage. In this case, the MRI is the best biomedical instrumentation can be ever used to discover the AD existence. Therefore, this paper proposed a fusion method to distinguish between the normal and (AD) MRIs. In this combined method around 27 MRIs collected from Jordanian Hospitals are analyzed based on the use of Low pass -morphological filters to get the extracted statistical outputs through intensity histogram to be employed by the descriptive box plot. Also, the artificial neural network (ANN) is applied to test the performance of this approach. Finally, the obtained result of t-test with confidence accuracy (95%) has compared with classification accuracy of ANN (100 %). The robust of the developed method can be considered effectively to diagnose and determine the type of AD image.

Keywords: Alzheimer disease, Brain MRI analysis, Morphological filter, Box plot, Intensity histogram, ANN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3132
338 Automatic Segmentation of Thigh Magnetic Resonance Images

Authors: Lorena Urricelqui, Armando Malanda, Arantxa Villanueva

Abstract:

Purpose: To develop a method for automatic segmentation of adipose and muscular tissue in thighs from magnetic resonance images. Materials and methods: Thirty obese women were scanned on a Siemens Impact Expert 1T resonance machine. 1500 images were finally used in the tests. The developed segmentation method is a recursive and multilevel process that makes use of several concepts such as shaped histograms, adaptative thresholding and connectivity. The segmentation process was implemented in Matlab and operates without the need of any user interaction. The whole set of images were segmented with the developed method. An expert radiologist segmented the same set of images following a manual procedure with the aid of the SliceOmatic software (Tomovision). These constituted our 'goal standard'. Results: The number of coincidental pixels of the automatic and manual segmentation procedures was measured. The average results were above 90 % of success in most of the images. Conclusions: The proposed approach allows effective automatic segmentation of MRIs from thighs, comparable to expert manual performance.

Keywords: Segmentation, thigh, magnetic resonance image, fat, muscle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
337 Real-Time Visual Simulation and Interactive Animation of Shadow Play Puppets Using OpenGL

Authors: Tan Kian Lam, Abdullah Zawawi bin Haji Talib, Mohd. Azam Osman

Abstract:

This paper describes a method of modeling to model shadow play puppet using sophisticated computer graphics techniques available in OpenGL in order to allow interactive play in real-time environment as well as producing realistic animation. This paper proposes a novel real-time method is proposed for modeling of puppet and its shadow image that allows interactive play of virtual shadow play using texture mapping and blending techniques. Special effects such as lighting and blurring effects for virtual shadow play environment are also developed. Moreover, the use of geometric transformations and hierarchical modeling facilitates interaction among the different parts of the puppet during animation. Based on the experiments and the survey that were carried out, the respondents involved are very satisfied with the outcomes of these techniques.

Keywords: Animation, blending, hierarchical modeling, interactive play, real-time, shadow play, visual simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2495
336 Solving Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms – Part II: Optimization

Authors: Wayan F. Mahmudy, Romeo M. Marian, Lee H. S. Luong

Abstract:

This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.

Keywords: Flexible manufacturing system, production planning, part type selection problem, loading problem, real-coded genetic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
335 Adaptive Digital Watermarking Integrating Fuzzy Inference HVS Perceptual Model

Authors: Sherin M. Youssef, Ahmed Abouelfarag, Noha M. Ghatwary

Abstract:

An adaptive Fuzzy Inference Perceptual model has been proposed for watermarking of digital images. The model depends on the human visual characteristics of image sub-regions in the frequency multi-resolution wavelet domain. In the proposed model, a multi-variable fuzzy based architecture has been designed to produce a perceptual membership degree for both candidate embedding sub-regions and strength watermark embedding factor. Different sizes of benchmark images with different sizes of watermarks have been applied on the model. Several experimental attacks have been applied such as JPEG compression, noises and rotation, to ensure the robustness of the scheme. In addition, the model has been compared with different watermarking schemes. The proposed model showed its robustness to attacks and at the same time achieved a high level of imperceptibility.

Keywords: Watermarking, The human visual system (HVS), Fuzzy Inference System (FIS), Local Binary Pattern (LBP), Discrete Wavelet Transform (DWT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1809
334 Low Cost Technique for Measuring Luminance in Biological Systems

Authors: N. Chetty, K. Singh

Abstract:

In this work, the relationship between the melanin content in a tissue and subsequent absorption of light through that tissue was determined using a digital camera. This technique proved to be simple, cost effective, efficient and reliable. Tissue phantom samples were created using milk and soy sauce to simulate the optical properties of melanin content in human tissue. Increasing the concentration of soy sauce in the milk correlated to an increase in melanin content of an individual. Two methods were employed to measure the light transmitted through the sample. The first was direct measurement of the transmitted intensity using a conventional lux meter. The second method involved correctly calibrating an ordinary digital camera and using image analysis software to calculate the transmitted intensity through the phantom. The results from these methods were then graphically compared to the theoretical relationship between the intensity of transmitted light and the concentration of absorbers in the sample. Conclusions were then drawn about the effectiveness and efficiency of these low cost methods.

Keywords: Tissue phantoms, scattering coefficient, albedo, low-cost method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1289
333 An Investigation into the Impact of Techno-Entrepreneurship Education on Self-Employment

Authors: F. Farzin

Abstract:

Research has shown that techno-entrepreneurship is economically significant. Therefore, it is suggested that teaching techno-entrepreneurship may be important because such programmes would prepare current and future generations of learners to recognise and act on high-technology opportunities. Education in technoentrepreneurship may increase the knowledge of how to start one’s own enterprise and recognise the technological opportunities for commercialisation to improve decision-making about starting a new venture; also it influence decisions about capturing the business opportunities and turning them into successful ventures. Universities can play a main role in connecting and networking technoentrepreneurship students towards a cooperative attitude with real business practice and industry knowledge. To investigate and answer whether education for techno-entrepreneurs really helps, this paper choses a comparison of literature reviews as its method of research. After reviewing literature related to the impact of technoentrepreneurship education on self-employment 6 studies which had similar aim and objective to this paper were. These particular papers were selected based on a keywords search and as their aim, objectives, and gaps were close to the current research. In addition, they were all based on the influence of techno-entrepreneurship education in self-employment and intention of students to start new ventures. The findings showed that teaching techno-entrepreneurship education may have an influence on students’ intention and their future self-employment, but which courses should be covered and the duration of programmes, needs further investigation.

Keywords: Techno-entrepreneurship education, training, higher education, intention, self-employment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
332 Infrared Camera-Based Hand Gesture Space Touch System Implementation of Smart Device Environment

Authors: Yang-Keun Ahn, Kwang-Soon Choi, Young-Choong Park, Kwang-Mo Jung

Abstract:

This paper proposes a method to recognize the tip of a finger and space touch hand gesture using an infrared camera in a smart device environment. The proposed method estimates the tip of a finger with a curvature-based ellipse fitting algorithm, and verifies that the estimated object is indeed a finger with an ellipse fitting rectangular area. The feature extracted from the verified finger tip is used to implement the movement of a mouse and clicking gesture. The proposed algorithm was implemented with an actual smart device to test the proposed method. Empirical parameters were obtained from the keypad software and an image analysis tool for the performance optimization, and a comparative analysis with conventional research showed improved performance with the proposed method.

Keywords: Infrared camera, Hand gesture, Smart device, Space touch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2330
331 Surrogate based Evolutionary Algorithm for Design Optimization

Authors: Maumita Bhattacharya

Abstract:

Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.

Keywords: Evolutionary algorithm, Fitness function, Optimization, Meta-model, Stochastic method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569