Search results for: Core transform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1282

Search results for: Core transform

922 A Normalization-based Robust Image Watermarking Scheme Using SVD and DCT

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking is one of the techniques for copyright protection. In this paper, a normalization-based robust image watermarking scheme which encompasses singular value decomposition (SVD) and discrete cosine transform (DCT) techniques is proposed. For the proposed scheme, the host image is first normalized to a standard form and divided into non-overlapping image blocks. SVD is applied to each block. By concatenating the first singular values (SV) of adjacent blocks of the normalized image, a SV block is obtained. DCT is then carried out on the SV blocks to produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency band of a SVD-DCT block by imposing a particular relationship between two pseudo-randomly selected DCT coefficients. An adaptive frequency mask is used to adjust local watermark embedding strength. Watermark extraction involves mainly the inverse process. The watermark extracting method is blind and efficient. Experimental results show that the quality degradation of watermarked image caused by the embedded watermark is visually transparent. Results also show that the proposed scheme is robust against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Singularvalue decomposition, Discrete cosine transform, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2057
921 Codebook Generation for Vector Quantization on Orthogonal Polynomials based Transform Coding

Authors: R. Krishnamoorthi, N. Kannan

Abstract:

In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.

Keywords: Orthogonal Polynomials, Image Coding, Vector Quantization, TSVQ, Binary Tree Classifier

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
920 Visual Thing Recognition with Binary Scale-Invariant Feature Transform and Support Vector Machine Classifiers Using Color Information

Authors: Wei-Jong Yang, Wei-Hau Du, Pau-Choo Chang, Jar-Ferr Yang, Pi-Hsia Hung

Abstract:

The demands of smart visual thing recognition in various devices have been increased rapidly for daily smart production, living and learning systems in recent years. This paper proposed a visual thing recognition system, which combines binary scale-invariant feature transform (SIFT), bag of words model (BoW), and support vector machine (SVM) by using color information. Since the traditional SIFT features and SVM classifiers only use the gray information, color information is still an important feature for visual thing recognition. With color-based SIFT features and SVM, we can discard unreliable matching pairs and increase the robustness of matching tasks. The experimental results show that the proposed object recognition system with color-assistant SIFT SVM classifier achieves higher recognition rate than that with the traditional gray SIFT and SVM classification in various situations.

Keywords: Color moments, visual thing recognition system, SIFT, color SIFT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988
919 Novel Rao-Blackwellized Particle Filter for Mobile Robot SLAM Using Monocular Vision

Authors: Maohai Li, Bingrong Hong, Zesu Cai, Ronghua Luo

Abstract:

This paper presents the novel Rao-Blackwellised particle filter (RBPF) for mobile robot simultaneous localization and mapping (SLAM) using monocular vision. The particle filter is combined with unscented Kalman filter (UKF) to extending the path posterior by sampling new poses that integrate the current observation which drastically reduces the uncertainty about the robot pose. The landmark position estimation and update is also implemented through UKF. Furthermore, the number of resampling steps is determined adaptively, which seriously reduces the particle depletion problem, and introducing the evolution strategies (ES) for avoiding particle impoverishment. The 3D natural point landmarks are structured with matching Scale Invariant Feature Transform (SIFT) feature pairs. The matching for multi-dimension SIFT features is implemented with a KD-Tree in the time cost of O(log2 N). Experiment results on real robot in our indoor environment show the advantages of our methods over previous approaches.

Keywords: Mobile robot, simultaneous localization and mapping, Rao-Blackwellised particle filter, evolution strategies, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098
918 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: Matrix Minimization Algorithm, Decoding Sequential Search Algorithm, image compression, Discrete Cosine Transform, Discrete Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184
917 Sustainable Renovation and Restoration of the Rural Based on the View Point of Psychology

Authors: Luo Jin, Jin Fang

Abstract:

Countryside has been generally recognized and regarded as a characteristic symbol which presents in human memory for a long time. As a result of the change of times, because of it is failure to meet the growing needs of the growing life and mental decline, the vast rural area began to decline. But their history feature image which accumulated by the ancient tradition provides people with the origins of existence on the spiritual level, such as "identity" and "belonging", makes people closer to the others in the spiritual and psychological aspects of a common experience about the past, thus the sense of a lack of culture caused by the losing of memory symbols is weakened. So, in the modernization process, how to repair its vitality and transform and planning it in a sustainable way has become a hot topics in architectural and urban planning. This paper aims to break the constraints of disciplines, from the perspective of interdiscipline, using the research methods of systems science to analyze and discuss the theories and methods of rural form factors, which based on the viewpoint of memory in psychology. So we can find a right way to transform the Rural to give full play to the role of the countryside in the actual use and the shape of history spirits.

Keywords: The rural, sustainable renovation, restoration, psychology, memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
916 Hospitality Program Postgraduate Theses: What Hinders Their Accomplishment?

Authors: Mohd Salehuddin Mohd Zahari, Hamizad Abdul Hadi, Nik Mohd Shahril Nik Mohd Nor, Syuhirdy Mat Noor

Abstract:

Postgraduate education is generally aimed at providing in-depth knowledge and understanding that include general philosophy in the world sciences, management, technologies, applications and other elements closely related to specific areas. In most universities, besides core and non-core subjects, a thesis is one of the requirements for the postgraduate student to accomplish before graduating. This paper reports on the empirical investigation into attributes that are associated with the obstacles to thesis accomplishment among postgraduate students. Using the quantitative approach the experiences of postgraduate students were tapped. Findings clearly revealed that information seeking, writing skills and other factors which refer to supervisor and time management, in particular, are recognized as contributory factors which positively or negatively influence postgraduates’ thesis accomplishment. Among these, writing skills dimensions were found to be the most difficult process in thesis accomplishment compared to information seeking and other factors. This pessimistic indication has provided some implications not only for the students but supervisors and institutions as a whole.

Keywords: Hospitality, Program, Postgraduate, thesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501
915 Neutronic Study of Two Reactor Cores Cooled with Light and Heavy Water Using Computation Method

Authors: Z. Gholamzadeh, A. Zali, S. A. H. Feghhi, C. Tenreiro, Y. Kadi, M. Rezazadeh, M. Aref

Abstract:

Most HWRs currently use natural uranium fuel. Using enriched uranium fuel results in a significant improvement in fuel cycle costs and uranium utilization. On the other hand, reactivity changes of HWRs over the full range of operating conditions from cold shutdown to full power are small. This reduces the required reactivity worth of control devices and minimizes local flux distribution perturbations, minimizing potential problems due to transient local overheating of fuel. Analyzing heavy water effectiveness on neutronic parameters such as enrichment requirements, peaking factor and reactivity is important and should pay attention as primary concepts of a HWR core designing. Two nuclear nuclear reactors of CANDU-type and hexagonal-type reactor cores of 33 fuel assemblies and 19 assemblies in 1.04 P/D have been respectively simulated using MCNP-4C code. Using heavy water and light water as moderator have been compared for achieving less reactivity insertion and enrichment requirements. Two fuel matrixes of (232Th/235U)O2 and (238/235U)O2 have been compared to achieve more economical and safe design. Heavy water not only decreased enrichment needs, but it concluded in negative reactivity insertions during moderator density variations. Thorium oxide fuel assemblies of 2.3% enrichment loaded into the core of heavy water moderator resulted in 0.751 fission to absorption ratio and peaking factor of 1.7 using. Heavy water not only provides negative reactivity insertion during temperature raises which changes moderator density but concluded in 2 to 10 kg reduction of enrichment requirements, depend on geometry type.

Keywords: MCNP-4C, Reactor core, Multiplication factor, Reactivity, Peaking factor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1809
914 Efficient Secured Lossless Coding of Medical Images– Using Modified Runlength Coding for Character Representation

Authors: S. Annadurai, P. Geetha

Abstract:

Lossless compression schemes with secure transmission play a key role in telemedicine applications that helps in accurate diagnosis and research. Traditional cryptographic algorithms for data security are not fast enough to process vast amount of data. Hence a novel Secured lossless compression approach proposed in this paper is based on reversible integer wavelet transform, EZW algorithm, new modified runlength coding for character representation and selective bit scrambling. The use of the lifting scheme allows generating truly lossless integer-to-integer wavelet transforms. Images are compressed/decompressed by well-known EZW algorithm. The proposed modified runlength coding greatly improves the compression performance and also increases the security level. This work employs scrambling method which is fast, simple to implement and it provides security. Lossless compression ratios and distortion performance of this proposed method are found to be better than other lossless techniques.

Keywords: EZW algorithm, lifting scheme, losslesscompression, reversible integer wavelet transform, securetransmission, selective bit scrambling, modified runlength coding .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
913 Improved Neutron Leakage Treatment on Nodal Expansion Method for PWR Reactors

Authors: Antonio Carlos Marques Alvim, Fernando Carvalho da Silva, Aquilino Senra Martinez

Abstract:

For a quick and accurate calculation of spatial neutron distribution in nuclear power reactors 3D nodal codes are usually used aiming at solving the neutron diffusion equation for a given reactor core geometry and material composition. These codes use a second order polynomial to represent the transverse leakage term. In this work, a nodal method based on the well known nodal expansion method (NEM), developed at COPPE, making use of this polynomial expansion was modified to treat the transverse leakage term for the external surfaces of peripheral reflector nodes. The proposed method was implemented into a computational system which, besides solving the diffusion equation, also solves the burnup equations governing the gradual changes in material compositions of the core due to fuel depletion. Results confirm the effectiveness of this modified treatment of peripheral nodes for practical purposes in PWR reactors.

Keywords: Transverse leakage, nodal expansion method, power density, PWR reactors

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1989
912 Usage of Channel Coding Techniques for Peak-to-Average Power Ratio Reduction in Visible Light Communications Systems

Authors: P.L.D.N.M. de Silva, S.G. Edirisinghe, R. Weerasuriya

Abstract:

High Peak-to-Average Power Ratio (PAPR) is a concern of Orthogonal Frequency Division Multiplexing (OFDM) based Visible Light Communication (VLC) systems. Discrete Fourier Transform spread (DFT-s) OFDM is an alternative single carrier modulation scheme which would address this concern. Employing channel coding techniques is another mechanism to reduce the PAPR. In this study, the improvement which can be harnessed by hybridizing these two techniques for VLC system is being studied. Within the study, efficient techniques such as Hamming coding and Convolutional coding have been studied. Thus, we present the impact of the hybrid of DFT-s OFDM and Channel coding (Hamming coding and Convolutional coding) on PAPR in VLC systems, using MATLAB simulations.

Keywords: Convolutional Coding, Discrete Fourier Transform spread Orthogonal Frequency Division Multiplexing (DFT-s OFDM), Hamming Coding, Peak-to-Average Power Ratio (PAPR), Visible Light Communications (VLC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 432
911 A Robust Extrapolation Method for Curtailed Aperture Reconstruction in Acoustic Imaging

Authors: R. Bremananth

Abstract:

Acoustic Imaging based sound localization using microphone array is a challenging task in digital-signal processing. Discrete Fourier transform (DFT) based near-field acoustical holography (NAH) is an important acoustical technique for sound source localization and provide an efficient solution to the ill-posed problem. However, in practice, due to the usage of small curtailed aperture and its consequence of significant spectral leakage, the DFT could not reconstruct the active-region-of-sound (AROS) effectively, especially near the edges of aperture. In this paper, we emphasize the fundamental problems of DFT-based NAH, provide a solution to spectral leakage effect by the extrapolation based on linear predictive coding and 2D Tukey windowing. This approach has been tested to localize the single and multi-point sound sources. We observe that incorporating extrapolation technique increases the spatial resolution, localization accuracy and reduces spectral leakage when small curtail aperture with a lower number of sensors accounts.

Keywords: Acoustic Imaging, Discrete Fourier Transform (DFT), k-space wavenumber, Near-Field Acoustical Holography (NAH), Source Localization, Spectral Leakage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
910 Efficient HAAR Wavelet Transform with Embedded Zerotrees of Wavelet Compression for Color Images

Authors: S. Piramu Kailasam

Abstract:

This study is expected to compress true color image with compression algorithms in color spaces to provide high compression rates. The need of high compression ratio is to improve storage space. Alternative aim is to rank compression algorithms in a suitable color space. The dataset is sequence of true color images with size 128 x 128. HAAR Wavelet is one of the famous wavelet transforms, has great potential and maintains image quality of color images. HAAR wavelet Transform using Set Partitioning in Hierarchical Trees (SPIHT) algorithm with different color spaces framework is applied to compress sequence of images with angles. Embedded Zerotrees of Wavelet (EZW) is a powerful standard method to sequence data. Hence the proposed compression frame work of HAAR wavelet, xyz color space, morphological gradient and applied image with EZW compression, obtained improvement to other methods, in terms of Compression Ratio, Mean Square Error, Peak Signal Noise Ratio and Bits Per Pixel quality measures.

Keywords: Color Spaces, HAAR Wavelet, Morphological Gradient, Embedded Zerotrees Wavelet Compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 452
909 Method of Intelligent Fault Diagnosis of Preload Loss for Single Nut Ball Screws through the Sensed Vibration Signals

Authors: Yi-Cheng Huang, Yan-Chen Shin

Abstract:

This paper proposes method of diagnosing ball screw preload loss through the Hilbert-Huang Transform (HHT) and Multiscale entropy (MSE) process. The proposed method can diagnose ball screw preload loss through vibration signals when the machine tool is in operation. Maximum dynamic preload of 2 %, 4 %, and 6 % ball screws were predesigned, manufactured, and tested experimentally. Signal patterns are discussed and revealed using Empirical Mode Decomposition(EMD)with the Hilbert Spectrum. Different preload features are extracted and discriminated using HHT. The irregularity development of a ball screw with preload loss is determined and abstracted using MSE based on complexity perception. Experiment results show that the proposed method can predict the status of ball screw preload loss. Smart sensing for the health of the ball screw is also possible based on a comparative evaluation of MSE by the signal processing and pattern matching of EMD/HHT. This diagnosis method realizes the purposes of prognostic effectiveness on knowing the preload loss and utilizing convenience.

Keywords: Empirical Mode Decomposition, Hilbert-Huang Transform, Multi-scale Entropy, Preload Loss, Single-nut Ball Screw

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2790
908 Fluorescent-Core Microcavities Based On Silicon Quantum Dots for Oil Sensing Applications

Authors: V. Zamora, Z. Zhang, A. Meldrum

Abstract:

The compatibility of optical resonators with microfluidic systems may be relevant for chemical and biological applications. Here, a fluorescent-core microcavity (FCM) is investigated as a refractometric sensor for heavy oils. A high-index film of silicon quantum dots (QDs) was formed inside the capillary, supporting cylindrical fluorescence whispering gallery modes (WGMs). A set of standard refractive index oils was injected into a capillary, causing a shift of the WGM resonances toward longer wavelengths. A maximum sensitivity of 240 nm/RIU (refractive index unit) was found for a nominal oil index of 1.74. As well, a sensitivity of 22 nm/RIU was obtained for a lower index of 1.48, more typical of fuel hydrocarbons. Furthermore, the observed spectra and sensitivities were compared to theoretical predictions and reproduced via FDTD simulations, showing in general an excellent agreement. This work demonstrates the potential use of FCMs for oil sensing applications and the more generally for detecting liquid solutions with a high refractive index or high viscosity.

Keywords: Oils, optical resonators, sensing applications, whispering gallery modes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
907 Study on Performance of Wigner Ville Distribution for Linear FM and Transient Signal Analysis

Authors: Azeemsha Thacham Poyil, Nasimudeen KM

Abstract:

This research paper presents some methods to assess the performance of Wigner Ville Distribution for Time-Frequency representation of non-stationary signals, in comparison with the other representations like STFT, Spectrogram etc. The simultaneous timefrequency resolution of WVD is one of the important properties which makes it preferable for analysis and detection of linear FM and transient signals. There are two algorithms proposed here to assess the resolution and to compare the performance of signal detection. First method is based on the measurement of area under timefrequency plot; in case of a linear FM signal analysis. A second method is based on the instantaneous power calculation and is used in case of transient, non-stationary signals. The implementation is explained briefly for both methods with suitable diagrams. The accuracy of the measurements is validated to show the better performance of WVD representation in comparison with STFT and Spectrograms.

Keywords: WVD: Wigner Ville Distribution, STFT: Short Time Fourier Transform, FT: Fourier Transform, TFR: Time-Frequency Representation, FM: Frequency Modulation, LFM Signal: Linear FM Signal, JTFA: Joint time frequency analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2379
906 Local Curvelet Based Classification Using Linear Discriminant Analysis for Face Recognition

Authors: Mohammed Rziza, Mohamed El Aroussi, Mohammed El Hassouni, Sanaa Ghouzali, Driss Aboutajdine

Abstract:

In this paper, an efficient local appearance feature extraction method based the multi-resolution Curvelet transform is proposed in order to further enhance the performance of the well known Linear Discriminant Analysis(LDA) method when applied to face recognition. Each face is described by a subset of band filtered images containing block-based Curvelet coefficients. These coefficients characterize the face texture and a set of simple statistical measures allows us to form compact and meaningful feature vectors. The proposed method is compared with some related feature extraction methods such as Principal component analysis (PCA), as well as Linear Discriminant Analysis LDA, and independent component Analysis (ICA). Two different muti-resolution transforms, Wavelet (DWT) and Contourlet, were also compared against the Block Based Curvelet-LDA algorithm. Experimental results on ORL, YALE and FERET face databases convince us that the proposed method provides a better representation of the class information and obtains much higher recognition accuracies.

Keywords: Curvelet, Linear Discriminant Analysis (LDA) , Contourlet, Discreet Wavelet Transform, DWT, Block-based analysis, face recognition (FR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774
905 Speaker Identification Using Admissible Wavelet Packet Based Decomposition

Authors: Mangesh S. Deshpande, Raghunath S. Holambe

Abstract:

Mel Frequency Cepstral Coefficient (MFCC) features are widely used as acoustic features for speech recognition as well as speaker recognition. In MFCC feature representation, the Mel frequency scale is used to get a high resolution in low frequency region, and a low resolution in high frequency region. This kind of processing is good for obtaining stable phonetic information, but not suitable for speaker features that are located in high frequency regions. The speaker individual information, which is non-uniformly distributed in the high frequencies, is equally important for speaker recognition. Based on this fact we proposed an admissible wavelet packet based filter structure for speaker identification. Multiresolution capabilities of wavelet packet transform are used to derive the new features. The proposed scheme differs from previous wavelet based works, mainly in designing the filter structure. Unlike others, the proposed filter structure does not follow Mel scale. The closed-set speaker identification experiments performed on the TIMIT database shows improved identification performance compared to other commonly used Mel scale based filter structures using wavelets.

Keywords: Speaker identification, Wavelet transform, Feature extraction, MFCC, GMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1946
904 On Developing a Core Guideline for English Language Training Programs in Business Settings

Authors: T. Ito, K. Kawaguchi, R. Ohta

Abstract:

The purpose of this study is to provide a guideline to assist globally-minded companies in developing task-based English- language programs for their employees. After conducting an online self-assessment questionnaire comprised of 45 job-related tasks, we analyzed responses received from 3,000 Japanese company employees and developed a checklist that considered three areas; i) the percentage of those who need to accomplish English-language tasks in their workplace (need for English), ii) a five-point self-assessment score (task performance level), and iii) the impact of previous task experience on perceived performance (experience factor). The 45 tasks were graded according to five proficiency levels. Our results helped us to create a core guideline that may assist companies in two ways: first, in helping determine which tasks employees with a certain English proficiency should be able to satisfactorily carry out, and secondly, to quickly prioritize which business-related English skills they would need in future English language programs.

Keywords: Business settings, Can-do statements, English language training programs, Self-assessment, Task experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1416
903 Thermal Hydraulic Analysis of the IAEA 10MW Benchmark Reactor under Normal Operating Condition

Authors: Hamed Djalal

Abstract:

The aim of this paper is to perform a thermal-hydraulic analysis of the IAEA 10 MW benchmark reactor solving analytically and numerically, by mean of the finite volume method, respectively the steady state and transient forced convection in rectangular narrow channel between two parallel MTR-type fuel plates, imposed under a cosine shape heat flux. A comparison between both solutions is presented to determine the minimal coolant velocity which can ensure a safe reactor core cooling, where the cladding temperature should not reach a specific safety limit 90 °C. For this purpose, a computer program is developed to determine the principal parameter related to the nuclear core safety, such as the temperature distribution in the fuel plate and in the coolant (light water) as a function of the inlet coolant velocity. Finally, a good agreement is noticed between the both analytical and numerical solutions, where the obtained results are displayed graphically.

Keywords: Forced convection, friction factor pressure drop thermal hydraulic analysis, vertical heated rectangular channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 820
902 Implementation of Neural Network Based Electricity Load Forecasting

Authors: Myint Myint Yi, Khin Sandar Linn, Marlar Kyaw

Abstract:

This paper proposed a novel model for short term load forecast (STLF) in the electricity market. The prior electricity demand data are treated as time series. The model is composed of several neural networks whose data are processed using a wavelet technique. The model is created in the form of a simulation program written with MATLAB. The load data are treated as time series data. They are decomposed into several wavelet coefficient series using the wavelet transform technique known as Non-decimated Wavelet Transform (NWT). The reason for using this technique is the belief in the possibility of extracting hidden patterns from the time series data. The wavelet coefficient series are used to train the neural networks (NNs) and used as the inputs to the NNs for electricity load prediction. The Scale Conjugate Gradient (SCG) algorithm is used as the learning algorithm for the NNs. To get the final forecast data, the outputs from the NNs are recombined using the same wavelet technique. The model was evaluated with the electricity load data of Electronic Engineering Department in Mandalay Technological University in Myanmar. The simulation results showed that the model was capable of producing a reasonable forecasting accuracy in STLF.

Keywords: Neural network, Load forecast, Time series, wavelettransform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2460
901 A Novel Approach to Iris Localization for Iris Biometric Processing

Authors: Somnath Dey, Debasis Samanta

Abstract:

Iris-based biometric system is gaining its importance in several applications. However, processing of iris biometric is a challenging and time consuming task. Detection of iris part in an eye image poses a number of challenges such as, inferior image quality, occlusion of eyelids and eyelashes etc. Due to these problems it is not possible to achieve 100% accuracy rate in any iris-based biometric authentication systems. Further, iris detection is a computationally intensive task in the overall iris biometric processing. In this paper, we address these two problems and propose a technique to localize iris part efficiently and accurately. We propose scaling and color level transform followed by thresholding, finding pupil boundary points for pupil boundary detection and dilation, thresholding, vertical edge detection and removal of unnecessary edges present in the eye images for iris boundary detection. Scaling reduces the search space significantly and intensity level transform is helpful for image thresholding. Experimental results show that our approach is comparable with the existing approaches. Following our approach it is possible to detect iris part with 95-99% accuracy as substantiated by our experiments on CASIA Ver-3.0, ICE 2005, UBIRIS, Bath and MMU iris image databases.

Keywords: Iris recognition, iris localization, biometrics, image processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3143
900 Numerical Investigation of Poling Vector Angle on Adaptive Sandwich Plate Deflection

Authors: Alireza Pouladkhan, Mohammad Yavari Foroushani, Ali Mortazavi

Abstract:

This paper presents a finite element model for a Sandwich Plate containing a piezoelectric core. A sandwich plate with a piezoelectric core is constructed using the shear mode of piezoelectric materials. The orientation of poling vector has a significant effect on deflection and stress induced in the piezo-actuated adaptive sandwich plate. In the present study, the influence of this factor for a clamped-clamped-free-free and simple-simple-free-free square sandwich plate is investigated using Finite Element Method. The study uses ABAQUS (v.6.7) software to derive the finite element model of the sandwich plate. By using this model, the study gives the influences of the poling vector angle on the response of the smart structure and determines the maximum transverse displacement and maximum stress induced.

Keywords: Finite element method, Sandwich plate, Poling vector, Piezoelectric materials, Smart structure, Electric enthalpy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
899 A Method for Iris Recognition Based on 1D Coiflet Wavelet

Authors: Agus Harjoko, Sri Hartati, Henry Dwiyasa

Abstract:

There have been numerous implementations of security system using biometric, especially for identification and verification cases. An example of pattern used in biometric is the iris pattern in human eye. The iris pattern is considered unique for each person. The use of iris pattern poses problems in encoding the human iris. In this research, an efficient iris recognition method is proposed. In the proposed method the iris segmentation is based on the observation that the pupil has lower intensity than the iris, and the iris has lower intensity than the sclera. By detecting the boundary between the pupil and the iris and the boundary between the iris and the sclera, the iris area can be separated from pupil and sclera. A step is taken to reduce the effect of eyelashes and specular reflection of pupil. Then the four levels Coiflet wavelet transform is applied to the extracted iris image. The modified Hamming distance is employed to measure the similarity between two irises. This research yields the identification success rate of 84.25% for the CASIA version 1.0 database. The method gives an accuracy of 77.78% for the left eyes of MMU 1 database and 86.67% for the right eyes. The time required for the encoding process, from the segmentation until the iris code is generated, is 0.7096 seconds. These results show that the accuracy and speed of the method is better than many other methods.

Keywords: Biometric, iris recognition, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
898 Palmprint Recognition by Wavelet Transform with Competitive Index and PCA

Authors: Deepti Tamrakar, Pritee Khanna

Abstract:

This manuscript presents, palmprint recognition by combining different texture extraction approaches with high accuracy. The Region of Interest (ROI) is decomposed into different frequencytime sub-bands by wavelet transform up-to two levels and only the approximate image of two levels is selected, which is known as Approximate Image ROI (AIROI). This AIROI has information of principal lines of the palm. The Competitive Index is used as the features of the palmprint, in which six Gabor filters of different orientations convolve with the palmprint image to extract the orientation information from the image. The winner-take-all strategy is used to select dominant orientation for each pixel, which is known as Competitive Index. Further, PCA is applied to select highly uncorrelated Competitive Index features, to reduce the dimensions of the feature vector, and to project the features on Eigen space. The similarity of two palmprints is measured by the Euclidean distance metrics. The algorithm is tested on Hong Kong PolyU palmprint database. Different AIROI of different wavelet filter families are also tested with the Competitive Index and PCA. AIROI of db7 wavelet filter achievs Equal Error Rate (EER) of 0.0152% and Genuine Acceptance Rate (GAR) of 99.67% on the palm database of Hong Kong PolyU.

Keywords: DWT, EER, Euclidean Distance, Gabor filter, PCA, ROI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1702
897 A Speeded up Robust Scale-Invariant Feature Transform Currency Recognition Algorithm

Authors: Daliyah S. Aljutaili, Redna A. Almutlaq, Suha A. Alharbi, Dina M. Ibrahim

Abstract:

All currencies around the world look very different from each other. For instance, the size, color, and pattern of the paper are different. With the development of modern banking services, automatic methods for paper currency recognition become important in many applications like vending machines. One of the currency recognition architecture’s phases is Feature detection and description. There are many algorithms that are used for this phase, but they still have some disadvantages. This paper proposes a feature detection algorithm, which merges the advantages given in the current SIFT and SURF algorithms, which we call, Speeded up Robust Scale-Invariant Feature Transform (SR-SIFT) algorithm. Our proposed SR-SIFT algorithm overcomes the problems of both the SIFT and SURF algorithms. The proposed algorithm aims to speed up the SIFT feature detection algorithm and keep it robust. Simulation results demonstrate that the proposed SR-SIFT algorithm decreases the average response time, especially in small and minimum number of best key points, increases the distribution of the number of best key points on the surface of the currency. Furthermore, the proposed algorithm increases the accuracy of the true best point distribution inside the currency edge than the other two algorithms.

Keywords: Currency recognition, feature detection and description, SIFT algorithm, SURF algorithm, speeded up and robust features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 824
896 Optimized and Secured Digital Watermarking Using Entropy, Chaotic Grid Map and Its Performance Analysis

Authors: R. Rama Kishore, Sunesh

Abstract:

This paper presents an optimized, robust, and secured watermarking technique. The methodology used in this work is the combination of entropy and chaotic grid map. The proposed methodology incorporates Discrete Cosine Transform (DCT) on the host image. To improve the imperceptibility of the method, the host image DCT blocks, where the watermark is to be embedded, are further optimized by considering the entropy of the blocks. Chaotic grid is used as a key to reorder the DCT blocks so that it will further increase security while selecting the watermark embedding locations and its sequence. Without a key, one cannot reveal the exact watermark from the watermarked image. The proposed method is implemented on four different images. It is concluded that the proposed method is giving better results in terms of imperceptibility measured through PSNR and found to be above 50. In order to prove the effectiveness of the method, the performance analysis is done after implementing different attacks on the watermarked images. It is found that the methodology is very strong against JPEG compression attack even with the quality parameter up to 15. The experimental results are confirming that the combination of entropy and chaotic grid map method is strong and secured to different image processing attacks.

Keywords: Digital watermarking, discrete cosine transform, chaotic grid map, entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 670
895 A Methodology for Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and cloud computing, we mostly rely on the machine and natural language processing capabilities of AI, and energy efficient hardware and software devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and to sustain the depletion of natural resources. The core pillars of sustainability are Economic, Environmental, and Social, which are also informally referred to as 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core sustainability model in the enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand there is also a growing concern in many industries on how to reduce carbon emission and conserve natural resources while adopting sustainability in the corporate business models and policies. In our paper, we would like to discuss the driving forces such as climate changes, natural disasters, pandemic, disruptive technologies, corporate policies, scaled business models and emerging social media and AI platforms that influence the 3 main pillars of sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increase recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (shared IT services, cloud computing and application modernization) with the vision for a sustainable environment.

Keywords: AI, cloud computing, machine learning, social media platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137
894 Thermal Treatment Influence on the Quality of Rye Bread Packaged in Different Polymer Films

Authors: Tatjana Rakcejeva, Lija Dukalska, Olga Petrova, Dace Klava, Emils Kozlinskis, Martins Sabovics

Abstract:

this study was carried out to investigate the changes in quality parameters of rye bread packaged in different polymer films during convection air-flow thermal treatment process. Whole loafs of bread were placed in polymer pouches, which were sealed in reduced pressure air ambiance, bread was thermally treated in at temperature +(130; 140; and 150) ± 5 ºC within 40min, as long as the core temperature of the samples have reached accordingly +80±1 ºC. For bread packaging pouches were used: anti-fog Mylar®OL12AF and thermo resistant combined polymer material. Main quality parameters was analysed using standard methods: temperature in bread core, bread crumb and crust firmness value, starch granules volume and microflora. In the current research it was proved, that polymer films significantly influence rye bread quality parameters changes during thermal treatment. Thermo resistant combined polymer material film could be recommendable for packaged rye bread pasteurization, for maximal bread quality parameter keeping.

Keywords: bread, thermal treatment, bread crumb, bread crust, starch granule's volume.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3319
893 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation

Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan

Abstract:

Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.

Keywords: Binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1337