Search results for: wavelet collocation
143 An Efficient Encryption Scheme Using DWT and Arnold Transforms
Authors: Ali Abdrhman M. Ukasha
Abstract:
Data security needed in data transmission, storage, and communication to ensure the security. The color image is decomposed into red, green, and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using a key image that has same original size and is generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours of color image recovery can be obtained with accepted level of distortion using Canny edge detector. Experiments have demonstrated that proposed algorithm can fully encrypt 2D color image and completely reconstructed without any distortion. It has shown that the color image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.Keywords: color image, wavelet transform, edge detector, Arnold transform, lossy image encryption
Procedia PDF Downloads 482142 Cost and Benefits of Collocation in the Use of Biogas to Reduce Vulnerabilities and Risks
Authors: Janaina Camile Pasqual Lofhagen, David Savarese, Veronika Vazhnik
Abstract:
The urgency of the climate crisis requires both innovation and practicality. The energy transition framework allows industry to deliver resilient cities, enhance adaptability to change, pursue energy objectives such as growth or efficiencies, and increase renewable energy. This paper investigates a real-world application perspective for the use of biogas in Brazil and the U.S.. It will examine interventions to provide a foundation of infrastructure, as well as the tangible benefits for policy-makers crafting law and providing incentives.Keywords: resilience, vulnerability, risks, biogas, sustainability.
Procedia PDF Downloads 105141 Forecasting the Sea Level Change in Strait of Hormuz
Authors: Hamid Goharnejad, Amir Hossein Eghbali
Abstract:
Recent investigations have demonstrated the global sea level rise due to climate change impacts. In this study climate changes study the effects of increasing water level in the strait of Hormuz. The probable changes of sea level rise should be investigated to employ the adaption strategies. The climatic output data of a GCM (General Circulation Model) named CGCM3 under climate change scenario of A1b and A2 were used. Among different variables simulated by this model, those of maximum correlation with sea level changes in the study region and least redundancy among themselves were selected for sea level rise prediction by using stepwise regression. One models of Discrete Wavelet artificial Neural Network (DWNN) was developed to explore the relationship between climatic variables and sea level changes. In these models, wavelet was used to disaggregate the time series of input and output data into different components and then ANN was used to relate the disaggregated components of predictors and predictands to each other. The results showed in the Shahid Rajae Station for scenario A1B sea level rise is among 64 to 75 cm and for the A2 Scenario sea level rise is among 90 to 105 cm. Furthermore the result showed a significant increase of sea level at the study region under climate change impacts, which should be incorporated in coastal areas management.Keywords: climate change scenarios, sea-level rise, strait of Hormuz, forecasting
Procedia PDF Downloads 271140 Gear Fault Diagnosis Based on Optimal Morlet Wavelet Filter and Autocorrelation Enhancement
Authors: Mohamed El Morsy, Gabriela Achtenová
Abstract:
Condition monitoring is used to increase machinery availability and machinery performance, whilst reducing consequential damage, increasing machine life, reducing spare parts inventories, and reducing breakdown maintenance. An efficient condition monitoring system provides early warning of faults by predicting them at an early stage. When a localized fault occurs in gears, the vibration signals always exhibit non-stationary behavior. The periodic impulsive feature of the vibration signal appears in the time domain and the corresponding gear mesh frequency (GMF) emerges in the frequency domain. However, one limitation of frequency-domain analysis is its inability to handle non-stationary waveform signals, which are very common when machinery faults occur. Particularly at the early stage of gear failure, the GMF contains very little energy and is often overwhelmed by noise and higher-level macro-structural vibrations. An effective signal processing method would be necessary to remove such corrupting noise and interference. In this paper, a new hybrid method based on optimal Morlet wavelet filter and autocorrelation enhancement is presented. First, to eliminate the frequency associated with interferential vibrations, the vibration signal is filtered with a band-pass filter determined by a Morlet wavelet whose parameters are selected or optimized based on maximum Kurtosis. Then, to further reduce the residual in-band noise and highlight the periodic impulsive feature, an autocorrelation enhancement algorithm is applied to the filtered signal. The test stand is equipped with three dynamometers; the input dynamometer serves as the internal combustion engine, the output dynamometers induce a load on the output joint shaft flanges. The pitting defect is manufactured on the tooth side of a gear of the fifth speed on the secondary shaft. The gearbox used for experimental measurements is of the type most commonly used in modern small to mid-sized passenger cars with transversely mounted powertrain and front wheel drive: a five-speed gearbox with final drive gear and front wheel differential. The results obtained from practical experiments prove that the proposed method is very effective for gear fault diagnosis.Keywords: wavelet analysis, pitted gear, autocorrelation, gear fault diagnosis
Procedia PDF Downloads 388139 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis
Authors: Tawfik Thelaidjia, Salah Chenikher
Abstract:
Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approachKeywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement
Procedia PDF Downloads 437138 Device Control Using Brain Computer Interface
Authors: P. Neeraj, Anurag Sharma, Harsukhpreet Singh
Abstract:
In current years, Brain-Computer Interface (BCI) scheme based on steady-state Visual Evoked Potential (SSVEP) have earned much consideration. This study tries to evolve an SSVEP based BCI scheme that can regulate any gadget mock-up in two unique positions ON and OFF. In this paper, two distinctive gleam frequencies in low-frequency part were utilized to evoke the SSVEPs and were shown on a Liquid Crystal Display (LCD) screen utilizing Lab View. Two stimuli shading, Yellow, and Blue were utilized to prepare the system in SSVEPs. The Electroencephalogram (EEG) signals recorded from the occipital part. Elements of the brain were separated by utilizing discrete wavelet Transform. A prominent system for multilayer system diverse Neural Network Algorithm (NNA), is utilized to characterize SSVEP signals. During training of the network with diverse calculation Regression plot results demonstrated that when Levenberg-Marquardt preparing calculation was utilized the exactness turns out to be 93.9%, which is superior to another training algorithm.Keywords: brain computer interface, electroencephalography, steady-state visual evoked potential, wavelet transform, neural network
Procedia PDF Downloads 334137 CFD simulation of Near Wall Turbulence and Heat Transfer of Molten Salts
Authors: C. S. Sona, Makrand A. Khanwale, Channamallikarjun S. Mathpati
Abstract:
New generation nuclear power plants are currently being developed to be highly economical, to be passive safe, to produce hydrogen. An important feature of these reactors will be the use of coolants at temperature higher than that being used in current nuclear reactors. The molten fluoride salt with a eutectic composition of 46.5% LiF - 11.5% NaF - 42% KF (mol %) commonly known as FLiNaK is a leading candidate for heat transfer coolant for these nuclear reactors. CFD simulations were carried out using large eddy simulations to investigate the flow characteristics of molten FLiNaK at 850°C at a Reynolds number of 10,500 in a cylindrical pipe. Simulation results have been validated with the help of mean velocity profile using direct numerical simulation data. Transient velocity information was used to identify and characterise turbulent structures which are important for transfer of heat across solid-fluid interface. A wavelet transform based methodology called wavelet transform modulus maxima was used to identify and characterise the singularities. This analysis was also used for flow visualisation, and also to calculate the heat transfer coefficient using small eddy model. The predicted Nusselt number showed good agreement with the available experimental data.Keywords: FLiNaK, heat transfer, molten salt, turbulent structures
Procedia PDF Downloads 449136 A Corpus Study of English Verbs in Chinese EFL Learners’ Academic Writing Abstracts
Authors: Shuaili Ji
Abstract:
The correct use of verbs is an important element of high-quality research articles, and thus for Chinese EFL learners, it is significant to master characteristics of verbs and to precisely use verbs. However, some researches have shown that there are differences in using verbs between learners and native speakers and learners have difficulty in using English verbs. This corpus-based quantitative research can enhance learners’ knowledge of English verbs and promote the quality of research article abstracts even of the whole academic writing. The aim of this study is to find the differences between learners’ and native speakers’ use of verbs and to study the factors that contribute to those differences. To this end, the research question is as follows: What are the differences between most frequently used verbs by learners and those by native speakers? The research question is answered through a study that uses corpus-based data-driven approach to analyze the verbs used by learners in their abstract writings in terms of collocation, colligation and semantic prosody. The results show that: (1) EFL learners obviously overused ‘be, can, find, make’ and underused ‘investigate, examine, may’. As to modal verbs, learners obviously overused ‘can’ while underused ‘may’. (2) Learners obviously overused ‘we find + object clauses’ while underused ‘nouns (results, findings, data) + suggest/indicate/reveal + object clauses’ when expressing research results. (3) Learners tended to transfer the collocation, colligation and semantic prosody of shǐ and zuò to make. (4) Learners obviously overused ‘BE+V-ed’ and used BE as the main verb. They also obviously overused the basic forms of BE such as be, is, are, while obviously underused its inflections (was, were). These results manifested learners’ lack of accuracy and idiomatic property in verb usage. Due to the influence of the concept transfer of Chinese, the verbs in learners’ abstracts showed obvious transfer of mother language. In addition, learners have not fully mastered the use of verbs, avoiding using complex colligations to prevent errors. Based on these findings, the present study has implications for English teaching, seeking to have implications for English academic abstract writing in China. Further research could be undertaken to study the use of verbs in the whole dissertation to find out whether the characteristic of the verbs in abstracts can apply in the whole dissertation or not.Keywords: academic writing abstracts, Chinese EFL learners, corpus-based, data-driven, verbs
Procedia PDF Downloads 335135 Design of a Real Time Heart Sounds Recognition System
Authors: Omer Abdalla Ishag, Magdi Baker Amien
Abstract:
Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform
Procedia PDF Downloads 445134 Perceptual Image Coding by Exploiting Internal Generative Mechanism
Authors: Kuo-Cheng Liu
Abstract:
In the perceptual image coding, the objective is to shape the coding distortion such that the amplitude of distortion does not exceed the error visibility threshold, or to remove perceptually redundant signals from the image. While most researches focus on color image coding, the perceptual-based quantizer developed for luminance signals are always directly applied to chrominance signals such that the color image compression methods are inefficient. In this paper, the internal generative mechanism is integrated into the design of a color image compression method. The internal generative mechanism working model based on the structure-based spatial masking is used to assess the subjective distortion visibility thresholds that are visually consistent to human eyes better. The estimation method of structure-based distortion visibility thresholds for color components is further presented in a locally adaptive way to design quantization process in the wavelet color image compression scheme. Since the lowest subband coefficient matrix of images in the wavelet domain preserves the local property of images in the spatial domain, the error visibility threshold inherent in each coefficient of the lowest subband for each color component is estimated by using the proposed spatial error visibility threshold assessment. The threshold inherent in each coefficient of other subbands for each color component is then estimated in a local adaptive fashion based on the distortion energy allocation. By considering that the error visibility thresholds are estimated using predicting and reconstructed signals of the color image, the coding scheme incorporated with locally adaptive perceptual color quantizer does not require side information. Experimental results show that the entropies of three color components obtained by using proposed IGM-based color image compression scheme are lower than that obtained by using the existing color image compression method at perceptually lossless visual quality.Keywords: internal generative mechanism, structure-based spatial masking, visibility threshold, wavelet domain
Procedia PDF Downloads 248133 Real Interest Rates and Real Returns of Agricultural Commodities in the Context of Quantitative Easing
Authors: Wei Yao, Constantinos Alexiou
Abstract:
In the existing literature, many studies have focused on the implementation and effectiveness of quantitative easing (QE) since 2008, but only a few have evaluated QE’s effect on commodity prices. In this context, by following Frankel’s (1986) commodity price overshooting model, we study the dynamic covariation between the expected real interest rates and six agricultural commodities’ real returns over the period from 2000:1 to 2018 for the US economy. We use wavelet analysis to investigate the causal relationship and co-movement of time series data by calculating the coefficient of determination in different frequencies. We find that a) US unconventional monetary policy may cause more positive and significant covariation between the expected real interest rates and agricultural commodities’ real returns over the short horizons; b) a lead-lag relationship that runs from agricultural commodities’ real returns to the expected real short-term interest rates over the long horizons; and c) a lead-lag relationship from agricultural commodities’ real returns to the expected real long-term interest rates over short horizons. In the realm of monetary policy, we argue that QE may shift the negative relationship between most commodities’ real returns and the expected real interest rates to a positive one over a short horizon.Keywords: QE, commodity price, interest rate, wavelet coherence
Procedia PDF Downloads 89132 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings
Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller
Abstract:
Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram
Procedia PDF Downloads 265131 Block Implicit Adams Type Algorithms for Solution of First Order Differential Equation
Authors: Asabe Ahmad Tijani, Y. A. Yahaya
Abstract:
The paper considers the derivation of implicit Adams-Moulton type method, with k=4 and 5. We adopted the method of interpolation and collocation of power series approximation to generate the continuous formula which was evaluated at off-grid and some grid points within the step length to generate the proposed block schemes, the schemes were investigated and found to be consistent and zero stable. Finally, the methods were tested with numerical experiments to ascertain their level of accuracy.Keywords: Adam-Moulton Type (AMT), off-grid, block method, consistent and zero stable
Procedia PDF Downloads 482130 Collocation Errors in English as Second Language (ESL) Essay Writing
Authors: Fatima Muhammad Shitu
Abstract:
In language learning, Second language learners like their native speaker counter parts, commit errors in their attempt to achieve competence in the target language. The realm of Collocation has to do with meaning relation between lexical items. In all human language, there is a kind of ‘natural order’ in which words are arranged or relate to one another in sentences so much so that when a word occurs in a given context, the related or naturally co -occurring word will automatically come to the mind. It becomes an error, therefore, if students inappropriately pair or arrange such ‘naturally’ co – occurring lexical items in a text. It has been observed that most of the second language learners in this research group commit collocational errors. A study of this kind is very significant as it gives insight into the kinds of errors committed by learners. This will help the language teacher to be able to identify the sources and causes of such errors as well as correct them thereby guiding, helping and leading the learners towards achieving some level of competence in the language. The aim of the study is to understand the nature of these errors as stumbling blocks to effective essay writing. The objective of the study is to identify the errors, analyse their structural compositions so as to determine whether there are similarities between students in this regard and to find out whether there are patterns to these kinds of errors which will enable the researcher to understand their sources and causes. As a descriptive research, the researcher samples some nine hundred essays collected from three hundred undergraduate learners of English as a second language in the Federal College of Education, Kano, North- West Nigeria, i.e. three essays per each student. The essays which were given on three different lecture times were of similar thematic preoccupations (i.e. same topics) and length (i.e. same number of words). The essays were written during the lecture hour at three different lecture occasions. The errors were identified in a systematic manner whereby errors so identified were recorded only once even if they occur severally in students’ essays. The data was collated using percentages in which the identified number of occurrences were converted accordingly in percentages. The findings from the study indicates that there are similarities as well as regular and repeated errors which provided a pattern. Based on the pattern identified, the conclusion is that students’ collocational errors are attributable to poor teaching and learning which resulted in wrong generalisation of rules.Keywords: collocations, errors, second language learning, ESL students
Procedia PDF Downloads 330129 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach
Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh
Abstract:
This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling
Procedia PDF Downloads 175128 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response
Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul
Abstract:
The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response
Procedia PDF Downloads 667127 Interpersonal Body-Synchronization in Young Children When Watching Video Together
Authors: Saeko Takahashi, Kazuo Hiraki
Abstract:
Is it more fun to watch videos together than watching alone? Previous studies showed that synchronizing with others enhances subsequent prosocial behavior and affiliation, and conversely, prosocial individuals tend to coordinate with a partner to a greater extent. However, compared to adults, less is known about interpersonal coordination of young children in real-life situations because most studies have focused on children’s particular movement using specific tools or tasks in a laboratory setting. It has also been unclear if prosociality of young children affect the extent of interpersonal coordination within dyads. The present study examined data from motion capture of five body parts of 4-year-old dyads watching the same stimuli together or alone. A questionnaire survey including participants’ prosocial trait was also conducted. The wavelet coherence of each body parts within dyads was calculated as a measure of the extent of interpersonal coordination. Results showed that the dyads became significantly more coordinated in a social situation compared to a non-social situation. Moreover, dyads with averagely higher prosociality were more coordinated. These results shed some light on the development of interpersonal coordination in terms of social ability in young children. This study also offers a useful method for a study of spontaneous coordination in young children and infants without instructions or verbal responses.Keywords: child development, interpersonal coordination, prosociality, synchrony, wavelet transform
Procedia PDF Downloads 139126 Multi-Layer Perceptron and Radial Basis Function Neural Network Models for Classification of Diabetic Retinopathy Disease Using Video-Oculography Signals
Authors: Ceren Kaya, Okan Erkaymaz, Orhan Ayar, Mahmut Özer
Abstract:
Diabetes Mellitus (Diabetes) is a disease based on insulin hormone disorders and causes high blood glucose. Clinical findings determine that diabetes can be diagnosed by electrophysiological signals obtained from the vital organs. 'Diabetic Retinopathy' is one of the most common eye diseases resulting on diabetes and it is the leading cause of vision loss due to structural alteration of the retinal layer vessels. In this study, features of horizontal and vertical Video-Oculography (VOG) signals have been used to classify non-proliferative and proliferative diabetic retinopathy disease. Twenty-five features are acquired by using discrete wavelet transform with VOG signals which are taken from 21 subjects. Two models, based on multi-layer perceptron and radial basis function, are recommended in the diagnosis of Diabetic Retinopathy. The proposed models also can detect level of the disease. We show comparative classification performance of the proposed models. Our results show that proposed the RBF model (100%) results in better classification performance than the MLP model (94%).Keywords: diabetic retinopathy, discrete wavelet transform, multi-layer perceptron, radial basis function, video-oculography (VOG)
Procedia PDF Downloads 259125 Linear Stability of Convection in an Inclined Channel with Nanofluid Saturated Porous Medium
Authors: D. Srinivasacharya, Nidhi Humnekar
Abstract:
The goal of this research is to numerically investigate the convection of nanofluid flow in an inclined porous channel. Brownian motion and thermophoresis effects are accounted for by nanofluid. In addition, the flow in the porous region governs Brinkman’s equation. The perturbed state of the generalized eigenvalue problem is obtained using normal mode analysis, and Chebyshev spectral collocation was used to solve this problem. For various values of the governing parameters, the critical wavenumber and critical Rayleigh number are calculated, and preferred modes are identified.Keywords: Brinkman model, inclined channel, nanofluid, linear stability, porous media
Procedia PDF Downloads 110124 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection
Authors: Ankur Dixit, Hiroaki Wagatsuma
Abstract:
The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform
Procedia PDF Downloads 173123 Natural Factors of Interannual Variability of Winter Precipitation over the Altai Krai
Authors: Sukovatov K.Yu., Bezuglova N.N.
Abstract:
Winter precipitation variability over the Altai Krai was investigated by retrieving temporal patterns. The spectral singular analysis was used to describe the variance distribution and to reduce the precipitation data into a few components (modes). The associated time series were related to large-scale atmospheric and oceanic circulation indices by using lag cross-correlation and wavelet-coherence analysis. GPCC monthly precipitation data for rectangular field limited by 50-550N, 77-880E and monthly climatological circulation index data for the cold season were used to perform SSA decomposition and retrieve statistics for analyzed parameters on the time period 1951-2017. Interannual variability of winter precipitation over the Altai Krai are mostly caused by three natural factors: intensity variations of momentum exchange between mid and polar latitudes over the North Atlantic (explained variance 11.4%); wind speed variations in equatorial stratosphere (quasi-biennial oscillation, explained variance 15.3%); and surface temperature variations for equatorial Pacific sea (ENSO, explained variance 2.8%). It is concluded that under the current climate conditions (Arctic amplification and increasing frequency of meridional processes in mid-latitudes) the second and the third factors are giving more significant contribution into explained variance of interannual variability for cold season atmospheric precipitation over the Altai Krai than the first factor.Keywords: interannual variability, winter precipitation, Altai Krai, wavelet-coherence
Procedia PDF Downloads 188122 Finite Element Method for Solving the Generalized RLW Equation
Authors: Abdel-Maksoud Abdel-Kader Soliman
Abstract:
The General Regularized Long Wave (GRLW) equation is solved numerically by giving a new algorithm based on collocation method using quartic B-splines at the mid-knot points as element shape. Also, we use the Fourth Runge-Kutta method for solving the system of first order ordinary differential equations instead of finite difference method. Our test problems, including the migration and interaction of solitary waves, are used to validate the algorithm which is found to be accurate and efficient. The three invariants of the motion are evaluated to determine the conservation properties of the algorithm.Keywords: generalized RLW equation, solitons, quartic b-spline, nonlinear partial differential equations, difference equations
Procedia PDF Downloads 489121 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 84120 Hybridization of Mathematical Transforms for Robust Video Watermarking Technique
Authors: Harpal Singh, Sakshi Batra
Abstract:
The widespread and easy accesses to multimedia contents and possibility to make numerous copies without loss of significant fidelity have roused the requirement of digital rights management. Thus this problem can be effectively solved by Digital watermarking technology. This is a concept of embedding some sort of data or special pattern (watermark) in the multimedia content; this information will later prove ownership in case of a dispute, trace the marked document’s dissemination, identify a misappropriating person or simply inform user about the rights-holder. The primary motive of digital watermarking is to embed the data imperceptibly and robustly in the host information. Extensive counts of watermarking techniques have been developed to embed copyright marks or data in digital images, video, audio and other multimedia objects. With the development of digital video-based innovations, copyright dilemma for the multimedia industry increases. Video watermarking had been proposed in recent years to serve the issue of illicit copying and allocation of videos. It is the process of embedding copyright information in video bit streams. Practically video watermarking schemes have to address some serious challenges as compared to image watermarking schemes like real-time requirements in the video broadcasting, large volume of inherently redundant data between frames, the unbalance between the motion and motionless regions etc. and they are particularly vulnerable to attacks, for example, frame swapping, statistical analysis, rotation, noise, median and crop attacks. In this paper, an effective, robust and imperceptible video watermarking algorithm is proposed based on hybridization of powerful mathematical transforms; Fractional Fourier Transform (FrFT), Discrete Wavelet transforms (DWT) and Singular Value Decomposition (SVD) using redundant wavelet. This scheme utilizes various transforms for embedding watermarks on different layers by using Hybrid systems. For this purpose, the video frames are portioned into layers (RGB) and the watermark is being embedded in two forms in the video frames using SVD portioning of the watermark, and DWT sub-band decomposition of host video, to facilitate copyright safeguard as well as reliability. The FrFT orders are used as the encryption key that allows the watermarking method to be more robust against various attacks. The fidelity of the scheme is enhanced by introducing key generation and wavelet based key embedding watermarking scheme. Thus, for watermark embedding and extraction, same key is required. Therefore the key must be shared between the owner and the verifier via some safe network. This paper demonstrates the performance by considering different qualitative metrics namely Peak Signal to Noise ratio, Structure similarity index and correlation values and also apply some attacks to prove the robustness. The Experimental results are presented to demonstrate that the proposed scheme can withstand a variety of video processing attacks as well as imperceptibility.Keywords: discrete wavelet transform, robustness, video watermarking, watermark
Procedia PDF Downloads 224119 An Adaptive Decomposition for the Variability Analysis of Observation Time Series in Geophysics
Authors: Olivier Delage, Thierry Portafaix, Hassan Bencherif, Guillaume Guimbretiere
Abstract:
Most observation data sequences in geophysics can be interpreted as resulting from the interaction of several physical processes at several time and space scales. As a consequence, measurements time series in geophysics have often characteristics of non-linearity and non-stationarity and thereby exhibit strong fluctuations at all time-scales and require a time-frequency representation to analyze their variability. Empirical Mode Decomposition (EMD) is a relatively new technic as part of a more general signal processing method called the Hilbert-Huang transform. This analysis method turns out to be particularly suitable for non-linear and non-stationary signals and consists in decomposing a signal in an auto adaptive way into a sum of oscillating components named IMFs (Intrinsic Mode Functions), and thereby acts as a bank of bandpass filters. The advantages of the EMD technic are to be entirely data driven and to provide the principal variability modes of the dynamics represented by the original time series. However, the main limiting factor is the frequency resolution that may give rise to the mode mixing phenomenon where the spectral contents of some IMFs overlap each other. To overcome this problem, J. Gilles proposed an alternative entitled “Empirical Wavelet Transform” (EWT) which consists in building from the segmentation of the original signal Fourier spectrum, a bank of filters. The method used is based on the idea utilized in the construction of both Littlewood-Paley and Meyer’s wavelets. The heart of the method lies in the segmentation of the Fourier spectrum based on the local maxima detection in order to obtain a set of non-overlapping segments. Because linked to the Fourier spectrum, the frequency resolution provided by EWT is higher than that provided by EMD and therefore allows to overcome the mode-mixing problem. On the other hand, if the EWT technique is able to detect the frequencies involved in the original time series fluctuations, EWT does not allow to associate the detected frequencies to a specific mode of variability as in the EMD technic. Because EMD is closer to the observation of physical phenomena than EWT, we propose here a new technic called EAWD (Empirical Adaptive Wavelet Decomposition) based on the coupling of the EMD and EWT technics by using the IMFs density spectral content to optimize the segmentation of the Fourier spectrum required by EWT. In this study, EMD and EWT technics are described, then EAWD technic is presented. Comparison of results obtained respectively by EMD, EWT and EAWD technics on time series of ozone total columns recorded at Reunion island over [1978-2019] period is discussed. This study was carried out as part of the SOLSTYCE project dedicated to the characterization and modeling of the underlying dynamics of time series issued from complex systems in atmospheric sciencesKeywords: adaptive filtering, empirical mode decomposition, empirical wavelet transform, filter banks, mode-mixing, non-linear and non-stationary time series, wavelet
Procedia PDF Downloads 137118 Diagnosis of Induction Machine Faults by DWT
Authors: Hamidreza Akbari
Abstract:
In this paper, for detection of inclined eccentricity in an induction motor, time–frequency analysis of the stator startup current is carried out. For this purpose, the discrete wavelet transform is used. Data are obtained from simulations, using winding function approach. The results show the validity of the approach for detecting the fault and discriminating with respect to other faults.Keywords: induction machine, fault, DWT, electric
Procedia PDF Downloads 350117 DWT-SATS Based Detection of Image Region Cloning
Authors: Michael Zimba
Abstract:
A duplicated image region may be subjected to a number of attacks such as noise addition, compression, reflection, rotation, and scaling with the intention of either merely mating it to its targeted neighborhood or preventing its detection. In this paper, we present an effective and robust method of detecting duplicated regions inclusive of those affected by the various attacks. In order to reduce the dimension of the image, the proposed algorithm firstly performs discrete wavelet transform, DWT, of a suspicious image. However, unlike most existing copy move image forgery (CMIF) detection algorithms operating in the DWT domain which extract only the low frequency sub-band of the DWT of the suspicious image thereby leaving valuable information in the other three sub-bands, the proposed algorithm simultaneously extracts features from all the four sub-bands. The extracted features are not only more accurate representation of image regions but also robust to additive noise, JPEG compression, and affine transformation. Furthermore, principal component analysis-eigenvalue decomposition, PCA-EVD, is applied to reduce the dimension of the features. The extracted features are then sorted using the more computationally efficient Radix Sort algorithm. Finally, same affine transformation selection, SATS, a duplication verification method, is applied to detect duplicated regions. The proposed algorithm is not only fast but also more robust to attacks compared to the related CMIF detection algorithms. The experimental results show high detection rates.Keywords: affine transformation, discrete wavelet transform, radix sort, SATS
Procedia PDF Downloads 230116 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators
Authors: Fathi Abid, Bilel Kaffel
Abstract:
The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode
Procedia PDF Downloads 339115 Numerical Solution of Space Fractional Order Solute Transport System
Authors: Shubham Jaiswal
Abstract:
In the present article, a drive is taken to compute the solution of spatial fractional order advection-dispersion equation having source/sink term with given initial and boundary conditions. The equation is converted to a system of ordinary differential equations using second-kind shifted Chebyshev polynomials, which have finally been solved using finite difference method. The striking feature of the article is the fast transportation of solute concentration as and when the system approaches fractional order from standard order for specified values of the parameters of the system.Keywords: spatial fractional order advection-dispersion equation, second-kind shifted Chebyshev polynomial, collocation method, conservative system, non-conservative system
Procedia PDF Downloads 261114 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement
Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini
Abstract:
Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis
Procedia PDF Downloads 138