Search results for: empirical wavelet transform
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4017

Search results for: empirical wavelet transform

3777 Reliability Assessment of Various Empirical Formulas for Prediction of Scour Hole Depth (Plunge Pool) Using a Comprehensive Physical Model

Authors: Majid Galoie, Khodadad Safavi, Abdolreza Karami Nejad, Reza Roshan

Abstract:

In this study, a comprehensive scouring model has been developed in order to evaluate the accuracy of various empirical relationships which were suggested for prediction of scour hole depth in plunge pools by Martins, Mason, Chian and Veronese. For this reason, scour hole depths caused by free falling jets from a flip bucket to a plunge pool were investigated. In this study various discharges, angles, scouring times, etc. have been considered. The final results demonstrated that the all mentioned empirical formulas, except Mason formula, were reasonably agreement with the experimental data.

Keywords: scour hole depth, plunge pool, physical model, reliability assessment

Procedia PDF Downloads 508
3776 A Problem in Microstretch Thermoelastic Diffusive Medium

Authors: Devinder Singh, Arvind Kumar, Rajneesh Kumar

Abstract:

The general solution of the equations for a homogeneous isotropic microstretch thermo elastic medium with mass diffusion for two dimensional problems is obtained due to normal and tangential forces. The integral transform technique is used to obtain the components of displacements, microrotation, stress and mass concentration, temperature change and mass concentration. A particular case of interest is deduced from the present investigation.

Keywords: normal force, tangential force, microstretch, thermoelastic, the integral transform technique, deforming force, microstress force, boundary value problem

Procedia PDF Downloads 593
3775 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation

Authors: Serge B. Provost, Yishan Zhang

Abstract:

A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.

Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation

Procedia PDF Downloads 135
3774 Forecasting Amman Stock Market Data Using a Hybrid Method

Authors: Ahmad Awajan, Sadam Al Wadi

Abstract:

In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.

Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series

Procedia PDF Downloads 102
3773 Perceptual Image Coding by Exploiting Internal Generative Mechanism

Authors: Kuo-Cheng Liu

Abstract:

In the perceptual image coding, the objective is to shape the coding distortion such that the amplitude of distortion does not exceed the error visibility threshold, or to remove perceptually redundant signals from the image. While most researches focus on color image coding, the perceptual-based quantizer developed for luminance signals are always directly applied to chrominance signals such that the color image compression methods are inefficient. In this paper, the internal generative mechanism is integrated into the design of a color image compression method. The internal generative mechanism working model based on the structure-based spatial masking is used to assess the subjective distortion visibility thresholds that are visually consistent to human eyes better. The estimation method of structure-based distortion visibility thresholds for color components is further presented in a locally adaptive way to design quantization process in the wavelet color image compression scheme. Since the lowest subband coefficient matrix of images in the wavelet domain preserves the local property of images in the spatial domain, the error visibility threshold inherent in each coefficient of the lowest subband for each color component is estimated by using the proposed spatial error visibility threshold assessment. The threshold inherent in each coefficient of other subbands for each color component is then estimated in a local adaptive fashion based on the distortion energy allocation. By considering that the error visibility thresholds are estimated using predicting and reconstructed signals of the color image, the coding scheme incorporated with locally adaptive perceptual color quantizer does not require side information. Experimental results show that the entropies of three color components obtained by using proposed IGM-based color image compression scheme are lower than that obtained by using the existing color image compression method at perceptually lossless visual quality.

Keywords: internal generative mechanism, structure-based spatial masking, visibility threshold, wavelet domain

Procedia PDF Downloads 221
3772 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 49
3771 Real Interest Rates and Real Returns of Agricultural Commodities in the Context of Quantitative Easing

Authors: Wei Yao, Constantinos Alexiou

Abstract:

In the existing literature, many studies have focused on the implementation and effectiveness of quantitative easing (QE) since 2008, but only a few have evaluated QE’s effect on commodity prices. In this context, by following Frankel’s (1986) commodity price overshooting model, we study the dynamic covariation between the expected real interest rates and six agricultural commodities’ real returns over the period from 2000:1 to 2018 for the US economy. We use wavelet analysis to investigate the causal relationship and co-movement of time series data by calculating the coefficient of determination in different frequencies. We find that a) US unconventional monetary policy may cause more positive and significant covariation between the expected real interest rates and agricultural commodities’ real returns over the short horizons; b) a lead-lag relationship that runs from agricultural commodities’ real returns to the expected real short-term interest rates over the long horizons; and c) a lead-lag relationship from agricultural commodities’ real returns to the expected real long-term interest rates over short horizons. In the realm of monetary policy, we argue that QE may shift the negative relationship between most commodities’ real returns and the expected real interest rates to a positive one over a short horizon.

Keywords: QE, commodity price, interest rate, wavelet coherence

Procedia PDF Downloads 58
3770 Generalized Model Estimating Strength of Bauxite Residue-Lime Mix

Authors: Sujeet Kumar, Arun Prasad

Abstract:

The present work investigates the effect of multiple parameters on the unconfined compressive strength of the bauxite residue-lime mix. A number of unconfined compressive strength tests considering various curing time, lime content, dry density and moisture content were carried out. The results show that an empirical correlation may be successfully developed using volumetric lime content, porosity, moisture content, curing time unconfined compressive strength for the range of the bauxite residue-lime mix studied. The proposed empirical correlations efficiently predict the strength of bauxite residue-lime mix, and it can be used as a generalized empirical equation to estimate unconfined compressive strength.

Keywords: bauxite residue, curing time, porosity/volumetric lime ratio, unconfined compressive strength

Procedia PDF Downloads 210
3769 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings

Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller

Abstract:

Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.

Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram

Procedia PDF Downloads 238
3768 Taleghan Dam Break Numerical Modeling

Authors: Hamid Goharnejad, Milad Sadeghpoor Moalem, Mahmood Zakeri Niri, Leili Sadeghi Khalegh Abadi

Abstract:

While there are many benefits to using reservoir dams, their break leads to destructive effects. From the viewpoint of International Committee of Large Dams (ICOLD), dam break means the collapse of whole or some parts of a dam; thereby the dam will be unable to hold water. Therefore, studying dam break phenomenon and prediction of its behavior and effects reduces losses and damages of the mentioned phenomenon. One of the most common types of reservoir dams is embankment dam. Overtopping in embankment dams occurs because of flood discharge system inability in release inflows to reservoir. One of the most important issues among managers and engineers to evaluate the performance of the reservoir dam rim when sliding into the storage, creating waves is large and long. In this study, the effects of floods which caused the overtopping of the dam have been investigated. It was assumed that spillway is unable to release the inflow. To determine outflow hydrograph resulting from dam break, numerical model using Flow-3D software and empirical equations was used. Results of numerical models and their comparison with empirical equations show that numerical model and empirical equations can be used to study the flood resulting from dam break.

Keywords: embankment dam break, empirical equations, Taleghan dam, Flow-3D numerical model

Procedia PDF Downloads 299
3767 An Accurate Computation of 2D Zernike Moments via Fast Fourier Transform

Authors: Mohammed S. Al-Rawi, J. Bastos, J. Rodriguez

Abstract:

Object detection and object recognition are essential components of every computer vision system. Despite the high computational complexity and other problems related to numerical stability and accuracy, Zernike moments of 2D images (ZMs) have shown resilience when used in object recognition and have been used in various image analysis applications. In this work, we propose a novel method for computing ZMs via Fast Fourier Transform (FFT). Notably, this is the first algorithm that can generate ZMs up to extremely high orders accurately, e.g., it can be used to generate ZMs for orders up to 1000 or even higher. Furthermore, the proposed method is also simpler and faster than the other methods due to the availability of FFT software and/or hardware. The accuracies and numerical stability of ZMs computed via FFT have been confirmed using the orthogonality property. We also introduce normalizing ZMs with Neumann factor when the image is embedded in a larger grid, and color image reconstruction based on RGB normalization of the reconstructed images. Astonishingly, higher-order image reconstruction experiments show that the proposed methods are superior, both quantitatively and subjectively, compared to the q-recursive method.

Keywords: Chebyshev polynomial, fourier transform, fast algorithms, image recognition, pseudo Zernike moments, Zernike moments

Procedia PDF Downloads 240
3766 The Relationship between Resource Sharing and Economic Resilience: An Empirical Analysis of Firms’ Resilience from the Perspective of Resource Dependence Theory

Authors: Alfredo R. Roa-Henriquez

Abstract:

This paper is about organizational-level resilience and decision-making in the face of natural hazards. Research on resilience emerged to explain systems’ ability to absorb and recover in the midst of adversity and uncertainty from natural disasters, crises, and other disruptive events. While interest in resilience has accelerated, research multiplied, and the number of policies and implementations of resilience to natural hazards has increased over the last several years, mainly at the level of communities and regions, there has been a dearth of empirical work on resilience at the level of the firm. This paper uses empirical data and a sample selection model to test some hypotheses related to the firm’s dependence on critical resources, the sharing of resources and its economic resilience. The objective is to understand how the sharing of resources among organizations is related to economic resilience. Empirical results that are obtained from a sample of firms affected by Superstorm Sandy and Hurricane Harvey indicate that there is unobserved heterogeneity that explains the strategic behavior of firms in the post-disaster and that those firms that are more likely to resource share are also the ones that exhibit higher economic resilience. The impact of property damage on the sharing of resources and economic resilience is explored.

Keywords: economic resilience, resource sharing, critical resources, strategic management

Procedia PDF Downloads 121
3765 The Use of Psychological Tests in Polish Organizations - Empirical Evidence

Authors: Milena Gojny-Zbierowska

Abstract:

In the last decades psychological tests have been gaining in popularity as a method used for evaluating personnel, and they bring consulting companies solid profits rising by up to 10% each year. The market is offering a growing range of tools for the assessment of personality. Tests are used in organizations mainly in the recruitment and selection of staff. This paper is an attempt to initially diagnose the state of the use of psychological tests in Polish companies on the basis of empirical research.

Keywords: psychological tests, personality, content analysis, NEO FFI, big five personality model

Procedia PDF Downloads 333
3764 Enhanced Face Recognition with Daisy Descriptors Using 1BT Based Registration

Authors: Sevil Igit, Merve Meric, Sarp Erturk

Abstract:

In this paper, it is proposed to improve Daisy descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.

Keywords: face recognition, Daisy descriptor, One-Bit Transform, image registration

Procedia PDF Downloads 342
3763 Deepnic, A Method to Transform Each Variable into Image for Deep Learning

Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.

Abstract:

Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.

Keywords: tabular data, deep learning, perfect trees, NICS

Procedia PDF Downloads 60
3762 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 414
3761 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response

Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul

Abstract:

The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.

Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response

Procedia PDF Downloads 648
3760 Denoising Convolutional Neural Network Assisted Electrocardiogram Signal Watermarking for Secure Transmission in E-Healthcare Applications

Authors: Jyoti Rani, Ashima Anand, Shivendra Shivani

Abstract:

In recent years, physiological signals obtained in telemedicine have been stored independently from patient information. In addition, people have increasingly turned to mobile devices for information on health-related topics. Major authentication and security issues may arise from this storing, degrading the reliability of diagnostics. This study introduces an approach to reversible watermarking, which ensures security by utilizing the electrocardiogram (ECG) signal as a carrier for embedding patient information. In the proposed work, Pan-Tompkins++ is employed to convert the 1D ECG signal into a 2D signal. The frequency subbands of a signal are extracted using RDWT(Redundant discrete wavelet transform), and then one of the subbands is subjected to MSVD (Multiresolution singular valued decomposition for masking. Finally, the encrypted watermark is embedded within the signal. The experimental results show that the watermarked signal obtained is indistinguishable from the original signals, ensuring the preservation of all diagnostic information. In addition, the DnCNN (Denoising convolutional neural network) concept is used to denoise the retrieved watermark for improved accuracy. The proposed ECG signal-based watermarking method is supported by experimental results and evaluations of its effectiveness. The results of the robustness tests demonstrate that the watermark is susceptible to the most prevalent watermarking attacks.

Keywords: ECG, VMD, watermarking, PanTompkins++, RDWT, DnCNN, MSVD, chaotic encryption, attacks

Procedia PDF Downloads 65
3759 Potentials and Influencing Factors of Dynamic Pricing in Business: Empirical Insights of European Experts

Authors: Christopher Reichstein, Ralf-Christian Härting, Martina Häußler

Abstract:

With a continuously increasing speed of information exchange on the World Wide Web, retailers in the E-Commerce sector are faced with immense possibilities regarding different online purchase processes like dynamic price settings. By use of Dynamic Pricing, retailers are able to set short time price changes in order to optimize producer surplus. The empirical research illustrates the basics of Dynamic Pricing and identifies six influencing factors of Dynamic Pricing. The results of a structural equation modeling approach show five main drivers increasing the potential of dynamic price settings in the E-Commerce. Influencing factors are the knowledge of customers’ individual willingness to pay, rising sales, the possibility of customization, the data volume and the information about competitors’ pricing strategy.

Keywords: e-commerce, empirical research, experts, dynamic pricing (DP), influencing factors, potentials

Procedia PDF Downloads 234
3758 Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area

Authors: Nassib Abdallah, Pierre Chauvet, Abd El Salam Hajjar, Bassam Daya

Abstract:

In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words.

Keywords: brain-computer interface, speech recognition, artificial neural network, electroencephalography, EEG, wernicke area

Procedia PDF Downloads 249
3757 Empirical Orthogonal Functions Analysis of Hydrophysical Characteristics in the Shira Lake in Southern Siberia

Authors: Olga S. Volodko, Lidiya A. Kompaniets, Ludmila V. Gavrilova

Abstract:

The method of empirical orthogonal functions is the method of data analysis with a complex spatial-temporal structure. This method allows us to decompose the data into a finite number of modes determined by empirically finding the eigenfunctions of data correlation matrix. The modes have different scales and can be associated with various physical processes. The empirical orthogonal function method has been widely used for the analysis of hydrophysical characteristics, for example, the analysis of sea surface temperatures in the Western North Atlantic, ocean surface currents in the North Carolina, the study of tropical wave disturbances etc. The method used in this study has been applied to the analysis of temperature and velocity measurements in saline Lake Shira (Southern Siberia, Russia). Shira is a shallow lake with the maximum depth of 25 m. The lake Shira can be considered as a closed water site because of it has one small river providing inflow and but it has no outflows. The main factor that causes the motion of fluid is variable wind flows. In summer the lake is strongly stratified by temperature and saline. Long-term measurements of the temperatures and currents were conducted at several points during summer 2014-2015. The temperature has been measured with an accuracy of 0.1 ºC. The data were analyzed using the empirical orthogonal function method in the real version. The first empirical eigenmode accounts for 70-80 % of the energy and can be interpreted as temperature distribution with a thermocline. A thermocline is a thermal layer where the temperature decreases rapidly from the mixed upper layer of the lake to much colder deep water. The higher order modes can be interpreted as oscillations induced by internal waves. The currents measurements were recorded using Acoustic Doppler Current Profilers 600 kHz and 1200 kHz. The data were analyzed using the empirical orthogonal function method in the complex version. The first empirical eigenmode accounts for about 40 % of the energy and corresponds to the Ekman spiral occurring in the case of a stationary homogeneous fluid. Other modes describe the effects associated with the stratification of fluids. The second and next empirical eigenmodes were associated with dynamical modes. These modes were obtained for a simplified model of inhomogeneous three-level fluid at a water site with a flat bottom.

Keywords: Ekman spiral, empirical orthogonal functions, data analysis, stratified fluid, thermocline

Procedia PDF Downloads 118
3756 Investigating the Energy Gap and Wavelength of (AlₓGa₁₋ₓAs)ₘ/(GaAs)ₙ Superlattices in Terms of Material Thickness and Al Mole Fraction Using Empirical Tight-Binding Method

Authors: Matineh Sadat Hosseini Gheidari, Vahid Reza Yazdanpanah

Abstract:

In this paper, we used the empirical tight-binding method (ETBM) with sp3s* approximation and considering the first nearest neighbor with spin-orbit interactions in order to model superlattice structure (SLS) of (AlₓGa₁₋ₓAs)ₘ/(GaAs)ₙ grown on GaAs (100) substrate at 300K. In the next step, we investigated the behavior of the energy gap and wavelength of this superlattice in terms of different thicknesses of core materials and Al mole fractions. As a result of this survey, we found out that as the Al composition increases, the energy gap of this superlattice has an upward trend and ranges from 1.42-1.63 eV. Also, according to the wavelength range that we gained from this superlattice in different Al mole fractions and various thicknesses, we can find a suitable semiconductor for a special light-emitting diode (LED) application.

Keywords: energy gap, empirical tight-binding method, light-emitting diode, superlattice, wavelength

Procedia PDF Downloads 168
3755 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain

Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA

Abstract:

In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.

Keywords: BER, DWT, extreme leaning machine (ELM), PSNR

Procedia PDF Downloads 285
3754 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach

Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon

Abstract:

Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.

Keywords: data mining, defensive m&s, management system, knowledge management

Procedia PDF Downloads 228
3753 An Analysis of the Influence of Employee Readiness for Change on TQM Implementation

Authors: Mohamed Haffar, Khalil Al-Hyari, Mohammed Khair Abu Zaid, Ramadane Djbarni, Mohammed Hamdan

Abstract:

While employee readiness for change (ERFC) is recognised as critical for total quality management (TQM) implementation, there is a lack of systematic and empirical studies regarding the relationship between ERFC dimensions and TQM. Therefore, this study proposes to fill this gap by providing empirical evidence leading to advancement in the understanding of the influences of ERFC components on TQM implementation. The empirical data for this study was drawn from a survey of 400 middle and senior managers of Jordanian firms. The analysis of the collected data, which was conducted using Structural Equation Modeling technique, revealed that three of the ERFC components, namely personally beneficial, change self-efficacy and management support are the most supportive ERFC dimensions for TQM implementation. Therefore, this paper makes a novel contribution by providing a refined and deeper comprehension of the relationships between ERFCs and TQM implementation.

Keywords: total quality management, employee readiness for change, manufacturing organisations, Jordan

Procedia PDF Downloads 530
3752 A Numerical Investigation of Lamb Wave Damage Diagnosis for Composite Delamination Using Instantaneous Phase

Authors: Haode Huo, Jingjing He, Rui Kang, Xuefei Guan

Abstract:

This paper presents a study of Lamb wave damage diagnosis of composite delamination using instantaneous phase data. Numerical experiments are performed using the finite element method. Different sizes of delamination damages are modeled using finite element package ABAQUS. Lamb wave excitation and responses data are obtained using a pitch-catch configuration. Empirical mode decomposition is employed to extract the intrinsic mode functions (IMF). Hilbert–Huang Transform is applied to each of the resulting IMFs to obtain the instantaneous phase information. The baseline data for healthy plates are also generated using the same procedure. The size of delamination is correlated with the instantaneous phase change for damage diagnosis. It is observed that the unwrapped instantaneous phase of shows a consistent behavior with the increasing delamination size.

Keywords: delamination, lamb wave, finite element method, EMD, instantaneous phase

Procedia PDF Downloads 298
3751 Peer-To-Peer Lending and Macroeconomics: Searching for a Link

Authors: Asror Nigmonov Asqar Ogli, Sitora Inoyatova Amonovna

Abstract:

It has been a decade when the crowdfunding and P2P lending opportunities were created. Today, the market of these modern alternative investments is becoming increasingly complex to navigate. There are overwhelming amount of peer-to-peer lending platforms both in developed and emerging economies. This study looks into this market via the cross country empirical study. In this respect, it tests the effect of various macroeconomic factors on P2P loan lending. Based on the existing literature that largely lacks empirical investigations, it builds regression model that aims to explore the relationship between economy and P2P lending. Though the author found it extremely difficult to compare the findings with earlier studies, this paper had identified certain tendencies in the data and had certain policy implications. However, the paper could not find any significant effect of economic variables on P2P lending. The paper can be considered as a starting point in empirical investigation of P2P lending and highlights room further research based on limitations of the study.

Keywords: peer-to-peer lending, crowdfunding, marketplace lending, alternative finance, fintech

Procedia PDF Downloads 171
3750 Infinite Impulse Response Digital Filters Design

Authors: Phuoc Si Nguyen

Abstract:

Infinite impulse response (IIR) filters can be designed from an analogue low pass prototype by using frequency transformation in the s-domain and bilinear z-transformation with pre-warping frequency; this method is known as frequency transformation from the s-domain to the z-domain. This paper will introduce a new method to transform an IIR digital filter to another type of IIR digital filter (low pass, high pass, band pass, band stop or narrow band) using a technique based on inverse bilinear z-transformation and inverse matrices. First, a matrix equation is derived from inverse bilinear z-transformation and Pascal’s triangle. This Low Pass Digital to Digital Filter Pascal Matrix Equation is used to transform a low pass digital filter to other digital filter types. From this equation and the inverse matrix, a Digital to Digital Filter Pascal Matrix Equation can be derived that is able to transform any IIR digital filter. This paper will also introduce some specific matrices to replace the inverse matrix, which is difficult to determine due to the larger size of the matrix in the current method. This will make computing and hand calculation easier when transforming from one IIR digital filter to another in the digital domain.

Keywords: bilinear z-transformation, frequency transformation, inverse bilinear z-transformation, IIR digital filters

Procedia PDF Downloads 394
3749 Natural Factors of Interannual Variability of Winter Precipitation over the Altai Krai

Authors: Sukovatov K.Yu., Bezuglova N.N.

Abstract:

Winter precipitation variability over the Altai Krai was investigated by retrieving temporal patterns. The spectral singular analysis was used to describe the variance distribution and to reduce the precipitation data into a few components (modes). The associated time series were related to large-scale atmospheric and oceanic circulation indices by using lag cross-correlation and wavelet-coherence analysis. GPCC monthly precipitation data for rectangular field limited by 50-550N, 77-880E and monthly climatological circulation index data for the cold season were used to perform SSA decomposition and retrieve statistics for analyzed parameters on the time period 1951-2017. Interannual variability of winter precipitation over the Altai Krai are mostly caused by three natural factors: intensity variations of momentum exchange between mid and polar latitudes over the North Atlantic (explained variance 11.4%); wind speed variations in equatorial stratosphere (quasi-biennial oscillation, explained variance 15.3%); and surface temperature variations for equatorial Pacific sea (ENSO, explained variance 2.8%). It is concluded that under the current climate conditions (Arctic amplification and increasing frequency of meridional processes in mid-latitudes) the second and the third factors are giving more significant contribution into explained variance of interannual variability for cold season atmospheric precipitation over the Altai Krai than the first factor.

Keywords: interannual variability, winter precipitation, Altai Krai, wavelet-coherence

Procedia PDF Downloads 148
3748 A Modelling Analysis of Monetary Policy Rule

Authors: Wael Bakhit, Salma Bakhit

Abstract:

This paper employs a quarterly time series to determine the timing of structural breaks for interest rates in USA over the last 60 years. The Chow test is used for investigating the non-stationary, where the date of the potential break is assumed to be known. Moreover, an empirical examination of the financial sector was made to check if it is positively related to deviations from an assumed interest rate as given in a standard Taylor rule. The empirical analysis is strengthened by analysing the rule from a historical perspective and a look at the effect of setting the interest rate by the central bank on financial imbalances. The empirical evidence indicates that deviation in monetary policy has a potential causal factor in the build-up of financial imbalances and the subsequent crisis where macro prudential intervention could have beneficial effect. Thus, our findings tend to support the view which states that the probable existence of central banks has been a source of global financial crisis since the past decade.

Keywords: Taylor rule, financial imbalances, central banks, econometrics

Procedia PDF Downloads 367