Search results for: Electro Myo Graphic (EMG) signals
1114 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement
Authors: Pogula Rakesh, T. Kishore Kumar
Abstract:
Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids, and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB, and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR), and SNR loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.Keywords: adaptive filter, adaptive noise canceller, mean squared error, noise reduction, NLMS, RLS, SNR, SNR loss
Procedia PDF Downloads 4811113 Automatic Content Curation of Visual Heritage
Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz
Abstract:
Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research
Procedia PDF Downloads 1841112 An Ultrasonic Signal Processing System for Tomographic Imaging of Reinforced Concrete Structures
Authors: Edwin Forero-Garcia, Jaime Vitola, Brayan Cardenas, Johan Casagua
Abstract:
This research article presents the integration of electronic and computer systems, which developed an ultrasonic signal processing system that performs the capture, adaptation, and analog-digital conversion to later carry out its processing and visualization. The capture and adaptation of the signal were carried out from the design and implementation of an analog electronic system distributed in stages: 1. Coupling of impedances; 2. Analog filter; 3. Signal amplifier. After the signal conditioning was carried out, the ultrasonic information was digitized using a digital microcontroller to carry out its respective processing. The digital processing of the signals was carried out in MATLAB software for the elaboration of A-Scan, B and D-Scan types of ultrasonic images. Then, advanced processing was performed using the SAFT technique to improve the resolution of the Scan-B-type images. Thus, the information from the ultrasonic images was displayed in a user interface developed in .Net with Visual Studio. For the validation of the system, ultrasonic signals were acquired, and in this way, the non-invasive inspection of the structures was carried out and thus able to identify the existing pathologies in them.Keywords: acquisition, signal processing, ultrasound, SAFT, HMI
Procedia PDF Downloads 1071111 Coils and Antennas Fabricated with Sewing Litz Wire for Wireless Power Transfer
Authors: Hikari Ryu, Yuki Fukuda, Kento Oishi, Chiharu Igarashi, Shogo Kiryu
Abstract:
Recently, wireless power transfer has been developed in various fields. Magnetic coupling is popular for feeding power at a relatively short distance and at a lower frequency. Electro-magnetic wave coupling at a high frequency is used for long-distance power transfer. The wireless power transfer has attracted attention in e-textile fields. Rigid batteries are required for many body-worn electric systems at the present time. The technology enables such batteries to be removed from the systems. Flexible coils have been studied for such applications. Coils with a high Q factor are required in the magnetic-coupling power transfer. Antennas with low return loss are needed for the electro-magnetic coupling. Litz wire is so flexible to fabricate coils and antennas sewn on fabric and has low resistivity. In this study, the electric characteristics of some coils and antennas fabricated with the Litz wire by using two sewing techniques are investigated. As examples, a coil and an antenna are described. Both were fabricated with 330/0.04 mm Litz wire. The coil was a planar coil with a square shape. The outer side was 150 mm, the number of turns was 15, and the pitch interval between each turn was 5 mm. The Litz wire of the coil was overstitched with a sewing machine. The coil was fabricated as a receiver coil for a magnetic coupled wireless power transfer. The Q factor was 200 at a frequency of 800 kHz. A wireless power system was constructed by using the coil. A power oscillator was used in the system. The resonant frequency of the circuit was set to 123 kHz, where the switching loss of power FETs was small. The power efficiencies were 0.44 – 0.99, depending on the distance between the transmitter and receiver coils. As an example of an antenna with a sewing technique, a fractal pattern antenna was stitched on a 500 mm x 500 mm fabric by using a needle punch method. The pattern was the 2nd-oder Vicsec fractal. The return loss of the antenna was -28 dB at a frequency of 144 MHz.Keywords: e-textile, flexible coils and antennas, Litz wire, wireless power transfer
Procedia PDF Downloads 1331110 Electromyography Pattern Classification with Laplacian Eigenmaps in Human Running
Authors: Elnaz Lashgari, Emel Demircan
Abstract:
Electromyography (EMG) is one of the most important interfaces between humans and robots for rehabilitation. Decoding this signal helps to recognize muscle activation and converts it into smooth motion for the robots. Detecting each muscle’s pattern during walking and running is vital for improving the quality of a patient’s life. In this study, EMG data from 10 muscles in 10 subjects at 4 different speeds were analyzed. EMG signals are nonlinear with high dimensionality. To deal with this challenge, we extracted some features in time-frequency domain and used manifold learning and Laplacian Eigenmaps algorithm to find the intrinsic features that represent data in low-dimensional space. We then used the Bayesian classifier to identify various patterns of EMG signals for different muscles across a range of running speeds. The best result for vastus medialis muscle corresponds to 97.87±0.69 for sensitivity and 88.37±0.79 for specificity with 97.07±0.29 accuracy using Bayesian classifier. The results of this study provide important insight into human movement and its application for robotics research.Keywords: electromyography, manifold learning, ISOMAP, Laplacian Eigenmaps, locally linear embedding
Procedia PDF Downloads 3611109 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals
Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar
Abstract:
Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks
Procedia PDF Downloads 1861108 Monitoring Saltwater Corrosion on Steel Samples Using Coda Wave Interferometry in MHZ Frequencies
Authors: Maxime Farin, Emmanuel Moulin, Lynda Chehami, Farouk Benmeddour, Pierre Campistron
Abstract:
Assessing corrosion is crucial in the petrochemical and marine industry. Usual ultrasonic methods based on guided waves to detect corrosion can inspect large areas but lack precision. We propose a complementary and sensitive ultrasonic method (~ 10 MHz) based on coda wave interferometry to detect and quantify corrosion at the surface of a steel sample. The method relies on a single piezoelectric transducer, exciting the sample and measuring the scattered coda signals at different instants in time. A laboratory experiment is conducted with a steel sample immersed in salted water for 60~h with parallel coda and temperature measurements to correct coda dependence to temperature variations. Micrometric changes to the sample surface caused by corrosion are detected in the late coda signals, allowing precise corrosion detection. Moreover, a good correlation is found between a parameter quantifying the temperature-corrected stretching of the coda over time with respect to a reference without corrosion and the corrosion surface over the sample recorded with a camera.Keywords: coda wave interferometry, nondestructive evaluation, corrosion, ultrasonics
Procedia PDF Downloads 2341107 Transverse Momentum Dependent Factorization and Evolution for Spin Physics
Authors: Bipin Popat Sonawane
Abstract:
After 1988 Electron muon Collaboration (EMC) announcement of measurement of spin dependent structure function, it has been found that it has become a need to understand spin structure of a hadron. In the study of three-dimensional spin structure of a proton, we need to understand the foundation of quantum field theory in terms of electro-weak and strong theories using rigorous mathematical theories and models. In the process of understanding the inner dynamical stricture of proton we need understand the mathematical formalism in perturbative quantum chromodynamics (pQCD). In QCD processes like proton-proton collision at high energy we calculate cross section using conventional collinear factorization schemes. In this calculations, parton distribution functions (PDFs) and fragmentation function are used which provide the information about probability density of finding quarks and gluons ( partons) inside the proton and probability density of finding final hadronic state from initial partons. In transverse momentum dependent (TMD) PDFs and FFs, collectively called as TMDs, take an account for intrinsic transverse motion of partons. The TMD factorization in the calculation of cross sections provide a scheme of hadronic and partonic states in the given QCD process. In this study we review Transverse Momentum Dependent (TMD) factorization scheme using Collins-Soper-Sterman (CSS) Formalism. CSS formalism considers the transverse momentum dependence of the partons, in this formalism the cross section is written as a Fourier transform over a transverse position variable which has physical interpretation as impact parameter. Along with this we compare this formalism with improved CSS formalism. In this work we study the TMD evolution schemes and their comparison with other schemes. This would provide description in the process of measurement of transverse single spin asymmetry (TSSA) in hadro-production and electro-production of J/psi meson at RHIC, LHC, ILC energy scales. This would surely help us to understand J/psi production mechanism which is an appropriate test of QCD. Procedia PDF Downloads 691106 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder
Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh
Abstract:
In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization
Procedia PDF Downloads 1141105 Application of Local Mean Decomposition for Rolling Bearing Fault Diagnosis Based On Vibration Signals
Authors: Toufik Bensana, Slimane Mekhilef, Kamel Tadjine
Abstract:
Vibration analysis has been frequently applied in the condition monitoring and fault diagnosis of rolling element bearings. Unfortunately, the vibration signals collected from a faulty bearing are generally non stationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA) and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. The results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.Keywords: fault diagnosis, condition monitoring, local mean decomposition, rolling element bearing, vibration analysis
Procedia PDF Downloads 3971104 An International Curriculum Development for Languages and Technology
Authors: Miguel Nino
Abstract:
When considering the challenges of a changing and demanding globalizing world, it is important to reflect on how university students will be prepared for the realities of internationalization, marketization and intercultural conversation. The present study is an interdisciplinary program designed to respond to the needs of the global community. The proposal bridges the humanities and science through three different fields: Languages, graphic design and computer science, specifically, fundamentals of programming such as python, java script and software animation. Therefore, the goal of the four year program is twofold: First, enable students for intercultural communication between English and other languages such as Spanish, Mandarin, French or German. Second, students will acquire knowledge in practical software and relevant employable skills to collaborate in assisted computer projects that most probable will require essential programing background in interpreted or compiled languages. In order to become inclusive and constructivist, the cognitive linguistics approach is suggested for the three different fields, particularly for languages that rely on the traditional method of repetition. This methodology will help students develop their creativity and encourage them to become independent problem solving individuals, as languages enhance their common ground of interaction for culture and technology. Participants in this course of study will be evaluated in their second language acquisition at the Intermediate-High level. For graphic design and computer science students will apply their creative digital skills, as well as their critical thinking skills learned from the cognitive linguistics approach, to collaborate on a group project design to find solutions for media web design problems or marketing experimentation for a company or the community. It is understood that it will be necessary to apply programming knowledge and skills to deliver the final product. In conclusion, the program equips students with linguistics knowledge and skills to be competent in intercultural communication, where English, the lingua franca, remains the medium for marketing and product delivery. In addition to their employability, students can expand their knowledge and skills in digital humanities, computational linguistics, or increase their portfolio in advertising and marketing. These students will be the global human capital for the competitive globalizing community.Keywords: curriculum, international, languages, technology
Procedia PDF Downloads 4431103 Field-Programmable Gate Array-Based Baseband Signals Generator of X-Band Transmitter for Micro Satellite/CubeSat
Authors: Shih-Ming Wang, Chun-Kai Yeh, Ming-Hwang Shie, Tai-Wei Lin, Chieh-Fu Chang
Abstract:
This paper introduces a FPGA-based baseband signals generator (BSG) of X-band transmitter developed by National Space Organization (NSPO), Taiwan, for earth observation. In order to gain more flexibility for various applications, a number of modulation schemes, QPSK, DeQPSK and 8PSK 4D-TCM are included. For micro satellite scenario, the maximum symbol rate is up to 150Mbsps, and the EVM is as low as 1.9%. For CubeSat scenario, the maximum symbol rate is up to 60Mbsps, and the EVM is less than 1.7%. The maximum data rates are 412.5Mbps and 165Mbps, respectively. Besides, triple modular redundancy (TMR) scheme is implemented in order to reduce single event effect (SEE) induced by radiation. Finally, the theoretical error performance is provided based on comprehensive analysis, especially when BER is lower and much lower than 10⁻⁶ due to low error bit requirement of modern high-resolution earth remote-sensing instruments.Keywords: X-band transmitter, FPGA (Field-Programmable Gate Array), CubeSat, micro satellite
Procedia PDF Downloads 2951102 A Quality Index Optimization Method for Non-Invasive Fetal ECG Extraction
Authors: Lucia Billeci, Gennaro Tartarisco, Maurizio Varanini
Abstract:
Fetal cardiac monitoring by fetal electrocardiogram (fECG) can provide significant clinical information about the healthy condition of the fetus. Despite this potentiality till now the use of fECG in clinical practice has been quite limited due to the difficulties in its measuring. The recovery of fECG from the signals acquired non-invasively by using electrodes placed on the maternal abdomen is a challenging task because abdominal signals are a mixture of several components and the fetal one is very weak. This paper presents an approach for fECG extraction from abdominal maternal recordings, which exploits the characteristics of pseudo-periodicity of fetal ECG. It consists of devising a quality index (fQI) for fECG and of finding the linear combinations of preprocessed abdominal signals, which maximize these fQI (quality index optimization - QIO). It aims at improving the performances of the most commonly adopted methods for fECG extraction, usually based on maternal ECG (mECG) estimating and canceling. The procedure for the fECG extraction and fetal QRS (fQRS) detection is completely unsupervised and based on the following steps: signal pre-processing; maternal ECG (mECG) extraction and maternal QRS detection; mECG component approximation and canceling by weighted principal component analysis; fECG extraction by fQI maximization and fetal QRS detection. The proposed method was compared with our previously developed procedure, which obtained the highest at the Physionet/Computing in Cardiology Challenge 2013. That procedure was based on removing the mECG from abdominal signals estimated by a principal component analysis (PCA) and applying the Independent component Analysis (ICA) on the residual signals. Both methods were developed and tuned using 69, 1 min long, abdominal measurements with fetal QRS annotation of the dataset A provided by PhysioNet/Computing in Cardiology Challenge 2013. The QIO-based and the ICA-based methods were compared in analyzing two databases of abdominal maternal ECG available on the Physionet site. The first is the Abdominal and Direct Fetal Electrocardiogram Database (ADdb) which contains the fetal QRS annotations thus allowing a quantitative performance comparison, the second is the Non-Invasive Fetal Electrocardiogram Database (NIdb), which does not contain the fetal QRS annotations so that the comparison between the two methods can be only qualitative. In particular, the comparison on NIdb was performed defining an index of quality for the fetal RR series. On the annotated database ADdb the QIO method, provided the performance indexes Sens=0.9988, PPA=0.9991, F1=0.9989 overcoming the ICA-based one, which provided Sens=0.9966, PPA=0.9972, F1=0.9969. The comparison on NIdb was performed defining an index of quality for the fetal RR series. The index of quality resulted higher for the QIO-based method compared to the ICA-based one in 35 records out 55 cases of the NIdb. The QIO-based method gave very high performances with both the databases. The results of this study foresees the application of the algorithm in a fully unsupervised way for the implementation in wearable devices for self-monitoring of fetal health.Keywords: fetal electrocardiography, fetal QRS detection, independent component analysis (ICA), optimization, wearable
Procedia PDF Downloads 2801101 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data
Authors: M. Kharrat, G. Moreau, Z. Aboura
Abstract:
The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition
Procedia PDF Downloads 1551100 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 861099 Hierarchical Scheme for Detection of Rotating Mimo Visible Light Communication Systems Using Mobile Phone Camera
Authors: Shih-Hao Chen, Chi-Wai Chow
Abstract:
Multiple-input and multiple-output (MIMO) scheme can extend the transmission capacity for the light-emitting-diode (LED) visible light communication (VLC) system. The MIMO VLC system using the popular mobile-phone camera as the optical receiver (Rx) to receive MIMO signal from n x n Red-Green-Blue (RGB) LED array is desirable. The key step of decoding the received RGB LED array signals is detecting the direction of received array signals. If the LED transmitter (Tx) is rotated, the signal may not be received correctly and cause an error in the received signal. In this work, we propose and demonstrate a novel hierarchical transmission scheme which can reduce the computation complexity of rotation detection in LED array VLC system. We use the n x n RGB LED array as the MIMO Tx. A novel two dimension Hadamard coding scheme is proposed and demonstrated. The detection correction rate is above 95% in the indoor usage distance. Experimental results confirm the feasibility of the proposed scheme.Keywords: Visible Light Communication (VLC), Multiple-input and multiple-output (MIMO), Red-Green-Blue (RGB), Hadamard coding scheme
Procedia PDF Downloads 4191098 Electroencephalography-Based Intention Recognition and Consensus Assessment during Emergency Response
Abstract:
After natural and man-made disasters, robots can bypass the danger, expedite the search, and acquire unprecedented situational awareness to design rescue plans. The hands-free requirement from the first responders excludes the use of tedious manual control and operation. In unknown, unstructured, and obstructed environments, natural-language-based supervision is not amenable for first responders to formulate, and is difficult for robots to understand. Brain-computer interface is a promising option to overcome the limitations. This study aims to test the feasibility of using electroencephalography (EEG) signals to decode human intentions and detect the level of consensus on robot-provided information. EEG signals were classified using machine-learning and deep-learning methods to discriminate search intentions and agreement perceptions. The results show that the average classification accuracy for intention recognition and consensus assessment is 67% and 72%, respectively, proving the potential of incorporating recognizable users’ bioelectrical responses into advanced robot-assisted systems for emergency response.Keywords: consensus assessment, electroencephalogram, emergency response, human-robot collaboration, intention recognition, search and rescue
Procedia PDF Downloads 931097 Theory and Practice of Wavelets in Signal Processing
Authors: Jalal Karam
Abstract:
The methods of Fourier, Laplace, and Wavelet Transforms provide transfer functions and relationships between the input and the output signals in linear time invariant systems. This paper shows the equivalence among these three methods and in each case presenting an application of the appropriate (Fourier, Laplace or Wavelet) to the convolution theorem. In addition, it is shown that the same holds for a direct integration method. The Biorthogonal wavelets Bior3.5 and Bior3.9 are examined and the zeros distribution of their polynomials associated filters are located. This paper also presents the significance of utilizing wavelets as effective tools in processing speech signals for common multimedia applications in general, and for recognition and compression in particular. Theoretically and practically, wavelets have proved to be effective and competitive. The practical use of the Continuous Wavelet Transform (CWT) in processing and analysis of speech is then presented along with explanations of how the human ear can be thought of as a natural wavelet transformer of speech. This generates a variety of approaches for applying the (CWT) to many paradigms analysing speech, sound and music. For perception, the flexibility of implementation of this transform allows the construction of numerous scales and we include two of them. Results for speech recognition and speech compression are then included.Keywords: continuous wavelet transform, biorthogonal wavelets, speech perception, recognition and compression
Procedia PDF Downloads 4161096 The Effect of Parameters on Production of NİO/Al2O3/B2O3/SiO2 Composite Nanofibers by Using Sol-Gel Processing and Electrospinning Technique
Authors: F. Sevim, E. Sevimli, F. Demir, T. Çalban
Abstract:
For the first time, nanofibers of PVA /nickel nitrate/silica/alumina izopropoxide/boric acid composite were prepared by using sol-gel processing and electrospinning technique. By high temperature calcinations of the above precursor fibers, nanofibers of NiO/Al2O3/B2O3/SiO2 composite with diameters of 500 nm could be successfully obtained. The fibers were characterized by TG/DTA, FT-IR, XRD and SEM analyses.Keywords: nano fibers, NiO/Al2O3/B2O3/SiO2 composite, sol-gel processing, electro spinning
Procedia PDF Downloads 3371095 Portable System for the Acquisition and Processing of Electrocardiographic Signals to Obtain Different Metrics of Heart Rate Variability
Authors: Daniel F. Bohorquez, Luis M. Agudelo, Henry H. León
Abstract:
Heart rate variability (HRV) is defined as the temporary variation between heartbeats or RR intervals (distance between R waves in an electrocardiographic signal). This distance is currently a recognized biomarker. With the analysis of the distance, it is possible to assess the sympathetic and parasympathetic nervous systems. These systems are responsible for the regulation of the cardiac muscle. The analysis allows health specialists and researchers to diagnose various pathologies based on this variation. For the acquisition and analysis of HRV taken from a cardiac electrical signal, electronic equipment and analysis software that work independently are currently used. This complicates and delays the process of interpretation and diagnosis. With this delay, the health condition of patients can be put at greater risk. This can lead to an untimely treatment. This document presents a single portable device capable of acquiring electrocardiographic signals and calculating a total of 19 HRV metrics. This reduces the time required, resulting in a timelier intervention. The device has an electrocardiographic signal acquisition card attached to a microcontroller capable of transmitting the cardiac signal wirelessly to a mobile device. In addition, a mobile application was designed to analyze the cardiac waveform. The device calculates the RR and different metrics. The application allows a user to visualize in real-time the cardiac signal and the 19 metrics. The information is exported to a cloud database for remote analysis. The study was performed under controlled conditions in the simulated hospital of the Universidad de la Sabana, Colombia. A total of 60 signals were acquired and analyzed. The device was compared against two reference systems. The results show a strong level of correlation (r > 0.95, p < 0.05) between the 19 metrics compared. Therefore, the use of the portable system evaluated in clinical scenarios controlled by medical specialists and researchers is recommended for the evaluation of the condition of the cardiac system.Keywords: biological signal análisis, heart rate variability (HRV), HRV metrics, mobile app, portable device.
Procedia PDF Downloads 1841094 Intrinsic Motivational Factor of Students in Learning Mathematics and Science Based on Electroencephalogram Signals
Authors: Norzaliza Md. Nor, Sh-Hussain Salleh, Mahyar Hamedi, Hadrina Hussain, Wahab Abdul Rahman
Abstract:
Motivational factor is mainly the students’ desire to involve in learning process. However, it also depends on the goal towards their involvement or non-involvement in academic activity. Even though, the students’ motivation might be in the same level, but the basis of their motivation may differ. In this study, it focuses on the intrinsic motivational factor which student enjoy learning or feeling of accomplishment the activity or study for its own sake. The intrinsic motivational factor of students in learning mathematics and science has found as difficult to be achieved because it depends on students’ interest. In the Program for International Student Assessment (PISA) for mathematics and science, Malaysia is ranked as third lowest. The main problem in Malaysian educational system, students tend to have extrinsic motivation which they have to score in exam in order to achieve a good result and enrolled as university students. The use of electroencephalogram (EEG) signals has found to be scarce especially to identify the students’ intrinsic motivational factor in learning science and mathematics. In this research study, we are identifying the correlation between precursor emotion and its dynamic emotion to verify the intrinsic motivational factor of students in learning mathematics and science. The 2-D Affective Space Model (ASM) was used in this research in order to identify the relationship of precursor emotion and its dynamic emotion based on the four basic emotions, happy, calm, fear and sad. These four basic emotions are required to be used as reference stimuli. Then, in order to capture the brain waves, EEG device was used, while Mel Frequency Cepstral Coefficient (MFCC) was adopted to be used for extracting the features before it will be feed to Multilayer Perceptron (MLP) to classify the valence and arousal axes for the ASM. The results show that the precursor emotion had an influence the dynamic emotions and it identifies that most students have no interest in mathematics and science according to the negative emotion (sad and fear) appear in the EEG signals. We hope that these results can help us further relate the behavior and intrinsic motivational factor of students towards learning of mathematics and science.Keywords: EEG, MLP, MFCC, intrinsic motivational factor
Procedia PDF Downloads 3661093 Microfluidic Device for Real-Time Electrical Impedance Measurements of Biological Cells
Authors: Anil Koklu, Amin Mansoorifar, Ali Beskok
Abstract:
Dielectric spectroscopy (DS) is a noninvasive, label free technique for a long term real-time measurements of the impedance spectra of biological cells. DS enables characterization of cellular dielectric properties such as membrane capacitance and cytoplasmic conductivity. We have developed a lab-on-a-chip device that uses an electro-activated microwells array for loading, DS measurements, and unloading of biological cells. We utilized from dielectrophoresis (DEP) to capture target cells inside the wells and release them after DS measurement. DEP is a label-free technique that exploits differences among dielectric properties of the particles. In detail, DEP is the motion of polarizable particles suspended in an ionic solution and subjected to a spatially non-uniform external electric field. To the best of our knowledge, this is the first microfluidic chip that combines DEP and DS to analyze biological cells using electro-activated wells. Device performance is tested using two different cell lines of prostate cancer cells (RV122, PC-3). Impedance measurements were conducted at 0.2 V in the 10 kHz to 40 MHz range with 6 s time resolution. An equivalent circuit model was developed to extract the cell membrane capacitance and cell cytoplasmic conductivity from the impedance spectra. We report the time course of the variations in dielectric properties of PC-3 and RV122 cells suspended in low conductivity medium (LCB), which enhances dielectrophoretic and impedance responses, and their response to sudden pH change from a pH of 7.3 to a pH of 5.8. It is shown that microfluidic chip allowed online measurements of dielectric properties of prostate cancer cells and the assessment of the cellular level variations under external stimuli such as different buffer conductivity and pH. Based on these data, we intend to deploy the current device for single cell measurements by fabricating separately addressable N × N electrode platforms. Such a device will allow time-dependent dielectric response measurements for individual cells with the ability of selectively releasing them using negative-DEP and pressure driven flow.Keywords: microfluidic, microfabrication, lab on a chip, AC electrokinetics, dielectric spectroscopy
Procedia PDF Downloads 1511092 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea
Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das
Abstract:
This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea
Procedia PDF Downloads 1341091 Combined Optical Coherence Microscopy and Spectrally Resolved Multiphoton Microscopy
Authors: Bjorn-Ole Meyer, Dominik Marti, Peter E. Andersen
Abstract:
A multimodal imaging system, combining spectrally resolved multiphoton microscopy (MPM) and optical coherence microscopy (OCM) is demonstrated. MPM and OCM are commonly integrated into multimodal imaging platforms to combine functional and morphological information. The MPM signals, such as two-photon fluorescence emission (TPFE) and signals created by second harmonic generation (SHG) are biomarkers which exhibit information on functional biological features such as the ratio of pyridine nucleotide (NAD(P)H) and flavin adenine dinucleotide (FAD) in the classification of cancerous tissue. While the spectrally resolved imaging allows for the study of biomarkers, using a spectrometer as a detector limits the imaging speed of the system significantly. To overcome those limitations, an OCM setup was added to the system, which allows for fast acquisition of structural information. Thus, after rapid imaging of larger specimens, navigation within the sample is possible. Subsequently, distinct features can be selected for further investigation using MPM. Additionally, by probing a different contrast, complementary information is obtained, and different biomarkers can be investigated. OCM images of tissue and cell samples are obtained, and distinctive features are evaluated using MPM to illustrate the benefits of the system.Keywords: optical coherence microscopy, multiphoton microscopy, multimodal imaging, two-photon fluorescence emission
Procedia PDF Downloads 5111090 Electro-Discharge Drilling in Residual Stress Measurement of Annealed St.37 Steel
Authors: H. Gholami, M. Jalali Azizpour
Abstract:
For materials such as hard coating whose stresses state are difficult to obtain by a widely used method called high-speed hole-drilling method (ASTM Standard E837). It is important to develop a non contact method. This process itself imposes an additional stresses. The through thickness residual stress of st37 steel using elector-discharge was investigated. The strain gage and dynamic strain indicator used in all cases was FRS-2-11 rosette type and TML 221, respectively. The average residual stress in depth of 320 µm was -6.47 MPa.Keywords: HVOF, residual stress, thermal spray, WC-Co
Procedia PDF Downloads 3111089 Device Control Using Brain Computer Interface
Authors: P. Neeraj, Anurag Sharma, Harsukhpreet Singh
Abstract:
In current years, Brain-Computer Interface (BCI) scheme based on steady-state Visual Evoked Potential (SSVEP) have earned much consideration. This study tries to evolve an SSVEP based BCI scheme that can regulate any gadget mock-up in two unique positions ON and OFF. In this paper, two distinctive gleam frequencies in low-frequency part were utilized to evoke the SSVEPs and were shown on a Liquid Crystal Display (LCD) screen utilizing Lab View. Two stimuli shading, Yellow, and Blue were utilized to prepare the system in SSVEPs. The Electroencephalogram (EEG) signals recorded from the occipital part. Elements of the brain were separated by utilizing discrete wavelet Transform. A prominent system for multilayer system diverse Neural Network Algorithm (NNA), is utilized to characterize SSVEP signals. During training of the network with diverse calculation Regression plot results demonstrated that when Levenberg-Marquardt preparing calculation was utilized the exactness turns out to be 93.9%, which is superior to another training algorithm.Keywords: brain computer interface, electroencephalography, steady-state visual evoked potential, wavelet transform, neural network
Procedia PDF Downloads 3341088 Wavelets Contribution on Textual Data Analysis
Authors: Habiba Ben Abdessalem
Abstract:
The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.Keywords: textual data, wavelet, denoising, contingency table
Procedia PDF Downloads 2771087 Undocumented Migrants on the Northern Border of Mexico: Social Imaginary, and Social Representations
Authors: César Enrique Jiménez Yañez, Yessica Martinez Soto
Abstract:
In the present work, the phenomenon of undocumented migration in the northern border of Mexico is analyzed through the graphic representation of the experience of people who migrate in an undocumented way to the United States. 33 of them drew what it meant for them to migrate. Our objective is to analyze the social phenomenon of migration through the drawings of migrants, using the concepts of social imaginary and social representations, identifying the different significant elements with which this symbolically builds their experience. Drawing, as a methodological tool, will help us to understand the migratory experience beyond words.Keywords: Mexico, social imaginary, social representations, undocumented migrants
Procedia PDF Downloads 4011086 Genome-Wide Association Study Identify COL2A1 as a Susceptibility Gene for the Hand Development Failure of Kashin-Beck Disease
Authors: Feng Zhang
Abstract:
Kashin-Beck disease (KBD) is a chronic osteochondropathy. The mechanism of hand growth and development failure of KBD remains elusive now. In this study, we conducted a two-stage genome-wide association study (GWAS) of palmar length-width ratio (LWR) of KBD, totally involving 493 Chinese Han KBD patients. Affymetrix Genome Wide Human SNP Array 6.0 was applied for SNP genotyping. Association analysis was conducted by PLINK software. Imputation analysis was performed by IMPUTE against the reference panel of the 1000 genome project. In the GWAS, the most significant association was observed between palmar LWR and rs2071358 of COL2A1 gene (P value = 4.68×10-8). Imputation analysis identified 3 SNPs surrounding rs2071358 with significant or suggestive association signals. Replication study observed additional significant association signals at both rs2071358 (P value = 0.017) and rs4760608 (P value = 0.002) of COL2A1 gene after Bonferroni correction. Our results suggest that COL2A1 gene was a novel susceptibility gene involved in the growth and development failure of hand of KBD.Keywords: Kashin-Beck disease, genome-wide association study, COL2A1, hand
Procedia PDF Downloads 2191085 Analysis and Modeling of Vibratory Signals Based on LMD for Rolling Bearing Fault Diagnosis
Authors: Toufik Bensana, Slimane Mekhilef, Kamel Tadjine
Abstract:
The use of vibration analysis has been established as the most common and reliable method of analysis in the field of condition monitoring and diagnostics of rotating machinery. Rolling bearings cover a broad range of rotary machines and plays a crucial role in the modern manufacturing industry. Unfortunately, the vibration signals collected from a faulty bearing are generally non-stationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA) and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that, the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. the results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.Keywords: fault diagnosis, local mean decomposition, rolling element bearing, vibration analysis
Procedia PDF Downloads 407