Search results for: algebraic signal processing
3216 3D Object Model Reconstruction Based on Polywogs Wavelet Network Parametrization
Authors: Mohamed Othmani, Yassine Khlifi
Abstract:
This paper presents a technique for compact three dimensional (3D) object model reconstruction using wavelet networks. It consists to transform an input surface vertices into signals,and uses wavelet network parameters for signal approximations. To prove this, we use a wavelet network architecture founded on several mother wavelet families. POLYnomials WindOwed with Gaussians (POLYWOG) wavelet families are used to maximize the probability to select the best wavelets which ensure the good generalization of the network. To achieve a better reconstruction, the network is trained several iterations to optimize the wavelet network parameters until the error criterion is small enough. Experimental results will shown that our proposed technique can effectively reconstruct an irregular 3D object models when using the optimized wavelet network parameters. We will prove that an accurateness reconstruction depends on the best choice of the mother wavelets.Keywords: 3d object, optimization, parametrization, polywog wavelets, reconstruction, wavelet networks
Procedia PDF Downloads 2863215 Decoding Kinematic Characteristics of Finger Movement from Electrocorticography Using Classical Methods and Deep Convolutional Neural Networks
Authors: Ksenia Volkova, Artur Petrosyan, Ignatii Dubyshkin, Alexei Ossadtchi
Abstract:
Brain-computer interfaces are a growing research field producing many implementations that find use in different fields and are used for research and practical purposes. Despite the popularity of the implementations using non-invasive neuroimaging methods, radical improvement of the state channel bandwidth and, thus, decoding accuracy is only possible by using invasive techniques. Electrocorticography (ECoG) is a minimally invasive neuroimaging method that provides highly informative brain activity signals, effective analysis of which requires the use of machine learning methods that are able to learn representations of complex patterns. Deep learning is a family of machine learning algorithms that allow learning representations of data with multiple levels of abstraction. This study explores the potential of deep learning approaches for ECoG processing, decoding movement intentions and the perception of proprioceptive information. To obtain synchronous recording of kinematic movement characteristics and corresponding electrical brain activity, a series of experiments were carried out, during which subjects performed finger movements at their own pace. Finger movements were recorded with a three-axis accelerometer, while ECoG was synchronously registered from the electrode strips that were implanted over the contralateral sensorimotor cortex. Then, multichannel ECoG signals were used to track finger movement trajectory characterized by accelerometer signal. This process was carried out both causally and non-causally, using different position of the ECoG data segment with respect to the accelerometer data stream. The recorded data was split into training and testing sets, containing continuous non-overlapping fragments of the multichannel ECoG. A deep convolutional neural network was implemented and trained, using 1-second segments of ECoG data from the training dataset as input. To assess the decoding accuracy, correlation coefficient r between the output of the model and the accelerometer readings was computed. After optimization of hyperparameters and training, the deep learning model allowed reasonably accurate causal decoding of finger movement with correlation coefficient r = 0.8. In contrast, the classical Wiener-filter like approach was able to achieve only 0.56 in the causal decoding mode. In the noncausal case, the traditional approach reached the accuracy of r = 0.69, which may be due to the presence of additional proprioceptive information. This result demonstrates that the deep neural network was able to effectively find a representation of the complex top-down information related to the actual movement rather than proprioception. The sensitivity analysis shows physiologically plausible pictures of the extent to which individual features (channel, wavelet subband) are utilized during the decoding procedure. In conclusion, the results of this study have demonstrated that a combination of a minimally invasive neuroimaging technique such as ECoG and advanced machine learning approaches allows decoding motion with high accuracy. Such setup provides means for control of devices with a large number of degrees of freedom as well as exploratory studies of the complex neural processes underlying movement execution.Keywords: brain-computer interface, deep learning, ECoG, movement decoding, sensorimotor cortex
Procedia PDF Downloads 1793214 Mean Square Responses of a Cantilever Beam with Various Damping Mechanisms
Authors: Yaping Zhao, Yimin Zhang
Abstract:
In the present paper, the stationary random vibration of a uniform cantilever beam is investigated. Two types of damping mechanism, i.e. the external and internal viscous dampings, are taken into account simultaneously. The excitation form is the support motion, and it is ideal white. Because two type of damping mechanism are considered concurrently, the product of the modal damping ratio and the natural frequency is not a constant anymore. As a result, the infinite definite integral encountered in the process of computing the mean square response is more complex than that in the existing literature. One signal progress of this work is to have calculated these definite integrals accurately. The precise solution of the mean square response is thus obtained in the infinite series form finally. Numerical examples are supplied and the numerical outcomes acquired confirm the validity of the theoretical analyses.Keywords: random vibration, cantilever beam, mean square response, white noise
Procedia PDF Downloads 3863213 Efficient Relay Selection Scheme Utilizing OVSF Code in Cooperative Communication System
Authors: Yeong-Seop Ahn, Myoung-Jin Kim, Young-Min Ko, Hyoung-Kyu Song
Abstract:
This paper proposes a relay selection scheme utilizing an orthogonal variable spreading factor (OVSF) code in a cooperative communication. The relay selection scheme influences on the communication performance in the cooperative communication. Conventional relay selection schemes such as the best harmonic mean relay selection scheme or the threshold-based relay selection scheme should know information such as channel state information (CSI) in advance. The proposed relay selection scheme does not require information in advance by using a reference signal utilizing the OVSF code. The simulation result shows that bit error rate (BER) performance of proposed relay selection scheme is similar to the best harmonic mean relay selection scheme that is known as one of the optimal relay selection schemes.Keywords: cooperative communication, relay selection, OFDM, OVSF code
Procedia PDF Downloads 6403212 Vascular Targeted Photodynamic Therapy Monitored by Real-Time Laser Speckle Imaging
Authors: Ruth Goldschmidt, Vyacheslav Kalchenko, Lilah Agemy, Rachel Elmoalem, Avigdor Scherz
Abstract:
Vascular Targeted Photodynamic therapy (VTP) is a new modality for selective cancer treatment that leads to the complete tumor ablation. A photosensitizer, a bacteriochlorophyll derivative in our case, is first administered to the patient and followed by the illumination of the tumor area, by a near-IR laser for its photoactivation. The photoactivated drug releases reactive oxygen species (ROS) in the circulation, which reacts with blood cells and the endothelium leading to the occlusion of the blood vasculature. If the blood vessels are only partially closed, the tumor may recover, and cancer cells could survive. On the other hand, excessive treatment may lead to toxicity of healthy tissues nearby. Simultaneous VTP monitoring and image processing independent of the photoexcitation laser has not yet been reported, to our knowledge. Here we present a method for blood flow monitoring, using a real-time laser speckle imaging (RTLSI) in the tumor during VTP. We have synthesized over the years a library of bacteriochlorophyll derivatives, among them WST11 and STL-6014. Both are water soluble derivatives that are retained in the blood vasculature through their partial binding to HSA. WST11 has been approved in Mexico for VTP treatment of prostate cancer at a certain drug dose, and time/intensity of illumination. Application to other bacteriochlorophyll derivatives or other cancers may require different treatment parameters (such as light/drug administration). VTP parameters for STL-6014 are still under study. This new derivative mainly differs from WST11 by its lack of the central Palladium, and its conjugation to an Arg-Gly-Asp (RGD) sequence. RGD is a tumor-specific ligand that is used for targeting the necrotic tumor domains through its affinity to αVβ3 integrin receptors. This enables the study of cell-targeted VTP. We developed a special RTLSI module, based on Labview software environment for data processing. The new module enables to acquire raw laser speckle images and calculate the values of the laser temporal statistics of time-integrated speckles in real time, without additional off-line processing. Using RTLSI, we could monitor the tumor’s blood flow following VTP in a CT26 colon carcinoma ear model. VTP with WST11 induced an immediate slow down of the blood flow within the tumor and a complete final flow arrest, after some sporadic reperfusions. If the irradiation continued further, the blood flow stopped also in the blood vessels of the surrounding healthy tissue. This emphasizes the significance of light dose control. Using our RTLSI system, we could prevent any additional healthy tissue damage by controlling the illumination time and restrict blood flow arrest within the tumor only. In addition, we found that VTP with STL-6014 was the most effective when the photoactivation was conducted 4h post-injection, in terms of tumor ablation success in-vivo and blood vessel flow arrest. In conclusion, RTSLI application should allow to optimize VTP efficacy vs. toxicity in both the preclinical and clinical arenas.Keywords: blood vessel occlusion, cancer treatment, photodynamic therapy, real time imaging
Procedia PDF Downloads 2283211 Dys-Regulation of Immune and Inflammatory Response in in vitro Fertilization Implantation Failure Patients under Ovarian Stimulation
Authors: Amruta D. S. Pathare, Indira Hinduja, Kusum Zaveri
Abstract:
Implantation failure (IF) even after the good-quality embryo transfer (ET) in the physiologically normal endometrium is the main obstacle in in vitro fertilization (IVF). Various microarray studies have been performed worldwide to elucidate the genes requisite for endometrial receptivity. These studies have included the population based on different phases of menstrual cycle during natural cycle and stimulated cycle in normal fertile women. Additionally, the literature is also available in recurrent implantation failure patients versus oocyte donors in natural cycle. However, for the first time, we aim to study the genomics of endometrial receptivity in IF patients under controlled ovarian stimulation (COS) during which ET is generally practised in IVF. Endometrial gene expression profiling in IF patients (n=10) and oocyte donors (n=8) were compared during window of implantation under COS by whole genome microarray (using Illumina platform). Enrichment analysis of microarray data was performed to determine dys-regulated biological functions and pathways using Database for Annotation, Visualization and Integrated Discovery, v6.8 (DAVID). The enrichment mapping was performed with the help of Cytoscape software. Microarray results were validated by real-time PCR. Localization of genes related to immune response (Progestagen-Associated Endometrial Protein (PAEP), Leukaemia Inhibitory Factor (LIF), Interleukin-6 Signal Transducer (IL6ST) was detected by immunohistochemistry. The study revealed 418 genes downregulated and 519 genes upregulated in IF patients compared to healthy fertile controls. The gene ontology, pathway analysis and enrichment mapping revealed significant downregulation in activation and regulation of immune and inflammation response in IF patients under COS. The lower expression of Progestagen Associated Endometrial Protein (PAEP), Leukemia Inhibitory Factor (LIF) and Interleukin 6 Signal Transducer (IL6ST) in cases compared to controls by real time and immunohistochemistry suggests the functional importance of these genes. The study was proved useful to uncover the probable reason of implantation failure being imbalance of immune and inflammatory regulation in our group of subjects. Based on the present study findings, a panel of significant dysregulated genes related to immune and inflammatory pathways needs to be further substantiated in larger cohort in natural as well as stimulated cycle. Upon which these genes could be screened in IF patients during window of implantation (WOI) before going for embryo transfer or any other immunological treatment. This would help to estimate the regulation of specific immune response during WOI in a patient. The appropriate treatment of either activation of immune response or suppression of immune response can be then attempted in IF patients to enhance the receptivity of endometrium.Keywords: endometrial receptivity, immune and inflammatory response, gene expression microarray, window of implantation
Procedia PDF Downloads 1583210 Signaling Using Phase Shifting in Wi-Fi Backscatter System
Authors: Chang-Bin Ha, Young-Min Ko, Seongjoo Lee, Hyoung-Kyu Song
Abstract:
In this paper, the signaling scheme using phase shifting is proposed for the improved performance of the Wi-Fi backscatter system. Because the communication in the Wi-Fi backscatter system is based on on-off modulation and impedance modulation by unit of packet, the data rate is very low compared to the conventional wireless systems. Also, because the Wi-Fi backscatter system is based on the RF-powered device, the achievement of high reliability is difficult. In order to increase the low data rate, the proposed scheme transmits information of multiple bits during one packet period. Also, in order to increase the reliability, the proposed scheme shifts the phase of signal in according to the transmitting information. The simulation result shows that the proposed scheme has the improved throughput performance.Keywords: phase shifting, RF-powered device, Wi-Fi backscatter system, IoT
Procedia PDF Downloads 4443209 A Novel Image Steganography Method Based on Mandelbrot Fractal
Authors: Adnan H. M. Al-Helali, Hamza A. Ali
Abstract:
The growth of censorship and pervasive monitoring on the Internet, Steganography arises as a new means of achieving secret communication. Steganography is the art and science of embedding information within electronic media used by common applications and systems. Generally, hiding information of multimedia within images will change some of their properties that may introduce few degradation or unusual characteristics. This paper presents a new image steganography approach for hiding information of multimedia (images, text, and audio) using generated Mandelbrot Fractal image as a cover. The proposed technique has been extensively tested with different images. The results show that the method is a very secure means of hiding and retrieving steganographic information. Experimental results demonstrate that an effective improvement in the values of the Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Normalized Cross Correlation (NCC), and Image Fidelity (IF) over the pervious techniques.Keywords: fractal image, information hiding, Mandelbrot set fractal, steganography
Procedia PDF Downloads 6223208 Investigating the Editing's Effect of Advertising Photos on the Virtual Purchase Decision Based on the Quantitative Electroencephalogram (EEG) Parameters
Authors: Parya Tabei, Maryam Habibifar
Abstract:
Decision-making is an important cognitive function that can be defined as the process of choosing an option among available options to achieve a specific goal. Consumer ‘need’ is the main reason for purchasing decisions. Human decision-making while buying products online is subject to various factors, one of which is the quality and effect of advertising photos. Advertising photo editing can have a significant impact on people's virtual purchase decisions. This technique helps improve the quality and overall appearance of photos by adjusting various aspects such as brightness, contrast, colors, cropping, resizing, and adding filters. This study, by examining the effect of editing advertising photos on the virtual purchase decision using EEG data, tries to investigate the effect of edited images on the decision-making of customers. A group of 30 participants were asked to react to 24 edited and unedited images while their EEG was recorded. Analysis of the EEG data revealed increased alpha wave activity in the occipital regions (O1, O2) for both edited and unedited images, which is related to visual processing and attention. Additionally, there was an increase in beta wave activity in the frontal regions (FP1, FP2, F4, F8) when participants viewed edited images, suggesting involvement in cognitive processes such as decision-making and evaluating advertising content. Gamma wave activity also increased in various regions, especially the frontal and parietal regions, which are associated with higher cognitive functions, such as attention, memory, and perception, when viewing the edited images. While the visual processing reflected by alpha waves remained consistent across different visual conditions, editing advertising photos appeared to boost neural activity in frontal and parietal regions associated with decision-making processes. These Findings suggest that photo editing could potentially influence consumer perceptions during virtual shopping experiences by modulating brain activity related to product assessment and purchase decisions.Keywords: virtual purchase decision, advertising photo, EEG parameters, decision Making
Procedia PDF Downloads 563207 Mg AZ31B Alloy Processed through ECASD
Authors: P. Fernández-Morales, D. Peláez, C. Isaza, J. M. Meza, E. Mendoza
Abstract:
Mg AZ31B alloy sheets were processed through equal-channel angular sheet drawing (ECASD) process, following the route A and C at room temperature and varying the processing speed. SEM was used to analyze the microstructure. The grain size was refined and presence of twins was observed. Vickers microhardness and tensile testing were carried out to evaluate the mechanical properties, showing in general; a remarkable increase in the first pass and slight increases during subsequent passes and, that the route C produces better uniform properties distribution through the thickness of the samples.Keywords: ECASD, Mg Alloy, mechanical properties, microstructure
Procedia PDF Downloads 3683206 Mechanical Properties of Poly(Propylene)-Based Graphene Nanocomposites
Authors: Luiza Melo De Lima, Tito Trindade, Jose M. Oliveira
Abstract:
The development of thermoplastic-based graphene nanocomposites has been of great interest not only to the scientific community but also to different industrial sectors. Due to the possible improvement of performance and weight reduction, thermoplastic nanocomposites are a great promise as a new class of materials. These nanocomposites are of relevance for the automotive industry, namely because the emission limits of CO2 emissions imposed by the European Commission (EC) regulations can be fulfilled without compromising the car’s performance but by reducing its weight. Thermoplastic polymers have some advantages over thermosetting polymers such as higher productivity, lower density, and recyclability. In the automotive industry, for example, poly(propylene) (PP) is a common thermoplastic polymer, which represents more than half of the polymeric raw material used in automotive parts. Graphene-based materials (GBM) are potential nanofillers that can improve the properties of polymer matrices at very low loading. In comparison to other composites, such as fiber-based composites, weight reduction can positively affect their processing and future applications. However, the properties and performance of GBM/polymer nanocomposites depend on the type of GBM and polymer matrix, the degree of dispersion, and especially the type of interactions between the fillers and the polymer matrix. In order to take advantage of the superior mechanical strength of GBM, strong interfacial strength between GBM and the polymer matrix is required for efficient stress transfer from GBM to the polymer. Thus, chemical compatibilizers and physicochemical modifications have been reported as important tools during the processing of these nanocomposites. In this study, PP-based nanocomposites were obtained by a simple melt blending technique, using a Brabender type mixer machine. Graphene nanoplatelets (GnPs) were applied as structural reinforcement. Two compatibilizers were used to improve the interaction between PP matrix and GnPs: PP graft maleic anhydride (PPgMA) and PPgMA modified with tertiary amine alcohol (PPgDM). The samples for tensile and Charpy impact tests were obtained by injection molding. The results suggested the GnPs presence can increase the mechanical strength of the polymer. However, it was verified that the GnPs presence can promote a decrease of impact resistance, turning the nanocomposites more fragile than neat PP. The compatibilizers’ incorporation increases the impact resistance, suggesting that the compatibilizers can enhance the adhesion between PP and GnPs. Compared to neat PP, Young’s modulus of non-compatibilized nanocomposite increase demonstrated that GnPs incorporation can promote a stiffness improvement of the polymer. This trend can be related to the several physical crosslinking points between the PP matrix and the GnPs. Furthermore, the decrease of strain at a yield of PP/GnPs, together with the enhancement of Young’s modulus, confirms that the GnPs incorporation led to an increase in stiffness but to a decrease in toughness. Moreover, the results demonstrated that incorporation of compatibilizers did not affect Young’s modulus and strain at yield results compared to non-compatibilized nanocomposite. The incorporation of these compatibilizers showed an improvement of nanocomposites’ mechanical properties compared both to those the non-compatibilized nanocomposite and to a PP sample used as reference.Keywords: graphene nanoplatelets, mechanical properties, melt blending processing, poly(propylene)-based nanocomposites
Procedia PDF Downloads 1883205 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication
Procedia PDF Downloads 2933204 Robust and Transparent Spread Spectrum Audio Watermarking
Authors: Ali Akbar Attari, Ali Asghar Beheshti Shirazi
Abstract:
In this paper, we propose a blind and robust audio watermarking scheme based on spread spectrum in Discrete Wavelet Transform (DWT) domain. Watermarks are embedded in the low-frequency coefficients, which is less audible. The key idea is dividing the audio signal into small frames, and magnitude of the 6th level of DWT approximation coefficients is modifying based upon the Direct Sequence Spread Spectrum (DSSS) technique. Also, the psychoacoustic model for enhancing in imperceptibility, as well as Savitsky-Golay filter for increasing accuracy in extraction, is used. The experimental results illustrate high robustness against most common attacks, i.e. Gaussian noise addition, Low pass filter, Resampling, Requantizing, MP3 compression, without significant perceptual distortion (ODG is higher than -1). The proposed scheme has about 83 bps data payload.Keywords: audio watermarking, spread spectrum, discrete wavelet transform, psychoacoustic, Savitsky-Golay filter
Procedia PDF Downloads 2003203 Influence of the Local External Pressure on Measured Parameters of Cutaneous Microcirculation
Authors: Irina Mizeva, Elena Potapova, Viktor Dremin, Mikhail Mezentsev, Valeri Shupletsov
Abstract:
The local tissue perfusion is regulated by the microvascular tone which is under the control of a number of physiological mechanisms. Laser Doppler flowmetry (LDF) together with wavelet analyses is the most commonly used technique to study the regulatory mechanisms of cutaneous microcirculation. External factors such as temperature, local pressure of the probe on the skin, etc. influence on the blood flow characteristics and are used as physiological tests to evaluate microvascular regulatory mechanisms. Local probe pressure influences on the microcirculation parameters measured by optical methods: diffuse reflectance spectroscopy, fluorescence spectroscopy, and LDF. Therefore, further study of probe pressure effects can be useful to improve the reliability of optical measurement. During pressure tests variation of the mean perfusion measured by means of LDF usually is estimated. An additional information concerning the physiological mechanisms of the vascular tone regulation system in response to local pressure can be obtained using spectral analyses of LDF samples. The aim of the present work was to develop protocol and algorithm of data processing appropriate for study physiological response to the local pressure test. Involving 6 subjects (20±2 years) and providing 5 measurements for every subject we estimated intersubject and-inter group variability of response of both averaged and oscillating parts of the LDF sample on external surface pressure. The final purpose of the work was to find special features which further can be used in wider clinic studies. The cutaneous perfusion measurements were carried out by LAKK-02 (SPE LAZMA Ltd., Russia), the skin loading was provided by the originally designed device which allows one to distribute the pressure around the LDF probe. The probe was installed on the dorsal part of the distal finger of the index figure. We collected measurements continuously for one hour and varied loading from 0 to 180mmHg stepwise with a step duration of 10 minutes. Further, we post-processed the samples using the wavelet transform and traced the energy of oscillations in five frequency bands over time. Weak loading leads to pressure-induced vasodilation, so one should take into account that the perfusion measured under pressure conditions will be overestimated. On the other hand, we revealed a decrease in endothelial associated fluctuations. Further loading (88 mmHg) induces amplification of pulsations in all frequency bands. We assume that such loading leads to a higher number of closed capillaries, higher input of arterioles in the LDF signal and as a consequence more vivid oscillations which mainly are formed in arterioles. External pressure higher than 144 mmHg leads to the decrease of oscillating components, after removing the loading very rapid restore of the tissue perfusion takes place. In this work, we have demonstrated that local skin loading influence on the microcirculation parameters measured by optic technique; this should be taken into account while developing portable electronic devices. The proposed protocol of local loading allows one to evaluate PIV as far as to trace dynamic of blood flow oscillations. This study was supported by the Russian Science Foundation under project N 18-15-00201.Keywords: blood microcirculation, laser Doppler flowmetry, pressure-induced vasodilation, wavelet analyses blood
Procedia PDF Downloads 1523202 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia
Authors: Rohan Bhasin
Abstract:
Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM
Procedia PDF Downloads 1663201 An Approach for Modeling CMOS Gates
Authors: Spyridon Nikolaidis
Abstract:
A modeling approach for CMOS gates is presented based on the use of the equivalent inverter. A new model for the inverter has been developed using a simplified transistor current model which incorporates the nanoscale effects for the planar technology. Parametric expressions for the output voltage are provided as well as the values of the output and supply current to be compatible with the CCS technology. The model is parametric according the input signal slew, output load, transistor widths, supply voltage, temperature and process. The transistor widths of the equivalent inverter are determined by HSPICE simulations and parametric expressions are developed for that using a fitting procedure. Results for the NAND gate shows that the proposed approach offers sufficient accuracy with an average error in propagation delay about 5%.Keywords: CMOS gate modeling, inverter modeling, transistor current mode, timing model
Procedia PDF Downloads 4243200 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing
Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig
Abstract:
Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.Keywords: empirical mode decomposition (EMD), mode mixing, sifting process, over-sifting
Procedia PDF Downloads 3993199 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 1483198 Robust Features for Impulsive Noisy Speech Recognition Using Relative Spectral Analysis
Authors: Hajer Rahali, Zied Hajaiej, Noureddine Ellouze
Abstract:
The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.Keywords: auditory filter, impulsive noise, MFCC, prosodic features, RASTA filter
Procedia PDF Downloads 4273197 Technical and Practical Aspects of Sizing a Autonomous PV System
Authors: Abdelhak Bouchakour, Mustafa Brahami, Layachi Zaghba
Abstract:
The use of photovoltaic energy offers an inexhaustible supply of energy but also a clean and non-polluting energy, which is a definite advantage. The geographical location of Algeria promotes the development of the use of this energy. Indeed, given the importance of the intensity of the radiation received and the duration of sunshine. For this reason, the objective of our work is to develop a data-processing tool (software) of calculation and optimization of dimensioning of the photovoltaic installations. Our approach of optimization is basing on mathematical models, which amongst other things describe the operation of each part of the installation, the energy production, the storage and the consumption of energy.Keywords: solar panel, solar radiation, inverter, optimization
Procedia PDF Downloads 6123196 Evaluating the Feasibility of Magnetic Induction to Cross an Air-Water Boundary
Authors: Mark Watson, J.-F. Bousquet, Adam Forget
Abstract:
A magnetic induction based underwater communication link is evaluated using an analytical model and a custom Finite-Difference Time-Domain (FDTD) simulation tool. The analytical model is based on the Sommerfeld integral, and a full-wave simulation tool evaluates Maxwell’s equations using the FDTD method in cylindrical coordinates. The analytical model and FDTD simulation tool are then compared and used to predict the system performance for various transmitter depths and optimum frequencies of operation. To this end, the system bandwidth, signal to noise ratio, and the magnitude of the induced voltage are used to estimate the expected channel capacity. The models show that in seawater, a relatively low-power and small coils may be capable of obtaining a throughput of 40 to 300 kbps, for the case where a transmitter is at depths of 1 to 3 m and a receiver is at a height of 1 m.Keywords: magnetic induction, FDTD, underwater communication, Sommerfeld
Procedia PDF Downloads 1263195 Creating Energy Sustainability in an Enterprise
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure
Procedia PDF Downloads 1143194 A New Dual Forward Affine Projection Adaptive Algorithm for Speech Enhancement in Airplane Cockpits
Authors: Djendi Mohmaed
Abstract:
In this paper, we propose a dual adaptive algorithm, which is based on the combination between the forward blind source separation (FBSS) structure and the affine projection algorithm (APA). This proposed algorithm combines the advantages of the source separation properties of the FBSS structure and the fast convergence characteristics of the APA algorithm. The proposed algorithm needs two noisy observations to provide an enhanced speech signal. This process is done in a blind manner without the need for ant priori information about the source signals. The proposed dual forward blind source separation affine projection algorithm is denoted (DFAPA) and used for the first time in an airplane cockpit context to enhance the communication from- and to- the airplane. Intensive experiments were carried out in this sense to evaluate the performance of the proposed DFAPA algorithm.Keywords: adaptive algorithm, speech enhancement, system mismatch, SNR
Procedia PDF Downloads 1393193 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 3523192 Emotional State and Cognitive Workload during a Flight Simulation: Heart Rate Study
Authors: Damien Mouratille, Antonio R. Hidalgo-Muñoz, Nadine Matton, Yves Rouillard, Mickael Causse, Radouane El Yagoubi
Abstract:
Background: The monitoring of the physiological activity related to mental workload (MW) on pilots will be useful to improve aviation safety by anticipating human performance degradation. The electrocardiogram (ECG) can reveal MW fluctuations due to either cognitive workload or/and emotional state since this measure exhibits autonomic nervous system modulations. Arguably, heart rate (HR) is one of its most intuitive and reliable parameters. It would be particularly interesting to analyze the interaction between cognitive requirements and emotion in ecologic sets such as a flight simulator. This study aims to explore by means of HR the relation between cognitive demands and emotional activation. Presumably, the effects of cognition and emotion overloads are not necessarily cumulative. Methodology: Eight healthy volunteers in possession of the Private Pilot License were recruited (male; 20.8±3.2 years). ECG signal was recorded along the whole experiment by placing two electrodes on the clavicle and left pectoral of the participants. The HR was computed within 4 minutes segments. NASA-TLX and Big Five inventories were used to assess subjective workload and to consider the influence of individual personality differences. The experiment consisted in completing two dual-tasks of approximately 30 minutes of duration into a flight simulator AL50. Each dual-task required the simultaneous accomplishment of both a pre-established flight plan and an additional task based on target stimulus discrimination inserted between Air Traffic Control instructions. This secondary task allowed us to vary the cognitive workload from low (LC) to high (HC) levels, by combining auditory and visual numerical stimuli to respond to meeting specific criteria. Regarding emotional condition, the two dual-tasks were designed to assure analogous difficulty in terms of solicited cognitive demands. The former was realized by the pilot alone, i.e. Low Arousal (LA) condition. In contrast, the latter generates a high arousal (HA), since the pilot was supervised by two evaluators, filmed and involved into a mock competition with the rest of the participants. Results: Performance for the secondary task showed significant faster reaction times (RT) for HA compared to LA condition (p=.003). Moreover, faster RT was found for LC compared to HC (p < .001) condition. No interaction was found. Concerning HR measure, despite the lack of main effects an interaction between emotion and cognition is evidenced (p=.028). Post hoc analysis showed smaller HR for HA compared to LA condition only for LC (p=.049). Conclusion. The control of an aircraft is a very complex task including strong cognitive demands and depends on the emotional state of pilots. According to the behavioral data, the experimental set has permitted to generate satisfactorily different emotional and cognitive levels. As suggested by the interaction found in HR measure, these two factors do not seem to have a cumulative impact on the sympathetic nervous system. Apparently, low cognitive workload makes pilots more sensitive to emotional variations. These results hint the independency between data processing and emotional regulation. Further physiological data are necessary to confirm and disentangle this relation. This procedure may be useful for monitoring objectively pilot’s mental workload.Keywords: cognitive demands, emotion, flight simulator, heart rate, mental workload
Procedia PDF Downloads 2753191 Occupational Heat Stress Condition According to Wet Bulb Globe Temperature Index in Textile Processing Unit: A Case Study of Surat, Gujarat, India
Authors: Dharmendra Jariwala, Robin Christian
Abstract:
Thermal exposure is a common problem in every manufacturing industry where heat is used in the manufacturing process. In developing countries like India, a lack of awareness regarding the proper work environmental condition is observed among workers. Improper planning of factory building, arrangement of machineries, ventilation system, etc. play a vital role in the rise of temperature within the manufacturing areas. Due to the uncontrolled thermal stress, workers may be subjected to various heat illnesses from mild disorder to heat stroke. Heat stress is responsible for the health risk and reduction in production. Wet Bulb Globe Temperature (WBGT) index and relative humidity are used to evaluate heat stress conditions. WBGT index is a weighted average of natural wet bulb temperature, globe temperature, dry bulb temperature, which are measured with standard instrument QuestTemp 36 area stress monitor. In this study textile processing units have been selected in the industrial estate in the Surat city. Based on the manufacturing process six locations were identified within the plant at which process was undertaken at 120°C to 180°C. These locations were jet dying machine area, stenter machine area, printing machine, looping machine area, washing area which generate process heat. Office area was also selected for comparision purpose as a sixth location. Present Study was conducted in the winter season and summer season for day and night shift. The results shows that average WBGT index was found above Threshold Limiting Value (TLV) during summer season for day and night shift in all three industries except office area. During summer season highest WBGT index of 32.8°C was found during day shift and 31.5°C was found during night shift at printing machine area. Also during winter season highest WBGT index of 30°C and 29.5°C was found at printing machine area during day shift and night shift respectively.Keywords: relative humidity, textile industry, thermal stress, WBGT
Procedia PDF Downloads 1763190 Emotional Processing Difficulties in Recovered Anorexia Nervosa Patients: State or Trait
Authors: Telma Fontao de Castro, Kylee Miller, Maria Xavier Araújo, Isabel Brandao, Sandra Torres
Abstract:
Objective: There is a dearth of research investigating the long-term emotional functioning of individuals recovered from anorexia nervosa (AN). This 15-year longitudinal study aimed to examine whether difficulties in cognitive processing of emotions persisted after long-term AN recovery and its link to anxiety and depression. Method: Twenty-four females, who were tested longitudinally during their acute and recovered AN phases, and 24 healthy control (HC) women, were screened for anxiety, depression, alexithymia, and emotion regulation difficulties (ER; only assessed in recovery phase). Results: Anxiety, depression, and alexithymia levels decreased significantly with AN recovery. However, scores on anxiety and difficulty in identifying feelings (alexithymia factor) remained high when compared to the HC group. Scores on emotion regulation difficulties were also lower in HC group. The abovementioned differences between AN recovered group and HC group in difficulties in identifying and accepting feelings and lack of emotional clarity were no longer present when the effect of anxiety and depression was controlled. Conclusions: Findings suggest that emotional dysfunction tends to decrease in AN recovered phase. However, using an HC group as a reference, we conclude that several emotional difficulties are still increased after long-term AN recovery, in particular, limited access to emotion regulation strategies, and difficulty controlling impulses and engaging in goal-directed behavior, thus suggesting to be a trait vulnerability. In turn, competencies related to emotional clarity and acceptance of emotional responses seem to be state-dependent phenomena linked to anxiety and depression. In sum, managing emotions remains a challenge for individuals recovered from AN. Under this circumstance, maladaptive eating behavior can serve as an affect regulatory function, increasing the risk of relapse. Emotional education and stabilization of depressive and anxious symptomatology after recovery emerge as an important avenue to protect from long-term AN relapse.Keywords: alexithymia, anorexia nervosa, emotion recognition, emotion regulation
Procedia PDF Downloads 1243189 Monitoring the Effect of Doxorubicin Liposomal in VX2 Tumor Using Magnetic Resonance Imaging
Authors: Ren-Jy Ben, Jo-Chi Jao, Chiu-Ya Liao, Ya-Ru Tsai, Lain-Chyr Hwang, Po-Chou Chen
Abstract:
Cancer is still one of the serious diseases threatening the lives of human beings. How to have an early diagnosis and effective treatment for tumors is a very important issue. The animal carcinoma model can provide a simulation tool for the study of pathogenesis, biological characteristics and therapeutic effects. Recently, drug delivery systems have been rapidly developed to effectively improve the therapeutic effects. Liposome plays an increasingly important role in clinical diagnosis and therapy for delivering a pharmaceutic or contrast agent to the targeted sites. Liposome can be absorbed and excreted by the human body, and is well known that no harm to the human body. This study aimed to compare the therapeutic effects between encapsulated (doxorubicin liposomal, LipoDox) and un-encapsulated (doxorubicin, Dox) anti-tumor drugs using Magnetic Resonance Imaging (MRI). Twenty-four New Zealand rabbits implanted with VX2 carcinoma at left thigh were classified into three groups: control group (untreated), Dox-treated group and LipoDox-treated group, 8 rabbits for each group. MRI scans were performed three days after tumor implantation. A 1.5T GE Signa HDxt whole body MRI scanner with a high resolution knee coil was used in this study. After a 3-plane localizer scan was performed, Three-Dimensional (3D) Fast Spin Echo (FSE) T2-Weighted Images (T2WI) was used for tumor volumetric quantification. And Two-Dimensional (2D) spoiled gradient recalled echo (SPGR) dynamic Contrast-enhanced (DCE) MRI was used for tumor perfusion evaluation. DCE-MRI was designed to acquire four baseline images, followed by contrast agent Gd-DOTA injection through the ear vein of rabbits. Afterwards, a series of 32 images were acquired to observe the signals change over time in the tumor and muscle. The MRI scanning was scheduled on a weekly basis for a period of four weeks to observe the tumor progression longitudinally. The Dox and LipoDox treatments were prescribed 3 times in the first week immediately after VX2 tumor implantation. ImageJ was used to quantitate tumor volume and time course signal enhancement on DCE images. The changes of tumor size showed that the growth of VX2 tumors was effectively inhibited for both LipoDox-treated and Dox-treated groups. Furthermore, the tumor volume of LipoDox-treated group was significantly lower than that of Dox-treated group, which implies that LipoDox has better therapeutic effect than Dox. The signal intensity of LipoDox-treated group is significantly lower than that of the other two groups, which implies that targeted therapeutic drug remained in the tumor tissue. This study provides a radiation-free and non-invasive MRI method for therapeutic monitoring of targeted liposome on an animal tumor model.Keywords: doxorubicin, dynamic contrast-enhanced MRI, lipodox, magnetic resonance imaging, VX2 tumor model
Procedia PDF Downloads 4603188 Joint Discrete Hartley Transform-Clipping for Peak to Average Power Ratio Reduction in Orthogonal Frequency Division Multiplexing System
Authors: Selcuk Comlekci, Mohammed Aboajmaa
Abstract:
Orthogonal frequency division multiplexing (OFDM) is promising technique for the modern wireless communications systems due to its robustness against multipath environment. The high peak to average power ratio (PAPR) of the transmitted signal is one of the major drawbacks of OFDM system, PAPR degrade the performance of bit error rate (BER) and effect on the linear characteristics of high power amplifier (HPA). In this paper, we proposed DHT-Clipping reduction technique to reduce the high PAPR by the combination between discrete Hartley transform (DHT) and Clipping techniques. From the simulation results, we notified that DHT-Clipping technique offers better PAPR reduction than DHT and Clipping, as well as DHT-Clipping introduce improved BER performance better than clipping.Keywords: ISI, cyclic prefix, BER, PAPR, HPA, DHT, subcarrier
Procedia PDF Downloads 4403187 Music Genre Classification Based on Non-Negative Matrix Factorization Features
Authors: Soyon Kim, Edward Kim
Abstract:
In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)
Procedia PDF Downloads 303