Search results for: noise filters
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1417

Search results for: noise filters

727 Assessment of the Occupancy’s Effect on Speech Intelligibility in Al-Madinah Holy Mosque

Authors: Wasim Orfali, Hesham Tolba

Abstract:

This research investigates the acoustical characteristics of Al-Madinah Holy Mosque. Extensive field measurements were conducted in different locations of Al-Madinah Holy Mosque to characterize its acoustic characteristics. The acoustical characteristics are usually evaluated by the use of objective parameters in unoccupied rooms due to practical considerations. However, under normal conditions, the room occupancy can vary such characteristics due to the effect of the additional sound absorption present in the room or by the change in signal-to-noise ratio. Based on the acoustic measurements carried out in Al-Madinah Holy Mosque with and without occupancy, and the analysis of such measurements, the existence of acoustical deficiencies has been confirmed.

Keywords: Al-Madinah Holy Mosque, mosque acoustics, speech intelligibility, worship sound

Procedia PDF Downloads 157
726 Experimental Device for Fluorescence Measurement by Optical Fiber Combined with Dielectrophoretic Sorting in Microfluidic Chips

Authors: Jan Jezek, Zdenek Pilat, Filip Smatlo, Pavel Zemanek

Abstract:

We present a device that combines fluorescence spectroscopy with fiber optics and dielectrophoretic micromanipulation in PDMS (poly-(dimethylsiloxane)) microfluidic chips. The device allows high speed detection (in the order of kHz) of the fluorescence signal, which is coming from the sample by an inserted optical fiber, e.g. from a micro-droplet flow in a microfluidic chip, or even from the liquid flowing in the transparent capillary, etc. The device uses a laser diode at a wavelength suitable for excitation of fluorescence, excitation and emission filters, optics for focusing the laser radiation into the optical fiber, and a highly sensitive fast photodiode for detection of fluorescence. The device is combined with dielectrophoretic sorting on a chip for sorting of micro-droplets according to their fluorescence intensity. The electrodes are created by lift-off technology on a glass substrate, or by using channels filled with a soft metal alloy or an electrolyte. This device found its use in screening of enzymatic reactions and sorting of individual fluorescently labelled microorganisms. The authors acknowledge the support from the Grant Agency of the Czech Republic (GA16-07965S) and Ministry of Education, Youth and Sports of the Czech Republic (LO1212) together with the European Commission (ALISI No. CZ.1.05/2.1.00/01.0017).

Keywords: dielectrophoretic sorting, fiber optics, laser, microfluidic chips, microdroplets, spectroscopy

Procedia PDF Downloads 700
725 Lifting Wavelet Transform and Singular Values Decomposition for Secure Image Watermarking

Authors: Siraa Ben Ftima, Mourad Talbi, Tahar Ezzedine

Abstract:

In this paper, we present a technique of secure watermarking of grayscale and color images. This technique consists in applying the Singular Value Decomposition (SVD) in LWT (Lifting Wavelet Transform) domain in order to insert the watermark image (grayscale) in the host image (grayscale or color image). It also uses signature in the embedding and extraction steps. The technique is applied on a number of grayscale and color images. The performance of this technique is proved by the PSNR (Pick Signal to Noise Ratio), the MSE (Mean Square Error) and the SSIM (structural similarity) computations.

Keywords: lifting wavelet transform (LWT), sub-space vectorial decomposition, secure, image watermarking, watermark

Procedia PDF Downloads 261
724 Radar-Based Classification of Pedestrian and Dog Using High-Resolution Raw Range-Doppler Signatures

Authors: C. Mayr, J. Periya, A. Kariminezhad

Abstract:

In this paper, we developed a learning framework for the classification of vulnerable road users (VRU) by their range-Doppler signatures. The frequency-modulated continuous-wave (FMCW) radar raw data is first pre-processed to obtain robust object range-Doppler maps per coherent time interval. The complex-valued range-Doppler maps captured from our outdoor measurements are further fed into a convolutional neural network (CNN) to learn the classification. This CNN has gone through a hyperparameter optimization process for improved learning. By learning VRU range-Doppler signatures, the three classes 'pedestrian', 'dog', and 'noise' are classified with an average accuracy of almost 95%. Interestingly, this classification accuracy holds for a combined longitudinal and lateral object trajectories.

Keywords: machine learning, radar, signal processing, autonomous driving

Procedia PDF Downloads 220
723 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 24
722 Notched Bands in Ultra-Wideband UWB Filter Design for Advanced Wireless Applications

Authors: Abdul Basit, Amil Daraz, Guoqiang Zhang

Abstract:

With the increasing demand for wireless communication systems for unlicensed indoor applications, the FCC, in February 2002, allocated unlicensed bands ranging from 3.1 GHZ to 10.6 GHz with fractional bandwidth of about 109 %, because it plays a key role in the radiofrequency (RF) front ends devices and has been widely applied in many other microwave circuits. Targeting the proposed band defined by the FCC for the UWB system, this article presents a UWB bandpass filter with three stop bands for the mitigation of wireless bands that may interfere with the UWB range. For this purpose, two resonators are utilized for the implementation of triple-notched bands. The C-shaped resonator is used for the first notch band creation at 3.4 GHz to suppress the WiMAX signal, while the H-shaped resonator is employed in the initial UWB design to introduce the dual notched characteristic at 4.5 GHz and 8.1 GHz to reject the WLAN and Satellite Communication signals. The overall circuit area covered by the proposed design is 30.6 mm × 20 mm, or in terms of guided wavelength at the first stopband, its size is 0.06 λg × 0.02 λg. The presented structure shows a good return loss under -10 dB over most of the passband and greater than -15 dB for the notched frequency bands. Finally, the filter is simulated and analyzed in HFSS 15.0. All the bands for the rejection of wireless signals are independently controlled, which makes this work superior to the rest of the UWB filters presented in the literature.

Keywords: a bandpass filter (BPF), ultra-wideband (UWB), wireless communication, C-shaped resonator, triple notch

Procedia PDF Downloads 63
721 A Blind Three-Dimensional Meshes Watermarking Using the Interquartile Range

Authors: Emad E. Abdallah, Alaa E. Abdallah, Bajes Y. Alskarnah

Abstract:

We introduce a robust three-dimensional watermarking algorithm for copyright protection and indexing. The basic idea behind our technique is to measure the interquartile range or the spread of the 3D model vertices. The algorithm starts by converting all the vertices to spherical coordinate followed by partitioning them into small groups. The proposed algorithm is slightly altering the interquartile range distribution of the small groups based on predefined watermark. The experimental results on several 3D meshes prove perceptual invisibility and the robustness of the proposed technique against the most common attacks including compression, noise, smoothing, scaling, rotation as well as combinations of these attacks.

Keywords: watermarking, three-dimensional models, perceptual invisibility, interquartile range, 3D attacks

Procedia PDF Downloads 458
720 Characteristics and Durability Evaluation of Air Spring

Authors: Chang Su Woo, Hyun Sung Park

Abstract:

Air spring system is widely accepted for railway vehicle secondary suspension to reduce and absorb the vibration and noise. The low natural frequency ensures a comfortable ride and an invariably good stiffness. In this paper, the characteristic and durability test was conducted in laboratory by using servo-hydraulic fatigue testing system to reliability evaluation of air spring for electric railway vehicle. The experimental results show that the characteristics and durability of domestically developed products are excellent. Moreover, to guarantee the adaption of air spring, the ride comfort and air pressure variation were measured in train test on subway line. Air spring developed by this study for railway vehicles can guarantee the reliability of average usage of 1 million times at 90% confidence level.

Keywords: air spring, reliability, railway, service lifetime

Procedia PDF Downloads 457
719 Inverse Heat Transfer Analysis of a Melting Furnace Using Levenberg-Marquardt Method

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents a simple inverse heat transfer procedure for predicting the wall erosion and the time-varying thickness of the protective bank that covers the inside surface of the refractory brick wall of a melting furnace. The direct problem is solved by using the Finite-Volume model. The melting/solidification process is modeled using the enthalpy method. The inverse procedure rests on the Levenberg-Marquardt method combined with the Broyden method. The effect of the location of the temperature sensors and of the measurement noise on the inverse predictions is investigated. Recommendations are made concerning the location of the temperature sensor.

Keywords: melting furnace, inverse heat transfer, enthalpy method, levenberg–marquardt method

Procedia PDF Downloads 306
718 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading

Authors: Jerome Joshi

Abstract:

The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.

Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus

Procedia PDF Downloads 59
717 Robust Noisy Speech Identification Using Frame Classifier Derived Features

Authors: Punnoose A. K.

Abstract:

This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.

Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering

Procedia PDF Downloads 108
716 Omni-Relay (OR) Scheme-Aided LTE-A Communication Systems

Authors: Hassan Mahasneh, Abu Sesay

Abstract:

We propose the use of relay terminals at the cell edge of an LTE-based cellar system. Each relay terminal is equipped with an omni-directional antenna. We refer to this scheme as the Omni-Relay (OR) scheme. The OR scheme coordinates the inter-cell interference (ICI) stemming from adjacent cells and increases the desired signal level at cell-edge regions. To validate the performance of the OR scheme, we derive the average signal-to-interference plus noise ratio (SINR) and the average capacity and compare it with the conventional universal frequency reuse factor (UFRF). The results show that the proposed OR scheme provides higher average SINR and average capacity compared to the UFRF due to the assistance of the distributed relay nodes.

Keywords: the UFRF scheme, the OR scheme, ICI, relay terminals, SINR, spectral efficiency

Procedia PDF Downloads 318
715 New Features for Copy-Move Image Forgery Detection

Authors: Michael Zimba

Abstract:

A novel set of features for copy-move image forgery, CMIF, detection method is proposed. The proposed set presents a new approach which relies on electrostatic field theory, EFT. Solely for the purpose of reducing the dimension of a suspicious image, firstly performs discrete wavelet transform, DWT, of the suspicious image and extracts only the approximation subband. The extracted subband is then bijectively mapped onto a virtual electrostatic field where concepts of EFT are utilised to extract robust features. The extracted features are shown to be invariant to additive noise, JPEG compression, and affine transformation. The proposed features can also be used in general object matching.

Keywords: virtual electrostatic field, features, affine transformation, copy-move image forgery

Procedia PDF Downloads 529
714 Image Steganography Using Predictive Coding for Secure Transmission

Authors: Baljit Singh Khehra, Jagreeti Kaur

Abstract:

In this paper, steganographic strategy is used to hide the text file inside an image. To increase the storage limit, predictive coding is utilized to implant information. In the proposed plan, one can exchange secure information by means of predictive coding methodology. The predictive coding produces high stego-image. The pixels are utilized to insert mystery information in it. The proposed information concealing plan is powerful as contrasted with the existing methodologies. By applying this strategy, a provision helps clients to productively conceal the information. Entropy, standard deviation, mean square error and peak signal noise ratio are the parameters used to evaluate the proposed methodology. The results of proposed approach are quite promising.

Keywords: cryptography, steganography, reversible image, predictive coding

Procedia PDF Downloads 399
713 Supported Gold Nanocatalysts for CO Oxidation in Mainstream Cigarette Smoke

Authors: Krasimir Ivanov, Dimitar Dimitrov, Tatyana Tabakova, Stefka Kirkova, Anna Stoilova, Violina Angelova

Abstract:

It has been suggested that nicotine, CO and tar in mainstream smoke are the most important substances and have been judged as the most harmful compounds, responsible for the health hazards of smoking. As nicotine is extremely important for smoking qualities of cigarettes and the tar yield in the tobacco smoke is significantly reduced due to the use of filters with various content and design, the main efforts of cigarettes researchers and manufacturers are related to the search of opportunities for CO content reduction. Highly active ceria supported gold catalyst was prepared by the deposition-precipitation method, and the possibilities for CO oxidation in the synthetic gaseous mixture were evaluated using continuous flow equipment with fixed bed glass reactor at atmospheric pressure. The efficiently of the catalyst in CO oxidation in the real cigarette smoke was examined by a single port, puf-by-puff smoking machine. Quality assessment of smoking using cigarette holder containing catalyst was carried out. It was established that the catalytic activity toward CO oxidation in cigarette smoke rapidly decreases from 70% for the first cigarette to nearly zero for the twentieth cigarette. The present study shows that there are two critical factors which do not permit the successful use of catalysts to reduce the CO content in the mainstream cigarette smoke: (i) significant influence of the processes of adsorption and oxidation on the main characteristics of tobacco products and (ii) rapid deactivation of the catalyst due to the covering of the catalyst’s grains with condensate.

Keywords: cigarette smoke, CO oxidation, gold catalyst, mainstream

Procedia PDF Downloads 199
712 Analysis and Performance of Handover in Universal Mobile Telecommunications System (UMTS) Network Using OPNET Modeller

Authors: Latif Adnane, Benaatou Wafa, Pla Vicent

Abstract:

Handover is of great significance to achieve seamless connectivity in wireless networks. This paper gives an impression of the main factors which are being affected by the soft and the hard handovers techniques. To know and understand the handover process in The Universal Mobile Telecommunications System (UMTS) network, different statistics are calculated. This paper focuses on the quality of service (QoS) of soft and hard handover in UMTS network, which includes the analysis of received power, signal to noise radio, throughput, delay traffic, traffic received, delay, total transmit load, end to end delay and upload response time using OPNET simulator.

Keywords: handover, UMTS, mobility, simulation, OPNET modeler

Procedia PDF Downloads 300
711 A Bayesian Model with Improved Prior in Extreme Value Problems

Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro

Abstract:

In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).

Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior

Procedia PDF Downloads 176
710 Perceptual Organization within Temporal Displacement

Authors: Michele Sinico

Abstract:

The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.

Keywords: time perception, perceptual present, temporal displacement, Gestalt laws of perceptual organization

Procedia PDF Downloads 234
709 Developing an AI-Driven Application for Real-Time Emotion Recognition from Human Vocal Patterns

Authors: Sayor Ajfar Aaron, Mushfiqur Rahman, Sajjat Hossain Abir, Ashif Newaz

Abstract:

This study delves into the development of an artificial intelligence application designed for real-time emotion recognition from human vocal patterns. Utilizing advanced machine learning algorithms, including deep learning and neural networks, the paper highlights both the technical challenges and potential opportunities in accurately interpreting emotional cues from speech. Key findings demonstrate the critical role of diverse training datasets and the impact of ambient noise on recognition accuracy, offering insights into future directions for improving robustness and applicability in real-world scenarios.

Keywords: artificial intelligence, convolutional neural network, emotion recognition, vocal patterns

Procedia PDF Downloads 27
708 Daily Variations of Polycyclic Aromatic Hydrocarbons (PAHs) in Industrial Sites in an Suburban Area of Sour El Ghozlane, Algeria

Authors: Sidali Khedidji, Noureddine Yassaa, Riad Ladji

Abstract:

In this study, n-alkanes which are hazardous for the environment and human health were investigated in Sour El Ghozlane suburban atmosphere at a sampling point from April 2013 to Mai 2013. Ambient concentration measurements of n-Alkanes were carried out at a regional study of the cement industry in Sour El Ghozlane. During sampling, the airborne particulate matter was enriched onto PTFE filters by using a two medium volume samplers with or without a size-selective inlet for PM10 and TSP were used and each sampling period lasted approximately 24 h. The organic compounds were characterized using gas chromatography coupled with mass spectrometric detection (GC-MS). Total concentrations for n-Alkanes recorded in Sour El Ghozlane suburban ranged from 42 to 69 ng m-3. Gravimeter method was applied to the black smoke concentration data for Springer seasons. The 24 h average concentrations of n-alkanes contain the PM10 and TSP of Sour El Ghozlane suburban atmosphere were found in the range 0.50–7.06 ng/m3 and 0.29–6.97 ng/m3, respectively, in the sampling period. Meteorological factors, such as (relative humidity and temperature) were typically found to be affecting PMs, especially PM10. Air temperature did not seem to be significantly affecting TSP and PM10 mass concentrations. The guide value fixed by the European Community, 40 μg/m3 was not to exceed 35 days, was exceeded in some samples. However, it should be noted that the value limit fixed by the Algerian regulations 80 μg/m3 has been exceeded in 1 sampler during the period study.

Keywords: n-alkanes, PM10, TSP, particulate matter, cement industry

Procedia PDF Downloads 377
707 Enhancing Cellulose Acetate Films: Impact of Glycerol and Ionic Liquid Plasticizers

Authors: Rezzouq Asiya, Bouftou Abderrahim, Belfadil Doha, Taoufyk Azzeddine, El Bouchti Mehdi, Zyade Souad, Cherkaoui Omar, Majid Sanaa

Abstract:

Plastic packaging is widely used, but its pollution is a major environmental problem. Solutions require new sustainable technologies, environmental management, and the use of bio-based polymers as sustainable packaging. Cellulose acetate (CA) is a biobased polymer used in a variety of applications such as the manufacture of plastic films, textiles, and filters. However, it has limitations in terms of thermal stability and rigidity, which necessitates the addition of plasticizers to optimize its use in packaging. Plasticizers are molecules that increase the flexibility of polymers, but their influence on the chemical and physical properties of films (CA) has not been studied in detail. Some studies have focused on mechanical and thermal properties. However, an in-depth analysis is needed to understand the interactions between the additives and the polymer matrix. In this study, the aim is to examine the effect of two types of plasticizers, glycerol (a conventional plasticizer) and an ionic liquid, on the transparency, mechanical, thermal and barrier properties of cellulose acetate (CA) films prepared by the solution-casting method . Various analytical techniques were used to characterize these films, including infrared spectroscopy (FT-IR), X-ray diffraction (XRD), thermogravimetric analysis (TGA), water vapor permeability (WVP), oxygen permeability, scanning electron microscopy (SEM), opacity, transmission analysis and mechanical tests.

Keywords: cellulose acetate, plasticizers, biopolymers, ionic liquid, glycerol.

Procedia PDF Downloads 27
706 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser

Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett

Abstract:

Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.

Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser

Procedia PDF Downloads 136
705 Image Steganography Using Least Significant Bit Technique

Authors: Preeti Kumari, Ridhi Kapoor

Abstract:

 In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.

Keywords: steganography, LSB, encoding, information hiding, color image

Procedia PDF Downloads 458
704 Investigating the Demand of Short-Shelf Life Food Products for SME Wholesalers

Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Alistair Duffy, Ashley Hopwell

Abstract:

Accurate prediction of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. Current research in this area focused on limited number of factors specific to a single product or a business type. This paper gives an overview of the current literature on the variability factors used to predict demand and the existing forecasting techniques of short shelf life products. It then extends it by adding new factors and investigating if there is a time lag and possibility of noise in the orders. It also identifies the most important factors using correlation and Principal Component Analysis (PCA).

Keywords: demand forecasting, deteriorating products, food wholesalers, principal component analysis, variability factors

Procedia PDF Downloads 499
703 Fractal-Wavelet Based Techniques for Improving the Artificial Neural Network Models

Authors: Reza Bazargan lari, Mohammad H. Fattahi

Abstract:

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for pre-processing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based pre-processing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Keywords: wavelet, de-noising, predictability, time series fractal analysis, valid length, ANN

Procedia PDF Downloads 349
702 DeClEx-Processing Pipeline for Tumor Classification

Authors: Gaurav Shinde, Sai Charan Gongiguntla, Prajwal Shirur, Ahmed Hambaba

Abstract:

Health issues are significantly increasing, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly. We introduce DeClEx, a pipeline that ensures that data mirrors real-world settings by incorporating Gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state-of-the-art pre-trained models while achieving a threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification, and explainability in a single pipeline called DeClEx.

Keywords: machine learning, healthcare, classification, explainability

Procedia PDF Downloads 17
701 Uplink Throughput Prediction in Cellular Mobile Networks

Authors: Engin Eyceyurt, Josko Zec

Abstract:

The current and future cellular mobile communication networks generate enormous amounts of data. Networks have become extremely complex with extensive space of parameters, features and counters. These networks are unmanageable with legacy methods and an enhanced design and optimization approach is necessary that is increasingly reliant on machine learning. This paper proposes that machine learning as a viable approach for uplink throughput prediction. LTE radio metric, such as Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Noise Ratio (SNR) are used to train models to estimate expected uplink throughput. The prediction accuracy with high determination coefficient of 91.2% is obtained from measurements collected with a simple smartphone application.

Keywords: drive test, LTE, machine learning, uplink throughput prediction

Procedia PDF Downloads 138
700 An Experimental Analysis of Squeeze Casting Parameters for 2017 a Wrought Al Alloy

Authors: Mohamed Ben Amar, Najib Souissi, Chedly Bradai

Abstract:

A Taguchi design investigation has been made into the relationship between the ductility and process variables in a squeeze cast 2017A wrought aluminium alloy. The considered process parameters were: squeeze pressure, melt temperature and die preheating temperature. An orthogonal array (OA), main effect, signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) are employed to analyze the effect of casting parameters. The results have shown that the selected parameters significantly affect the ductility of 2017A wrought Al alloy castings. Optimal squeeze cast process parameters were provided to illustrate the proposed approach and the results were proven to be trustworthy through practical experiments.

Keywords: Taguchi method, squeeze casting, process parameters, ductility, microstructure

Procedia PDF Downloads 380
699 Fabrication of Textile-Based Radio Frequency Metasurfaces

Authors: Adria Kajenski, Guinevere Strack, Edward Kingsley, Shahriar Khushrushahi, Alkim Akyurtlu

Abstract:

Radio Frequency (RF) metasurfaces are arrangements of subwavelength elements interacting with electromagnetic radiation. These arrangements affect polarization state, amplitude, and phase of impinged radio waves; for example, metasurface designs are used to produce functional passband and stopband filters. Recent advances in additive manufacturing techniques have enabled the low-cost, rapid fabrication of ultra-thin metasurface elements on flexible substrates such as plastic films, paper, and textiles. Furthermore, scalable manufacturing processes promote the integration of fabric-based RF metasurfaces into the market of sensors and devices within the Internet of Things (IoT). The design and fabrication of metasurfaces on textiles require a multidisciplinary team with expertise in i) textile and materials science, ii) metasurface design and simulation, and iii) metasurface fabrication and testing. In this presentation, we will discuss RF metasurfaces on fabric with an emphasis on how the materials, including fabric and inks, along with fabrication techniques, affect the RF performance. We printed metasurfaces using a direct-write approach onto various woven and non-woven fabrics, as well as on fabrics coated with either thermoplastic or thermoset coatings. Our team also performed a range of tests on the printed structures, including different inks and their curing parameters, wash durability, abrasion resistance, and RF performance over time.

Keywords: electronic textiles, metasurface, printed electronics, flexible

Procedia PDF Downloads 179
698 The Design Process of an Interactive Seat for Improving Workplace Productivity

Authors: Carlos Ferreira, Paulo Freitas, Valentim Freitas

Abstract:

Creative industries’ workers are becoming more prominent as countries move towards intellectual-based economies. Consequently, the nature and essence of the workplace needs to be reconfigured so that creativity and productivity can be better promoted at these spaces. Using a multidisciplinary approach and a user-centered methodology, combining product design, electronic engineering, software and human-computer interaction, we have designed and developed a new seat that uses embedded sensors and actuators to increase the overall well-being of its users, their productivity and their creativity. Our contribution focuses on the parameters that most affect the user’s work on these kinds of spaces, which are, according to our study, noise and temperature. We describe the design process for a new interactive seat targeted at improving workspace productivity.

Keywords: human-computer interaction, usability, user interface, creativity, ergonomics

Procedia PDF Downloads 206