Search results for: anomalous noise
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1238

Search results for: anomalous noise

668 Angle of Arrival Estimation Using Maximum Likelihood Method

Authors: Olomon Wu, Hung Lu, Nick Wilkins, Daniel Kerr, Zekeriya Aliyazicioglu, H. K. Hwang

Abstract:

Multiple Input Multiple Output (MIMO) radar has received increasing attention in recent years. MIMO radar has many advantages over conventional phased array radar such as target detection, resolution enhancement, and interference suppression. In this paper, the results are presented from a simulation study of MIMO Uniformly-Spaced Linear Array (ULA) antennas. The performance is investigated under varied parameters, including varied array size, Pseudo Random (PN) sequence length, number of snapshots, and Signal to Noise Ratio (SNR). The results of MIMO are compared to a traditional array antenna.

Keywords: MIMO radar, phased array antenna, target detection, radar signal processing

Procedia PDF Downloads 541
667 Nondecoupling Signatures of Supersymmetry and an Lμ-Lτ Gauge Boson at Belle-II

Authors: Heerak Banerjee, Sourov Roy

Abstract:

Supersymmetry, one of the most celebrated fields of study for explaining experimental observations where the standard model (SM) falls short, is reeling from the lack of experimental vindication. At the same time, the idea of additional gauge symmetry, in particular, the gauged Lμ-Lτ symmetric models have also generated significant interest. They have been extensively proposed in order to explain the tantalizing discrepancy in the predicted and measured value of the muon anomalous magnetic moment alongside several other issues plaguing the SM. While very little parameter space within these models remain unconstrained, this work finds that the γ + Missing Energy (ME) signal at the Belle-II detector will be a smoking gun for supersymmetry (SUSY) in the presence of a gauged U(1)Lμ-Lτ symmetry. A remarkable consequence of breaking the enhanced symmetry appearing in the limit of degenerate (s)leptons is the nondecoupling of the radiative contribution of heavy charged sleptons to the γ-Z΄ kinetic mixing. The signal process, e⁺e⁻ →γZ΄→γ+ME, is an outcome of this ubiquitous feature. Taking the severe constraints on gauged Lμ-Lτ models by several low energy observables into account, it is shown that any significant excess in all but the highest photon energy bin would be an undeniable signature of such heavy scalar fields in SUSY coupling to the additional gauge boson Z΄. The number of signal events depends crucially on the logarithm of the ratio of stau to smuon mass in the presence of SUSY. In addition, the number is also inversely proportional to the e⁺e⁻ collision energy, making a low-energy, high-luminosity collider like Belle-II an ideal testing ground for this channel. This process can probe large swathes of the hitherto free slepton mass ratio vs. additional gauge coupling (gₓ) parameter space. More importantly, it can explore the narrow slice of Z΄ mass (MZ΄) vs. gₓ parameter space still allowed in gauged U(1)Lμ-Lτ models for superheavy sparticles. The spectacular finding that the signal significance is independent of individual slepton masses is an exciting prospect indeed. Further, the prospect that signatures of even superheavy SUSY particles that may have escaped detection at the LHC may show up at the Belle-II detector is an invigorating revelation.

Keywords: additional gauge symmetry, electron-positron collider, kinetic mixing, nondecoupling radiative effect, supersymmetry

Procedia PDF Downloads 127
666 Linear MIMO Model Identification Using an Extended Kalman Filter

Authors: Matthew C. Best

Abstract:

Linear Multi-Input Multi-Output (MIMO) dynamic models can be identified, with no a priori knowledge of model structure or order, using a new Generalised Identifying Filter (GIF). Based on an Extended Kalman Filter, the new filter identifies the model iteratively, in a continuous modal canonical form, using only input and output time histories. The filter’s self-propagating state error covariance matrix allows easy determination of convergence and conditioning, and by progressively increasing model order, the best fitting reduced-order model can be identified. The method is shown to be resistant to noise and can easily be extended to identification of smoothly nonlinear systems.

Keywords: system identification, Kalman filter, linear model, MIMO, model order reduction

Procedia PDF Downloads 594
665 Assessment of the Occupancy’s Effect on Speech Intelligibility in Al-Madinah Holy Mosque

Authors: Wasim Orfali, Hesham Tolba

Abstract:

This research investigates the acoustical characteristics of Al-Madinah Holy Mosque. Extensive field measurements were conducted in different locations of Al-Madinah Holy Mosque to characterize its acoustic characteristics. The acoustical characteristics are usually evaluated by the use of objective parameters in unoccupied rooms due to practical considerations. However, under normal conditions, the room occupancy can vary such characteristics due to the effect of the additional sound absorption present in the room or by the change in signal-to-noise ratio. Based on the acoustic measurements carried out in Al-Madinah Holy Mosque with and without occupancy, and the analysis of such measurements, the existence of acoustical deficiencies has been confirmed.

Keywords: Al-Madinah Holy Mosque, mosque acoustics, speech intelligibility, worship sound

Procedia PDF Downloads 177
664 Lifting Wavelet Transform and Singular Values Decomposition for Secure Image Watermarking

Authors: Siraa Ben Ftima, Mourad Talbi, Tahar Ezzedine

Abstract:

In this paper, we present a technique of secure watermarking of grayscale and color images. This technique consists in applying the Singular Value Decomposition (SVD) in LWT (Lifting Wavelet Transform) domain in order to insert the watermark image (grayscale) in the host image (grayscale or color image). It also uses signature in the embedding and extraction steps. The technique is applied on a number of grayscale and color images. The performance of this technique is proved by the PSNR (Pick Signal to Noise Ratio), the MSE (Mean Square Error) and the SSIM (structural similarity) computations.

Keywords: lifting wavelet transform (LWT), sub-space vectorial decomposition, secure, image watermarking, watermark

Procedia PDF Downloads 276
663 Radar-Based Classification of Pedestrian and Dog Using High-Resolution Raw Range-Doppler Signatures

Authors: C. Mayr, J. Periya, A. Kariminezhad

Abstract:

In this paper, we developed a learning framework for the classification of vulnerable road users (VRU) by their range-Doppler signatures. The frequency-modulated continuous-wave (FMCW) radar raw data is first pre-processed to obtain robust object range-Doppler maps per coherent time interval. The complex-valued range-Doppler maps captured from our outdoor measurements are further fed into a convolutional neural network (CNN) to learn the classification. This CNN has gone through a hyperparameter optimization process for improved learning. By learning VRU range-Doppler signatures, the three classes 'pedestrian', 'dog', and 'noise' are classified with an average accuracy of almost 95%. Interestingly, this classification accuracy holds for a combined longitudinal and lateral object trajectories.

Keywords: machine learning, radar, signal processing, autonomous driving

Procedia PDF Downloads 244
662 A Blind Three-Dimensional Meshes Watermarking Using the Interquartile Range

Authors: Emad E. Abdallah, Alaa E. Abdallah, Bajes Y. Alskarnah

Abstract:

We introduce a robust three-dimensional watermarking algorithm for copyright protection and indexing. The basic idea behind our technique is to measure the interquartile range or the spread of the 3D model vertices. The algorithm starts by converting all the vertices to spherical coordinate followed by partitioning them into small groups. The proposed algorithm is slightly altering the interquartile range distribution of the small groups based on predefined watermark. The experimental results on several 3D meshes prove perceptual invisibility and the robustness of the proposed technique against the most common attacks including compression, noise, smoothing, scaling, rotation as well as combinations of these attacks.

Keywords: watermarking, three-dimensional models, perceptual invisibility, interquartile range, 3D attacks

Procedia PDF Downloads 474
661 Characteristics and Durability Evaluation of Air Spring

Authors: Chang Su Woo, Hyun Sung Park

Abstract:

Air spring system is widely accepted for railway vehicle secondary suspension to reduce and absorb the vibration and noise. The low natural frequency ensures a comfortable ride and an invariably good stiffness. In this paper, the characteristic and durability test was conducted in laboratory by using servo-hydraulic fatigue testing system to reliability evaluation of air spring for electric railway vehicle. The experimental results show that the characteristics and durability of domestically developed products are excellent. Moreover, to guarantee the adaption of air spring, the ride comfort and air pressure variation were measured in train test on subway line. Air spring developed by this study for railway vehicles can guarantee the reliability of average usage of 1 million times at 90% confidence level.

Keywords: air spring, reliability, railway, service lifetime

Procedia PDF Downloads 474
660 Inverse Heat Transfer Analysis of a Melting Furnace Using Levenberg-Marquardt Method

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents a simple inverse heat transfer procedure for predicting the wall erosion and the time-varying thickness of the protective bank that covers the inside surface of the refractory brick wall of a melting furnace. The direct problem is solved by using the Finite-Volume model. The melting/solidification process is modeled using the enthalpy method. The inverse procedure rests on the Levenberg-Marquardt method combined with the Broyden method. The effect of the location of the temperature sensors and of the measurement noise on the inverse predictions is investigated. Recommendations are made concerning the location of the temperature sensor.

Keywords: melting furnace, inverse heat transfer, enthalpy method, levenberg–marquardt method

Procedia PDF Downloads 324
659 Experimental Study of the Sound Absorption of a Geopolymer Panel with a Textile Component Designed for a Railway Corridor

Authors: Ludmila Fridrichová, Roman Knížek, Pavel Němeček, Katarzyna Ewa Buczkowska

Abstract:

The design of the sound absorption panel, which consists of three layers, is presented in this study. The first layer of the panel is perforated and provides sound transmission to the inner part of the panel. The second layer is composed of a bulk material whose purpose is to absorb as much noise as possible. The third layer of the panel has two functions: the first function is to ensure the strength of the panel, and the second function is to reflect the sound back into the bulk layer. Experimental results have shown that the size of the holes in the perforated panel affects the sound absorption of the required frequency. The percentage of filling of the perforated area affects the quantity of sound absorbed.

Keywords: sound absorption, railway corridor, health, textile waste, natural fibres, concrete

Procedia PDF Downloads 14
658 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading

Authors: Jerome Joshi

Abstract:

The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.

Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus

Procedia PDF Downloads 77
657 Robust Noisy Speech Identification Using Frame Classifier Derived Features

Authors: Punnoose A. K.

Abstract:

This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.

Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering

Procedia PDF Downloads 127
656 Omni-Relay (OR) Scheme-Aided LTE-A Communication Systems

Authors: Hassan Mahasneh, Abu Sesay

Abstract:

We propose the use of relay terminals at the cell edge of an LTE-based cellar system. Each relay terminal is equipped with an omni-directional antenna. We refer to this scheme as the Omni-Relay (OR) scheme. The OR scheme coordinates the inter-cell interference (ICI) stemming from adjacent cells and increases the desired signal level at cell-edge regions. To validate the performance of the OR scheme, we derive the average signal-to-interference plus noise ratio (SINR) and the average capacity and compare it with the conventional universal frequency reuse factor (UFRF). The results show that the proposed OR scheme provides higher average SINR and average capacity compared to the UFRF due to the assistance of the distributed relay nodes.

Keywords: the UFRF scheme, the OR scheme, ICI, relay terminals, SINR, spectral efficiency

Procedia PDF Downloads 341
655 New Features for Copy-Move Image Forgery Detection

Authors: Michael Zimba

Abstract:

A novel set of features for copy-move image forgery, CMIF, detection method is proposed. The proposed set presents a new approach which relies on electrostatic field theory, EFT. Solely for the purpose of reducing the dimension of a suspicious image, firstly performs discrete wavelet transform, DWT, of the suspicious image and extracts only the approximation subband. The extracted subband is then bijectively mapped onto a virtual electrostatic field where concepts of EFT are utilised to extract robust features. The extracted features are shown to be invariant to additive noise, JPEG compression, and affine transformation. The proposed features can also be used in general object matching.

Keywords: virtual electrostatic field, features, affine transformation, copy-move image forgery

Procedia PDF Downloads 543
654 Image Steganography Using Predictive Coding for Secure Transmission

Authors: Baljit Singh Khehra, Jagreeti Kaur

Abstract:

In this paper, steganographic strategy is used to hide the text file inside an image. To increase the storage limit, predictive coding is utilized to implant information. In the proposed plan, one can exchange secure information by means of predictive coding methodology. The predictive coding produces high stego-image. The pixels are utilized to insert mystery information in it. The proposed information concealing plan is powerful as contrasted with the existing methodologies. By applying this strategy, a provision helps clients to productively conceal the information. Entropy, standard deviation, mean square error and peak signal noise ratio are the parameters used to evaluate the proposed methodology. The results of proposed approach are quite promising.

Keywords: cryptography, steganography, reversible image, predictive coding

Procedia PDF Downloads 417
653 Sedimentary, Diagenesis and Evaluation of High Quality Reservoir of Coarse Clastic Rocks in Nearshore Deep Waters in the Dongying Sag; Bohai Bay Basin

Authors: Kouassi Louis Kra

Abstract:

The nearshore deep-water gravity flow deposits in the Northern steep slope of Dongying depression, Bohai Bay basin, have been acknowledged as important reservoirs in the rift lacustrine basin. These deep strata term as coarse clastic sediment, deposit at the root of the slope have complex depositional processes and involve wide diagenetic events which made high-quality reservoir prediction to be complex. Based on the integrated study of seismic interpretation, sedimentary analysis, petrography, cores samples, wireline logging data, 3D seismic and lithological data, the reservoir formation mechanism deciphered. The Geoframe software was used to analyze 3-D seismic data to interpret the stratigraphy and build a sequence stratigraphic framework. Thin section identification, point counts were performed to assess the reservoir characteristics. The software PetroMod 1D of Schlumberger was utilized for the simulation of burial history. CL and SEM analysis were performed to reveal diagenesis sequences. Backscattered electron (BSE) images were recorded for definition of the textural relationships between diagenetic phases. The result showed that the nearshore steep slope deposits mainly consist of conglomerate, gravel sandstone, pebbly sandstone and fine sandstone interbedded with mudstone. The reservoir is characterized by low-porosity and ultra-low permeability. The diagenesis reactions include compaction, precipitation of calcite, dolomite, kaolinite, quartz cement and dissolution of feldspars and rock fragment. The main types of reservoir space are primary intergranular pores, residual intergranular pores, intergranular dissolved pores, intergranular dissolved pores, and fractures. There are three obvious anomalous high-porosity zones in the reservoir. Overpressure and early hydrocarbon filling are the main reason for abnormal secondary pores development. Sedimentary facies control the formation of high-quality reservoir, oil and gas filling preserves secondary pores from late carbonate cementation.

Keywords: Bohai Bay, Dongying Sag, deep strata, formation mechanism, high-quality reservoir

Procedia PDF Downloads 135
652 Analysis and Performance of Handover in Universal Mobile Telecommunications System (UMTS) Network Using OPNET Modeller

Authors: Latif Adnane, Benaatou Wafa, Pla Vicent

Abstract:

Handover is of great significance to achieve seamless connectivity in wireless networks. This paper gives an impression of the main factors which are being affected by the soft and the hard handovers techniques. To know and understand the handover process in The Universal Mobile Telecommunications System (UMTS) network, different statistics are calculated. This paper focuses on the quality of service (QoS) of soft and hard handover in UMTS network, which includes the analysis of received power, signal to noise radio, throughput, delay traffic, traffic received, delay, total transmit load, end to end delay and upload response time using OPNET simulator.

Keywords: handover, UMTS, mobility, simulation, OPNET modeler

Procedia PDF Downloads 321
651 A Bayesian Model with Improved Prior in Extreme Value Problems

Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro

Abstract:

In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).

Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior

Procedia PDF Downloads 198
650 Perceptual Organization within Temporal Displacement

Authors: Michele Sinico

Abstract:

The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.

Keywords: time perception, perceptual present, temporal displacement, Gestalt laws of perceptual organization

Procedia PDF Downloads 250
649 Developing an AI-Driven Application for Real-Time Emotion Recognition from Human Vocal Patterns

Authors: Sayor Ajfar Aaron, Mushfiqur Rahman, Sajjat Hossain Abir, Ashif Newaz

Abstract:

This study delves into the development of an artificial intelligence application designed for real-time emotion recognition from human vocal patterns. Utilizing advanced machine learning algorithms, including deep learning and neural networks, the paper highlights both the technical challenges and potential opportunities in accurately interpreting emotional cues from speech. Key findings demonstrate the critical role of diverse training datasets and the impact of ambient noise on recognition accuracy, offering insights into future directions for improving robustness and applicability in real-world scenarios.

Keywords: artificial intelligence, convolutional neural network, emotion recognition, vocal patterns

Procedia PDF Downloads 52
648 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser

Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett

Abstract:

Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.

Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser

Procedia PDF Downloads 155
647 Image Steganography Using Least Significant Bit Technique

Authors: Preeti Kumari, Ridhi Kapoor

Abstract:

 In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.

Keywords: steganography, LSB, encoding, information hiding, color image

Procedia PDF Downloads 474
646 Investigating the Demand of Short-Shelf Life Food Products for SME Wholesalers

Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Alistair Duffy, Ashley Hopwell

Abstract:

Accurate prediction of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. Current research in this area focused on limited number of factors specific to a single product or a business type. This paper gives an overview of the current literature on the variability factors used to predict demand and the existing forecasting techniques of short shelf life products. It then extends it by adding new factors and investigating if there is a time lag and possibility of noise in the orders. It also identifies the most important factors using correlation and Principal Component Analysis (PCA).

Keywords: demand forecasting, deteriorating products, food wholesalers, principal component analysis, variability factors

Procedia PDF Downloads 520
645 Fractal-Wavelet Based Techniques for Improving the Artificial Neural Network Models

Authors: Reza Bazargan lari, Mohammad H. Fattahi

Abstract:

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for pre-processing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based pre-processing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Keywords: wavelet, de-noising, predictability, time series fractal analysis, valid length, ANN

Procedia PDF Downloads 368
644 DeClEx-Processing Pipeline for Tumor Classification

Authors: Gaurav Shinde, Sai Charan Gongiguntla, Prajwal Shirur, Ahmed Hambaba

Abstract:

Health issues are significantly increasing, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly. We introduce DeClEx, a pipeline that ensures that data mirrors real-world settings by incorporating Gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state-of-the-art pre-trained models while achieving a threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification, and explainability in a single pipeline called DeClEx.

Keywords: machine learning, healthcare, classification, explainability

Procedia PDF Downloads 55
643 Uplink Throughput Prediction in Cellular Mobile Networks

Authors: Engin Eyceyurt, Josko Zec

Abstract:

The current and future cellular mobile communication networks generate enormous amounts of data. Networks have become extremely complex with extensive space of parameters, features and counters. These networks are unmanageable with legacy methods and an enhanced design and optimization approach is necessary that is increasingly reliant on machine learning. This paper proposes that machine learning as a viable approach for uplink throughput prediction. LTE radio metric, such as Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Noise Ratio (SNR) are used to train models to estimate expected uplink throughput. The prediction accuracy with high determination coefficient of 91.2% is obtained from measurements collected with a simple smartphone application.

Keywords: drive test, LTE, machine learning, uplink throughput prediction

Procedia PDF Downloads 156
642 An Experimental Analysis of Squeeze Casting Parameters for 2017 a Wrought Al Alloy

Authors: Mohamed Ben Amar, Najib Souissi, Chedly Bradai

Abstract:

A Taguchi design investigation has been made into the relationship between the ductility and process variables in a squeeze cast 2017A wrought aluminium alloy. The considered process parameters were: squeeze pressure, melt temperature and die preheating temperature. An orthogonal array (OA), main effect, signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) are employed to analyze the effect of casting parameters. The results have shown that the selected parameters significantly affect the ductility of 2017A wrought Al alloy castings. Optimal squeeze cast process parameters were provided to illustrate the proposed approach and the results were proven to be trustworthy through practical experiments.

Keywords: Taguchi method, squeeze casting, process parameters, ductility, microstructure

Procedia PDF Downloads 400
641 The Design Process of an Interactive Seat for Improving Workplace Productivity

Authors: Carlos Ferreira, Paulo Freitas, Valentim Freitas

Abstract:

Creative industries’ workers are becoming more prominent as countries move towards intellectual-based economies. Consequently, the nature and essence of the workplace needs to be reconfigured so that creativity and productivity can be better promoted at these spaces. Using a multidisciplinary approach and a user-centered methodology, combining product design, electronic engineering, software and human-computer interaction, we have designed and developed a new seat that uses embedded sensors and actuators to increase the overall well-being of its users, their productivity and their creativity. Our contribution focuses on the parameters that most affect the user’s work on these kinds of spaces, which are, according to our study, noise and temperature. We describe the design process for a new interactive seat targeted at improving workspace productivity.

Keywords: human-computer interaction, usability, user interface, creativity, ergonomics

Procedia PDF Downloads 221
640 OFDM Radar for High Accuracy Target Tracking

Authors: Mahbube Eghtesad

Abstract:

For a number of years, the problem of simultaneous detection and tracking of a target has been one of the most relevant and challenging issues in a wide variety of military and civilian systems. We develop methods for detecting and tracking a target using an orthogonal frequency division multiplexing (OFDM) based radar. As a preliminary step we introduce the target trajectory and Gaussian noise model in discrete time form. Then resorting to match filter and Kalman filter we derive a detector and target tracker. After that we propose an OFDM radar in order to achieve further improvement in tracking performance. The motivation for employing multiple frequencies is that the different scattering centers of a target resonate differently at each frequency. Numerical examples illustrate our analytical results, demonstrating the achieved performance improvement due to the OFDM signaling method.

Keywords: matched filter, target trashing, OFDM radar, Kalman filter

Procedia PDF Downloads 398
639 Periodically Forced Oscillator with Noisy Chaotic Dynamics

Authors: Adedayo Oke Adelakun

Abstract:

The chaotic dynamics of periodically forced oscillators with smooth potential has been extensively investigated via theoretical, numerical and experimental simulations. With the advent of the study of chaotic dynamics by means of method of multiple time scale analysis, Melnikov theory, bifurcation diagram, Poincare's map, bifurcation diagrams and Lyapunov exponents, it has become necessary to seek for a better understanding of nonlinear oscillator with noisy term. In this paper, we examine the influence of noise on complex dynamical behaviour of periodically forced F6 - Duffing oscillator for specific choice of noisy parameters. The inclusion of noisy term improves the dynamical behaviour of the oscillator which may have wider application in secure communication than smooth potential.

Keywords: hierarchical structure, periodically forced oscillator, noisy parameters, dynamical behaviour, F6 - duffing oscillator

Procedia PDF Downloads 325