Search results for: noise filters
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1417

Search results for: noise filters

877 On the Equalization of Nonminimum Phase Electroacoustic Systems Using Digital Inverse Filters

Authors: Avelino Marques, Diamantino Freitas

Abstract:

Some important electroacoustic systems, like loudspeaker systems, exhibit a nonminimum phase behavior that poses considerable effort when applying advanced digital signal processing techniques, such as linear equalization. In this paper, the position and the number of zeros and poles of the inverse filter, FIR type or IIR type, designed using time domain techniques, are studied, compared and related to the nonminimum phase zeros of system to be equalized. Conclusions about the impact of the position of the system non-minimum phase zeros, on the length/order of the inverse filter and on the delay of the equalized system are outlined as a guide to previously decide which type of filter will be more adequate.

Keywords: loudspeaker systems, nonminimum phase system, FIR and IIR filter, delay

Procedia PDF Downloads 54
876 Green Building Risks: Limits on Environmental and Health Quality Metrics for Contractors

Authors: Erica Cochran Hameen, Bobuchi Ken-Opurum, Mounica Guturu

Abstract:

The United Stated (U.S.) populous spends the majority of their time indoors in spaces where building codes and voluntary sustainability standards provide clear Indoor Environmental Quality (IEQ) metrics. The existing sustainable building standards and codes are aimed towards improving IEQ, health of occupants, and reducing the negative impacts of buildings on the environment. While they address the post-occupancy stage of buildings, there are fewer standards on the pre-occupancy stage thereby placing a large labor population in environments much less regulated. Construction personnel are often exposed to a variety of uncomfortable and unhealthy elements while on construction sites, primarily thermal, visual, acoustic, and air quality related. Construction site power generators, equipment, and machinery generate on average 9 decibels (dBA) above the U.S. OSHA regulations, creating uncomfortable noise levels. Research has shown that frequent exposure to high noise levels leads to chronic physiological issues and increases noise induced stress, yet beyond OSHA no other metric focuses directly on the impacts of noise on contractors’ well-being. Research has also associated natural light with higher productivity and attention span, and lower cases of fatigue in construction workers. However, daylight is not always available as construction workers often perform tasks in cramped spaces, dark areas, or at nighttime. In these instances, the use of artificial light is necessary, yet lighting standards for use during lengthy tasks and arduous activities is not specified. Additionally, ambient air, contaminants, and material off-gassing expelled at construction sites are one of the causes of serious health effects in construction workers. Coupled with extreme hot and cold temperatures for different climate zones, health and productivity can be seriously compromised. This research evaluates the impact of existing green building metrics on construction and risk management, by analyzing two codes and nine standards including LEED, WELL, and BREAM. These metrics were chosen based on the relevance to the U.S. construction industry. This research determined that less than 20% of the sustainability context within the standards and codes (texts) are related to the pre-occupancy building sector. The research also investigated the impact of construction personnel’s health and well-being on construction management through two surveys of project managers and on-site contractors’ perception of their work environment on productivity. To fully understand the risks of limited Environmental and Health Quality metrics for contractors (EHQ) this research evaluated the connection between EHQ factors such as inefficient lighting, on construction workers and investigated the correlation between various site coping strategies for comfort and productivity. Outcomes from this research are three-pronged. The first includes fostering a discussion about the existing conditions of EQH elements, i.e. thermal, lighting, ergonomic, acoustic, and air quality on the construction labor force. The second identifies gaps in sustainability standards and codes during the pre-occupancy stage of building construction from ground-breaking to substantial completion. The third identifies opportunities for improvements and mitigation strategies to improve EQH such as increased monitoring of effects on productivity and health of contractors and increased inclusion of the pre-occupancy stage in green building standards.

Keywords: construction contractors, health and well-being, environmental quality, risk management

Procedia PDF Downloads 119
875 Sensor Registration in Multi-Static Sonar Fusion Detection

Authors: Longxiang Guo, Haoyan Hao, Xueli Sheng, Hanjun Yu, Jingwei Yin

Abstract:

In order to prevent target splitting and ensure the accuracy of fusion, system error registration is an important step in multi-static sonar fusion detection system. To eliminate the inherent system errors including distance error and angle error of each sonar in detection, this paper uses offline estimation method for error registration. Suppose several sonars from different platforms work together to detect a target. The target position detected by each sonar is based on each sonar’s own reference coordinate system. Based on the two-dimensional stereo projection method, this paper uses real-time quality control (RTQC) method and least squares (LS) method to estimate sensor biases. The RTQC method takes the average value of each sonar’s data as the observation value and the LS method makes the least square processing of each sonar’s data to get the observation value. In the underwater acoustic environment, matlab simulation is carried out and the simulation results show that both algorithms can estimate the distance and angle error of sonar system. The performance of the two algorithms is also compared through the root mean square error and the influence of measurement noise on registration accuracy is explored by simulation. The system error convergence of RTQC method is rapid, but the distribution of targets has a serious impact on its performance. LS method can not be affected by target distribution, but the increase of random noise will slow down the convergence rate. LS method is an improvement of RTQC method, which is widely used in two-dimensional registration. The improved method can be used for underwater multi-target detection registration.

Keywords: data fusion, multi-static sonar detection, offline estimation, sensor registration problem

Procedia PDF Downloads 150
874 Wearable Monitoring and Treatment System for Parkinson’s Disease

Authors: Bulcha Belay Etana, Benny Malengier, Janarthanan Krishnamoorthy, Timothy Kwa, Lieva Vanlangenhove

Abstract:

Electromyography measures the electrical activity of muscles using surface electrodes or needle electrodes to monitor various disease conditions. Recent developments in the signal acquisition of electromyograms using textile electrodes facilitate wearable devices, enabling patients to monitor and control their health status outside of healthcare facilities. Here, we have developed and tested wearable textile electrodes to acquire electromyography signals from patients suffering from Parkinson’s disease and incorporated a feedback-control system to relieve muscle cramping through thermal stimulus. In brief, the textile electrodes made of stainless steel was knitted into a textile fabric as a sleeve, and their electrical characteristic, such as signal-to-noise ratio, was compared with traditional electrodes. To relieve muscle cramping, a heating element made of stainless-steel conductive yarn sewn onto a cotton fabric, coupled with a vibration system, was developed. The system integrated a microcontroller and a Myoware muscle sensor to activate the heating element as well as the vibration motor when cramping occurred. At the same time, the element gets deactivated when the muscle cramping subsides. An optimum therapeutic temperature of 35.5°C is regulated and maintained continuously by a heating device. The textile electrode exhibited a signal-to-noise ratio of 6.38dB, comparable to that of the traditional electrode’s value of 7.05 dB. For a given 9 V power supply, the rise time for the developed heating element was about 6 minutes to reach an optimum temperature.

Keywords: smart textile system, wearable electronic textile, electromyography, heating textile, vibration therapy, Parkinson’s disease

Procedia PDF Downloads 53
873 MRI Quality Control Using Texture Analysis and Spatial Metrics

Authors: Kumar Kanudkuri, A. Sandhya

Abstract:

Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.

Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy

Procedia PDF Downloads 141
872 A New Microstrip Diplexer Using Coupled Stepped Impedance Resonators

Authors: A. Chinig, J. Zbitou, A. Errkik, L. Elabdellaoui, A. Tajmouati, A. Tribak, M. Latrach

Abstract:

This paper presents a new structure of microstrip band pass filter (BPF) based on coupled stepped impedance resonators. Each filter consists of two coupled stepped impedance resonators connected to microstrip feed lines. The coupled junction is utilized to connect the two BPFs to the antenna. This two band pass filters are designed and simulated to operate for the digital communication system (DCS) and Industrial Scientific and Medical (ISM) bands at 1.8 GHz and 2.45 GHz respectively. The proposed circuit presents good performances with an insertion loss lower than 2.3 dB and isolation between the two channels greater than 21 dB. The prototype of the optimized diplexer have been investigated numerically by using ADS Agilent and verified with CST microwave software.

Keywords: band pass filter, coupled junction, coupled stepped impedance resonators, diplexer, insertion loss, isolation

Procedia PDF Downloads 420
871 UniFi: Universal Filter Model for Image Enhancement

Authors: Aleksei Samarin, Artyom Nazarenko, Valentin Malykh

Abstract:

Image enhancement is becoming more and more popular, especially on mobile devices. Nowadays, it is a common approach to enhance an image using a convolutional neural network (CNN). Such a network should be of significant size; otherwise, a possibility for the artifacts to occur is overgrowing. The existing large CNNs are computationally expensive, which could be crucial for mobile devices. Another important flaw of such models is they are poorly interpretable. There is another approach to image enhancement, namely, the usage of predefined filters in combination with the prediction of their applicability. We present an approach following this paradigm, which outperforms both existing CNN-based and filter-based approaches in the image enhancement task. It is easily adaptable for mobile devices since it has only 47 thousand parameters. It shows the best SSIM 0.919 on RANDOM250 (MIT Adobe FiveK) among small models and is thrice faster than previous models.

Keywords: universal filter, image enhancement, neural networks, computer vision

Procedia PDF Downloads 85
870 Sleep Quality as Perceived by Critically Ill Patients at El Manial University Hospitals

Authors: Mohamed Adel Ahmed, Warda Youssef Morsy , Hanaa Ali El Feky

Abstract:

Background: Literature review cited that sleep is absolutely essential for surviving and reclamation of the quality of life. Critically ill patients often have poor sleep quality with prolonged sleep latency, sleep fragmentation, decreased sleep efficiency and frequent arousals. Nurses have a unique role for the early diagnosis of sleep disorders, decreasing stressors levels and providing the necessary environmental regulations to create a therapeutic ambiance. The aim of the study: to assess perceived sleep quality and identify factors affecting sleep quality among adult critically ill patients At El Manial University Hospital. Research Design: A descriptive exploratory design was utilized. Research questions: a) how do adult critically ill patients perceive sleep quality in the Critical Care Department of El Manial University Hospital? b) What are the factors affecting sleep quality among adult critically ill patients at El Manial University Hospital? Setting: selected critical and cardiac care units at El Manial University Hospital. Sample: A samples of convenience consisting of 100 adult male and female patients were included in the study. Tools of data collection: tool 1: Socio-demographic and Medical Data Sheet, tool 2: Modified St Mary's Hospital Sleep Questionnaire tool 3: Factors Affecting Sleep Quality Questionnaire among ICU Patients Results: The current study revealed that 76.0% of the studied sample had lack of sleep disturbance before hospitalization. However, 84 % had sleep disturbances during ICU stay, of these more than two-thirds (67 %) had moderate sleep disturbance. Presence of strange and bad odors, noise, having pain, fear of death and a loud voice produced by the ICU personnel had the most significant negative impact on patients’ sleep in percentage of 52.4, 50, 61.9, 45.2, 52.4, respectively. Conclusion: Sleep disturbances in the ICU are multifactorial, and ICU patients’ perceived degrees of sleep disturbance as a moderate. Recommendations: Based on findings of the present study, the following are recommended to be done by ICU nurses; create a healing ICU environment that should incorporate noise, light and temperature controls; decrease stimuli during night time hours to promote regulation of the circadian rhythm, allow usage of sleeping aids such as relaxing music, eye patches and earplugs into their daily nursing practice; cluster nursing activities and eliminate non-essential treatments during night time hours to allow uninterrupted sleep periods of at least 90 minutes to complete one sleep cycle , and minimize staff conversation, alarm noise and light during the quiet night time hours.

Keywords: sleep quality, critically ill, patients, perception

Procedia PDF Downloads 426
869 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model

Authors: Didier Auroux, Vladimir Groza

Abstract:

This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.

Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization

Procedia PDF Downloads 300
868 Filter for the Measurement of Supraharmonics in Distribution Networks

Authors: Sivaraman Karthikeyan

Abstract:

Due to rapidly developing power electronics devices and technologies such as power line communication or self-commutating converters, voltage and current distortion, as well as interferences, have increased in the frequency range of 2 kHz to 150 kHz; there is an urgent need for regulation of electromagnetic compatibility (EMC) standards in this frequency range. Measuring or testing compliance with emission and immunity limitations necessitates the use of precise, repeatable measuring methods. Appropriate filters to minimize the fundamental component and its harmonics below 2 kHz in the measuring signal would improve the measurement accuracy in this frequency range leading to better analysis. This paper discusses filter suggestions in the current measurement standard and proposes an infinite impulse response (IIR) filter design that is optimized for a low number of poles, strong fundamental damping, and high accuracy above 2 kHz. The new filter’s transfer function is delivered as a result. An analog implementation is derived from the overall design.

Keywords: supraharmonics, 2 kHz, 150 kHz, filter, analog filter

Procedia PDF Downloads 122
867 Optimum Design of Attenuator of Spun-Bond Production System

Authors: Nasser Ghassembaglou, Abdullah Bolek, Oktay Yilmaz, Ertan Oznergiz, Hikmet Kocabas, Safak Yilmaz

Abstract:

Nanofibers are effective material which have frequently been investigated to produce high quality air filters. As an environmental approach our aim is to achieve nanofibers by melting. In spun-bond systems extruder, spin-pump, nozzle package and attenuator are used. Molten polymer which flows from extruder is made steady by spin-pump. Regular melt passes through nozzle holes and forms fibers under high pressure. The fibers pulled from nozzle are shrunk to micron size by an attenuator, after solidification they are collected on a conveyor. In this research different designs of attenuator system have been studied and also CFD analysis have been done on them. Afterwards, one of these designs tested and finally some optimizations have been done to reduce pressure loss and increase air velocity.

Keywords: attenuator, nanofiber, spun-bond, extruder

Procedia PDF Downloads 394
866 The Direct Deconvolution Model for the Large Eddy Simulation of Turbulence

Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang

Abstract:

Large eddy simulation (LES) has been extensively used in the investigation of turbulence. LES calculates the grid-resolved large-scale motions and leaves small scales modeled by sub lfilterscale (SFS) models. Among the existing SFS models, the deconvolution model has been used successfully in the LES of the engineering flows and geophysical flows. Despite the wide application of deconvolution models, the effects of subfilter scale dynamics and filter anisotropy on the accuracy of SFS modeling have not been investigated in depth. The results of LES are highly sensitive to the selection of fi lters and the anisotropy of the grid, which has been overlooked in previous research. In the current study, two critical aspects of LES are investigated. Firstly, we analyze the influence of sub-fi lter scale (SFS) dynamics on the accuracy of direct deconvolution models (DDM) at varying fi lter-to-grid ratios (FGR) in isotropic turbulence. An array of invertible filters are employed, encompassing Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The signi ficance of FGR becomes evident, as it acts as a pivotal factor in error control for precise SFS stress prediction. When FGR is set to 1, the DDM models cannot accurately reconstruct the SFS stress due to the insufficient resolution of SFS dynamics. Notably, prediction capabilities are enhanced at an FGR of 2, resulting in accurate SFS stress reconstruction, except for cases involving Helmholtz I and II fi lters. A remarkable precision close to 100% is achieved at an FGR of 4 for all DDM models. Additionally, the further exploration extends to the fi lter anisotropy to address its impact on the SFS dynamics and LES accuracy. By employing dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with the anisotropic fi lter, aspect ratios (AR) ranging from 1 to 16 in LES fi lters are evaluated. The findings highlight the DDM's pro ficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. High correlation coefficients exceeding 90% are observed in the a priori study for the DDM's reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as lter anisotropy increases. In the a posteriori studies, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, encompassing velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strain-rate tensors, and SFS stress. It is observed that as fi lter anisotropy intensify , the results of DSM and DMM become worse, while the DDM continues to deliver satisfactory results across all fi lter-anisotropy scenarios. The fi ndings emphasize the DDM framework's potential as a valuable tool for advancing the development of sophisticated SFS models for LES of turbulence.

Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence

Procedia PDF Downloads 57
865 Teaching Light Polarization by Putting Art and Physics Together

Authors: Fabrizio Logiurato

Abstract:

Light Polarization has many technological applications, and its discovery was crucial to reveal the transverse nature of the electromagnetic waves. However, despite its fundamental and practical importance, in high school, this property of light is often neglected. This is a pity not only for its conceptual relevance, but also because polarization gives the possibility to perform many brilliant experiments with low cost materials. Moreover, the treatment of this matter lends very well to an interdisciplinary approach between art, biology and technology, which usually makes things more interesting to students. For these reasons, we have developed, and in this work, we introduce a laboratory on light polarization for high school and undergraduate students. They can see beautiful pictures when birefringent materials are set between two crossed polarizing filters. Pupils are very fascinated and drawn into by what they observe. The colourful images remind them of those ones of abstract painting or alien landscapes. With this multidisciplinary teaching method, students are more engaged and participative, and also, the learning process of the respective physics concepts is more effective.

Keywords: light polarization, optical activity, multidisciplinary education, science and art

Procedia PDF Downloads 194
864 Road Systems as Environmental Barriers: An Overview of Roadways in Their Function as Fences for Wildlife Movement

Authors: Rachael Bentley, Callahan Gergen, Brodie Thiede

Abstract:

Roadways have a significant impact on the environment in so far as they function as barriers to wildlife movement, both through road mortality and through resultant road avoidance. Roads have an im-mense presence worldwide, and it is predicted to increase substantially in the next thirty years. As roadways become even more common, it is important to consider their environmental impact, and to mitigate the negative effects which they have on wildlife and wildlife mobility. In a thorough analysis of several related studies, a common conclusion was that roads cause habitat fragmentation, which can lead split populations to evolve differently, for better or for worse. Though some populations adapted positively to roadways, becoming more resistant to road mortality, and more tolerant to noise and chemical contamination, many others experienced maladaptation, either due to chemical contamination in and around their environment, or because of genetic mutations from inbreeding when their population was fragmented too substantially to support a large enough group for healthy genetic exchange. Large mammals were especially susceptible to maladaptation from inbreed-ing, as they require larger areas to roam and therefore require even more space to sustain a healthy population. Regardless of whether a species evolved positively or negatively as a result of their proximity to a road, animals tended to avoid roads, making the genetic diversity from habitat fragmentation an exceedingly prevalent issue in the larger discussion of road ecology. Additionally, the consideration of solu-tions, such as overpasses and underpasses, is crucial to ensuring the long term survival of many wildlife populations. In studies addressing the effectiveness of overpasses and underpasses, it seemed as though animals adjusted well to these sorts of solutions, but strategic place-ment, as well as proper sizing, proper height, shelter from road noise, and other considerations were important in construction. When an underpass or overpass was well-built and well-shielded from human activity, animals’ usage of the structure increased significantly throughout its first five years, thus reconnecting previously divided populations. Still, these structures are costly and they are often unable to fully address certain issues such as light, noise, and contaminants from vehicles. Therefore, the need for further discussion of new, crea-tive solutions remains paramount. Roads are one of the most consistent and prominent features of today’s landscape, but their environmental impacts are largely overlooked. While roads are useful for connecting people, they divide landscapes and animal habitats. Therefore, further research and investment in possible solutions is necessary to mitigate the negative effects which roads have on wildlife mobility and to pre-vent issues from resultant habitat fragmentation.

Keywords: fences, habitat fragmentation, roadways, wildlife mobility

Procedia PDF Downloads 151
863 [Keynote Talk]: Photocatalytic Cleaning Performance of Air Filters for a Binary Mixture

Authors: Lexuan Zhong, Chang-Seo Lee, Fariborz Haghighat, Stuart Batterman, John C. Little

Abstract:

Ultraviolet photocatalytic oxidation (UV-PCO) technology has been recommended as a green approach to health indoor environment when it is integrated into mechanical ventilation systems for inorganic and organic compounds removal as well as energy saving due to less outdoor air intakes. Although much research has been devoted to UV-PCO, limited information is available on the UV-PCO behavior tested by the mixtures in literature. This project investigated UV-PCO performance and by-product generation using a single and a mixture of acetone and MEK at 100 ppb each in a single-pass duct system in an effort to obtain knowledge associated with competitive photochemical reactions involved in. The experiments were performed at 20 % RH, 22 °C, and a gas flow rate of 128 m3/h (75 cfm). Results show that acetone and MEK mutually reduced each other’s PCO removal efficiency, particularly negative removal efficiency for acetone. These findings were different from previous observation of facilitatory effects on the adsorption of acetone and MEK on photocatalyst surfaces.

Keywords: by-products, inhibitory effect, mixture, photocatalytic oxidation

Procedia PDF Downloads 485
862 Musical Tesla Coil Controlled by an Audio Signal Processed in Matlab

Authors: Sandra Cuenca, Danilo Santana, Anderson Reyes

Abstract:

The following project is based on the manipulation of audio signals through the Matlab software, which has an audio signal that is modified, and its resultant obtained through the auxiliary port of the computer is passed through a signal amplifier whose amplified signal is connected to a tesla coil which has a behavior like a vumeter, the flashes at the output of the tesla coil increase and decrease its intensity depending on the audio signal in the computer and also the voltage source from which it is sent. The amplified signal then passes to the tesla coil being shown in the plasma sphere with the respective flashes; this activation is given through the specified parameters that we want to give in the MATLAB algorithm that contains the digital filters for the manipulation of our audio signal sent to the tesla coil to be displayed in a plasma sphere with flashes of the combination of colors commonly pink and purple that varies according to the tone of the song.

Keywords: auxiliary port, tesla coil, vumeter, plasma sphere

Procedia PDF Downloads 70
861 Improved Wearable Monitoring and Treatment System for Parkinson’s Disease

Authors: Bulcha Belay Etana, Benny Malengier, Janarthanan Krishnamoorthy, Timothy Kwa, Lieva VanLangenhove

Abstract:

Electromyography measures the electrical activity of muscles using surface electrodes or needle electrodes to monitor various disease conditions. Recent developments in the signal acquisition of electromyograms using textile electrodes facilitate wearable devices, enabling patients to monitor and control their health status outside of healthcare facilities. Here, we have developed and tested wearable textile electrodes to acquire electromyography signals from patients suffering from Parkinson’s disease and incorporated a feedback-control system to relieve muscle cramping through thermal stimulus. In brief, the textile electrodes made of stainless steel was knitted into a textile fabric as a sleeve, and their electrical characteristic, such as signal-to-noise ratio, was compared with traditional electrodes. To relieve muscle cramping, a heating element made of stainless-steel conductive yarn sewn onto cotton fabric, coupled with a vibration system, was developed. The system integrated a microcontroller and a Myoware muscle sensor to activate the heating element as well as the vibration motor when cramping occurs, and at the same time, the element gets deactivated when the muscle cramping subsides. An optimum therapeutic temperature of 35.5 °C is regulated by continuous temperature monitoring to deactivate the heating system when this threshold value is reached. The textile electrode exhibited a signal-to-noise ratio of 6.38dB, comparable to that of the traditional electrode’s value of 7.05 dB. For a given 9 V power supply, the rise time was about 6 minutes for the developed heating element to reach an optimum temperature.

Keywords: smart textile system, wearable electronic textile, electromyography, heating textile, vibration therapy, Parkinson’s disease

Procedia PDF Downloads 83
860 Equity Risk Premiums and Risk Free Rates in Modelling and Prediction of Financial Markets

Authors: Mohammad Ghavami, Reza S. Dilmaghani

Abstract:

This paper presents an adaptive framework for modelling financial markets using equity risk premiums, risk free rates and volatilities. The recorded economic factors are initially used to train four adaptive filters for a certain limited period of time in the past. Once the systems are trained, the adjusted coefficients are used for modelling and prediction of an important financial market index. Two different approaches based on least mean squares (LMS) and recursive least squares (RLS) algorithms are investigated. Performance analysis of each method in terms of the mean squared error (MSE) is presented and the results are discussed. Computer simulations carried out using recorded data show MSEs of 4% and 3.4% for the next month prediction using LMS and RLS adaptive algorithms, respectively. In terms of twelve months prediction, RLS method shows a better tendency estimation compared to the LMS algorithm.

Keywords: adaptive methods, LSE, MSE, prediction of financial Markets

Procedia PDF Downloads 317
859 X-Ray Detector Technology Optimization In CT Imaging

Authors: Aziz Ikhlef

Abstract:

Most of multi-slices CT scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80kVp and 140kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.

Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts

Procedia PDF Downloads 246
858 Performance Evaluation of a Very High-Resolution Satellite Telescope

Authors: Walid A. Attia, Taher M. Bazan, Fawzy Eltohamy, Mahmoud Fathy

Abstract:

System performance evaluation is an essential stage in the design of high-resolution satellite telescopes prior to the development process. In this paper, a system performance evaluation of a very high-resolution satellite telescope is investigated. The evaluated system has a Korsch optical scheme design. This design has been discussed in another paper with respect to three-mirror anastigmat (TMA) scheme design and the former configuration showed better results. The investigated system is based on the Korsch optical design integrated with a time-delay and integration charge coupled device (TDI-CCD) sensor to achieve a ground sampling distance (GSD) of 25 cm. The key performance metrics considered are the spatial resolution, the signal to noise ratio (SNR) and the total modulation transfer function (MTF) of the system. In addition, the national image interpretability rating scale (NIIRS) metric is assessed to predict the image quality according to the modified general image quality equation (GIQE). Based on the orbital, optical and detector parameters, the estimated GSD is found to be 25 cm. The SNR has been analyzed at different illumination conditions of target albedos, sun and sensor angles. The system MTF has been computed including diffraction, aberration, optical manufacturing, smear and detector sampling as the main contributors for evaluation the MTF. Finally, the system performance evaluation results show that the computed MTF value is found to be around 0.08 at the Nyquist frequency, the SNR value was found to be 130 at albedo 0.2 with a nadir viewing angles and the predicted NIIRS is in the order of 6.5 which implies a very good system image quality.

Keywords: modulation transfer function, national image interpretability rating scale, signal to noise ratio, satellite telescope performance evaluation

Procedia PDF Downloads 372
857 Use of a Business Intelligence Software for Interactive Visualization of Data on the Swiss Elite Sports System

Authors: Corinne Zurmuehle, Andreas Christoph Weber

Abstract:

In 2019, the Swiss Federal Institute of Sport Magglingen (SFISM) conducted a mixed-methods study on the Swiss elite sports system, which yielded a large quantity of research data. In a quantitative online survey, 1151 elite sports athletes, 542 coaches, and 102 Performance Directors of national sports federations (NF) have submitted their perceptions of the national support measures of the Swiss elite sports system. These data provide an essential database for the further development of the Swiss elite sports system. The results were published in a report presenting the results divided into 40 Olympic summer and 14 winter sports (Olympic classification). The authors of this paper assume that, in practice, this division is too unspecific to assess where further measures would be needed. The aim of this paper is to find appropriate parameters for data visualization in order to identify disparities in sports promotion that allow an assessment of where further interventions by Swiss Olympic (NF umbrella organization) are required. Method: First, the variable 'salary earned from sport' was defined as a variable to measure the impact of elite sports promotion. This variable was chosen as a measure as it represents an important indicator for the professionalization of elite athletes and therefore reflects national level sports promotion measures applied by Swiss Olympic. Afterwards, the variable salary was tested with regard to the correlation between Olympic classification [a], calculating the Eta coefficient. To estimate the appropriate parameters for data visualization, the correlation between salary and four further parameters was analyzed by calculating the Eta coefficient: [a] sport; [b] prioritization (from 1 to 5) of the sports by Swiss Olympic; [c] gender; [d] employment level in sports. Results & Discussion: The analyses reveal a very small correlation between salary and Olympic classification (ɳ² = .011, p = .005). Gender demonstrates an even small correlation (ɳ² = .006, p = .014). The parameter prioritization was correlating with small effect (ɳ² = .017, p = .001) as did employment level (ɳ² = .028, p < .001). The highest correlation was identified by the parameter sport with a moderate effect (ɳ² = .075, p = .047). The analyses show that the disparities in sports promotion cannot be determined by a particular parameter but presumably explained by a combination of several parameters. We argue that the possibility of combining parameters for data visualization should be enabled when the analysis is provided to Swiss Olympic for further strategic decision-making. However, the inclusion of multiple parameters massively multiplies the number of graphs and is therefore not suitable for practical use. Therefore, we suggest to apply interactive dashboards for data visualization using Business Intelligence Software. Practical & Theoretical Contribution: This contribution provides the first attempt to use Business Intelligence Software for strategic decision-making in national level sports regarding the prioritization of national resources for sports and athletes. This allows to set specific parameters with a significant effect as filters. By using filters, parameters can be combined and compared against each other and set individually for each strategic decision.

Keywords: data visualization, business intelligence, Swiss elite sports system, strategic decision-making

Procedia PDF Downloads 76
856 X-Ray Detector Technology Optimization in Computed Tomography

Authors: Aziz Ikhlef

Abstract:

Most of multi-slices Computed Tomography (CT) scanners are built with detectors composed of scintillator - photodiodes arrays. The photodiodes arrays are mainly based on front-illuminated technology for detectors under 64 slices and on back-illuminated photodiode for systems of 64 slices or more. The designs based on back-illuminated photodiodes were being investigated for CT machines to overcome the challenge of the higher number of runs and connection required in front-illuminated diodes. In backlit diodes, the electronic noise has already been improved because of the reduction of the load capacitance due to the routing reduction. This is translated by a better image quality in low signal application, improving low dose imaging in large patient population. With the fast development of multi-detector-rows CT (MDCT) scanners and the increasing number of examinations, the clinical community has raised significant concerns on radiation dose received by the patient in both medical and regulatory community. In order to reduce individual exposure and in response to the recommendations of the International Commission on Radiological Protection (ICRP) which suggests that all exposures should be kept as low as reasonably achievable (ALARA), every manufacturer is trying to implement strategies and solutions to optimize dose efficiency and image quality based on x-ray emission and scanning parameters. The added demands on the CT detector performance also comes from the increased utilization of spectral CT or dual-energy CT in which projection data of two different tube potentials are collected. One of the approaches utilizes a technology called fast-kVp switching in which the tube voltage is switched between 80 kVp and 140 kVp in fraction of a millisecond. To reduce the cross-contamination of signals, the scintillator based detector temporal response has to be extremely fast to minimize the residual signal from previous samples. In addition, this paper will present an overview of detector technologies and image chain improvement which have been investigated in the last few years to improve the signal-noise ratio and the dose efficiency CT scanners in regular examinations and in energy discrimination techniques. Several parameters of the image chain in general and in the detector technology contribute in the optimization of the final image quality. We will go through the properties of the post-patient collimation to improve the scatter-to-primary ratio, the scintillator material properties such as light output, afterglow, primary speed, crosstalk to improve the spectral imaging, the photodiode design characteristics and the data acquisition system (DAS) to optimize for crosstalk, noise and temporal/spatial resolution.

Keywords: computed tomography, X-ray detector, medical imaging, image quality, artifacts

Procedia PDF Downloads 182
855 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc

Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez

Abstract:

The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.

Keywords: BLER, LTE, network, qualipoc, SNR.

Procedia PDF Downloads 95
854 Improved Distance Estimation in Dynamic Environments through Multi-Sensor Fusion with Extended Kalman Filter

Authors: Iffat Ara Ebu, Fahmida Islam, Mohammad Abdus Shahid Rafi, Mahfuzur Rahman, Umar Iqbal, John Ball

Abstract:

The application of multi-sensor fusion for enhanced distance estimation accuracy in dynamic environments is crucial for advanced driver assistance systems (ADAS) and autonomous vehicles. Limitations of single sensors such as cameras or radar in adverse conditions motivate the use of combined camera and radar data to improve reliability, adaptability, and object recognition. A multi-sensor fusion approach using an extended Kalman filter (EKF) is proposed to combine sensor measurements with a dynamic system model, achieving robust and accurate distance estimation. The research utilizes the Mississippi State University Autonomous Vehicular Simulator (MAVS) to create a controlled environment for data collection. Data analysis is performed using MATLAB. Qualitative (visualization of fused data vs ground truth) and quantitative metrics (RMSE, MAE) are employed for performance assessment. Initial results with simulated data demonstrate accurate distance estimation compared to individual sensors. The optimal sensor measurement noise variance and plant noise variance parameters within the EKF are identified, and the algorithm is validated with real-world data from a Chevrolet Blazer. In summary, this research demonstrates that multi-sensor fusion with an EKF significantly improves distance estimation accuracy in dynamic environments. This is supported by comprehensive evaluation metrics, with validation transitioning from simulated to real-world data, paving the way for safer and more reliable autonomous vehicle control.

Keywords: sensor fusion, EKF, MATLAB, MAVS, autonomous vehicle, ADAS

Procedia PDF Downloads 10
853 Online Prediction of Nonlinear Signal Processing Problems Based Kernel Adaptive Filtering

Authors: Hamza Nejib, Okba Taouali

Abstract:

This paper presents two of the most knowing kernel adaptive filtering (KAF) approaches, the kernel least mean squares and the kernel recursive least squares, in order to predict a new output of nonlinear signal processing. Both of these methods implement a nonlinear transfer function using kernel methods in a particular space named reproducing kernel Hilbert space (RKHS) where the model is a linear combination of kernel functions applied to transform the observed data from the input space to a high dimensional feature space of vectors, this idea known as the kernel trick. Then KAF is the developing filters in RKHS. We use two nonlinear signal processing problems, Mackey Glass chaotic time series prediction and nonlinear channel equalization to figure the performance of the approaches presented and finally to result which of them is the adapted one.

Keywords: online prediction, KAF, signal processing, RKHS, Kernel methods, KRLS, KLMS

Procedia PDF Downloads 381
852 Application of Argumentation for Improving the Classification Accuracy in Inductive Concept Formation

Authors: Vadim Vagin, Marina Fomina, Oleg Morosin

Abstract:

This paper contains the description of argumentation approach for the problem of inductive concept formation. It is proposed to use argumentation, based on defeasible reasoning with justification degrees, to improve the quality of classification models, obtained by generalization algorithms. The experiment’s results on both clear and noisy data are also presented.

Keywords: argumentation, justification degrees, inductive concept formation, noise, generalization

Procedia PDF Downloads 419
851 Permeodynamic Particulate Matter Filtration for Improved Air Quality

Authors: Hamad M. Alnagran, Mohammed S. Imbabi

Abstract:

Particulate matter (PM) in the air we breathe is detrimental to health. Overcoming this problem has attracted interest and prompted research on the use of PM filtration in commercial buildings and homes to be carried out. The consensus is that tangible health benefits can result from the use of PM filters in most urban environments, to clean up the building’s fresh air supply and thereby reduce exposure of residents to airborne PM. The authors have investigated and are developing a new large-scale Permeodynamic Filtration Technology (PFT) capable of permanently filtering and removing airborne PMs from outdoor spaces, thus also benefiting internal spaces such as the interiors of buildings. Theoretical models were developed, and laboratory trials carried out to determine, and validate through measurement permeodynamic filtration efficiency and pressure drop as functions of PM particle size distributions. The conclusion is that PFT offers a potentially viable, cost effective end of pipe solution to the problem of airborne PM.

Keywords: air filtration, particulate matter, particle size distribution, permeodynamic

Procedia PDF Downloads 185
850 Gaussian Particle Flow Bernoulli Filter for Single Target Tracking

Authors: Hyeongbok Kim, Lingling Zhao, Xiaohong Su, Junjie Wang

Abstract:

The Bernoulli filter is a precise Bayesian filter for single target tracking based on the random finite set theory. The standard Bernoulli filter often underestimates the number of targets. This study proposes a Gaussian particle flow (GPF) Bernoulli filter employing particle flow to migrate particles from prior to posterior positions to improve the performance of the standard Bernoulli filter. By employing the particle flow filter, the computational speed of the Bernoulli filters is significantly improved. In addition, the GPF Bernoulli filter provides a more accurate estimation compared with that of the standard Bernoulli filter. Simulation results confirm the improved tracking performance and computational speed in two- and three-dimensional scenarios compared with other algorithms.

Keywords: Bernoulli filter, particle filter, particle flow filter, random finite sets, target tracking

Procedia PDF Downloads 72
849 Taguchi Robust Design for Optimal Setting of Process Wastes Parameters in an Automotive Parts Manufacturing Company

Authors: Charles Chikwendu Okpala, Christopher Chukwutoo Ihueze

Abstract:

As a technique that reduces variation in a product by lessening the sensitivity of the design to sources of variation, rather than by controlling their sources, Taguchi Robust Design entails the designing of ideal goods, by developing a product that has minimal variance in its characteristics and also meets the desired exact performance. This paper examined the concept of the manufacturing approach and its application to brake pad product of an automotive parts manufacturing company. Although the firm claimed that only defects, excess inventory, and over-production were the few wastes that grossly affect their productivity and profitability, a careful study and analysis of their manufacturing processes with the application of Single Minute Exchange of Dies (SMED) tool showed that the waste of waiting is the fourth waste that bedevils the firm. The selection of the Taguchi L9 orthogonal array which is based on the four parameters and the three levels of variation for each parameter revealed that with a range of 2.17, that waiting is the major waste that the company must reduce in order to continue to be viable. Also, to enhance the company’s throughput and profitability, the wastes of over-production, excess inventory, and defects with ranges of 2.01, 1.46, and 0.82, ranking second, third, and fourth respectively must also be reduced to the barest minimum. After proposing -33.84 as the highest optimum Signal-to-Noise ratio to be maintained for the waste of waiting, the paper advocated for the adoption of all the tools and techniques of Lean Production System (LPS), and Continuous Improvement (CI), and concluded by recommending SMED in order to drastically reduce set up time which leads to unnecessary waiting.

Keywords: lean production system, single minute exchange of dies, signal to noise ratio, Taguchi robust design, waste

Procedia PDF Downloads 108
848 Enhance Power Quality by HVDC System, Comparison Technique between HVDC and HVAC Transmission Systems

Authors: Smko Zangana, Ergun Ercelebi

Abstract:

The alternating current is the main power in all industries and other aspects especially for the short and mid distances, but as far as long a distance which exceeds 500 KMs, using the alternating current technically will face many difficulties and more costs because it's difficult to control the current and also other restrictions. Therefore, recently those reasons led to building transmission lines HVDC to transmit power for long distances. This document presents technical comparison and assessments for power transmission system among distances either ways and studying the stability of the system regarding the proportion of losses in the actual power sent and received between both sides in different systems and also categorizing filters used in the HVDC system and its impact and effect on reducing Harmonic in the power transmission. MATLAB /Simulink simulation software is used to simulate both HVAC & HVDC power transmission system topologies.

Keywords: HVAC power system, HVDC power system, power system simulation (MATLAB), the alternating current, voltage stability

Procedia PDF Downloads 349