Search results for: removing noise
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1568

Search results for: removing noise

1028 Generation of High-Quality Synthetic CT Images from Cone Beam CT Images Using A.I. Based Generative Networks

Authors: Heeba A. Gurku

Abstract:

Introduction: Cone Beam CT(CBCT) images play an integral part in proper patient positioning in cancer patients undergoing radiation therapy treatment. But these images are low in quality. The purpose of this study is to generate high-quality synthetic CT images from CBCT using generative models. Material and Methods: This study utilized two datasets from The Cancer Imaging Archive (TCIA) 1) Lung cancer dataset of 20 patients (with full view CBCT images) and 2) Pancreatic cancer dataset of 40 patients (only 27 patients having limited view images were included in the study). Cycle Generative Adversarial Networks (GAN) and its variant Attention Guided Generative Adversarial Networks (AGGAN) models were used to generate the synthetic CTs. Models were evaluated by visual evaluation and on four metrics, Structural Similarity Index Measure (SSIM), Peak Signal Noise Ratio (PSNR) Mean Absolute Error (MAE) and Root Mean Square Error (RMSE), to compare the synthetic CT and original CT images. Results: For pancreatic dataset with limited view CBCT images, our study showed that in Cycle GAN model, MAE, RMSE, PSNR improved from 12.57to 8.49, 20.94 to 15.29 and 21.85 to 24.63, respectively but structural similarity only marginally increased from 0.78 to 0.79. Similar, results were achieved with AGGAN with no improvement over Cycle GAN. However, for lung dataset with full view CBCT images Cycle GAN was able to reduce MAE significantly from 89.44 to 15.11 and AGGAN was able to reduce it to 19.77. Similarly, RMSE was also decreased from 92.68 to 23.50 in Cycle GAN and to 29.02 in AGGAN. SSIM and PSNR also improved significantly from 0.17 to 0.59 and from 8.81 to 21.06 in Cycle GAN respectively while in AGGAN SSIM increased to 0.52 and PSNR increased to 19.31. In both datasets, GAN models were able to reduce artifacts, reduce noise, have better resolution, and better contrast enhancement. Conclusion and Recommendation: Both Cycle GAN and AGGAN were significantly able to reduce MAE, RMSE and PSNR in both datasets. However, full view lung dataset showed more improvement in SSIM and image quality than limited view pancreatic dataset.

Keywords: CT images, CBCT images, cycle GAN, AGGAN

Procedia PDF Downloads 75
1027 Towards Resilient Cloud Computing through Cyber Risk Assessment

Authors: Hilalah Alturkistani, Alaa AlFaadhel, Nora AlJahani, Fatiha Djebbar

Abstract:

Cloud computing is one of the most widely used technology which provides opportunities and services to government entities, large companies, and standard users. However, cybersecurity risk management studies of cloud computing and resiliency approaches are lacking. This paper proposes resilient cloud cybersecurity risk assessment and management tailored specifically, to Dropbox with two approaches:1) technical-based solution motivated by a cybersecurity risk assessment of cloud services, and 2)a target personnel-based solution guided by cybersecurity-related survey among employees to identify their knowledge that qualifies them withstand to any cyberattack. The proposed work attempts to identify cloud vulnerabilities, assess threats and detect high risk components, to finally propose appropriate safeguards such as failure predicting and removing, redundancy or load balancing techniques for quick recovery and return to pre-attack state if failure happens.

Keywords: cybersecurity risk management plan, resilient cloud computing, cyberattacks, cybersecurity risk assessment

Procedia PDF Downloads 125
1026 Effect of Concrete Strength on the Bond Between Carbon Fiber Reinforced Polymer and Concrete in Hot Weather

Authors: Usama Mohamed Ahamed

Abstract:

This research deals with the bond behavior of carbon FRP composite wraps adhered/bonded to the surface of the concrete. Four concrete mixes were designed to achieve a concrete compressive strength of 18, 22.5,25 and 30 MP after 28 days of curing. The focus of the study is on bond degradation when the hybrid structure is exposed to hot weather conditions. Specimens were exposed to 50 0C temperature duration 6 months and other specimens were sustained in laboratory temperature ( 20-24) 0C. Upon removing the specimens from their conditioning environment, tension tests were performed in the machine using a specially manufactured concrete cube holder. A lightweight mortar layer is used to protect the bonded carbon FRP layer on the concrete surface. The results show that the higher the concrete's compressive, the higher the bond strength. The high temperature decreases the bond strength between concrete and carbon fiber-reinforced polymer. The use of a protection layer is essential for concrete exposed to hot weather.

Keywords: concrete, bond, hot weather and carbon fiber, carbon fiber reinforced polymers

Procedia PDF Downloads 94
1025 Experiment and Analytical Study on Fire Resistance Performance of Slot Type Concrete-Filled Tube

Authors: Bum Yean Cho, Heung-Youl Kim, Ki-Seok Kwon, Kang-Su Kim

Abstract:

In this study, a full-scale test and analysis (numerical analysis) of fire resistance performance of bare CFT column on which slot was used instead of existing welding method to connect the steel pipe on the concrete-filled tube were conducted. Welded CFT column is known to be vulnerable to high or low temperature because of low brittleness of welding part. As a result of a fire resistance performance test of slot CFT column after removing the welding part and fixing it by a slot which was folded into the tube, slot type CFT column indicated the improved fire resistance performance than welded CFT column by 28% or more. And as a result of conducting finite element analysis of slot type column using ABAQUS, analysis result proved the reliability of the test result in predicting the fire behavior and fire resistance hour.

Keywords: CFT (concrete-filled tube) column, fire resistance performance, slot, weld

Procedia PDF Downloads 174
1024 Acceleration Techniques of DEM Simulation for Dynamics of Particle Damping

Authors: Masato Saeki

Abstract:

Presented herein is a novel algorithms for calculating the damping performance of particle dampers. The particle damper is a passive vibration control technique and has many practical applications due to simple design. It consists of granular materials constrained to move between two ends in the cavity of a primary vibrating system. The damping effect results from the exchange of momentum during the impact of granular materials against the wall of the cavity. This damping has the advantage of being independent of the environment. Therefore, particle damping can be applied in extreme temperature environments, where most conventional dampers would fail. It was shown experimentally in many papers that the efficiency of the particle dampers is high in the case of resonant vibration. In order to use the particle dampers effectively, it is necessary to solve the equations of motion for each particle, considering the granularity. The discrete element method (DEM) has been found to be effective for revealing the dynamics of particle damping. In this method, individual particles are assumed as rigid body and interparticle collisions are modeled by mechanical elements as springs and dashpots. However, the computational cost is significant since the equation of motion for each particle must be solved at each time step. In order to improve the computational efficiency of the DEM, the new algorithms are needed. In this study, new algorithms are proposed for implementing the high performance DEM. On the assumption that behaviors of the granular particles in the each divided area of the damper container are the same, the contact force of the primary system with all particles can be considered to be equal to the product of the divided number of the damper area and the contact force of the primary system with granular materials per divided area. This convenience makes it possible to considerably reduce the calculation time. The validity of this calculation method was investigated and the calculated results were compared with the experimental ones. This paper also presents the results of experimental studies of the performance of particle dampers. It is shown that the particle radius affect the noise level. It is also shown that the particle size and the particle material influence the damper performance.

Keywords: particle damping, discrete element method (DEM), granular materials, numerical analysis, equivalent noise level

Procedia PDF Downloads 450
1023 Smooth Second Order Nonsingular Terminal Sliding Mode Control for a 6 DOF Quadrotor UAV

Authors: V. Tabrizi, A. Vali, R. GHasemi, V. Behnamgol

Abstract:

In this article, a nonlinear model of an under actuated six degrees of freedom (6 DOF) quadrotor UAV is derived on the basis of the Newton-Euler formula. The derivation comprises determining equations of the motion of the quadrotor in three dimensions and approximating the actuation forces through the modeling of aerodynamic coefficients and electric motor dynamics. The robust nonlinear control strategy includes a smooth second order non-singular terminal sliding mode control which is applied to stabilizing this model. The control method is on the basis of super twisting algorithm for removing the chattering and producing smooth control signal. Also, nonsingular terminal sliding mode idea is used for introducing a nonlinear sliding variable that guarantees the finite time convergence in sliding phase. Simulation results show that the proposed algorithm is robust against uncertainty or disturbance and guarantees a fast and precise control signal.

Keywords: quadrotor UAV, nonsingular terminal sliding mode, second order sliding mode t, electronics, control, signal processing

Procedia PDF Downloads 429
1022 Quantitative Changes in Biofilms of a Seawater Tubular Heat Exchanger Subjected to Electromagnetic Fields Treatment

Authors: Sergio Garcia, Alfredo Trueba, Luis M. Vega, Ernesto Madariaga

Abstract:

Biofilms adhesion is one of the more important cost of industries plants on wide world, which use to water for cooling heat exchangers or are in contact with water. This study evaluated the effect of Electromagnetic Fields on biofilms in tubular heat exchangers using seawater cooling. The results showed an up to 40% reduction of the biofilm thickness compared to the untreated control tubes. The presence of organic matter was reduced by 75%, the inorganic mater was reduced by 87%, and 53% of the dissolved solids were eliminated. The biofilm thermal conductivity in the treated tube was reduced by 53% as compared to the control tube. The hardness in the effluent during the experimental period was decreased by 18% in the treated tubes compared with control tubes. Our results show that the electromagnetic fields treatment has a great potential in the process of removing biofilms in heat exchanger.

Keywords: biofilm, heat exchanger, electromagnetic fields, seawater

Procedia PDF Downloads 184
1021 An Indoor Positioning System in Wireless Sensor Networks with Measurement Delay

Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang

Abstract:

In the current paper, an indoor positioning system is proposed with consideration of measurement delay. Firstly, an estimation filter with a measurement delay is designed for the indoor positioning mechanism under a weighted least square criterion, which utilizes only finite measurements on the most recent window. The proposed estimation filtering based scheme gives the filtered estimates for position, velocity and acceleration of moving target in real-time, while removing undesired noisy effects and preserving desired moving positions. Secondly, the proposed scheme is shown to have good inherent properties such as unbiasedness, efficiency, time-invariance, deadbeat, and robustness due to the finite memory structure. Finally, computer simulations shows that the performance of the proposed estimation filtering based scheme can outperform to the existing infinite memory filtering based mechanism.

Keywords: indoor positioning system, wireless sensor networks, measurement delay

Procedia PDF Downloads 474
1020 Interruption Overload in an Office Environment: Hungarian Survey Focusing on the Factors that Affect Job Satisfaction and Work Efficiency

Authors: Fruzsina Pataki-Bittó, Edit Németh

Abstract:

On the one hand, new technologies and communication tools improve employee productivity and accelerate information and knowledge transfer, while on the other hand, information overload and continuous interruptions make it even harder to concentrate at work. It is a great challenge for companies to find the right balance, while there is also an ongoing demand to recruit and retain the talented employees who are able to adopt the modern work style and effectively use modern communication tools. For this reason, this research does not focus on the objective measures of office interruptions, but aims to find those disruption factors which influence the comfort and job satisfaction of employees, and the way how they feel generally at work. The focus of this research is on how employees feel about the different types of interruptions, which are those they themselves identify as hindering factors, and those they feel as stress factors. By identifying and then reducing these destructive factors, job satisfaction can reach a higher level and employee turnover can be reduced. During the research, we collected information from depth interviews and questionnaires asking about work environment, communication channels used in the workplace, individual communication preferences, factors considered as disruptions, and individual steps taken to avoid interruptions. The questionnaire was completed by 141 office workers from several types of workplaces based in Hungary. Even though 66 respondents are working at Hungarian offices of multinational companies, the research is about the characteristics of the Hungarian labor force. The most important result of the research shows that while more than one third of the respondents consider office noise as a disturbing factor, personal inquiries are welcome and considered useful, even if in such cases the work environment will not be convenient to solve tasks requiring concentration. Analyzing the sizes of the offices, in an open-space environment, the rate of those who consider office noise as a disturbing factor is surprisingly lower than in smaller office rooms. Opinions are more diverse regarding information communication technologies. In addition to the interruption factors affecting the employees' job satisfaction, the research also focuses on the role of the offices in the 21st century.

Keywords: information overload, interruption, job satisfaction, office environment, work efficiency

Procedia PDF Downloads 224
1019 Biosorption of Ni (II) Using Alkaline-Treated Rice Husk

Authors: Khanom Simarani

Abstract:

Rice husk has been widely reported as a good sorbent for heavy metals. Pre treatment of rice husk minimizes cellulose crystallinity and increases the surface area thus ensuring better adsorption capacity. Commercial base and natural base-treated rice husk were used to investigate the potential of Ni(II) adsorption from synthetic solutions and waste water in batch systems. Effects of process variables such as pH, contact time, adsorbent dose, initial Ni (II) concentration were studied. Optimum Ni (II) adsorption was observed at pH 6 within 60 min of contact time. Experimental data showed increased amount of adsorbed Ni(II) with increasing adsorbent dose and decreased percent of adsorption with increasing initial Ni(II) concentration. Kinetic isotherms (Langmuir, Freundlich) were also applied. Biosorption mechanism of rice husk was analyzed using SEM/EDS, FT-IR, and XRD. The results revealed that natural base produced from agroindustrial waste could be used as efficient as commercial bases during pre treatment rice husk in removing Ni(II) from waste waters within 15 min.

Keywords: Nickel removal, adsorbent, heavy metal, biomass

Procedia PDF Downloads 283
1018 Analysis of Process for Solution of Fiber-Ends after Biopolishing on the Surface of Cotton Knit Fabric

Authors: P. Altay, G. Kartal, B. Kizilkaya, S. Kahraman, N. C. Gursoy

Abstract:

Biopolishing is applied to remove the fuzz or pills on the fiber or fabric surface which will reduce its tendency to pill or fuzz after repetitive launderings. After biopolishing process, the fuzzes ripped by cellulase enzymes cannot be thoroughly removed from fabric surface, they remain on the fabric or fiber surface; accordingly disturb the user and lead to decrease in productivity of drying process. The main objective of this study is to develop a method for removing weakened fuzz fibers and surface pills from biofinished fabric surface before drying process. Fuzzes in the lattice structure of fabric were completely removed from the internal structure of the fabric by air blowing. The presence of fuzzes leads to problems with formation of pilling and faded appearance; the removal of fuzzes from the fabric results in reduced tendency to pill formation, cleaner, smoother and softer surface, improved handling properties of fabric with maintaining original color.

Keywords: biopolishing, fuzz fiber, weakened fiber, biofinished cotton fabric

Procedia PDF Downloads 371
1017 Moving Beyond the Limits of Disability Inclusion: Using the Concept of Belonging Through Friendship to Improve the Outcome of the Social Model of Disability

Authors: Luke S. Carlos A. Thompson

Abstract:

The medical model of disability, though beneficial for the medical professional, is often exclusionary, restrictive and dehumanizing when applied to the lived experience of disability. As a result, a critique of this model was constructed called the social model of disability. Much of the language used to articulate the purpose behind the social model of disability can be summed up within the word inclusion. However, this essay asserts that inclusiveness is an incomplete aspiration. The social model, as it currently stands, does not aid in creating a society where those with impairments actually belong. Rather, the social model aids in lessening the visibility, or negative consequence of, difference. Therefore, the social model does not invite society to welcome those with physical and intellectual impairments. It simply aids society in ignoring the existence of impairment by removing explicit forms of exclusion. Rather than simple inclusion, then, this essay uses John Swinton’s concept of friendship and Jean Vanier’s understanding of belonging to better articulate the intended outcome of the social model—a society where everyone can belong.

Keywords: belong, community, differently-able, disability, exclusion, friendship, inclusion, normality

Procedia PDF Downloads 440
1016 Computation of ΔV Requirements for Space Debris Removal Using Orbital Transfer

Authors: Sadhvi Gupta, Charulatha S.

Abstract:

Since the dawn of the early 1950s humans have launched numerous vehicles in space. Be it from rockets to rovers humans have done tremendous growth in the technology sector. While there is mostly upside for it for humans the only major downside which cannot be ignored now is the amount of junk produced in space due to it i.e. space debris. All this space junk amounts from objects we launch from earth which so remains in orbit until it re-enters the atmosphere. Space debris can be of various sizes mainly the big ones are of the dead satellites floating in space and small ones can consist of various things like paint flecks, screwdrivers, bolts etc. Tracking of small space debris whose size is less than 10 cm is impossible and can have vast implications. As the amount of space debris increases in space the chances of it hitting a functional satellite also increases. And it is extremely costly to repair or recover the satellite once hit by a revolving space debris. So the proposed solution is, Actively removing space debris while keeping space sustainability in mind. For this solution a total of 8 modules will be launched in LEO and in GEO and these models will be placed in their desired orbits through Hohmann transfer and for that calculating ΔV values is crucial. After which the modules will be placed in their designated positions in STK software and thorough analysis is conducted.

Keywords: space debris, Hohmann transfer, STK, delta-V

Procedia PDF Downloads 79
1015 An Investigation on the Removal of Synthetic Dyes from Aqueous Solution by a Functional Polymer

Authors: Ali Kara, Asim Olgun, Sevgi Sozugecer, Sahin Ozel, Kubra Nur Yildiz, P. Sevinç, Abdurrahman Kuresh, Guliz Turhan, Duygu Gulgun

Abstract:

The synthetic dyes, one of the most hazardous chemical compound classes, are important potential water pollutions since their presence in water bodies reduces light penetration, precluding the photosynthesis of aqueous flora and causing various diseases. Some the synthetic dyes are highly toxic and/or carcinogenic, and their biodegradation can produce even more toxic aromatic amines. The adsorption procedure is one of the most effective means of removing synthetic dye pollutants, and has been described in a number of previous studies by using the functional polymers. In this study, we investigated the removal of synthetic dyes from aqueous solution by using a functional polymer as an adsorbent material. The effect of initial solution concentration, pH, and contact time on the adsorption capacity of the adsorbent were studied in details. The results showed that functional polymer has a potential to be used as cost-effective and efficient adsorbent for the treatment of aqueous solutions from textile industries.

Keywords: functional polymers, synhetic dyes, adsorption, physicochemical parameters

Procedia PDF Downloads 173
1014 Construction of the Large Scale Biological Networks from Microarrays

Authors: Fadhl Alakwaa

Abstract:

One of the sustainable goals of the system biology is understanding gene-gene interactions. Hence, gene regulatory networks (GRN) need to be constructed for understanding the disease ontology and to reduce the cost of drug development. To construct gene regulatory from gene expression we need to overcome many challenges such as data denoising and dimensionality. In this paper, we develop an integrated system to reduce data dimension and remove the noise. The generated network from our system was validated via available interaction databases and was compared to previous methods. The result revealed the performance of our proposed method.

Keywords: gene regulatory network, biclustering, denoising, system biology

Procedia PDF Downloads 226
1013 Automatic Target Recognition in SAR Images Based on Sparse Representation Technique

Authors: Ahmet Karagoz, Irfan Karagoz

Abstract:

Synthetic Aperture Radar (SAR) is a radar mechanism that can be integrated into manned and unmanned aerial vehicles to create high-resolution images in all weather conditions, regardless of day and night. In this study, SAR images of military vehicles with different azimuth and descent angles are pre-processed at the first stage. The main purpose here is to reduce the high speckle noise found in SAR images. For this, the Wiener adaptive filter, the mean filter, and the median filters are used to reduce the amount of speckle noise in the images without causing loss of data. During the image segmentation phase, pixel values are ordered so that the target vehicle region is separated from other regions containing unnecessary information. The target image is parsed with the brightest 20% pixel value of 255 and the other pixel values of 0. In addition, by using appropriate parameters of statistical region merging algorithm, segmentation comparison is performed. In the step of feature extraction, the feature vectors belonging to the vehicles are obtained by using Gabor filters with different orientation, frequency and angle values. A number of Gabor filters are created by changing the orientation, frequency and angle parameters of the Gabor filters to extract important features of the images that form the distinctive parts. Finally, images are classified by sparse representation method. In the study, l₁ norm analysis of sparse representation is used. A joint database of the feature vectors generated by the target images of military vehicle types is obtained side by side and this database is transformed into the matrix form. In order to classify the vehicles in a similar way, the test images of each vehicle is converted to the vector form and l₁ norm analysis of the sparse representation method is applied through the existing database matrix form. As a result, correct recognition has been performed by matching the target images of military vehicles with the test images by means of the sparse representation method. 97% classification success of SAR images of different military vehicle types is obtained.

Keywords: automatic target recognition, sparse representation, image classification, SAR images

Procedia PDF Downloads 352
1012 Microwave Assisted Extractive Desulfurization of Gas Oil Feedstock

Authors: Hamida Y. Mostafa, Ghada E. Khedr, Dina M. Abd El-Aty

Abstract:

Sulfur compound removal from petroleum fractions is a critical component of environmental protection demands. Solvent extraction, oxidative desulfurization, or hydro-treatment techniques have traditionally been used as the removal processes. While all methods were capable of eliminating sulfur compounds at moderate rates, they had some limitations. A major problem with these routes is their high running expenses, which are caused by their prolonged operation times and high energy consumption. Therefore, new methods for removing sulfur are still necessary. In the current study, a simple assisted desulfurization system for gas oil fraction has been successfully developed using acetonitrile and methanol as a solvent under microwave irradiation. The key variables affecting sulfur removal have been studied, including microwave power, irradiation time, and solvent to gas oil volume ratio. At the conclusion of the research that is being presented, promising results have been found. The results show that a microwave-assisted extractive desulfurization method had remove sulfur with a high degree of efficiency under the suitable conditions.

Keywords: extractive desulfurization, microwave assisted extraction, petroleum fractions, acetonitrile and methanol

Procedia PDF Downloads 89
1011 Global Stability Of Nonlinear Itô Equations And N. V. Azbelev's W-method

Authors: Arcady Ponosov., Ramazan Kadiev

Abstract:

The work studies the global moment stability of solutions of systems of nonlinear differential Itô equations with delays. A modified regularization method (W-method) for the analysis of various types of stability of such systems, based on the choice of the auxiliaryequations and applications of the theory of positive invertible matrices, is proposed and justified. Development of this method for deterministic functional differential equations is due to N.V. Azbelev and his students. Sufficient conditions for the moment stability of solutions in terms of the coefficients for sufficiently general as well as specific classes of Itô equations are given.

Keywords: asymptotic stability, delay equations, operator methods, stochastic noise

Procedia PDF Downloads 213
1010 Backward-Facing Step Measurements at Different Reynolds Numbers Using Acoustic Doppler Velocimetry

Authors: Maria Amelia V. C. Araujo, Billy J. Araujo, Brian Greenwood

Abstract:

The flow over a backward-facing step is characterized by the presence of flow separation, recirculation and reattachment, for a simple geometry. This type of fluid behaviour takes place in many practical engineering applications, hence the reason for being investigated. Historically, fluid flows over a backward-facing step have been examined in many experiments using a variety of measuring techniques such as laser Doppler velocimetry (LDV), hot-wire anemometry, particle image velocimetry or hot-film sensors. However, some of these techniques cannot conveniently be used in separated flows or are too complicated and expensive. In this work, the applicability of the acoustic Doppler velocimetry (ADV) technique is investigated to such type of flows, at various Reynolds numbers corresponding to different flow regimes. The use of this measuring technique in separated flows is very difficult to find in literature. Besides, most of the situations where the Reynolds number effect is evaluated in separated flows are in numerical modelling. The ADV technique has the advantage in providing nearly non-invasive measurements, which is important in resolving turbulence. The ADV Nortek Vectrino+ was used to characterize the flow, in a recirculating laboratory flume, at various Reynolds Numbers (Reh = 3738, 5452, 7908 and 17388) based on the step height (h), in order to capture different flow regimes, and the results compared to those obtained using other measuring techniques. To compare results with other researchers, the step height, expansion ratio and the positions upstream and downstream the step were reproduced. The post-processing of the AVD records was performed using a customized numerical code, which implements several filtering techniques. Subsequently, the Vectrino noise level was evaluated by computing the power spectral density for the stream-wise horizontal velocity component. The normalized mean stream-wise velocity profiles, skin-friction coefficients and reattachment lengths were obtained for each Reh. Turbulent kinetic energy, Reynolds shear stresses and normal Reynolds stresses were determined for Reh = 7908. An uncertainty analysis was carried out, for the measured variables, using the moving block bootstrap technique. Low noise levels were obtained after implementing the post-processing techniques, showing their effectiveness. Besides, the errors obtained in the uncertainty analysis were relatively low, in general. For Reh = 7908, the normalized mean stream-wise velocity and turbulence profiles were compared directly with those acquired by other researchers using the LDV technique and a good agreement was found. The ADV technique proved to be able to characterize the flow properly over a backward-facing step, although additional caution should be taken for measurements very close to the bottom. The ADV measurements showed reliable results regarding: a) the stream-wise velocity profiles; b) the turbulent shear stress; c) the reattachment length; d) the identification of the transition from transitional to turbulent flows. Despite being a relatively inexpensive technique, acoustic Doppler velocimetry can be used with confidence in separated flows and thus very useful for numerical model validation. However, it is very important to perform adequate post-processing of the acquired data, to obtain low noise levels, thus decreasing the uncertainty.

Keywords: ADV, experimental data, multiple Reynolds number, post-processing

Procedia PDF Downloads 132
1009 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 299
1008 Removal of Heavy Metals from Water in the Presence of Organic Wastes: Fruit Peels

Authors: Özge Yılmaz Gel, Berk Kılıç, Derin Dalgıç, Ela Mia Sevilla Levi, Ömer Aydın

Abstract:

In this experiment, our goal was to remove heavy metals from water. Most recent studies have used removing toxic heavy elements: Cu⁺², Cr⁺³ and Fe⁺³ ions from aqueous solutions has been previously investigated with different kinds of plants like kiwi and tangerines. However, in this study, three different fruit peels were used. We tested banana, peach, and potato peels to remove heavy metal ions from their solution. The first step of the experiment was to wash the peels with distilled water and then dry the peels in an oven for 48 hrs at 80°C. Once the peels were washed and dried, 0.2 grams were weighed and added into 200 mL of %0.1 percent heavy metal solutions by mass. The mixing process was done via a magnetic stirrer. Each sample was taken in 15-minute intervals, and absorbance changes of the solutions were detected using a UV-Vis Spectrophotometer. Among the used waste products, banana peel was the most efficient one. Moreover, the amount of fruit peel, pH values of the initial heavy metal solution, and initial concentration of heavy metal solutions were investigated to determine the effect of fruit peels.

Keywords: absorbance, heavy metal, removal of heavy metals, fruit peels

Procedia PDF Downloads 69
1007 On the Influence of Sleep Habits for Predicting Preterm Births: A Machine Learning Approach

Authors: C. Fernandez-Plaza, I. Abad, E. Diaz, I. Diaz

Abstract:

Births occurring before the 37th week of gestation are considered preterm births. A threat of preterm is defined as the beginning of regular uterine contractions, dilation and cervical effacement between 23 and 36 gestation weeks. To author's best knowledge, the factors that determine the beginning of the birth are not completely defined yet. In particular, the incidence of sleep habits on preterm births is weekly studied. The aim of this study is to develop a model to predict the factors affecting premature delivery on pregnancy, based on the above potential risk factors, including those derived from sleep habits and light exposure at night (introduced as 12 variables obtained by a telephone survey using two questionnaires previously used by other authors). Thus, three groups of variables were included in the study (maternal, fetal and sleep habits). The study was approved by Research Ethics Committee of the Principado of Asturias (Spain). An observational, retrospective and descriptive study was performed with 481 births between January 1, 2015 and May 10, 2016 in the University Central Hospital of Asturias (Spain). A statistical analysis using SPSS was carried out to compare qualitative and quantitative variables between preterm and term delivery. Chi-square test qualitative variable and t-test for quantitative variables were applied. Statistically significant differences (p < 0.05) between preterm vs. term births were found for primiparity, multi-parity, kind of conception, place of residence or premature rupture of membranes and interruption during nights. In addition to the statistical analysis, machine learning methods to look for a prediction model were tested. In particular, tree based models were applied as the trade-off between performance and interpretability is especially suitable for this study. C5.0, recursive partitioning, random forest and tree bag models were analysed using caret R-package. Cross validation with 10-folds and parameter tuning to optimize the methods were applied. In addition, different noise reduction methods were applied to the initial data using NoiseFiltersR package. The best performance was obtained by C5.0 method with Accuracy 0.91, Sensitivity 0.93, Specificity 0.89 and Precision 0.91. Some well known preterm birth factors were identified: Cervix Dilation, maternal BMI, Premature rupture of membranes or nuchal translucency analysis in the first trimester. The model also identifies other new factors related to sleep habits such as light through window, bedtime on working days, usage of electronic devices before sleeping from Mondays to Fridays or change of sleeping habits reflected in the number of hours, in the depth of sleep or in the lighting of the room. IF dilation < = 2.95 AND usage of electronic devices before sleeping from Mondays to Friday = YES and change of sleeping habits = YES, then preterm is one of the predicting rules obtained by C5.0. In this work a model for predicting preterm births is developed. It is based on machine learning together with noise reduction techniques. The method maximizing the performance is the one selected. This model shows the influence of variables related to sleep habits in preterm prediction.

Keywords: machine learning, noise reduction, preterm birth, sleep habit

Procedia PDF Downloads 135
1006 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 283
1005 A General Iterative Nonlinear Programming Method to Synthesize Heat Exchanger Network

Authors: Rupu Yang, Cong Toan Tran, Assaad Zoughaib

Abstract:

The work provides an iterative nonlinear programming method to synthesize a heat exchanger network by manipulating the trade-offs between the heat load of process heat exchangers (HEs) and utilities. We consider for the synthesis problem two cases, the first one without fixed cost for HEs, and the second one with fixed cost. For the no fixed cost problem, the nonlinear programming (NLP) model with all the potential HEs is optimized to obtain the global optimum. For the case with fixed cost, the NLP model is iterated through adding/removing HEs. The method was applied in five case studies and illustrated quite well effectiveness. Among which, the approach reaches the lowest TAC (2,904,026$/year) compared with the best record for the famous Aromatic plants problem. It also locates a slightly better design than records in literature for a 10 streams case without fixed cost with only 1/9 computational time. Moreover, compared to the traditional mixed-integer nonlinear programming approach, the iterative NLP method opens a possibility to consider constraints (such as controllability or dynamic performances) that require knowing the structure of the network to be calculated.

Keywords: heat exchanger network, synthesis, NLP, optimization

Procedia PDF Downloads 154
1004 Moment-Curvature Relation for Nonlinear Analysis of Slender Structural Walls

Authors: E. Dehghan, R. Dehghan

Abstract:

Generally, the slender structural walls have flexural behavior. Since behavior of bending members can be explained by moment–curvature relation, therefore, an analytical model is proposed based on moment–curvature relation for slender structural walls. The moment–curvature relationships of RC sections are constructed through section analysis. Governing equations describing the bond-slip behavior in walls are derived and applied to moment–curvature relations. For the purpose of removing the imprecision in analytical results, the plastic hinge length is included in the finite element modeling. Finally, correlation studies between analytical and experimental results are conducted with the objective to establish the validity of the proposed algorithms. The results show that bond-slip effect is more significant in walls subjected to larger axial compression load. Moreover, preferable results are obtained when ultimate strain of concrete is assumed conservatively.

Keywords: nonlinear analysis, slender structural walls, moment-curvature relation, bond-slip, plastic hinge length

Procedia PDF Downloads 305
1003 Mitigation of High Voltage Equipment Design Deficiencies for Improved Operation and Maintenance

Authors: Riyad Awad, Abdulmohsen Alghadeer, Meshari Otaibi

Abstract:

Proper operation and maintenance (O&M) activities of high voltage equipment can lead to an increased asset lifecycle and maintain its integrity and reliability. Such a vital process is important to be proactively considered during equipment design and manufacturing phases by removing and eliminating any obstacles in the equipment which adversely affect the (O&M) activities. This paper presents a gap analysis pertaining to difficulties in performing operations and maintenance (O&M) high voltage electrical equipment, includes power transformers, switch gears, motor control center, disconnect switches and circuit breakers. The difficulties are gathered from field personnel, equipment design review comments, quality management system, and lessons learned database. The purpose of the gap analysis is to mitigate and prevent the (O&M) difficulties as early as possible in the design stage of the equipment lifecycle. The paper concludes with several recommendations and corrective actions for all identified gaps in order to reduce the cost (O&M) difficulties and improve the equipment lifecycle.

Keywords: operation and maintenance, high voltage equipment, equipment lifecycle, reduce the cost of maintenance

Procedia PDF Downloads 152
1002 Analysis of Generated Biogas from Anaerobic Digestion of Piggery Dung

Authors: Babatope Alabadan, Adeyinka Adesanya, I. E. Afangideh

Abstract:

The use of energy is paramount to human existence. Every activity globally revolves round it. Over the years, different sources of energy (petroleum fuels predominantly) have been utilized. Animal waste treatment on the farm is a phenomenon that has called for rapt research attention. Generated wastes on farm pollute the environment in diverse ways. Waste-to-bioenergy treatments can provide livestock operators with multiple value-added, renewable energy products. The objective of this work is to generate methane (CH4) gas from the anaerobic digestion of piggery dung. A retention time of 15 and 30 days and a mesophilic temperature range were selected. The generated biogas composition was methane (CH4), carbondioxide (CO2), hydrogen sulphide (H2S) and ammonia (NH3) using gas chromatography method. At 15 days retention time, 60% of (CH4) was collected while CO2 and traces of H2S and NH3 accounted for 40%. At 30 days retention time, 75% of CH4, 20% of CO2 was collected while traces of H2S and NH3 amounted to 5%. For on and off farm uses, biogas can be upgraded to biomethane by removing the CO2, NH3 and H2S. This product (CH4) can meet heating and power needs or serve as transportation fuels

Keywords: anaerobic digestion, biogas, methane, piggery dung

Procedia PDF Downloads 333
1001 Sigma-Delta ADCs Converter a Study Case

Authors: Thiago Brito Bezerra, Mauro Lopes de Freitas, Waldir Sabino da Silva Júnior

Abstract:

The Sigma-Delta A/D converters have been proposed as a practical application for A/D conversion at high rates because of its simplicity and robustness to imperfections in the circuit, also because the traditional converters are more difficult to implement in VLSI technology. These difficulties with conventional conversion methods need precise analog components in their filters and conversion circuits, and are more vulnerable to noise and interference. This paper aims to analyze the architecture, function and application of Analog-Digital converters (A/D) Sigma-Delta to overcome these difficulties, showing some simulations using the Simulink software and Multisim.

Keywords: analysis, oversampling modulator, A/D converters, sigma-delta

Procedia PDF Downloads 320
1000 Study of Adaptive Filtering Algorithms and the Equalization of Radio Mobile Channel

Authors: Said Elkassimi, Said Safi, B. Manaut

Abstract:

This paper presented a study of three algorithms, the equalization algorithm to equalize the transmission channel with ZF and MMSE criteria, application of channel Bran A, and adaptive filtering algorithms LMS and RLS to estimate the parameters of the equalizer filter, i.e. move to the channel estimation and therefore reflect the temporal variations of the channel, and reduce the error in the transmitted signal. So far the performance of the algorithm equalizer with ZF and MMSE criteria both in the case without noise, a comparison of performance of the LMS and RLS algorithm.

Keywords: adaptive filtering second equalizer, LMS, RLS Bran A, Proakis (B) MMSE, ZF

Procedia PDF Downloads 305
999 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm

Authors: Zachary Huffman, Joana Rocha

Abstract:

Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.

Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations

Procedia PDF Downloads 130