Search results for: gaussian%20mixture%20model%20%28GMM%29
105 A Simple Adaptive Atomic Decomposition Voice Activity Detector Implemented by Matching Pursuit
Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic
Abstract:
A simple adaptive voice activity detector (VAD) is implemented using Gabor and gammatone atomic decomposition of speech for high Gaussian noise environments. Matching pursuit is used for atomic decomposition, and is shown to achieve optimal speech detection capability at high data compression rates for low signal to noise ratios. The most active dictionary elements found by matching pursuit are used for the signal reconstruction so that the algorithm adapts to the individual speakers dominant time-frequency characteristics. Speech has a high peak to average ratio enabling matching pursuit greedy heuristic of highest inner products to isolate high energy speech components in high noise environments. Gabor and gammatone atoms are both investigated with identical logarithmically spaced center frequencies, and similar bandwidths. The algorithm performs equally well for both Gabor and gammatone atoms with no significant statistical differences. The algorithm achieves 70% accuracy at a 0 dB SNR, 90% accuracy at a 5 dB SNR and 98% accuracy at a 20dB SNR using 30dB SNR as a reference for voice activity.Keywords: atomic decomposition, gabor, gammatone, matching pursuit, voice activity detection
Procedia PDF Downloads 289104 Forecasting of COVID-19 Cases, Hospitalization Admissions, and Death Cases Based on Wastewater Sars-COV-2 Surveillance Using Copula Time Series Model
Authors: Hueiwang Anna Jeng, Norou Diawara, Nancy Welch, Cynthia Jackson, Rekha Singh, Kyle Curtis, Raul Gonzalez, David Jurgens, Sasanka Adikari
Abstract:
Modeling effort is needed to predict the COVID-19 trends for developing management strategies and adaptation measures. The objective of this study was to assess whether SARS-CoV-2 viral load in wastewater could serve as a predictor for forecasting COVID-19 cases, hospitalization cases, and death cases using copula-based time series modeling. SARS-CoV-2 RNA load in raw wastewater in Chesapeake VA was measured using the RT-qPCR method. Gaussian copula time series marginal regression model, incorporating an autoregressive moving average model and the copula function, served as a forecasting model. COVID-19 cases were correlated with wastewater viral load, hospitalization cases, and death cases. The forecasted trend of COVID-19 cases closely paralleled one of the reported cases, with over 90% of the forecasted COVID-19 cases falling within the 99% confidence interval of the reported cases. Wastewater SARS-CoV-2 viral load could serve as a predictor for COVID-19 cases and hospitalization cases.Keywords: COVID-19, modeling, time series, copula function
Procedia PDF Downloads 66103 Non-Parametric Changepoint Approximation for Road Devices
Authors: Loïc Warscotte, Jehan Boreux
Abstract:
The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.Keywords: changepoint, weigh-in-motion, process, non-parametric
Procedia PDF Downloads 78102 Medical Image Augmentation Using Spatial Transformations for Convolutional Neural Network
Authors: Trupti Chavan, Ramachandra Guda, Kameshwar Rao
Abstract:
The lack of data is a pain problem in medical image analysis using a convolutional neural network (CNN). This work uses various spatial transformation techniques to address the medical image augmentation issue for knee detection and localization using an enhanced single shot detector (SSD) network. The spatial transforms like a negative, histogram equalization, power law, sharpening, averaging, gaussian blurring, etc. help to generate more samples, serve as pre-processing methods, and highlight the features of interest. The experimentation is done on the OpenKnee dataset which is a collection of knee images from the openly available online sources. The CNN called enhanced single shot detector (SSD) is utilized for the detection and localization of the knee joint from a given X-ray image. It is an enhanced version of the famous SSD network and is modified in such a way that it will reduce the number of prediction boxes at the output side. It consists of a classification network (VGGNET) and an auxiliary detection network. The performance is measured in mean average precision (mAP), and 99.96% mAP is achieved using the proposed enhanced SSD with spatial transformations. It is also seen that the localization boundary is comparatively more refined and closer to the ground truth in spatial augmentation and gives better detection and localization of knee joints.Keywords: data augmentation, enhanced SSD, knee detection and localization, medical image analysis, openKnee, Spatial transformations
Procedia PDF Downloads 152101 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra
Authors: Armin Rahimi
Abstract:
The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution
Procedia PDF Downloads 354100 Bayesian Variable Selection in Quantile Regression with Application to the Health and Retirement Study
Authors: Priya Kedia, Kiranmoy Das
Abstract:
There is a rich literature on variable selection in regression setting. However, most of these methods assume normality for the response variable under consideration for implementing the methodology and establishing the statistical properties of the estimates. In many real applications, the distribution for the response variable may be non-Gaussian, and one might be interested in finding the best subset of covariates at some predetermined quantile level. We develop dynamic Bayesian approach for variable selection in quantile regression framework. We use a zero-inflated mixture prior for the regression coefficients, and consider the asymmetric Laplace distribution for the response variable for modeling different quantiles of its distribution. An efficient Gibbs sampler is developed for our computation. Our proposed approach is assessed through extensive simulation studies, and real application of the proposed approach is also illustrated. We consider the data from health and retirement study conducted by the University of Michigan, and select the important predictors when the outcome of interest is out-of-pocket medical cost, which is considered as an important measure for financial risk. Our analysis finds important predictors at different quantiles of the outcome, and thus enhance our understanding on the effects of different predictors on the out-of-pocket medical cost.Keywords: variable selection, quantile regression, Gibbs sampler, asymmetric Laplace distribution
Procedia PDF Downloads 15599 Thermal Radiation Effect on Mixed Convection Boundary Layer Flow over a Vertical Plate with Varying Density and Volumetric Expansion Coefficient
Authors: Sadia Siddiqa, Z. Khan, M. A. Hossain
Abstract:
In this article, the effect of thermal radiation on mixed convection boundary layer flow of a viscous fluid along a highly heated vertical flat plate is considered with varying density and volumetric expansion coefficient. The density of the fluid is assumed to vary exponentially with temperature, however; volumetric expansion coefficient depends linearly on temperature. Boundary layer equations are transformed into convenient form by introducing primitive variable formulations. Solutions of transformed system of equations are obtained numerically through implicit finite difference method along with Gaussian elimination technique. Results are discussed in view of various parameters, like thermal radiation parameter, volumetric expansion parameter and density variation parameter on the wall shear stress and heat transfer rate. It is concluded from the present investigation that increase in volumetric expansion parameter decreases wall shear stress and enhances heat transfer rate.Keywords: thermal radiation, mixed convection, variable density, variable volumetric expansion coefficient
Procedia PDF Downloads 36798 Distribution of Maximum Loss of Fractional Brownian Motion with Drift
Authors: Ceren Vardar Acar, Mine Caglar
Abstract:
In finance, the price of a volatile asset can be modeled using fractional Brownian motion (fBm) with Hurst parameter H>1/2. The Black-Scholes model for the values of returns of an asset using fBm is given as, 〖Y_t=Y_0 e^((r+μ)t+σB)〗_t^H, 0≤t≤T where Y_0 is the initial value, r is constant interest rate, μ is constant drift and σ is constant diffusion coefficient of fBm, which is denoted by B_t^H where t≥0. Black-Scholes model can be constructed with some Markov processes such as Brownian motion. The advantage of modeling with fBm to Markov processes is its capability of exposing the dependence between returns. The real life data for a volatile asset display long-range dependence property. For this reason, using fBm is a more realistic model compared to Markov processes. Investors would be interested in any kind of information on the risk in order to manage it or hedge it. The maximum possible loss is one way to measure highest possible risk. Therefore, it is an important variable for investors. In our study, we give some theoretical bounds on the distribution of maximum possible loss of fBm. We provide both asymptotical and strong estimates for the tail probability of maximum loss of standard fBm and fBm with drift and diffusion coefficients. In the investment point of view, these results explain, how large values of possible loss behave and its bounds.Keywords: maximum drawdown, maximum loss, fractional brownian motion, large deviation, Gaussian process
Procedia PDF Downloads 48297 Spatiotemporal Analysis of Land Surface Temperature and Urban Heat Island Evaluation of Four Metropolitan Areas of Texas, USA
Authors: Chunhong Zhao
Abstract:
Remotely sensed land surface temperature (LST) is vital to understand the land-atmosphere energy balance, hydrological cycle, and thus is widely used to describe the urban heat island (UHI) phenomenon. However, due to technical constraints, satellite thermal sensors are unable to provide LST measurement with both high spatial and high temporal resolution. Despite different downscaling techniques and algorithms to generate high spatiotemporal resolution LST. Four major metropolitan areas in Texas, USA: Dallas-Fort Worth, Houston, San Antonio, and Austin all demonstrate UHI effects. Different cities are expected to have varying SUHI effect during the urban development trajectory. With the help of the Landsat, ASTER, and MODIS archives, this study focuses on the spatial patterns of UHIs and the seasonal and annual variation of these metropolitan areas. With Gaussian model, and Local Indicators of Spatial Autocorrelations (LISA), as well as data fusion methods, this study identifies the hotspots and the trajectory of the UHI phenomenon of the four cities. By making comparison analysis, the result can help to alleviate the advent effect of UHI and formulate rational urban planning in the long run.Keywords: spatiotemporal analysis, land surface temperature, urban heat island evaluation, metropolitan areas of Texas, USA
Procedia PDF Downloads 41696 Superficial Metrology of Organometallic Chemical Vapour Deposited Undoped ZnO Thin Films on Stainless Steel and Soda-Lime Glass Substrates
Authors: Uchenna Sydney Mbamara, Bolu Olofinjana, Ezekiel Oladele B. Ajayi
Abstract:
Elaborate surface metrology of undoped ZnO thin films, deposited by organometallic chemical vapour deposition (OMCVD) technique at different precursor flow rates, was carried out. Dicarbomethyl-zinc precursor was used. The films were deposited on AISI304L steel and soda-lime glass substrates. Ultraviolet-visible-near-infrared (UV-Vis-NIR) spectroscopy showed that all the thin films were over 80% transparent, with an average bandgap of 3.39 eV, X-ray diffraction (XRD) results showed that the thin films were crystalline with a hexagonal structure, while Rutherford backscattering spectroscopy (RBS) results identified the elements present in each thin film as zinc and oxygen in the ratio of 1:1. Microscope and contactless profilometer results gave images with characteristic colours. The profilometer also gave the surface roughness data in both 2D and 3D. The asperity distribution of the thin film surfaces was Gaussian, while the average fractal dimension Da was in the range of 2.5 ≤ Da. The metrology proved the surfaces good for ‘touch electronics’ and coating mechanical parts for low friction.Keywords: undoped ZnO, precursor flow rate, OMCVD, thin films, surface texture, tribology
Procedia PDF Downloads 6095 An Improved Data Aided Channel Estimation Technique Using Genetic Algorithm for Massive Multi-Input Multiple-Output
Authors: M. Kislu Noman, Syed Mohammed Shamsul Islam, Shahriar Hassan, Raihana Pervin
Abstract:
With the increasing rate of wireless devices and high bandwidth operations, wireless networking and communications are becoming over crowded. To cope with such crowdy and messy situation, massive MIMO is designed to work with hundreds of low costs serving antennas at a time as well as improve the spectral efficiency at the same time. TDD has been used for gaining beamforming which is a major part of massive MIMO, to gain its best improvement to transmit and receive pilot sequences. All the benefits are only possible if the channel state information or channel estimation is gained properly. The common methods to estimate channel matrix used so far is LS, MMSE and a linear version of MMSE also proposed in many research works. We have optimized these methods using genetic algorithm to minimize the mean squared error and finding the best channel matrix from existing algorithms with less computational complexity. Our simulation result has shown that the use of GA worked beautifully on existing algorithms in a Rayleigh slow fading channel and existence of Additive White Gaussian Noise. We found that the GA optimized LS is better than existing algorithms as GA provides optimal result in some few iterations in terms of MSE with respect to SNR and computational complexity.Keywords: channel estimation, LMMSE, LS, MIMO, MMSE
Procedia PDF Downloads 19094 Predicting Relative Performance of Sector Exchange Traded Funds Using Machine Learning
Abstract:
Machine learning has been used in many areas today. It thrives at reviewing large volumes of data and identifying patterns and trends that might not be apparent to a human. Given the huge potential benefit and the amount of data available in the financial market, it is not surprising to see machine learning applied to various financial products. While future prices of financial securities are extremely difficult to forecast, we study them from a different angle. Instead of trying to forecast future prices, we apply machine learning algorithms to predict the direction of future price movement, in particular, whether a sector Exchange Traded Fund (ETF) would outperform or underperform the market in the next week or in the next month. We apply several machine learning algorithms for this prediction. The algorithms are Linear Discriminant Analysis (LDA), k-Nearest Neighbors (KNN), Decision Tree (DT), Gaussian Naive Bayes (GNB), and Neural Networks (NN). We show that these machine learning algorithms, most notably GNB and NN, have some predictive power in forecasting out-performance and under-performance out of sample. We also try to explore whether it is possible to utilize the predictions from these algorithms to outperform the buy-and-hold strategy of the S&P 500 index. The trading strategy to explore out-performance predictions does not perform very well, but the trading strategy to explore under-performance predictions can earn higher returns than simply holding the S&P 500 index out of sample.Keywords: machine learning, ETF prediction, dynamic trading, asset allocation
Procedia PDF Downloads 9793 Image Segmentation Using Active Contours Based on Anisotropic Diffusion
Authors: Shafiullah Soomro
Abstract:
Active contour is one of the image segmentation techniques and its goal is to capture required object boundaries within an image. In this paper, we propose a novel image segmentation method by using an active contour method based on anisotropic diffusion feature enhancement technique. The traditional active contour methods use only pixel information to perform segmentation, which produces inaccurate results when an image has some noise or complex background. We use Perona and Malik diffusion scheme for feature enhancement, which sharpens the object boundaries and blurs the background variations. Our main contribution is the formulation of a new SPF (signed pressure force) function, which uses global intensity information across the regions. By minimizing an energy function using partial differential framework the proposed method captures semantically meaningful boundaries instead of catching uninterested regions. Finally, we use a Gaussian kernel which eliminates the problem of reinitialization in level set function. We use several synthetic and real images from different modalities to validate the performance of the proposed method. In the experimental section, we have found the proposed method performance is better qualitatively and quantitatively and yield results with higher accuracy compared to other state-of-the-art methods.Keywords: active contours, anisotropic diffusion, level-set, partial differential equations
Procedia PDF Downloads 15792 Recognizing an Individual, Their Topic of Conversation and Cultural Background from 3D Body Movement
Authors: Gheida J. Shahrour, Martin J. Russell
Abstract:
The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that inter-subject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.Keywords: person recognition, topic recognition, culture recognition, 3D body movement signals, variability compensation
Procedia PDF Downloads 53991 Design and Implementation of a Counting and Differentiation System for Vehicles through Video Processing
Authors: Derlis Gregor, Kevin Cikel, Mario Arzamendia, Raúl Gregor
Abstract:
This paper presents a self-sustaining mobile system for counting and classification of vehicles through processing video. It proposes a counting and classification algorithm divided in four steps that can be executed multiple times in parallel in a SBC (Single Board Computer), like the Raspberry Pi 2, in such a way that it can be implemented in real time. The first step of the proposed algorithm limits the zone of the image that it will be processed. The second step performs the detection of the mobile objects using a BGS (Background Subtraction) algorithm based on the GMM (Gaussian Mixture Model), as well as a shadow removal algorithm using physical-based features, followed by morphological operations. In the first step the vehicle detection will be performed by using edge detection algorithms and the vehicle following through Kalman filters. The last step of the proposed algorithm registers the vehicle passing and performs their classification according to their areas. An auto-sustainable system is proposed, powered by batteries and photovoltaic solar panels, and the data transmission is done through GPRS (General Packet Radio Service)eliminating the need of using external cable, which will facilitate it deployment and translation to any location where it could operate. The self-sustaining trailer will allow the counting and classification of vehicles in specific zones with difficult access.Keywords: intelligent transportation system, object detection, vehicle couting, vehicle classification, video processing
Procedia PDF Downloads 31990 Conventional and Computational Investigation of the Synthesized Organotin(IV) Complexes Derived from o-Vanillin and 3-Nitro-o-Phenylenediamine
Authors: Harminder Kaur, Manpreet Kaur, Akanksha Kapila, Reenu
Abstract:
Schiff base with general formula H₂L was derived from condensation of o-vanillin and 3-nitro-o-phenylenediamine. This Schiff base was used for the synthesis of organotin(IV) complexes with general formula R₂SnL [R=Phenyl or n-octyl] using equimolar quantities. Elemental analysis UV-Vis, FTIR, and multinuclear spectroscopic techniques (¹H, ¹³C, and ¹¹⁹Sn) NMR were carried out for the characterization of the synthesized complexes. These complexes were coloured and soluble in polar solvents. Computational studies have been performed to obtain the details of the geometry and electronic structures of ligand as well as complexes. Geometry of the ligands and complexes have been optimized at the level of Density Functional Theory with B3LYP/6-311G (d,p) and B3LYP/MPW1PW91 respectively followed by vibrational frequency analysis using Gaussian 09. Observed ¹¹⁹Sn NMR chemical shifts of one of the synthesized complexes showed tetrahedral geometry around Tin atom which is also confirmed by DFT. HOMO-LUMO energy distribution was calculated. FTIR, ¹HNMR and ¹³CNMR spectra were also obtained theoretically using DFT. Further IRC calculations were employed to determine the transition state for the reaction and to get the theoretical information about the reaction pathway. Moreover, molecular docking studies can be explored to ensure the anticancer activity of the newly synthesized organotin(IV) complexes.Keywords: DFT, molecular docking, organotin(IV) complexes, o-vanillin, 3-nitro-o-phenylenediamine
Procedia PDF Downloads 15989 Modeling and Simulation of Organic Solar Cells Based on P3HT:PCBM using SCAPS 1-D (Influence of Defects and Temperature on the Performance of the Solar Cell)
Authors: Souhila Boukli Hacene, Djamila Kherbouche, Abdelhak Chikhaoui
Abstract:
In this work, we elucidate theoretically the effect of defects and temperature on the performance of the organic bulk heterojunction solar cell (BHJ) P3HT: PCBM. We have studied the influence of their parameters on cell characteristics. For this purpose, we used the effective medium model and the solar cell simulator (SCAPS) to model the characteristics of the solar cell. We also explore the transport of charge carriers in the device. It was assumed that the mixture is lightly p-type doped and that the band gap contains acceptor defects near the HOMO level with a Gaussian distribution of energy states at 100 and 50 meV. We varied defects density between 1012-1017 cm-3, from 1016 cm-3, a total decrease of the photovoltaic characteristics due to the increase of the non-radiative recombination can be noticed. Then we studied the effect of variation of the electron and the hole capture cross-section on the cell’s performance, we noticed that the cell obtains a better efficiency of about 3.6% for an electron capture cross section ≤ 10-15 cm2 and a hole capture cross section ≤ 10-19 cm2. On the other hand, we also varied the temperature between 120K and 400K. We observed that the temperature of the solar cell induces a noticeable effect on its voltage. While the effect of temperature on the solar cell current is negligible.Keywords: organic solar cell, P3HT:PCBM, defects, temperature, SCAPS
Procedia PDF Downloads 8988 Dynamic Distribution Calibration for Improved Few-Shot Image Classification
Authors: Majid Habib Khan, Jinwei Zhao, Xinhong Hei, Liu Jiedong, Rana Shahzad Noor, Muhammad Imran
Abstract:
Deep learning is increasingly employed in image classification, yet the scarcity and high cost of labeled data for training remain a challenge. Limited samples often lead to overfitting due to biased sample distribution. This paper introduces a dynamic distribution calibration method for few-shot learning. Initially, base and new class samples undergo normalization to mitigate disparate feature magnitudes. A pre-trained model then extracts feature vectors from both classes. The method dynamically selects distribution characteristics from base classes (both adjacent and remote) in the embedding space, using a threshold value approach for new class samples. Given the propensity of similar classes to share feature distributions like mean and variance, this research assumes a Gaussian distribution for feature vectors. Subsequently, distributional features of new class samples are calibrated using a corrected hyperparameter, derived from the distribution features of both adjacent and distant base classes. This calibration augments the new class sample set. The technique demonstrates significant improvements, with up to 4% accuracy gains in few-shot classification challenges, as evidenced by tests on miniImagenet and CUB datasets.Keywords: deep learning, computer vision, image classification, few-shot learning, threshold
Procedia PDF Downloads 6487 Theoretical Study of Structural Parameters, Chemical Reactivity and Spectral and Thermodynamical Properties of Organometallic Complexes Containing Zinc, Nickel and Cadmium with Nitrilotriacetic Acid and Tea Ligands: Density Functional Theory Investigation
Authors: Nour El Houda Bensiradj, Nafila Zouaghi, Taha Bensiradj
Abstract:
The pollution of water resources is characterized by the presence of microorganisms, chemicals, or industrial waste. Generally, this waste generates effluents containing large quantities of heavy metals, making the water unsuitable for consumption and causing the death of aquatic life and associated biodiversity. Currently, it is very important to assess the impact of heavy metals in water pollution as well as the processes for treating and reducing them. Among the methods of water treatment and disinfection, we mention the complexation of metal ions using ligands which serve to precipitate and subsequently eliminate these ions. In this context, we are interested in the study of complexes containing heavy metals such as zinc, nickel, and cadmium, which are present in several industrial discharges and are discharged into water sources. We will use the ligands of triethanolamine (TEA) and nitrilotriacetic acid (NTA). The theoretical study is based on molecular modeling, using the density functional theory (DFT) implemented in the Gaussian 09 program. The geometric and energetic properties of the above complexes will be calculated. Spectral properties such as infrared, as well as reactivity descriptors, and thermodynamic properties such as enthalpy and free enthalpy will also be determined.Keywords: heavy metals, NTA, TEA, DFT, IR, reactivity descriptors
Procedia PDF Downloads 9986 Diagnosis and Analysis of Automated Liver and Tumor Segmentation on CT
Authors: R. R. Ramsheeja, R. Sreeraj
Abstract:
For view the internal structures of the human body such as liver, brain, kidney etc have a wide range of different modalities for medical images are provided nowadays. Computer Tomography is one of the most significant medical image modalities. In this paper use CT liver images for study the use of automatic computer aided techniques to calculate the volume of the liver tumor. Segmentation method is used for the detection of tumor from the CT scan is proposed. Gaussian filter is used for denoising the liver image and Adaptive Thresholding algorithm is used for segmentation. Multiple Region Of Interest(ROI) based method that may help to characteristic the feature different. It provides a significant impact on classification performance. Due to the characteristic of liver tumor lesion, inherent difficulties appear selective. For a better performance, a novel proposed system is introduced. Multiple ROI based feature selection and classification are performed. In order to obtain of relevant features for Support Vector Machine(SVM) classifier is important for better generalization performance. The proposed system helps to improve the better classification performance, reason in which we can see a significant reduction of features is used. The diagnosis of liver cancer from the computer tomography images is very difficult in nature. Early detection of liver tumor is very helpful to save the human life.Keywords: computed tomography (CT), multiple region of interest(ROI), feature values, segmentation, SVM classification
Procedia PDF Downloads 50885 Brain Tumor Segmentation Based on Minimum Spanning Tree
Authors: Simeon Mayala, Ida Herdlevær, Jonas Bull Haugsøen, Shamundeeswari Anandan, Sonia Gavasso, Morten Brun
Abstract:
In this paper, we propose a minimum spanning tree-based method for segmenting brain tumors. The proposed method performs interactive segmentation based on the minimum spanning tree without tuning parameters. The steps involve preprocessing, making a graph, constructing a minimum spanning tree, and a newly implemented way of interactively segmenting the region of interest. In the preprocessing step, a Gaussian filter is applied to 2D images to remove the noise. Then, the pixel neighbor graph is weighted by intensity differences and the corresponding minimum spanning tree is constructed. The image is loaded in an interactive window for segmenting the tumor. The region of interest and the background are selected by clicking to split the minimum spanning tree into two trees. One of these trees represents the region of interest and the other represents the background. Finally, the segmentation given by the two trees is visualized. The proposed method was tested by segmenting two different 2D brain T1-weighted magnetic resonance image data sets. The comparison between our results and the standard gold segmentation confirmed the validity of the minimum spanning tree approach. The proposed method is simple to implement and the results indicate that it is accurate and efficient.Keywords: brain tumor, brain tumor segmentation, minimum spanning tree, segmentation, image processing
Procedia PDF Downloads 11984 Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation-Based Approach
Authors: Sujoy Das, M. M. Ghosh
Abstract:
The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solid-solid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulse-like pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.Keywords: brownian dynamics, molecular dynamics, nanofluid, thermal conductivity
Procedia PDF Downloads 36983 Electronic Structure Calculation of AsSiTeB/SiAsBTe Nanostructures Using Density Functional Theory
Authors: Ankit Kargeti, Ravikant Shrivastav, Tabish Rasheed
Abstract:
The electronic structure calculation for the nanoclusters of AsSiTeB/SiAsBTe quaternary semiconductor alloy belonging to the III-V Group elements was performed. Motivation for this research work was to look for accurate electronic and geometric data of small nanoclusters of AsSiTeB/SiAsBTe in the gaseous form. The two clusters, one in the linear form and the other in the bent form, were studied under the framework of Density Functional Theory (DFT) using the B3LYP functional and LANL2DZ basis set with the software packaged Gaussian 16. We have discussed the Optimized Energy, Frontier Orbital Energy Gap in terms of HOMO-LUMO, Dipole Moment, Ionization Potential, Electron Affinity, Binding Energy, Embedding Energy, Density of States (DoS) spectrum for both structures. The important findings of the predicted nanostructures are that these structures have wide band gap energy, where linear structure has band gap energy (Eg) value is 2.375 eV and bent structure (Eg) value is 2.778 eV. Therefore, these structures can be utilized as wide band gap semiconductors. These structures have high electron affinity value of 4.259 eV for the linear structure and electron affinity value of 3.387 eV for the bent structure form. It shows that electron acceptor capability is high for both forms. The widely known application of these compounds is in the light emitting diodes due to their wide band gap nature.Keywords: density functional theory, DFT, density functional theory, nanostructures, HOMO-LUMO, density of states
Procedia PDF Downloads 11382 A Posteriori Trading-Inspired Model-Free Time Series Segmentation
Authors: Plessen Mogens Graf
Abstract:
Within the context of multivariate time series segmentation, this paper proposes a method inspired by a posteriori optimal trading. After a normalization step, time series are treated channelwise as surrogate stock prices that can be traded optimally a posteriori in a virtual portfolio holding either stock or cash. Linear transaction costs are interpreted as hyperparameters for noise filtering. Trading signals, as well as trading signals obtained on the reversed time series, are used for unsupervised channelwise labeling before a consensus over all channels is reached that determines the final segmentation time instants. The method is model-free such that no model prescriptions for segments are made. Benefits of proposed approach include simplicity, computational efficiency, and adaptability to a wide range of different shapes of time series. Performance is demonstrated on synthetic and real-world data, including a large-scale dataset comprising a multivariate time series of dimension 1000 and length 2709. Proposed method is compared to a popular model-based bottom-up approach fitting piecewise affine models and to a recent model-based top-down approach fitting Gaussian models and found to be consistently faster while producing more intuitive results in the sense of segmenting time series at peaks and valleys.Keywords: time series segmentation, model-free, trading-inspired, multivariate data
Procedia PDF Downloads 13381 Patented Free-Space Optical System for Auto Aligned Optical Beam Allowing to Compensate Mechanical Misalignments
Authors: Aurelien Boutin
Abstract:
In optical systems such as Variable Optical Delay Lines, where a collimated beam has to go back and forth, corner cubes are used in order to keep the reflected beam parallel to the incoming beam. However, the reflected beam can be laterally shifted, which will lead to losses. In this paper, we report on a patented optical design that allows keeping the reflected beam with the exact same position and direction whatever the displacement of the corner cube leading to zero losses. After explaining how the optical design works and theoretically allows to compensate for any defects in the translation of the corner cube, we will present the results of experimental comparisons between a standard layout (i.e., only corner cubes) and our optical layout. To compare both optical layouts, we used a fiber-to-fiber coupling setup. It consists of a couple of lights from one fiber to the other, thanks to two lenses. The ensemble [fiber+lense] is fixed and called a collimator so that the light is coupled from one collimator to another. Each collimator was precisely made in order to have a precise working distance. In the experiment, we measured and compared the Insertion Losses (IL) variations between both collimators with the distance between them (i.e., natural Gaussian beam coupling losses) and between both collimators in the different optical layouts tested, with the same optical length propagation. We will show that the IL variations of our setup are less than 0.05dB with respect to the IL variations of collimators alone.Keywords: free-space optics, variable optical delay lines, optical cavity, auto-alignment
Procedia PDF Downloads 9780 Estimation of the Road Traffic Emissions and Dispersion in the Developing Countries Conditions
Authors: Hicham Gourgue, Ahmed Aharoune, Ahmed Ihlal
Abstract:
We present in this work our model of road traffic emissions (line sources) and dispersion of these emissions, named DISPOLSPEM (Dispersion of Poly Sources and Pollutants Emission Model). In its emission part, this model was designed to keep the consistent bottom-up and top-down approaches. It also allows to generate emission inventories from reduced input parameters being adapted to existing conditions in Morocco and in the other developing countries. While several simplifications are made, all the performance of the model results are kept. A further important advantage of the model is that it allows the uncertainty calculation and emission rate uncertainty according to each of the input parameters. In the dispersion part of the model, an improved line source model has been developed, implemented and tested against a reference solution. It provides improvement in accuracy over previous formulas of line source Gaussian plume model, without being too demanding in terms of computational resources. In the case study presented here, the biggest errors were associated with the ends of line source sections; these errors will be canceled by adjacent sections of line sources during the simulation of a road network. In cases where the wind is parallel to the source line, the use of the combination discretized source and analytical line source formulas minimizes remarkably the error. Because this combination is applied only for a small number of wind directions, it should not excessively increase the calculation time.Keywords: air pollution, dispersion, emissions, line sources, road traffic, urban transport
Procedia PDF Downloads 43979 Simulation of Laser Structuring by Three Dimensional Heat Transfer Model
Authors: Bassim Shaheen Bachy, Jörg Franke
Abstract:
In this study, a three dimensional numerical heat transfer model has been used to simulate the laser structuring of polymer substrate material in the Three-Dimensional Molded Interconnect Device (3D MID) which is used in the advanced multi-functional applications. A finite element method (FEM) transient thermal analysis is performed using APDL (ANSYS Parametric Design Language) provided by ANSYS. In this model, the effect of surface heat source was modeled with Gaussian distribution, also the effect of the mixed boundary conditions which consist of convection and radiation heat transfers have been considered in this analysis. The model provides a full description of the temperature distribution, as well as calculates the depth and the width of the groove upon material removal at different set of laser parameters such as laser power and laser speed. This study also includes the experimental procedure to study the effect of laser parameters on the depth and width of the removal groove metal as verification to the modeled results. Good agreement between the experimental and the model results is achieved for a wide range of laser powers. It is found that the quality of the laser structure process is affected by the laser scan speed and laser power. For a high laser structured quality, it is suggested to use laser with high speed and moderate to high laser power.Keywords: laser structuring, simulation, finite element analysis, thermal modeling
Procedia PDF Downloads 34678 Localization of Buried People Using Received Signal Strength Indication Measurement of Wireless Sensor
Authors: Feng Tao, Han Ye, Shaoyi Liao
Abstract:
City constructions collapse after earthquake and people will be buried under ruins. Search and rescue should be conducted as soon as possible to save them. Therefore, according to the complicated environment, irregular aftershocks and rescue allow of no delay, a kind of target localization method based on RSSI (Received Signal Strength Indication) is proposed in this article. The target localization technology based on RSSI with the features of low cost and low complexity has been widely applied to nodes localization in WSN (Wireless Sensor Networks). Based on the theory of RSSI transmission and the environment impact to RSSI, this article conducts the experiments in five scenes, and multiple filtering algorithms are applied to original RSSI value in order to establish the signal propagation model with minimum test error respectively. Target location can be calculated from the distance, which can be estimated from signal propagation model, through improved centroid algorithm. Result shows that the localization technology based on RSSI is suitable for large-scale nodes localization. Among filtering algorithms, mixed filtering algorithm (average of average, median and Gaussian filtering) performs better than any other single filtering algorithm, and by using the signal propagation model, the minimum error of distance between known nodes and target node in the five scene is about 3.06m.Keywords: signal propagation model, centroid algorithm, localization, mixed filtering, RSSI
Procedia PDF Downloads 29977 Reducing Hazardous Materials Releases from Railroad Freights through Dynamic Trip Plan Policy
Authors: Omar A. Abuobidalla, Mingyuan Chen, Satyaveer S. Chauhan
Abstract:
Railroad transportation of hazardous materials freights is important to the North America economics that supports the national’s supply chain. This paper introduces various extensions of the dynamic hazardous materials trip plan problems. The problem captures most of the operational features of a real-world railroad transportations systems that dynamically initiates a set of blocks and assigns each shipment to a single block path or multiple block paths. The dynamic hazardous materials trip plan policies have distinguishing features that are integrating the blocking plan, and the block activation decisions. We also present a non-linear mixed integer programming formulation for each variant and present managerial insights based on a hypothetical railroad network. The computation results reveal that the dynamic car scheduling policies are not only able to take advantage of the capacity of the network but also capable of diminishing the population, and environment risks by rerouting the active blocks along the least risky train services without sacrificing the cost advantage of the railroad. The empirical results of this research illustrate that the issue of integrating the blocking plan, and the train makeup of the hazardous materials freights must receive closer attentions.Keywords: dynamic car scheduling, planning and scheduling hazardous materials freights, airborne hazardous materials, gaussian plume model, integrated blocking and routing plans, box model
Procedia PDF Downloads 20476 Urban Park Green Space Planning and Construction under the Theory of Environmental Justice
Authors: Ma Chaoyang
Abstract:
This article starts from the perspective of environmental justice theory and analyzes the accessibility and regional equity of park green spaces in the central urban area of Chengdu in 2022 based on the improved Gaussian 2SFCA analysis method and Gini coefficient method. Then, according to the relevant analysis model, it further explores the correlation between the spatial distribution of park green spaces and the socio-economic conditions of residents in order to provide a reference for the construction and research of Chengdu's park city under the guidance of fairness and justice. The results show that: (1) Overall, the spatial distribution of parks and green spaces in Chengdu shows a significantly uneven distribution of extreme core edge, with a certain degree of unfairness; that is, there is an environmental injustice pattern. (2) The spatial layout of urban parks and green spaces is subject to strong guiding interference from the socio-economic level; that is, there is a high correlation between housing prices and the tendency of parks. (3) Green space resources Gini coefficient analysis shows that residents of the three modes of transportation in the study area have unequal opportunities to enjoy park and green space services, and the degree of unfairness in walking is much greater than that in cycling and cycling.Keywords: parks and green spaces, environmental justice, two step mobile search method, Gini coefficient, spatial distribution
Procedia PDF Downloads 48