Search results for: domain decomposition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2272

Search results for: domain decomposition

2062 Testing the Change in Correlation Structure across Markets: High-Dimensional Data

Authors: Malay Bhattacharyya, Saparya Suresh

Abstract:

The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.

Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition

Procedia PDF Downloads 100
2061 Time-Domain Analysis of Pulse Parameters Effects on Crosstalk in High-Speed Circuits

Authors: Loubna Tani, Nabih Elouzzani

Abstract:

Crosstalk among interconnects and printed-circuit board (PCB) traces is a major limiting factor of signal quality in high-speed digital and communication equipments especially when fast data buses are involved. Such a bus is considered as a planar multiconductor transmission line. This paper will demonstrate how the finite difference time domain (FDTD) method provides an exact solution of the transmission-line equations to analyze the near end and the far end crosstalk. In addition, this study makes it possible to analyze the rise time effect on the near and far end voltages of the victim conductor. The paper also discusses a statistical analysis, based upon a set of several simulations. Such analysis leads to a better understanding of the phenomenon and yields useful information.

Keywords: multiconductor transmission line, crosstalk, finite difference time domain (FDTD), printed-circuit board (PCB), rise time, statistical analysis

Procedia PDF Downloads 402
2060 Epistemic Stance in Chinese Medicine Translation: A Systemic Functional Perspective

Authors: Yan Yue

Abstract:

Epistemic stance refers to the writer’s judgement about the certainty of the proposition, which demonstrates writer’s degree of commitment and confidence to the status of the information. Epistemic stance can exert great consequence to the validity or reliability of the values of a statement, however, to date, it receives little attention in translations studies, especially from the perspective of systemic functional linguistics (SFL) and with the relation to translator’s domain knowledge. This study is corpus-based research carried out in SFL perspective, which investigates translator’s epistemic stance pattern in Chinese medicine discourse translations by translators with and without medical domain knowledge. Overall, our findings show that all translators tend to be neither too assertive nor too doubted about Chinese medicine statements, and they all tend to express their epistemic stance in a subjective rather than objective way. Individually, there is a clear pattern of epistemic stance marked off by translators’ medical expertise, which further consolidates the previous finding that epistemic asymmetry is found most salient between lay people and professionals. However, contrary to our hypothesis, translators as clinicians who have more medical knowledge are found to be more tentative to TCM statements than translators as non-clinicians. This finding could serve to refine the statements about the relation between writer’s domain knowledge and epistemic stance-taking and the current debate whether Chinese medicine texts should only be translated by clinicians.

Keywords: epistemic stance, domain knowledge, SFL, medical translation

Procedia PDF Downloads 116
2059 Subsurface Structures Related to the Hydrocarbon Migration and Accumulation in the Afghan Tajik Basin, Northern Afghanistan: Insights from Seismic Attribute Analysis

Authors: Samim Khair Mohammad, Takeshi Tsuji, Chanmaly Chhun

Abstract:

The Afghan Tajik (foreland) basin, located in the depression zone between mountain axes, is under compression and deformation during the collision of India with the Eurasian plate. The southern part of the Afghan Tajik basin in the Northern part of Afghanistan has not been well studied and explored, but considered for the significant potential for oil and gas resources. The Afghan Tajik basin depositional environments (< 8km) resulted from mixing terrestrial and marine systems, which has potential prospects of Jurrasic (deep) and Tertiary (shallow) petroleum systems. We used 2D regional seismic profiles with a total length of 674.8 km (or over an area of 2500 km²) in the southern part of the basin. To characterize hydrocarbon systems and structures in this study area, we applied advanced seismic attributes such as spectral decomposition (10 - 60Hz) based on time-frequency analysis with continuous wavelet transform. The spectral decomposition results yield the (averaging 20 - 30Hz group) spectral amplitude anomaly. Based on this anomaly result, seismic, and structural interpretation, the potential hydrocarbon accumulations were inferred around the main thrust folds in the tertiary (Paleogene+Neogene) petroleum systems, which appeared to be accumulated around the central study area. Furthermore, it seems that hydrocarbons dominantly migrated along the main thrusts and then concentrated around anticline fold systems which could be sealed by mudstone/carbonate rocks.

Keywords: The Afghan Tajik basin, seismic lines, spectral decomposition, thrust folds, hydrocarbon reservoirs

Procedia PDF Downloads 64
2058 Quality Evaluation of Backfill Grout in Tunnel Boring Machine Tail Void Using Impact-Echo (IE): Short-Time Fourier Transform (STFT) Numerical Analysis

Authors: Ju-Young Choi, Ki-Il Song, Kyoung-Yul Kim

Abstract:

During Tunnel Boring Machine (TBM) tunnel excavation, backfill grout should be injected after the installation of segment lining to ensure the stability of the tunnel and to minimize ground deformation. If grouting is not sufficient to fill the gap between the segments and rock mass, hydraulic pressures occur in the void, which can negatively influence the stability of the tunnel. Recently the tendency to use TBM tunnelling method to replace the drill and blast(NATM) method is increasing. However, there are only a few studies of evaluation of backfill grout. This study evaluates the TBM tunnel backfill state using Impact-Echo(IE). 3-layers, segment-grout-rock mass, are simulated by FLAC 2D, FDM-based software. The signals obtained from numerical analysis and IE test are analyzed by Short-Time Fourier Transform(STFT) in time domain, frequency domain, and time-frequency domain. The result of this study can be used to evaluate the quality of backfill grouting in tail void.

Keywords: tunnel boring machine, backfill grout, impact-echo method, time-frequency domain analysis, finite difference method

Procedia PDF Downloads 243
2057 Unsupervised Domain Adaptive Text Retrieval with Query Generation

Authors: Rui Yin, Haojie Wang, Xun Li

Abstract:

Recently, mainstream dense retrieval methods have obtained state-of-the-art results on some datasets and tasks. However, they require large amounts of training data, which is not available in most domains. The severe performance degradation of dense retrievers on new data domains has limited the use of dense retrieval methods to only a few domains with large training datasets. In this paper, we propose an unsupervised domain-adaptive approach based on query generation. First, a generative model is used to generate relevant queries for each passage in the target corpus, and then the generated queries are used for mining negative passages. Finally, the query-passage pairs are labeled with a cross-encoder and used to train a domain-adapted dense retriever. Experiments show that our approach is more robust than previous methods in target domains that require less unlabeled data.

Keywords: dense retrieval, query generation, unsupervised training, text retrieval

Procedia PDF Downloads 38
2056 Algebras over an Integral Domain and Immediate Neighbors

Authors: Shai Sarussi

Abstract:

Let S be an integral domain with field of fractions F and let A be an F-algebra. An S-subalgebra R of A is called S-nice if R∩F = S and the localization of R with respect to S \{0} is A. Denoting by W the set of all S-nice subalgebras of A, and defining a notion of open sets on W, one can view W as a T0-Alexandroff space. A characterization of the property of immediate neighbors in an Alexandroff topological space is given, in terms of closed and open subsets of appropriate subspaces. Moreover, two special subspaces of W are introduced, and a way in which their closed and open subsets induce W is presented.

Keywords: integral domains, Alexandroff topology, immediate neighbors, valuation domains

Procedia PDF Downloads 146
2055 Video Shot Detection and Key Frame Extraction Using Faber-Shauder DWT and SVD

Authors: Assma Azeroual, Karim Afdel, Mohamed El Hajji, Hassan Douzi

Abstract:

Key frame extraction methods select the most representative frames of a video, which can be used in different areas of video processing such as video retrieval, video summary, and video indexing. In this paper we present a novel approach for extracting key frames from video sequences. The frame is characterized uniquely by his contours which are represented by the dominant blocks. These dominant blocks are located on the contours and its near textures. When the video frames have a noticeable changement, its dominant blocks changed, then we can extracte a key frame. The dominant blocks of every frame is computed, and then feature vectors are extracted from the dominant blocks image of each frame and arranged in a feature matrix. Singular Value Decomposition is used to calculate sliding windows ranks of those matrices. Finally the computed ranks are traced and then we are able to extract key frames of a video. Experimental results show that the proposed approach is robust against a large range of digital effects used during shot transition.

Keywords: FSDWT, key frame extraction, shot detection, singular value decomposition

Procedia PDF Downloads 359
2054 Catalytic Degradation of Tetracycline in Aqueous Solution by Magnetic Ore Pyrite Nanoparticles

Authors: Allah Bakhsh Javid, Ali Mashayekh-Salehi, Fatemeh Davardoost

Abstract:

This study presents the preparation, characterization and catalytic activity of a novel natural mineral-based catalyst for destructive adsorption of tetracycline (TTC) as water emerging compounds. Degradation potential of raw and calcined magnetite catalyst was evaluated at different experiments situations such as pH, catalyst dose, reaction time and pollutant concentration. Calcined magnetite attained greater catalytic potential than the raw ore in the degradation of tetracycline, around 69% versus 3% at reaction time of 30 min and TTC aqueous solution of 50 mg/L, respectively. Complete removal of TTC could be obtained using 2 g/L calcined nanoparticles at reaction time of 60 min. The removal of TTC increased with the increase in solution temperature. Accordingly, considering its abundance in nature together with its very high catalytic potential, calcined pyrite is a promising and reliable catalytic material for destructive decomposition for catalytic decomposition and mineralization of such pharmaceutical compounds as TTC in water and wastewater.

Keywords: catalytic degradation, tetracycline, pyrite, emerging pollutants

Procedia PDF Downloads 147
2053 Measurement of Fatty Acid Changes in Post-Mortem Belowground Carcass (Sus-scrofa) Decomposition: A Semi-Quantitative Methodology for Determining the Post-Mortem Interval

Authors: Nada R. Abuknesha, John P. Morgan, Andrew J. Searle

Abstract:

Information regarding post-mortem interval (PMI) in criminal investigations is vital to establish a time frame when reconstructing events. PMI is defined as the time period that has elapsed between the occurrence of death and the discovery of the corpse. Adipocere, commonly referred to as ‘grave-wax’, is formed when post-mortem adipose tissue is converted into a solid material that is heavily comprised of fatty acids. Adipocere is of interest to forensic anthropologists, as its formation is able to slow down the decomposition process. Therefore, analysing the changes in the patterns of fatty acids during the early decomposition process may be able to estimate the period of burial, and hence the PMI. The current study concerned the investigation of the fatty acid composition and patterns in buried pig fat tissue. This was in an attempt to determine whether particular patterns of fatty acid composition can be shown to be associated with the duration of the burial, and hence may be used to estimate PMI. The use of adipose tissue from the abdominal region of domestic pigs (Sus-scrofa), was used to model the human decomposition process. 17 x 20cm piece of pork belly was buried in a shallow artificial grave, and weekly samples (n=3) from the buried pig fat tissue were collected over an 11-week period. Marker fatty acids: palmitic (C16:0), oleic (C18:1n-9) and linoleic (C18:2n-6) acid were extracted from the buried pig fat tissue and analysed as fatty acid methyl esters using the gas chromatography system. Levels of the marker fatty acids were quantified from their respective standards. The concentrations of C16:0 (69.2 mg/mL) and C18:1n-9 (44.3 mg/mL) from time zero exhibited significant fluctuations during the burial period. Levels rose (116 and 60.2 mg/mL, respectively) and fell starting from the second week to reach 19.3 and 18.3 mg/mL, respectively at week 6. Levels showed another increase at week 9 (66.3 and 44.1 mg/mL, respectively) followed by gradual decrease at week 10 (20.4 and 18.5 mg/mL, respectively). A sharp increase was observed in the final week (131.2 and 61.1 mg/mL, respectively). Conversely, the levels of C18:2n-6 remained more or less constant throughout the study. In addition to fluctuations in the concentrations, several new fatty acids appeared in the latter weeks. Other fatty acids which were detectable in the time zero sample, were lost in the latter weeks. There are several probable opportunities to utilise fatty acid analysis as a basic technique for approximating PMI: the quantification of marker fatty acids and the detection of selected fatty acids that either disappear or appear during the burial period. This pilot study indicates that this may be a potential semi-quantitative methodology for determining the PMI. Ideally, the analysis of particular fatty acid patterns in the early stages of decomposition could be an additional tool to the already available techniques or methods in improving the overall processes in estimating PMI of a corpse.

Keywords: adipocere, fatty acids, gas chromatography, post-mortem interval

Procedia PDF Downloads 98
2052 Atom Probe Study of Early Stage of Precipitation on Binary Al-Li, Al-Cu Alloys and Ternary Al-Li-Cu Alloys

Authors: Muna Khushaim

Abstract:

Aluminum-based alloys play a key role in modern engineering, especially in the aerospace industry. Introduction of solute atoms such as Li and Cu is the main approach to improve the strength in age-hardenable Al alloys via the precipitation hardening phenomenon. Knowledge of the decomposition process of the microstructure during the precipitation reaction is particularly important for future technical developments. The objective of this study is to investigate the nano-scale chemical composition in the Al-Cu, Al-Li and Al-Li-Cu during the early stage of the precipitation sequence and to describe whether this compositional difference correlates with variations in the observed precipitation kinetics. Comparing the random binomial frequency distribution and the experimental frequency distribution of concentrations in atom probe tomography data was used to investigate the early stage of decomposition in the different binary and ternary alloys which were experienced different heat treatments. The results show that an Al-1.7 at.% Cu alloy requires a long ageing time of approximately 8 h at 160 °C to allow the diffusion of Cu atoms into Al matrix. For the Al-8.2 at.% Li alloy, a combination of both the natural ageing condition (48 h at room temperature) and a short artificial ageing condition (5 min at 160 °C) induces increasing on the number density of the Li clusters and hence increase number of precipitated δ' particles. Applying this combination of natural ageing and short artificial ageing conditions onto the ternary Al-4 at.% Li-1.7 at.% Cu alloy induces the formation of a Cu-rich phase. Increasing the Li content in the ternary alloy up to 8 at.% and increasing the ageing time to 30 min resulted in the precipitation processes ending with δ' particles. Thus, the results contribute to the understanding of Al-alloy design.

Keywords: aluminum alloy, atom probe tomography, early stage, decomposition

Procedia PDF Downloads 319
2051 Dynamical Analysis of the Fractional-Order Mathematical Model of Hashimoto’s Thyroiditis

Authors: Neelam Singha

Abstract:

The present work intends to analyze the system dynamics of Hashimoto’s thyroiditis with the assistance of fractional calculus. Hashimoto’s thyroiditis or chronic lymphocytic thyroiditis is an autoimmune disorder in which the immune system attacks the thyroid gland, which gradually results in interrupting the normal thyroid operation. Consequently, the feedback control of the system gets disrupted due to thyroid follicle cell lysis. And, the patient perceives life-threatening clinical conditions like goiter, hyperactivity, euthyroidism, hyperthyroidism, etc. In this work, we aim to obtain the approximate solution to the posed fractional-order problem describing Hashimoto’s thyroiditis. We employ the Adomian decomposition method to solve the system of fractional-order differential equations, and the solutions obtained shall be useful to provide information about the effect of medical care. The numerical technique is executed in an organized manner to furnish the associated details of the progression of the disease and to visualize it graphically with suitable plots.

Keywords: adomian decomposition method, fractional derivatives, Hashimoto's thyroiditis, mathematical modeling

Procedia PDF Downloads 182
2050 Purification and Pre-Crystallization of Recombinant PhoR Cytoplasmic Domain Protein from Mycobacterium Tuberculosis H37Rv

Authors: Oktira Roka Aji, Maelita R. Moeis, Ihsanawati, Ernawati A. Giri-Rachman

Abstract:

Globally, tuberculosis (TB) remains a leading cause of death. The emergence of multidrug-resistant strains and extensively drug-resistant strains have become a major public concern. One of the potential candidates for drug target is the cytoplasmic domain of PhoR Histidine Kinase, a part of the Two Component System (TCS) PhoR-PhoP in Mycobacterium tuberculosis (Mtb). TCS PhoR-PhoP relay extracellular signal to control the expression of 114 virulent associated genes in Mtb. The 3D structure of PhoR cytoplasmic domain is needed to screen novel drugs using structure based drug discovery. The PhoR cytoplasmic domain from Mtb H37Rv was overexpressed in E. coli BL21(DE3), then purified using IMAC Ni-NTA Agarose his-tag affinity column and DEAE-ion exchange column chromatography. The molecular weight of the purified protein was estimated to be 37 kDa after SDS-PAGE analysis. This sample was used for pre-crystallization screening by applying sitting drop vapor diffusion method using Natrix (HR2-116) 48 solutions crystal screen kit at 25ºC. Needle-like crystals were observed after the seventh day of incubation in test solution No.47 (0.1 M KCl, 0.01 M MgCl2.6H2O, 0.05 M Tris-Cl pH 8.5, 30% v/v PEG 4000). Further testing is required for confirming the crystal.

Keywords: tuberculosis, two component system, histidine kinase, needle-like crystals

Procedia PDF Downloads 408
2049 Localization of Pyrolysis and Burning of Ground Forest Fires

Authors: Pavel A. Strizhak, Geniy V. Kuznetsov, Ivan S. Voytkov, Dmitri V. Antonov

Abstract:

This paper presents the results of experiments carried out at a specialized test site for establishing macroscopic patterns of heat and mass transfer processes at localizing model combustion sources of ground forest fires with the use of barrier lines in the form of a wetted lay of material in front of the zone of flame burning and thermal decomposition. The experiments were performed using needles, leaves, twigs, and mixtures thereof. The dimensions of the model combustion source and the ranges of heat release correspond well to the real conditions of ground forest fires. The main attention is paid to the complex analysis of the effect of dispersion of water aerosol (concentration and size of droplets) used to form the barrier line. It is shown that effective conditions for localization and subsequent suppression of flame combustion and thermal decomposition of forest fuel can be achieved by creating a group of barrier lines with different wetting width and depth of the material. Relative indicators of the effectiveness of one and combined barrier lines were established, taking into account all the main characteristics of the processes of suppressing burning and thermal decomposition of forest combustible materials. We performed the prediction of the necessary and sufficient parameters of barrier lines (water volume, width, and depth of the wetted lay of the material, specific irrigation density) for combustion sources with different dimensions, corresponding to the real fire extinguishing practice.

Keywords: forest fire, barrier water lines, pyrolysis front, flame front

Procedia PDF Downloads 102
2048 3D Modeling for Frequency and Time-Domain Airborne EM Systems with Topography

Authors: C. Yin, B. Zhang, Y. Liu, J. Cai

Abstract:

Airborne EM (AEM) is an effective geophysical exploration tool, especially suitable for ridged mountain areas. In these areas, topography will have serious effects on AEM system responses. However, until now little study has been reported on topographic effect on airborne EM systems. In this paper, an edge-based unstructured finite-element (FE) method is developed for 3D topographic modeling for both frequency and time-domain airborne EM systems. Starting from the frequency-domain Maxwell equations, a vector Helmholtz equation is derived to obtain a stable and accurate solution. Considering that the AEM transmitter and receiver are both located in the air, the scattered field method is used in our modeling. The Galerkin method is applied to discretize the Helmholtz equation for the final FE equations. Solving the FE equations, the frequency-domain AEM responses are obtained. To accelerate the calculation speed, the response of source in free-space is used as the primary field and the PARDISO direct solver is used to deal with the problem with multiple transmitting sources. After calculating the frequency-domain AEM responses, a Hankel’s transform is applied to obtain the time-domain AEM responses. To check the accuracy of present algorithm and to analyze the characteristic of topographic effect on airborne EM systems, both the frequency- and time-domain AEM responses for 3 model groups are simulated: 1) a flat half-space model that has a semi-analytical solution of EM response; 2) a valley or hill earth model; 3) a valley or hill earth with an abnormal body embedded. Numerical experiments show that close to the node points of the topography, AEM responses demonstrate sharp changes. Special attentions need to be paid to the topographic effects when interpreting AEM survey data over rugged topographic areas. Besides, the profile of the AEM responses presents a mirror relation with the topographic earth surface. In comparison to the topographic effect that mainly occurs at the high-frequency end and early time channels, the EM responses of underground conductors mainly occur at low frequencies and later time channels. For the signal of the same time channel, the dB/dt field reflects the change of conductivity better than the B-field. The research of this paper will serve airborne EM in the identification and correction of the topographic effects.

Keywords: 3D, Airborne EM, forward modeling, topographic effect

Procedia PDF Downloads 290
2047 Structural Molecular Dynamics Modelling of FH2 Domain of Formin DAAM

Authors: Rauan Sakenov, Peter Bukovics, Peter Gaszler, Veronika Tokacs-Kollar, Beata Bugyi

Abstract:

FH2 (formin homology-2) domains of several proteins, collectively known as formins, including DAAM, DAAM1 and mDia1, promote G-actin nucleation and elongation. FH2 domains of these formins exist as oligomers. Chain dimerization by ring structure formation serves as a structural basis for actin polymerization function of FH2 domain. Proper single chain configuration and specific interactions between its various regions are necessary for individual chains to form a dimer functional in G-actin nucleation and elongation. FH1 and WH2 domain-containing formins were shown to behave as intrinsically disordered proteins. Thus, the aim of this research was to study structural dynamics of FH2 domain of DAAM. To investigate structural features of FH2 domain of DAAM, molecular dynamics simulation of chain A of FH2 domain of DAAM solvated in water box in 50 mM NaCl was conducted at temperatures from 293.15 to 353.15K, with VMD 1.9.2, NAMD 2.14 and Amber Tools 21 using 2z6e and 1v9d PDB structures of DAAM was obtained on I-TASSER webserver. Calcium and ATP bound G-actin 3hbt PDB structure was used as a reference protein with well-described structural dynamics of denaturation. Topology and parameter information of CHARMM 2012 additive all-atom force fields for proteins, carbohydrate derivatives, water and ions were used in NAMD 2.14 and ff19SB force field for proteins in Amber Tools 21. The systems were energy minimized for the first 1000 steps, equilibrated and produced in NPT ensemble for 1ns using stochastic Langevin dynamics and the particle mesh Ewald method. Our root-mean square deviation (RMSD) analysis of molecular dynamics of chain A of FH2 domains of DAAM revealed similar insignificant changes of total molecular average RMSD values of FH2 domain of these formins at temperatures from 293.15 to 353.15K. In contrast, total molecular average RMSD values of G-actin showed considerable increase at 328K, which corresponds to the denaturation of G-actin molecule at this temperature and its transition from native, ordered, to denatured, disordered, state which is well-described in the literature. RMSD values of lasso and tail regions of chain A of FH2 domain of DAAM exhibited higher than total molecular average RMSD at temperatures from 293.15 to 353.15K. These regions are functional in intra- and interchain interactions and contain highly conserved tryptophan residues of lasso region, highly conserved GNYMN sequence of post region and amino acids of the shell of hydrophobic pocket of the salt bridge between Arg171 and Asp321, which are important for structural stability and ordered state of FH2 domain of DAAM and its functions in FH2 domain dimerization. In conclusion, higher than total molecular average RMSD values of lasso and post regions of chain A of FH2 domain of DAAM may explain disordered state of FH2 domain of DAAM at temperatures from 293.15 to 353.15K. Finally, absence of marked transition, in terms of significant changes in average molecular RMSD values between native and denatured states of FH2 domain of DAAM at temperatures from 293.15 to 353.15K, can make it possible to attribute these formins to the group of intrinsically disordered proteins rather than to the group of intrinsically ordered proteins such as G-actin.

Keywords: FH2 domain, DAAM, formins, molecular modelling, computational biophysics

Procedia PDF Downloads 106
2046 Speech Intelligibility Improvement Using Variable Level Decomposition DWT

Authors: Samba Raju, Chiluveru, Manoj Tripathy

Abstract:

Intelligibility is an essential characteristic of a speech signal, which is used to help in the understanding of information in speech signal. Background noise in the environment can deteriorate the intelligibility of a recorded speech. In this paper, we presented a simple variance subtracted - variable level discrete wavelet transform, which improve the intelligibility of speech. The proposed algorithm does not require an explicit estimation of noise, i.e., prior knowledge of the noise; hence, it is easy to implement, and it reduces the computational burden. The proposed algorithm decides a separate decomposition level for each frame based on signal dominant and dominant noise criteria. The performance of the proposed algorithm is evaluated with speech intelligibility measure (STOI), and results obtained are compared with Universal Discrete Wavelet Transform (DWT) thresholding and Minimum Mean Square Error (MMSE) methods. The experimental results revealed that the proposed scheme outperformed competing methods

Keywords: discrete wavelet transform, speech intelligibility, STOI, standard deviation

Procedia PDF Downloads 114
2045 Tracking the Effect of Ibutilide on Amplitude and Frequency of Fibrillatory Intracardiac Electrograms Using the Regression Analysis

Authors: H. Hajimolahoseini, J. Hashemi, D. Redfearn

Abstract:

Background: Catheter ablation is an effective therapy for symptomatic atrial fibrillation (AF). The intracardiac electrocardiogram (IEGM) collected during this procedure contains precious information that has not been explored to its full capacity. Novel processing techniques allow looking at these recordings from different perspectives which can lead to improved therapeutic approaches. In our previous study, we showed that variation in amplitude measured through Shannon Entropy could be used as an AF recurrence risk stratification factor in patients who received Ibutilide before the electrograms were recorded. The aim of this study is to further investigate the effect of Ibutilide on characteristics of the recorded signals from the left atrium (LA) of a patient with persistent AF before and after administration of the drug. Methods: The IEGMs collected from different intra-atrial sites of 12 patients were studied and compared before and after Ibutilide administration. First, the before and after Ibutilide IEGMs that were recorded within a Euclidian distance of 3 mm in LA were selected as pairs for comparison. For every selected pair of IEGMs, the Probability Distribution Function (PDF) of the amplitude in time domain and magnitude in frequency domain was estimated using the regression analysis. The PDF represents the relative likelihood of a variable falling within a specific range of values. Results: Our observations showed that in time domain, the PDF of amplitudes was fitted to a Gaussian distribution while in frequency domain, it was fitted to a Rayleigh distribution. Our observations also revealed that after Ibutilide administration, the IEGMs would have significantly narrower short-tailed PDFs both in time and frequency domains. Conclusion: This study shows that the PDFs of the IEGMs before and after administration of Ibutilide represents significantly different properties, both in time and frequency domains. Hence, by fitting the PDF of IEGMs in time domain to a Gaussian distribution or in frequency domain to a Rayleigh distribution, the effect of Ibutilide can easily be tracked using the statistics of their PDF (e.g., standard deviation) while this is difficult through the waveform of IEGMs itself.

Keywords: atrial fibrillation, catheter ablation, probability distribution function, time-frequency characteristics

Procedia PDF Downloads 141
2044 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 261
2043 CAG Repeat Polymorphism of Androgen Receptor and Female Sexual Functions in Egyptian Female Population

Authors: Azza Gaber Farag, Yasser Atta Shehata, Sara Elsayed Elghazouly, Mustafa Elsayed Elshaib, Nesreen Gamal Elden Elhelbawy

Abstract:

Background: Androgen receptor (AR) polymorphism in cytosine adenineguanine (CAG) repeat has an effect on the functional capacity of AR in males. However, little researches in this field are available regarding female sexual function. Aim: To investigate the possible link between polymorphism in the CAG repeat of AR gene and female sexual function in a sample of the Egyptian population. Materials and methods: 500 Egyptian married females completed a questionnaire regarding sociodemographic, reproductive, and sexual data. AR CAG repeat length was analyzed for those having female sexual dysfunctions (FSD) using real-time PCR. Results: The most sensitive domain to AR CAG repeat length was the orgasm domain that showed significant positive correlations with short allele (p=0.001), long allele (p=.015), biallellic mean (p=.000), and X weighted biallelic mean (p=.000). The satisfaction domain had significant positive correlations with the biallelic mean (p=.035), and the X weighted biallelic mean (p=. 032). However, the pain domain was of significant negative correlations with AR polymorphism of short allele (p=.002), biallelic mean (p=.013), and X weighted biallelic mean (p = . 011). Conclusions: AR polymorphism could represent a non-negligible aspect in female sexual function. The lower AR CAG repeat polymorphism was of significant impact on FSD, affecting mainly female orgasm followed by pain disorders that finally reflected On her sexual satisfaction.

Keywords: female sexual dysfunction, androgen receptor, CAG repeat polymorphism, androgen

Procedia PDF Downloads 144
2042 A Deviation Analysis of Career Anchors and Domain Specialization in Management Education

Authors: Santosh Kumar Sharma, Imran Ahmed Khan

Abstract:

Context: In the field of management education, it has been observed that students often have discrepancies between their career anchors and their chosen domain of specialization. This misalignment creates challenges for students during their summer internships and job placements in the corporate sector. The outcome is that some students opt to change their career track or even leave the management profession altogether. This situation poses a significant concern in terms of the overall human capital in the industry. However, there is a notable lack of substantial literature addressing this specific context. Therefore, this current study aims to contribute to the global discourse on management education and its impact on human resource management. Research Aim: The objective of this study is to analyze the deviation between career anchors and domain specialization in the context of management education in India. Methodology: This study adopts an exploratory approach. Data is collected from a substantial sample of post-graduate students who are currently pursuing management education from a renowned business school in India. The data collection process is followed by a descriptive analysis. Findings: The findings of this research contribute to the professional development of management students by highlighting the significance of aligning career anchors with their chosen domain of specialization. This alignment is crucial for enhancing human capital, which in turn impacts various factors within the Indian economy. Theoretical Importance: This study addresses the gap in the existing literature by exploring the relationship between career anchors and domain specialization in management education. By shedding light on this issue, it contributes to theoretical knowledge in the field and provides insights into the importance of career alignment within the management profession.

Keywords: management education, specialization, human resource management, India

Procedia PDF Downloads 34
2041 Body Farming in India and Asia

Authors: Yogesh Kumar, Adarsh Kumar

Abstract:

A body farm is a research facility where research is done on forensic investigation and medico-legal disciplines like forensic entomology, forensic pathology, forensic anthropology, forensic archaeology, and related areas of forensic veterinary. All the research is done to collect data on the rate of decomposition (animal and human) and forensically important insects to assist in crime detection. The data collected is used by forensic pathologists, forensic experts, and other experts for the investigation of crime cases and further research. The research work includes different conditions of a dead body like fresh, bloating, decay, dry, and skeleton, and data on local insects which depends on the climatic conditions of the local areas of that country. Therefore, it is the need of time to collect appropriate data in managed conditions with a proper set-up in every country. Hence, it is the duty of the scientific community of every country to establish/propose such facilities for justice and social management. The body farms are also used for training of police, military, investigative dogs, and other agencies. At present, only four countries viz. U.S., Australia, Canada, and Netherlands have body farms and related facilities in organised manner. There is no body farm in Asia also. In India, we have been trying to establish a body farm in A&N Islands that is near Singapore, Malaysia, and some other Asian countries. In view of the above, it becomes imperative to discuss the matter with Asian countries to collect the data on decomposition in a proper manner by establishing a body farm. We can also share the data, knowledge, and expertise to collaborate with one another to make such facilities better and have good scientific relations to promote science and explore ways of investigation at the world level.

Keywords: body farm, rate of decomposition, forensically important flies, time since death

Procedia PDF Downloads 50
2040 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model

Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li

Abstract:

Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.

Keywords: spatial information network, traffic prediction, wavelet decomposition, time series model

Procedia PDF Downloads 110
2039 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: steganalysis, security, Fast Fourier Transform, streaming media

Procedia PDF Downloads 118
2038 Comparison of Soils of Hungarian Dry and Humid Oak Forests Based on Changes in Nutrient Content

Authors: István Fekete, Imre Berki, Áron Béni, Katalin Juhos, Marianna Makádi, Zsolt Kotroczó

Abstract:

The average annual precipitation significantly influences the moisture content of the soils and, through this, the decomposition of the organic substances in the soils, the leaching of nutrients from the soils, and the pH of the soils. Climate change, together with the lengthening of the vegetation period and the increasing CO₂ level, can increase the amount of biomass that is formed. Degradation processes, which accelerate as the temperature increases and slow down due to the drying climate, and the change in the degree of leaching can cancel out or strengthen each other's effects. In the course of our research, we looked for oak forests with climate-zonal soils where the geological, geographical and ecological background conditions are as similar as possible, apart from the different annual precipitation averages and the differences that can arise from them. We examined 5 dry and 5 humid Hungarian oak soils. Climate change affects the soils of drier and wetter forests differently. The aim of our research was to compare the content of carbon, nitrogen and some other nutrients, as well as the pH of the soils of humid and dry forests. Showing the effects of the drier climate on the tested soil parameters. In the case of the examined forest soils, we found a significant difference between the soils of dry and humid forests: in the case of the annual average precipitation values (p≥ 0.0001, for dry forest soils: 564±5.2 mm; for humid forest soils: 716±3.8 mm) for pH (p= 0.0004, for dry forest soils: 5.49±0.16; for wet forest soils: 5.36±0.21); for C content (p= 0.0054, for dry forest soils: 6.92%±0.59; for humid forest soils 3.09%±0.24), for N content (p= 0.0022, dry forest in the case of soils: 0.44%±0.047; in the case of humid forest soils: 0.23%±0.013), for the K content (p=0.0017, in the case of dry forest soils: 5684±732 (mg/kg); in the case of humid forest soils 2169±196 (mg/kg)), for the Ca content (p= 0.0096, for dry forest soils: 8207±2118 (mg/kg); for wet forest soils 957±320 (mg/kg)). No significant difference was found in the case of Mg. In a wetter environment, especially if the moisture content of the soil is also optimal for the decomposing organisms during the growing season, the decomposition of organic residues accelerates, and the processes of leaching from the soil are also intensified. The different intensity of the leaching processes is also well reflected in the quantitative differences of Ca and K, and in connection with these, it is also reflected in the difference in pH values. The differences in the C and N content can be explained by differences in the intensity of the decomposition processes. In addition to warming, drying is expected in a significant part of Hungary due to climate change. Thus, the comparison of the soils of dry and humid forests allows us to predict the subsequent changes in the case of the examined parameters.

Keywords: soil nutrients, precipitation difference, climate change, organic matter decomposition, leaching

Procedia PDF Downloads 48
2037 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: semantic links, data mining, linked data, SKOS

Procedia PDF Downloads 142
2036 Measuring the Resilience of e-Governments Using an Ontology

Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips

Abstract:

The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.

Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats

Procedia PDF Downloads 316
2035 Numerical Simulation of Supersonic Gas Jet Flows and Acoustics Fields

Authors: Lei Zhang, Wen-jun Ruan, Hao Wang, Peng-Xin Wang

Abstract:

The source of the jet noise is generated by rocket exhaust plume during rocket engine testing. A domain decomposition approach is applied to the jet noise prediction in this paper. The aerodynamic noise coupling is based on the splitting into acoustic sources generation and sound propagation in separate physical domains. Large Eddy Simulation (LES) is used to simulate the supersonic jet flow. Based on the simulation results of the flow-fields, the jet noise distribution of the sound pressure level is obtained by applying the Ffowcs Williams-Hawkings (FW-H) acoustics equation and Fourier transform. The calculation results show that the complex structures of expansion waves, compression waves and the turbulent boundary layer could occur due to the strong interaction between the gas jet and the ambient air. In addition, the jet core region, the shock cell and the sound pressure level of the gas jet increase with the nozzle size increasing. Importantly, the numerical simulation results of the far-field sound are in good agreement with the experimental measurements in directivity.

Keywords: supersonic gas jet, Large Eddy Simulation(LES), acoustic noise, Ffowcs Williams-Hawkings(FW-H) equations, nozzle size

Procedia PDF Downloads 382
2034 Fine Grained Action Recognition of Skateboarding Tricks

Authors: Frederik Calsius, Mirela Popa, Alexia Briassouli

Abstract:

In the field of machine learning, it is common practice to use benchmark datasets to prove the working of a method. The domain of action recognition in videos often uses datasets like Kinet-ics, Something-Something, UCF-101 and HMDB-51 to report results. Considering the properties of the datasets, there are no datasets that focus solely on very short clips (2 to 3 seconds), and on highly-similar fine-grained actions within one specific domain. This paper researches how current state-of-the-art action recognition methods perform on a dataset that consists of highly similar, fine-grained actions. To do so, a dataset of skateboarding tricks was created. The performed analysis highlights both benefits and limitations of state-of-the-art methods, while proposing future research directions in the activity recognition domain. The conducted research shows that the best results are obtained by fusing RGB data with OpenPose data for the Temporal Shift Module.

Keywords: activity recognition, fused deep representations, fine-grained dataset, temporal modeling

Procedia PDF Downloads 200
2033 Implementation in Python of a Method to Transform One-Dimensional Signals in Graphs

Authors: Luis Andrey Fajardo Fajardo

Abstract:

We are immersed in complex systems. The human brain, the galaxies, the snowflakes are examples of complex systems. An area of interest in Complex systems is the chaos theory. This revolutionary field of science presents different ways of study than determinism and reductionism. Here is where in junction with the Nonlinear DSP, chaos theory offer valuable techniques that establish a link between time series and complex theory in terms of complex networks, so that, the study of signals can be explored from the graph theory. Recently, some people had purposed a method to transform time series in graphs, but no one had developed a suitable implementation in Python with signals extracted from Chaotic Systems or Complex systems. That’s why the implementation in Python of an existing method to transform one dimensional chaotic signals from time domain to graph domain and some measures that may reveal information not extracted in the time domain is proposed.

Keywords: Python, complex systems, graph theory, dynamical systems

Procedia PDF Downloads 481