Search results for: background noise statistical modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12785

Search results for: background noise statistical modeling

10175 Oral Hygiene Behaviors among Pregnant Women with Diabetes Who Attend Primary Health Care Centers at Baghdad City

Authors: Zena F. Mushtaq, Iqbal M. Abbas

Abstract:

Background: Diabetes mellitus during pregnancy is one of the major medical and social problems with increasing prevalence in last decades and may lead to more vulnerable to dental problems and increased risk for periodontal diseases. Objectives: To assess oral hygiene behaviors among pregnant women with diabetes who attended primary health care centers and find out the relationship between oral hygiene behaviors and studied variables. Methodology: A cross sectional design was conducted from 7 July to 30 September 2014 on non probability (convenient sample) of 150 pregnant women with diabetes was selected from twelve Primary Health Care Centers at Baghdad city. Questionnaire format is tool for data collection which had designed and consisted of three main parts including: socio demographic, reproductive characteristics and items of oral hygiene behaviors among pregnant women with diabetes. Reliability of the questionnaire was determined through internal consistency of correlation coefficient (R= 0.940) and validity of content was determined through reviewing it by (12) experts in different specialties and was determined through pilot study. Descriptive and inferential statistics were used to analyze collected data. Result: Result of study revealed that (35.3%) of study sample was (35-39) years old with mean and SD is (X & SD = 33.57 ± 5.54) years, and (34.7%) of the study sample was graduated from primary school and less, half of the study sample was government employment and self employed, (42.7%) of the study sample had moderate socioeconomic status, the highest percentage (70.0%) of the study sample was nonsmokers, The result indicates that oral hygiene behaviors have moderate mean score in all items. There are no statistical significant association between oral hygiene domain and studied variables. Conclusions: All items related to health behavior concerning oral hygiene is in moderate mean of score, which may expose pregnant women with diabetes to high risk of periodontal diseases. Recommendations: Dental care provider should perform a dental examination at least every three months for each pregnant woman with diabetes, explanation of the effect of DM on periodontal health, oral hygiene instruction, oral prophylaxis, professional cleaning and treatment of periodontal diseases(scaling and root planing) when needed.

Keywords: diabetes, health behavior, pregnant women, oral hygiene

Procedia PDF Downloads 290
10174 Periodically Forced Oscillator with Noisy Chaotic Dynamics

Authors: Adedayo Oke Adelakun

Abstract:

The chaotic dynamics of periodically forced oscillators with smooth potential has been extensively investigated via theoretical, numerical and experimental simulations. With the advent of the study of chaotic dynamics by means of method of multiple time scale analysis, Melnikov theory, bifurcation diagram, Poincare's map, bifurcation diagrams and Lyapunov exponents, it has become necessary to seek for a better understanding of nonlinear oscillator with noisy term. In this paper, we examine the influence of noise on complex dynamical behaviour of periodically forced F6 - Duffing oscillator for specific choice of noisy parameters. The inclusion of noisy term improves the dynamical behaviour of the oscillator which may have wider application in secure communication than smooth potential.

Keywords: hierarchical structure, periodically forced oscillator, noisy parameters, dynamical behaviour, F6 - duffing oscillator

Procedia PDF Downloads 329
10173 Automatic Method for Classification of Informative and Noninformative Images in Colonoscopy Video

Authors: Nidhal K. Azawi, John M. Gauch

Abstract:

Colorectal cancer is one of the leading causes of cancer death in the US and the world, which is why millions of colonoscopy examinations are performed annually. Unfortunately, noise, specular highlights, and motion artifacts corrupt many images in a typical colonoscopy exam. The goal of our research is to produce automated techniques to detect and correct or remove these noninformative images from colonoscopy videos, so physicians can focus their attention on informative images. In this research, we first automatically extract features from images. Then we use machine learning and deep neural network to classify colonoscopy images as either informative or noninformative. Our results show that we achieve image classification accuracy between 92-98%. We also show how the removal of noninformative images together with image alignment can aid in the creation of image panoramas and other visualizations of colonoscopy images.

Keywords: colonoscopy classification, feature extraction, image alignment, machine learning

Procedia PDF Downloads 256
10172 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting

Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero

Abstract:

In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling‎) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.

Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling

Procedia PDF Downloads 139
10171 Comparative Study of Gonadotropin Hormones and Sperm Parameters in Two Age Groups

Authors: G. Murtaza, H. Faiza, M. Rafiq, S. Gul, F. Raza, Sarwat Anjum

Abstract:

Our objective was to investigate whether and how extensively there is a correlation between aging in men, gonadotropin hormone regulation, and a decline in sperm parameters and whether it is possible to identify an age limit beyond which the decrease in sperm feature and hormonal regulation reaches statistical significance. A total of one hundred and twenty men (age: 20–50 years) were divided into two groups; each group contained 60 males (Group A with a young age of 20–35 years and Group B with an older age of 36–50 years) who visited the Center for Reproductive Medicine (CRM) in Peshawar General Hospital (PGH) Peshawar, Pakistan. Clinical assessment and sperm analysis were investigated. Hormone testing and semen analysis were carried out in accordance with World Health Organization (WHO) guidelines. Hormone tests, sperm morphology, and the total motile spermatozoa count (TMS) were computed. SPSS 20.0 (SPSS Inc., Chicago, IL, USA) was used for the statistical analysis. It was observed that the testosterone levels in Group A (mean = 3.770) and Group B (mean = 3.995) were comparable, with a significant P-value <0.005 in both age groups. Furthermore, similar levels are shown by follicle-stimulating hormone (FSH) (Group A mean = 19.73, Group B mean = 15.64) and luteinizing hormone (LH) (Group A mean = 12.25, Group B mean = 11.93) in both groups, with a significant P = <0.005. Sperm concentrations were most similar in Group A, with a mean of 4.44, and in Group B, with a mean of 4.42 and a significant P value of 0.005 in both groups. Additionally, it was discovered that sperm motility was higher in Group A, with a mean of 22.40 and a P-value of 0.052, which was non-significant when compared to Group B. Morphological differences were also observed in both age groups. This research found that advancing in male age does not affect sex hormone regulation; in contrast, the fraction of motile and morphologically normal spermatozoa decreases as male age increases, with the strongest evidence being when the age exceeds 40 years. To clarify the causes and clinical implications of these correlations, more research is necessary.

Keywords: gonadotropins, motility, spermatozoa, testosterone

Procedia PDF Downloads 44
10170 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 266
10169 Trajectories of Depression Anxiety and Stress among Breast Cancer Patients: Assessment at First Year of Diagnosis

Authors: Jyoti Srivastava, Sandhya S. Kaushik, Mallika Tewari, Hari S. Shukla

Abstract:

Little information is available about the development of psychological well being over time among women who have been undergoing treatment for breast cancer. The aim of this study was to identify the trajectories of depression anxiety and stress among women with early-stage breast cancer. Of the 48 Indian women with newly diagnosed early-stage breast cancer recruited from surgical oncology unit, 39 completed an interview and were assessed for depression anxiety and stress (Depression Anxiety Stress Scale-DASS 21) before their first course of chemotherapy (baseline) and follow up interviews at 3, 6 and 9 months thereafter. Growth mixture modeling was used to identify distinct trajectories of Depression Anxiety and Stress symptoms. Logistic Regression analysis was used to evaluate the characteristics of women in distinct groups. Most women showed mild to moderate level of depression and anxiety (68%) while normal to mild level of stress (71%). But one in 11 women was chronically anxious (9%) and depressed (9%). Young age, having a partner, shorter education and receiving chemotherapy but not radiotherapy might characterize women whose psychological symptoms remain strong nine months after diagnosis. By looking beyond the mean, it was found that several socio-demographic and treatment factors characterized the women whose depression, anxiety and stress level remained severe even nine months after diagnosis. The results suggest that support provided to cancer patients should have a special focus on a relatively small group of patient most in need.

Keywords: psychological well being, growth mixture modeling, logistic regression analysis, socio-demographic factors

Procedia PDF Downloads 154
10168 Modeling Core Flooding Experiments for Co₂ Geological Storage Applications

Authors: Avinoam Rabinovich

Abstract:

CO₂ geological storage is a proven technology for reducing anthropogenic carbon emissions, which is paramount for achieving the ambitious net zero emissions goal. Core flooding experiments are an important step in any CO₂ storage project, allowing us to gain information on the flow of CO₂ and brine in the porous rock extracted from the reservoir. This information is important for understanding basic mechanisms related to CO₂ geological storage as well as for reservoir modeling, which is an integral part of a field project. In this work, a different method for constructing accurate models of CO₂-brine core flooding will be presented. Results for synthetic cases and real experiments will be shown and compared with numerical models to exhibit their predictive capabilities. Furthermore, the various mechanisms which impact the CO₂ distribution and trapping in the rock samples will be discussed, and examples from models and experiments will be provided. The new method entails solving an inverse problem to obtain a three-dimensional permeability distribution which, along with the relative permeability and capillary pressure functions, constitutes a model of the flow experiments. The model is more accurate when data from a number of experiments are combined to solve the inverse problem. This model can then be used to test various other injection flow rates and fluid fractions which have not been tested in experiments. The models can also be used to bridge the gap between small-scale capillary heterogeneity effects (sub-core and core scale) and large-scale (reservoir scale) effects, known as the upscaling problem.

Keywords: CO₂ geological storage, residual trapping, capillary heterogeneity, core flooding, CO₂-brine flow

Procedia PDF Downloads 77
10167 Mathematical Modeling of Avascular Tumor Growth and Invasion

Authors: Meitham Amereh, Mohsen Akbari, Ben Nadler

Abstract:

Cancer has been recognized as one of the most challenging problems in biology and medicine. Aggressive tumors are a lethal type of cancers characterized by high genomic instability, rapid progression, invasiveness, and therapeutic resistance. Their behavior involves complicated molecular biology and consequential dynamics. Although tremendous effort has been devoted to developing therapeutic approaches, there is still a huge need for new insights into the dark aspects of tumors. As one of the key requirements in better understanding the complex behavior of tumors, mathematical modeling and continuum physics, in particular, play a pivotal role. Mathematical modeling can provide a quantitative prediction on biological processes and help interpret complicated physiological interactions in tumors microenvironment. The pathophysiology of aggressive tumors is strongly affected by the extracellular cues such as stresses produced by mechanical forces between the tumor and the host tissue. During the tumor progression, the growing mass displaces the surrounding extracellular matrix (ECM), and due to the level of tissue stiffness, stress accumulates inside the tumor. The produced stress can influence the tumor by breaking adherent junctions. During this process, the tumor stops the rapid proliferation and begins to remodel its shape to preserve the homeostatic equilibrium state. To reach this, the tumor, in turn, upregulates epithelial to mesenchymal transit-inducing transcription factors (EMT-TFs). These EMT-TFs are involved in various signaling cascades, which are often associated with tumor invasiveness and malignancy. In this work, we modeled the tumor as a growing hyperplastic mass and investigated the effects of mechanical stress from surrounding ECM on tumor invasion. The invasion is modeled as volume-preserving inelastic evolution. In this framework, principal balance laws are considered for tumor mass, linear momentum, and diffusion of nutrients. Also, mechanical interactions between the tumor and ECM is modeled using Ciarlet constitutive strain energy function, and dissipation inequality is utilized to model the volumetric growth rate. System parameters, such as rate of nutrient uptake and cell proliferation, are obtained experimentally. To validate the model, human Glioblastoma multiforme (hGBM) tumor spheroids were incorporated inside Matrigel/Alginate composite hydrogel and was injected into a microfluidic chip to mimic the tumor’s natural microenvironment. The invasion structure was analyzed by imaging the spheroid over time. Also, the expression of transcriptional factors involved in invasion was measured by immune-staining the tumor. The volumetric growth, stress distribution, and inelastic evolution of tumors were predicted by the model. Results showed that the level of invasion is in direct correlation with the level of predicted stress within the tumor. Moreover, the invasion length measured by fluorescent imaging was shown to be related to the inelastic evolution of tumors obtained by the model.

Keywords: cancer, invasion, mathematical modeling, microfluidic chip, tumor spheroids

Procedia PDF Downloads 115
10166 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 143
10165 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain

Authors: Bita Payami-Shabestari, Dariush Eslami

Abstract:

The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.

Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory

Procedia PDF Downloads 132
10164 Carrier Communication through Power Lines

Authors: Pavuluri Gopikrishna, B. Neelima

Abstract:

Power line carrier communication means audio power transmission via power line and reception of the amplified audio power at the receiver as in the form of speaker output signal using power line as the channel medium. The main objective of this suggested work is to transmit our message signal after frequency modulation by the help of FM modulator IC LM565 which gives output proportional to the input voltage of the input message signal. And this audio power is received from the power line by the help of isolation circuit and demodulated from IC LM565 which uses the concept of the PLL and produces FM demodulated signal to the listener. Message signal will be transmitted over the carrier signal that will be generated from the FM modulator IC LM565. Using this message signal will not damage because of no direct contact of message signal from the power line, but noise can disturb our information.

Keywords: amplification, fm demodulator ic 565, fm modulator ic 565, phase locked loop, power isolation

Procedia PDF Downloads 555
10163 Cleaner Technology for Stone Crushers

Authors: S. M. Ahuja

Abstract:

There are about 12000 stone crusher units in India and are located in clusters around urban areas to the stone quarries. These crushers create lot of fugitive dust emissions and noise pollution which is a major health hazard for the people working in the crushers and also living in its vicinity. Ambient air monitoring was carried out near various stone crushers and it has been observed that fugitive emission varied from 300 to 8000 mg/Nm3. A number of stone crushers were thoroughly studied and their existing pollution control devices were examined. Limitations in the existing technology were also studied. A technology consisting of minimal effective spray nozzles to reduce the emissions at source followed by a containment cum control system having modular cyclones as air pollution control device has been conceived. Besides preliminary energy audit has also been carried out in some of the stone crushers which indicates substantial potential for energy saving.

Keywords: stone crushers, spray nozzles, energy audit

Procedia PDF Downloads 338
10162 Enhancement of Pulsed Eddy Current Response Based on Power Spectral Density after Continuous Wavelet Transform Decomposition

Authors: A. Benyahia, M. Zergoug, M. Amir, M. Fodil

Abstract:

The main objective of this work is to enhance the Pulsed Eddy Current (PEC) response from the aluminum structure using signal processing. Cracks and metal loss in different structures cause changes in PEC response measurements. In this paper, time-frequency analysis is used to represent PEC response, which generates a large quantity of data and reduce the noise due to measurement. Power Spectral Density (PSD) after Wavelet Decomposition (PSD-WD) is proposed for defect detection. The experimental results demonstrate that the cracks in the surface can be extracted satisfactorily by the proposed methods. The validity of the proposed method is discussed.

Keywords: DT, pulsed eddy current, continuous wavelet transform, Mexican hat wavelet mother, defect detection, power spectral density.

Procedia PDF Downloads 244
10161 Polyclonal IgG glycosylation in Patients with Pediatric Appendicitis

Authors: Dalma Dojcsák, Csaba Váradi, Flóra Farkas, Tamás Farkas, János Papp, Béla Viskolcz

Abstract:

Background: Appendicitis is a common acute inflammatory condition in both children and adults, but current laboratory markers such as C-reactive protein (CRP), white blood cell count (WBC), absolute neutrophil count (ANC), and red blood cell count (RNC) lack specificity in detecting appendicitis-related inflammation. N-glycosylation, an asparagine-linked glycosylation process, plays a vital role in cellular interactions, angiogenesis, immune response, and effector functions. Altered N-glycosylation impacts tumor growth and both acute and chronic inflammatory processes. IgG, the second most abundant glycoprotein in serum, shows altered glycosylation patterns during inflammation, suggesting that IgG glycan modifications may serve as potential biomarkers for appendicitis. Specifically, increased levels of agalactosylated IgG glycans are a known feature of various inflammatory conditions, potentially including appendicitis. Identifying pediatric appendicitis remains challenging due to the absence of specific biomarkers, which makes diagnosis reliant on clinical symptoms, imaging such as ultrasound, and nonspecific lab indicators (e.g., CRP, WBC, ANC). In this study, we analyzed the IgG derived N-glycome in pediatric patients with appendicitis compared with healthy controls. Methodology: The N-glycome was analyzed by high-performance liquid-chromatography combined with mass spectrometry. IgG was isolated from serum samples by Protein G column. The IgG derived glycans were released by enzymatic deglycosylation and fluorescent tags were attached to each glycan moiety, which made necessitates the sample clean-up for further reliable quantitation. Overall, 38 controls and 40 serum samples diagnosed with pediatric appendicitis were analyzed by HILIC-MS methods. Multivariate statistical tests were performed with area percentage under the peak data derived from the integrated peaks, which were obtained from the chromatograms. Conclusions: Our results represented the altered N-glycome of IgG in pediatric appendicitis is similar with other observations. The glycosylation pattern reported so far for IgG is characterized by decreased galactosylation and sialylation, and an increase in fucosylation.

Keywords: N-glycosylation, liquid chromatography, mass spectrometry, inflammation, appendicitis, immunoglobulin G

Procedia PDF Downloads 18
10160 Assessing the Theoretical Suitability of Sentinel-2 and Worldview-3 Data for Hydrocarbon Mapping of Spill Events, Using Hydrocarbon Spectral Slope Model

Authors: K. Tunde Olagunju, C. Scott Allen, Freek Van Der Meer

Abstract:

Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization are only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two (2) operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the hydrocarbon spectral slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven (7) different hydrocarbon oils (crude and refined oil) taken on ten (10) different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).

Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon-substrate combination, Sentinel-2, WorldView-3

Procedia PDF Downloads 220
10159 Synergistic Effect of Eugenol Acetate with Betalactam Antibiotic on Betalactamase and Its Bioinformatics Analysis

Authors: Vinod Nair, C. Sadasivan

Abstract:

Beta-lactam antibiotics are the most frequently prescribed medications in modern medicine. The antibiotic resistance by the production of enzyme beta-lactamase is an important mechanism seen in microorganisms. Resistance to beta-lactams mediated by beta-lactamases can be overcome successfully with the use of beta-lactamase inhibitors. New generations of the antibiotics contain mostly synthetic compounds, and many side effects have been reported for them. Combinations of beta-lactam and beta-lactamase inhibitors have become one of the most successful antimicrobial strategies in the current scenario of bacterial infections. Plant-based drugs are very cheap and having lesser adverse effect than synthetic compounds. The synergistic effect of eugenol acetate with beta-lactams restores the activity of beta-lactams, allowing their continued clinical use. It is reported here the enhanced inhibitory effect of phytochemical, eugenol acetate, isolated from the plant Syzygium aromaticum with beta-lactams on beta-lactamase. The compound was found to have synergistic effect with the antibiotic amoxicillin against antibiotic-resistant strain of S.aureus. The enzyme was purified from the organism and incubated with the compound. The assay showed that the compound could inhibit the enzymatic activity of beta-lactamase. Modeling and molecular docking studies indicated that the compound can fit into the active site of beta-lactamase and can mask the important residue for hydrolysis of beta-lactams. The synergistic effects of eugenol acetate with beta-lactam antibiotics may justify, the use of these plant compounds for the preparation of β-lactamase inhibitors against β-lactam resistant S.aureus.

Keywords: betalactamase, eugenol acetate, synergistic effect, molecular modeling

Procedia PDF Downloads 253
10158 Analysis of the Level of Production Failures by Implementing New Assembly Line

Authors: Joanna Kochanska, Dagmara Gornicka, Anna Burduk

Abstract:

The article examines the process of implementing a new assembly line in a manufacturing enterprise of the household appliances industry area. At the initial stages of the project, a decision was made that one of its foundations should be the concept of lean management. Because of that, eliminating as many errors as possible in the first phases of its functioning was emphasized. During the start-up of the line, there were identified and documented all production losses (from serious machine failures, through any unplanned downtime, to micro-stops and quality defects). During 6 weeks (line start-up period), all errors resulting from problems in various areas were analyzed. These areas were, among the others, production, logistics, quality, and organization. The aim of the work was to analyze the occurrence of production failures during the initial phase of starting up the line and to propose a method for determining their critical level during its full functionality. There was examined the repeatability of the production losses in various areas and at different levels at such an early stage of implementation, by using the methods of statistical process control. Based on the Pareto analysis, there were identified the weakest points in order to focus improvement actions on them. The next step was to examine the effectiveness of the actions undertaken to reduce the level of recorded losses. Based on the obtained results, there was proposed a method for determining the critical failures level in the studied areas. The developed coefficient can be used as an alarm in case of imbalance of the production, which is caused by the increased failures level in production and production support processes in the period of the standardized functioning of the line.

Keywords: production failures, level of production losses, new production line implementation, assembly line, statistical process control

Procedia PDF Downloads 137
10157 A Review on Predictive Sound Recognition System

Authors: Ajay Kadam, Ramesh Kagalkar

Abstract:

The proposed research objective is to add to a framework for programmed recognition of sound. In this framework the real errand is to distinguish any information sound stream investigate it & anticipate the likelihood of diverse sounds show up in it. To create and industrially conveyed an adaptable sound web crawler a flexible sound search engine. The calculation is clamor and contortion safe, computationally productive, and hugely adaptable, equipped for rapidly recognizing a short portion of sound stream caught through a phone microphone in the presence of frontal area voices and other predominant commotion, and through voice codec pressure, out of a database of over accessible tracks. The algorithm utilizes a combinatorial hashed time-recurrence group of stars examination of the sound, yielding ordinary properties, for example, transparency, in which numerous tracks combined may each be distinguished.

Keywords: fingerprinting, pure tone, white noise, hash function

Procedia PDF Downloads 327
10156 Performance Tests of Wood Glues on Different Wood Species Used in Wood Workshops: Morogoro Tanzania

Authors: Japhet N. Mwambusi

Abstract:

High tropical forests deforestation for solid wood furniture industry is among of climate change contributing agents. This pressure indirectly is caused by furniture joints failure due to poor gluing technology based on improper use of different glues to different wood species which lead to low quality and weak wood-glue joints. This study was carried in order to run performance tests of wood glues on different wood species used in wood workshops: Morogoro Tanzania whereby three popular wood species of C. lusitanica, T. glandis and E. maidenii were tested against five glues of Woodfix, Bullbond, Ponal, Fevicol and Coral found in the market. The findings were necessary on developing a guideline for proper glue selection for a particular wood species joining. Random sampling was employed to interview carpenters while conducting a survey on the background of carpenters like their education level and to determine factors that influence their glues choice. Monsanto Tensiometer was used to determine bonding strength of identified wood glues to different wood species in use under British Standard of testing wood shear strength (BS EN 205) procedures. Data obtained from interviewing carpenters were analyzed through Statistical Package of Social Science software (SPSS) to allow the comparison of different data while laboratory data were compiled, related and compared by the use of MS Excel worksheet software as well as Analysis of Variance (ANOVA). Results revealed that among all five wood glues tested in the laboratory to three different wood species, Coral performed much better with the average shear strength 4.18 N/mm2, 3.23 N/mm2 and 5.42 N/mm2 for Cypress, Teak and Eucalyptus respectively. This displays that for a strong joint to be formed to all tree wood species for soft wood and hard wood, Coral has a first priority in use. The developed table of guideline from this research can be useful to carpenters on proper glue selection to a particular wood species so as to meet glue-bond strength. This will secure furniture market as well as reduce pressure to the forests for furniture production because of the strong existing furniture due to their strong joints. Indeed, this can be a good strategy on reducing climate change speed in tropics which result from high deforestation of trees for furniture production.

Keywords: climate change, deforestation, gluing technology, joint failure, wood-glue, wood species

Procedia PDF Downloads 245
10155 Development of a Roadmap for Assessment the Sustainability of Buildings in Saudi Arabia Using Building Information Modeling

Authors: Ibrahim A. Al-Sulaihi, Khalid S. Al-Gahtani, Abdullah M. Al-Sugair, Aref A. Abadel

Abstract:

Achieving environmental sustainability is one of the important issues considered in many countries’ vision. Green/Sustainable building is widely used terminology for describing a friendly environmental construction. Applying sustainable practices has a significant importance in various fields, including construction field that consumes an enormous amount of resource and causes a considerable amount of waste. The need for sustainability is increased in the regions that suffering from the limitation of natural resource and extreme weather conditions such as Saudi Arabia. Since buildings designs are getting sophisticated, the need for tools, which support decision-making for sustainability issues, is increasing, especially in the design and preconstruction stages. In this context, Building Information Modeling (BIM) can aid in performing complex building performance analyses to ensure an optimized sustainable building design. Accordingly, this paper introduces a roadmap towards developing a systematic approach for presenting the sustainability of buildings using BIM. The approach includes set of main processes including; identifying the sustainability parameters that can be used for sustainability assessment in Saudi Arabia, developing sustainability assessment method that fits the special circumstances in the Kingdom, identifying the sustainability requirements and BIM functions that can be used for satisfying these requirements, and integrating these requirements with identified functions. As a result, the sustainability-BIM approach can be developed which helps designers in assessing the sustainability and exploring different design alternatives at the early stage of the construction project.

Keywords: green buildings, sustainability, BIM, rating systems, environment, Saudi Arabia

Procedia PDF Downloads 381
10154 Commercial Automobile Insurance: A Practical Approach of the Generalized Additive Model

Authors: Nicolas Plamondon, Stuart Atkinson, Shuzi Zhou

Abstract:

The insurance industry is usually not the first topic one has in mind when thinking about applications of data science. However, the use of data science in the finance and insurance industry is growing quickly for several reasons, including an abundance of reliable customer data, ferocious competition requiring more accurate pricing, etc. Among the top use cases of data science, we find pricing optimization, customer segmentation, customer risk assessment, fraud detection, marketing, and triage analytics. The objective of this paper is to present an application of the generalized additive model (GAM) on a commercial automobile insurance product: an individually rated commercial automobile. These are vehicles used for commercial purposes, but for which there is not enough volume to apply pricing to several vehicles at the same time. The GAM model was selected as an improvement over GLM for its ease of use and its wide range of applications. The model was trained using the largest split of the data to determine model parameters. The remaining part of the data was used as testing data to verify the quality of the modeling activity. We used the Gini coefficient to evaluate the performance of the model. For long-term monitoring, commonly used metrics such as RMSE and MAE will be used. Another topic of interest in the insurance industry is to process of producing the model. We will discuss at a high level the interactions between the different teams with an insurance company that needs to work together to produce a model and then monitor the performance of the model over time. Moreover, we will discuss the regulations in place in the insurance industry. Finally, we will discuss the maintenance of the model and the fact that new data does not come constantly and that some metrics can take a long time to become meaningful.

Keywords: insurance, data science, modeling, monitoring, regulation, processes

Procedia PDF Downloads 79
10153 The ‘Quartered Head Technique’: A Simple, Reliable Way of Maintaining Leg Length and Offset during Total Hip Arthroplasty

Authors: M. Haruna, O. O. Onafowokan, G. Holt, K. Anderson, R. G. Middleton

Abstract:

Background: Requirements for satisfactory outcomes following total hip arthroplasty (THA) include restoration of femoral offset, version, and leg length. Various techniques have been described for restoring these biomechanical parameters, with leg length restoration being the most predominantly described. We describe a “quartered head technique” (QHT) which uses a stepwise series of femoral head osteotomies to identify and preserve the centre of rotation of the femoral head during THA in order to ensure reconstruction of leg length, offset and stem version, such that hip biomechanics are restored as near to normal as possible. This study aims to identify whether using the QHT during hip arthroplasty effectively restores leg length and femoral offset to within acceptable parameters. Methods: A retrospective review of 206 hips was carried out, leaving 124 hips in the final analysis. Power analysis indicated a minimum of 37 patients required. All operations were performed using an anterolateral approach by a single surgeon. All femoral implants were cemented, collarless, polished double taper CPT® stems (Zimmer, Swindon, UK). Both cemented, and uncemented acetabular components were used (Zimmer, Swindon, UK). Leg length, version, and offset were assessed intra-operatively and reproduced using the QHT. Post-operative leg length and femoral offset were determined and compared with the contralateral native hip, and the difference was then calculated. For the determination of leg length discrepancy (LLD), we used the method described by Williamson & Reckling, which has been shown to be reproducible with a measurement error of ±1mm. As a reference, the inferior margin of the acetabular teardrop and the most prominent point of the lesser trochanter were used. A discrepancy of less than 6mm LLD was chosen as acceptable. All peri-operative radiographs were assessed by two independent observers. Results: The mean absolute post-operative difference in leg length from the contralateral leg was +3.58mm. 84% of patients (104/124) had LLD within ±6mm of the contralateral limb. The mean absolute post-operative difference in offset from contralateral leg was +3.88mm (range -15 to +9mm, median 3mm). 90% of patients (112/124) were within ±6mm offset of the contralateral limb. There was no statistical difference noted between observer measurements. Conclusion: The QHT provides a simple, inexpensive yet effective method of maintaining femoral leg length and offset during total hip arthroplasty. Combining this technique with pre-operative templating or other techniques described may enable surgeons to reduce even further the discrepancies between pre-operative state and post-operative outcome.

Keywords: leg length discrepancy, technical tip, total hip arthroplasty, operative technique

Procedia PDF Downloads 89
10152 Modeling of Strong Motion Generation Areas of the 2011 Tohoku, Japan Earthquake Using Modified Semi-Empirical Technique Incorporating Frequency Dependent Radiation Pattern Model

Authors: Sandeep, A. Joshi, Kamal, Piu Dhibar, Parveen Kumar

Abstract:

In the present work strong ground motion has been simulated using a modified semi-empirical technique (MSET), with frequency dependent radiation pattern model. Joshi et al. (2014) have modified the semi-empirical technique to incorporate the modeling of strong motion generation areas (SMGAs). A frequency dependent radiation pattern model is applied to simulate high frequency ground motion more precisely. Identified SMGAs (Kurahashi and Irikura 2012) of the 2011 Tohoku earthquake (Mw 9.0) were modeled using this modified technique. Records are simulated for both frequency dependent and constant radiation pattern function. Simulated records for both cases are compared with observed records in terms of peak ground acceleration and pseudo acceleration response spectra at different stations. Comparison of simulated and observed records in terms of root mean square error suggests that the method is capable of simulating record which matches in a wide frequency range for this earthquake and bears realistic appearance in terms of shape and strong motion parameters. The results confirm the efficacy and suitability of rupture model defined by five SMGAs for the developed modified technique.

Keywords: strong ground motion, semi-empirical, strong motion generation area, frequency dependent radiation pattern, 2011 Tohoku Earthquake

Procedia PDF Downloads 542
10151 Evaluation of the Weight-Based and Fat-Based Indices in Relation to Basal Metabolic Rate-to-Weight Ratio

Authors: Orkide Donma, Mustafa M. Donma

Abstract:

Basal metabolic rate is questioned as a risk factor for weight gain. The relations between basal metabolic rate and body composition have not been cleared yet. The impact of fat mass on basal metabolic rate is also uncertain. Within this context, indices based upon total body mass as well as total body fat mass are available. In this study, the aim is to investigate the potential clinical utility of these indices in the adult population. 287 individuals, aged from 18 to 79 years, were included into the scope of the study. Based upon body mass index values, 10 underweight, 88 normal, 88 overweight, 81 obese, and 20 morbid obese individuals participated. Anthropometric measurements including height (m), and weight (kg) were performed. Body mass index, diagnostic obesity notation model assessment index I, diagnostic obesity notation model assessment index II, basal metabolic rate-to-weight ratio were calculated. Total body fat mass (kg), fat percent (%), basal metabolic rate, metabolic age, visceral adiposity, fat mass of upper as well as lower extremities and trunk, obesity degree were measured by TANITA body composition monitor using bioelectrical impedance analysis technology. Statistical evaluations were performed by statistical package (SPSS) for Windows Version 16.0. Scatterplots of individual measurements for the parameters concerning correlations were drawn. Linear regression lines were displayed. The statistical significance degree was accepted as p < 0.05. The strong correlations between body mass index and diagnostic obesity notation model assessment index I as well as diagnostic obesity notation model assessment index II were obtained (p < 0.001). A much stronger correlation was detected between basal metabolic rate and diagnostic obesity notation model assessment index I in comparison with that calculated for basal metabolic rate and body mass index (p < 0.001). Upon consideration of the associations between basal metabolic rate-to-weight ratio and these three indices, the best association was observed between basal metabolic rate-to-weight and diagnostic obesity notation model assessment index II. In a similar manner, this index was highly correlated with fat percent (p < 0.001). Being independent of the indices, a strong correlation was found between fat percent and basal metabolic rate-to-weight ratio (p < 0.001). Visceral adiposity was much strongly correlated with metabolic age when compared to that with chronological age (p < 0.001). In conclusion, all three indices were associated with metabolic age, but not with chronological age. Diagnostic obesity notation model assessment index II values were highly correlated with body mass index values throughout all ranges starting with underweight going towards morbid obesity. This index is the best in terms of its association with basal metabolic rate-to-weight ratio, which can be interpreted as basal metabolic rate unit.

Keywords: basal metabolic rate, body mass index, children, diagnostic obesity notation model assessment index, obesity

Procedia PDF Downloads 153
10150 Numerical Simulation of Free Surface Water Wave for the Flow Around NACA 0012 Hydrofoil and Wigley Hull Using VOF Method

Authors: Omar Imine, Mohammed Aounallah, Mustapha Belkadi

Abstract:

Steady three-dimensional and two free surface waves generated by moving bodies are presented, the flow problem to be simulated is rich in complexity and poses many modeling challenges because of the existence of breaking waves around the ship hull, and because of the interaction of the two-phase flow with the turbulent boundary layer. The results of several simulations are reported. The first study was performed for NACA0012 of hydrofoil with different meshes, this section is analyzed at h/c= 1, 0345 for 2D. In the second simulation, a mathematically defined Wigley hull form is used to investigate the application of a commercial CFD code in prediction of the total resistance and its components from tangential and normal forces on the hull wetted surface. The computed resistance and wave profiles are used to estimate the coefficient of the total resistance for Wigley hull advancing in calm water under steady conditions. The commercial CFD software FLUENT version 12 is used for the computations in the present study. The calculated grid is established using the code computer GAMBIT 2.3.26. The shear stress k-ωSST model is used for turbulence modeling and the volume of the fluid technique is employed to simulate the free-surface motion. The second order upwind scheme is used for discretizing the convection terms in the momentum transport equations, the Modified HRICscheme for VOF discretization. The results obtained compare well with the experimental data.

Keywords: free surface flows, breaking waves, boundary layer, Wigley hull, volume of fluid

Procedia PDF Downloads 379
10149 Air Quality Assessment for a Hot-Spot Station by Neural Network Modelling of the near-Traffic Emission-Immission Interaction

Authors: Tim Steinhaus, Christian Beidl

Abstract:

Urban air quality and climate protection are two major challenges for future mobility systems. Despite the steady reduction of pollutant emissions from vehicles over past decades, local immission load within cities partially still reaches heights, which are considered hazardous to human health. Although traffic-related emissions account for a major part of the overall urban pollution, modeling the exact interaction remains challenging. In this paper, a novel approach for the determination of the emission-immission interaction on the basis of neural network modeling for traffic induced NO2-immission load within a near-traffic hot-spot scenario is presented. In a detailed sensitivity analysis, the significance of relevant influencing variables on the prevailing NO2 concentration is initially analyzed. Based on this, the generation process of the model is described, in which not only environmental influences but also the vehicle fleet composition including its associated segment- and certification-specific real driving emission factors are derived and used as input quantities. The validity of this approach, which has been presented in the past, is re-examined in this paper using updated data on vehicle emissions and recent immission measurement data. Within the framework of a final scenario analysis, the future development of the immission load is forecast for different developments in the vehicle fleet composition. It is shown that immission levels of less than half of today’s yearly average limit values are technically feasible in hot-spot situations.

Keywords: air quality, emission, emission-immission-interaction, immission, NO2, zero impact

Procedia PDF Downloads 130
10148 Students Attitudes University of Tabuk Toward the Study at the Deanship of the Preparatory Year According to the Variables of the Academic and Gender

Authors: Awad Alhwiti

Abstract:

The purpose of this study was to investigate attitudes students in Tabuk University towards the study in the deanship of the preparation year according to the study stream (scientific, literature) and gender (male, female).The sample of the study consisted of (219) males, (120) of them are in the scientific stream and (99) from the literature stream. Moreover, (238) females, (172) of them are in the scientific stream and (66) from the literature stream. The researcher developed valid and reliable instrument to measure their attitudes towards the study in the deanship of the preparation year. The scale of the study consisted of a group of paragraphs which take positive numbers from (1) to (13) in the meter, and a group of paragraphs which take negative number from (14) to (34) in the scale. The findings of the study showed that (13) items of the scale had a high degree of evaluation, while two items had an average evaluation degree. Meanwhile, (19) items had a low evaluation degree, and the trends in general where it came from (19) paragraphs negative, and (14) paragraphs positive. As the total means of Tabuk students attitudes towards the study in the deanship of the preparation year was (1.92) with a standard deviation of (0.64) with an average evaluation degree. The findings showed that there were significant statistical difference at the level of (α = 0.05) in the samples’ attitudes towards the study in the preparation year attributed to study stream (scientific, literature) on the favor of the scientific stream. While, there were no significant statistical difference at the level of (α = 0.05) in the samples’ attitudes towards the study in the preparation year attributed to and gender (male, female).

Keywords: students attitudes, preparation year deanship, Tabuk University, education technology

Procedia PDF Downloads 259
10147 The Methodology of Hand-Gesture Based Form Design in Digital Modeling

Authors: Sanghoon Shim, Jaehwan Jung, Sung-Ah Kim

Abstract:

As the digital technology develops, studies on the TUI (Tangible User Interface) that links the physical environment utilizing the human senses with the virtual environment through the computer are actively being conducted. In addition, there has been a tremendous advance in computer design making through the use of computer-aided design techniques, which enable optimized decision-making through comparison with machine learning and parallel comparison of alternatives. However, a complex design that can respond to user requirements or performance can emerge through the intuition of the designer, but it is difficult to actualize the emerged design by the designer's ability alone. Ancillary tools such as Gaudí's Sandbag can be an instrument to reinforce and evolve emerged ideas from designers. With the advent of many commercial tools that support 3D objects, designers' intentions are easily reflected in their designs, but the degree of their reflection reflects their intentions according to the proficiency of design tools. This study embodies the environment in which the form can be implemented by the fingers of the most basic designer in the initial design phase of the complex type building design. Leapmotion is used as a sensor to recognize the hand motions of the designer, and it is converted into digital information to realize an environment that can be linked in real time in virtual reality (VR). In addition, the implemented design can be linked with Rhino™, a 3D authoring tool, and its plug-in Grasshopper™ in real time. As a result, it is possible to design sensibly using TUI, and it can serve as a tool for assisting designer intuition.

Keywords: design environment, digital modeling, hand gesture, TUI, virtual reality

Procedia PDF Downloads 369
10146 Analysis of Various Copy Move Image Forgery Techniques for Better Detection Accuracy

Authors: Grishma D. Solanki, Karshan Kandoriya

Abstract:

In modern era of information age, digitalization has revolutionized like never before. Powerful computers, advanced photo editing software packages and high resolution capturing devices have made manipulation of digital images incredibly easy. As per as image forensics concerns, one of the most actively researched area are detection of copy move forgeries. Higher computational complexity is one of the major component of existing techniques to detect such tampering. Moreover, copy move forgery is usually performed in three steps. First, copying of a region in an image then pasting the same one in the same respective image and finally doing some post-processing like rotation, scaling, shift, noise, etc. Consequently, pseudo Zernike moment is used as a features extraction method for matching image blocks and as a primary factor on which performance of detection algorithms depends.

Keywords: copy-move image forgery, digital forensics, image forensics, image forgery

Procedia PDF Downloads 292