Search results for: Anomaly Detection Model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19610

Search results for: Anomaly Detection Model

18110 Toward a Characteristic Optimal Power Flow Model for Temporal Constraints

Authors: Zongjie Wang, Zhizhong Guo

Abstract:

While the regular optimal power flow model focuses on a single time scan, the optimization of power systems is typically intended for a time duration with respect to a desired objective function. In this paper, a temporal optimal power flow model for a time period is proposed. To reduce the computation burden needed for calculating temporal optimal power flow, a characteristic optimal power flow model is proposed, which employs different characteristic load patterns to represent the objective function and security constraints. A numerical method based on the interior point method is also proposed for solving the characteristic optimal power flow model. Both the temporal optimal power flow model and characteristic optimal power flow model can improve the systems’ desired objective function for the entire time period. Numerical studies are conducted on the IEEE 14 and 118-bus test systems to demonstrate the effectiveness of the proposed characteristic optimal power flow model.

Keywords: optimal power flow, time period, security, economy

Procedia PDF Downloads 449
18109 The Evaluation Model for the Quality of Software Based on Open Source Code

Authors: Li Donghong, Peng Fuyang, Yang Guanghua, Su Xiaoyan

Abstract:

Using open source code is a popular method of software development. How to evaluate the quality of software becomes more important. This paper introduces an evaluation model. The model evaluates the quality from four dimensions: technology, production, management, and development. Each dimension includes many indicators. The weight of indicator can be modified according to the purpose of evaluation. The paper also introduces a method of using the model. The evaluating result can provide good advice for evaluating or purchasing the software.

Keywords: evaluation model, software quality, open source code, evaluation indicator

Procedia PDF Downloads 386
18108 Applying the Crystal Model to Different Nuclear Systems

Authors: A. Amar

Abstract:

The angular distributions of the nuclear systems under consideration have been analyzed in the framework of the optical model (OM), where the real part was taken in the crystal model form. A crystal model (CM) has been applied to deuteron elastically scattered by ⁶,⁷Li and ⁹Be. A crystal model (CM) + distorted-wave Born approximation (DWBA) + dynamic polarization potential (DPP) potential has been applied to deuteron elastically scattered by ⁶,⁷Li and 9Be. Also, a crystal model has been applied to ⁶Li elastically scattered by ¹⁶O and ²⁸Sn in addition to the ⁷Li+⁷Li system and the ¹²C(alpha,⁸Be) ⁸Be reaction. The continuum-discretized coupled-channels (CDCC) method has been applied to the ⁷Li+⁷Li system and agreement between the crystal model and the continuum-discretized coupled-channels (CDCC) method has been observed. In general, the models succeeded in reproducing the differential cross sections at the full angular range and for all the energies under consideration.

Keywords: optical model (OM), crystal model (CM), distorted-wave born approximation (DWBA), dynamic polarization potential (DPP), the continuum-discretized coupled-channels (CDCC) method, and deuteron elastically scattered by ⁶, ⁷Li and ⁹Be

Procedia PDF Downloads 77
18107 Graphen-Based Nanocomposites for Glucose and Ethanol Enzymatic Biosensor Fabrication

Authors: Tesfaye Alamirew, Delele Worku, Solomon W. Fanta, Nigus Gabbiye

Abstract:

Recently graphen based nanocomposites are become an emerging research areas for fabrication of enzymatic biosensors due to their property of large surface area, conductivity and biocompatibility. This review summarizes recent research reports of graphen based nanocomposites for the fabrication of glucose and ethanol enzymatic biosensors. The newly fabricated enzyme free microwave treated nitrogen doped graphen (MN-d-GR) had provided highest sensitivity towards glucose and GCE/rGO/AuNPs/ADH composite had provided far highest sensitivity towards ethanol compared to other reported graphen based nanocomposites. The MWCNT/GO/GOx and GCE/ErGO/PTH/ADH nanocomposites had also enhanced wide linear range for glucose and ethanol detection respectively. Generally, graphen based nanocomposite enzymatic biosensors had fast direct electron transfer rate, highest sensitivity and wide linear detection ranges during glucose and ethanol sensing.

Keywords: glucose, ethanol, enzymatic biosensor, graphen, nanocomposite

Procedia PDF Downloads 124
18106 Mathematical Model of Cancer Growth under the Influence of Radiation Therapy

Authors: Beata Jackowska-Zduniak

Abstract:

We formulate and analyze a mathematical model describing dynamics of cancer growth under the influence of radiation therapy. The effect of this type of therapy is considered as an additional equation of discussed model. Numerical simulations show that delay, which is added to ordinary differential equations and represent time needed for transformation from one type of cells to the other one, affects the behavior of the system. The validation and verification of proposed model is based on medical data. Analytical results are illustrated by numerical examples of the model dynamics. The model is able to reconstruct dynamics of treatment of cancer and may be used to determine the most effective treatment regimen based on the study of the behavior of individual treatment protocols.

Keywords: mathematical modeling, numerical simulation, ordinary differential equations, radiation therapy

Procedia PDF Downloads 406
18105 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans

Authors: Rene Hellmuth

Abstract:

Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.

Keywords: building information modeling, digital factory model, factory planning, restructuring

Procedia PDF Downloads 112
18104 Bifurcation and Stability Analysis of the Dynamics of Cholera Model with Controls

Authors: C. E. Madubueze, S. C. Madubueze, S. Ajama

Abstract:

Cholera is a disease that is predominately common in developing countries due to poor sanitation and overcrowding population. In this paper, a deterministic model for the dynamics of cholera is developed and control measures such as health educational message, therapeutic treatment, and vaccination are incorporated in the model. The effective reproduction number is computed in terms of the model parameters. The existence and stability of the equilibrium states, disease free and endemic equilibrium states are established and showed to be locally and globally asymptotically stable when R0 < 1 and R0 > 1 respectively. The existence of backward bifurcation of the model is investigated. Furthermore, numerical simulation of the model developed is carried out to show the impact of the control measures and the result indicates that combined control measures will help to reduce the spread of cholera in the population

Keywords: backward bifurcation, cholera, equilibrium, dynamics, stability

Procedia PDF Downloads 430
18103 Forensic Challenges in Source Device Identification for Digital Videos

Authors: Mustapha Aminu Bagiwa, Ainuddin Wahid Abdul Wahab, Mohd Yamani Idna Idris, Suleman Khan

Abstract:

Video source device identification has become a problem of concern in numerous domains especially in multimedia security and digital investigation. This is because videos are now used as evidence in legal proceedings. Source device identification aim at identifying the source of digital devices using the content they produced. However, due to affordable processing tools and the influx in digital content generating devices, source device identification is still a major problem within the digital forensic community. In this paper, we discuss source device identification for digital videos by identifying techniques that were proposed in the literature for model or specific device identification. This is aimed at identifying salient open challenges for future research.

Keywords: video forgery, source camcorder, device identification, forgery detection

Procedia PDF Downloads 629
18102 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river

Procedia PDF Downloads 286
18101 Automatic Censoring in K-Distribution for Multiple Targets Situations

Authors: Naime Boudemagh, Zoheir Hammoudi

Abstract:

The parameters estimation of the K-distribution is an essential part in radar detection. In fact, presence of interfering targets in reference cells causes a decrease in detection performances. In such situation, the estimate of the shape and the scale parameters are far from the actual values. In the order to avoid interfering targets, we propose an Automatic Censoring (AC) algorithm of radar interfering targets in K-distribution. The censoring technique used in this work offers a good discrimination between homogeneous and non-homogeneous environments. The homogeneous population is then used to estimate the unknown parameters by the classical Method of Moment (MOM). The AC algorithm does not need any prior information about the clutter parameters nor does it require both the number and the position of interfering targets. The accuracy of the estimation parameters obtained by this algorithm are validated and compared to various actual values of the shape parameter, using Monte Carlo simulations, this latter show that the probability of censing in multiple target situations are in good agreement.

Keywords: parameters estimation, method of moments, automatic censoring, K distribution

Procedia PDF Downloads 371
18100 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis

Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti

Abstract:

Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.

Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis

Procedia PDF Downloads 157
18099 An Inquiry on 2-Mass and Wheeled Mobile Robot Dynamics

Authors: Boguslaw Schreyer

Abstract:

In this paper, a general dynamical model is derived using the Lagrange formalism. The two masses: sprang and unsprang are included in a six-degree of freedom model for a sprung mass. The unsprung mass is included and shown only in a simplified model, although its equations have also been derived by an author. The simplified equations, more suitable for the computer model of robot’s dynamics are also shown.

Keywords: dynamics, mobile, robot, wheeled mobile robots

Procedia PDF Downloads 332
18098 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection

Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar

Abstract:

Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.

Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE

Procedia PDF Downloads 367
18097 Visual Detection of Escherichia coli (E. coli) through Formation of Beads Aggregation in Capillary Tube by Rolling Circle Amplification

Authors: Bo Ram Choi, Ji Su Kim, Juyeon Cho, Hyukjin Lee

Abstract:

Food contaminated by bacteria (E.coli), causes food poisoning, which occurs to many patients worldwide annually. We have introduced an application of rolling circle amplification (RCA) as a versatile biosensor and developed a diagnostic platform composed of capillary tube and microbeads for rapid and easy detection of Escherichia coli (E. coli). When specific mRNA of E.coli is extracted from cell lysis, rolling circle amplification (RCA) of DNA template can be achieved and can be visualized by beads aggregation in capillary tube. In contrast, if there is no bacterial pathogen in sample, no beads aggregation can be seen. This assay is possible to detect visually target gene without specific equipment. It is likely to the development of a genetic kit for point of care testing (POCT) that can detect target gene using microbeads.

Keywords: rolling circle amplification (RCA), Escherichia coli (E. coli), point of care testing (POCT), beads aggregation, capillary tube

Procedia PDF Downloads 362
18096 Unsupervised Detection of Burned Area from Remote Sensing Images Using Spatial Correlation and Fuzzy Clustering

Authors: Tauqir A. Moughal, Fusheng Yu, Abeer Mazher

Abstract:

Land-cover and land-use change information are important because of their practical uses in various applications, including deforestation, damage assessment, disasters monitoring, urban expansion, planning, and land management. Therefore, developing change detection methods for remote sensing images is an important ongoing research agenda. However, detection of change through optical remote sensing images is not a trivial task due to many factors including the vagueness between the boundaries of changed and unchanged regions and spatial dependence of the pixels to its neighborhood. In this paper, we propose a binary change detection technique for bi-temporal optical remote sensing images. As in most of the optical remote sensing images, the transition between the two clusters (change and no change) is overlapping and the existing methods are incapable of providing the accurate cluster boundaries. In this regard, a methodology has been proposed which uses the fuzzy c-means clustering to tackle the problem of vagueness in the changed and unchanged class by formulating the soft boundaries between them. Furthermore, in order to exploit the neighborhood information of the pixels, the input patterns are generated corresponding to each pixel from bi-temporal images using 3×3, 5×5 and 7×7 window. The between images and within image spatial dependence of the pixels to its neighborhood is quantified by using Pearson product moment correlation and Moran’s I statistics, respectively. The proposed technique consists of two phases. At first, between images and within image spatial correlation is calculated to utilize the information that the pixels at different locations may not be independent. Second, fuzzy c-means technique is used to produce two clusters from input feature by not only taking care of vagueness between the changed and unchanged class but also by exploiting the spatial correlation of the pixels. To show the effectiveness of the proposed technique, experiments are conducted on multispectral and bi-temporal remote sensing images. A subset (2100×1212 pixels) of a pan-sharpened, bi-temporal Landsat 5 thematic mapper optical image of Los Angeles, California, is used in this study which shows a long period of the forest fire continued from July until October 2009. Early forest fire and later forest fire optical remote sensing images were acquired on July 5, 2009 and October 25, 2009, respectively. The proposed technique is used to detect the fire (which causes change on earth’s surface) and compared with the existing K-means clustering technique. Experimental results showed that proposed technique performs better than the already existing technique. The proposed technique can be easily extendable for optical hyperspectral images and is suitable for many practical applications.

Keywords: burned area, change detection, correlation, fuzzy clustering, optical remote sensing

Procedia PDF Downloads 168
18095 Yaw Angle Effect on the Aerodynamic Performance of Rear-Roof Spoiler of Hatchback Vehicle

Authors: See-Yuan Cheng, Kwang-Yhee Chin, Shuhaimi Mansor

Abstract:

Rear-roof spoiler is commonly used for improving the aerodynamic performance of road vehicles. This study aims to investigate the effect of yaw angle on the effectiveness of strip-type rear-roof spoiler in providing lower drag and lift coefficients of a hatchback model. A computational fluid dynamics (CFD) method was used. The numerically obtained results were compared to the experimental data for validation of the CFD method. At increasing yaw angle, both the drag and lift coefficients of the model were to increase. In addition, the effectiveness of spoiler was deteriorated. These unfavorable effects were due to the formation of longitudinal vortices around the side edges of the model that had caused the surface pressure of the model to drop. Furthermore, there were significant crossflow structures developed behind the model at larger yaw angle, which were associated with the drop in the surface pressure of the rear section of the model and cause the drag coefficient to rise.

Keywords: Ahmed model, aerodynamics, spoiler, yaw angle

Procedia PDF Downloads 356
18094 Literature Review: Adversarial Machine Learning Defense in Malware Detection

Authors: Leidy M. Aldana, Jorge E. Camargo

Abstract:

Adversarial Machine Learning has gained importance in recent years as Cybersecurity has gained too, especially malware, it has affected different entities and people in recent years. This paper shows a literature review about defense methods created to prevent adversarial machine learning attacks, firstable it shows an introduction about the context and the description of some terms, in the results section some of the attacks are described, focusing on detecting adversarial examples before coming to the machine learning algorithm and showing other categories that exist in defense. A method with five steps is proposed in the method section in order to define a way to make the literature review; in addition, this paper summarizes the contributions in this research field in the last seven years to identify research directions in this area. About the findings, the category with least quantity of challenges in defense is the Detection of adversarial examples being this one a viable research route with the adaptive approach in attack and defense.

Keywords: Malware, adversarial, machine learning, defense, attack

Procedia PDF Downloads 60
18093 Molecular Detection of mRNA bcr-abl and Circulating Leukemic Stem Cells CD34+ in Patients with Acute Lymphoblastic Leukemia and Chronic Myeloid Leukemia and Its Association with Clinical Parameters

Authors: B. Gonzalez-Yebra, H. Barajas, P. Palomares, M. Hernandez, O. Torres, M. Ayala, A. L. González, G. Vazquez-Ortiz, M. L. Guzman

Abstract:

Leukemia arises by molecular alterations of the normal hematopoietic stem cell (HSC) transforming it into a leukemic stem cell (LSC) with high cell proliferation, self-renewal, and cell differentiation. Chronic myeloid leukemia (CML) originates from an LSC-leading to elevated proliferation of myeloid cells and acute lymphoblastic leukemia (ALL) originates from an LSC development leading to elevated proliferation of lymphoid cells. In both cases, LSC can be identified by multicolor flow cytometry using several antibodies. However, to date, LSC levels in peripheral blood (PB) are not established well enough in ALL and CML patients. On the other hand, the detection of the minimal residue disease (MRD) in leukemia is mainly based on the identification of the mRNA bcr-abl gene in CML patients and some other genes in ALL patients. There is no a properly biomarker to detect MDR in both types of leukemia. The objective of this study was to determine mRNA bcr-abl and the percentage of LSC in peripheral blood of patients with CML and ALL and identify a possible association between the amount of LSC in PB and clinical data. We included in this study 19 patients with Leukemia. A PB sample was collected per patient and leukocytes were obtained by Ficoll gradient. The immunophenotype for LSC CD34+ was done by flow cytometry analysis with CD33, CD2, CD14, CD16, CD64, HLA-DR, CD13, CD15, CD19, CD10, CD20, CD34, CD38, CD71, CD90, CD117, CD123 monoclonal antibodies. In addition, to identify the presence of the mRNA bcr-abl by RT-PCR, the RNA was isolated using TRIZOL reagent. Molecular (presence of mRNA bcr-abl and LSC CD34+) and clinical results were analyzed with descriptive statistics and a multiple regression analysis was performed to determine statistically significant association. In total, 19 patients (8 patients with ALL and 11 patients with CML) were analyzed, 9 patients with de novo leukemia (ALL = 6 and CML = 3) and 10 under treatment (ALL = 5 and CML = 5). The overall frequency of mRNA bcr-abl was 31% (6/19), and it was negative in ALL patients and positive in 80% in CML patients. On the other hand, LSC was determined in 16/19 leukemia patients (%LSC= 0.02-17.3). The Novo patients had higher percentage of LSC (0.26 to 17.3%) than patients under treatment (0 to 5.93%). The amount of LSC was significantly associated with the amount of LSC were: absence of treatment, the absence of splenomegaly, and a lower number of leukocytes, negative association for the clinical variables age, sex, blasts, and mRNA bcr-abl. In conclusion, patients with de novo leukemia had a higher percentage of circulating LSC than patients under treatment, and it was associated with clinical parameters as lack of treatment, absence of splenomegaly and a lower number of leukocytes. The mRNA bcr-abl detection was only possible in the series of patients with CML, and molecular detection of LSC could be identified in the peripheral blood of all leukemia patients, we believe the identification of circulating LSC may be used as biomarker for the detection of the MRD in leukemia patients.

Keywords: stem cells, leukemia, biomarkers, flow cytometry

Procedia PDF Downloads 355
18092 Investigated Optimization of Davidson Path Loss Model for Digital Terrestrial Television (DTTV) Propagation in Urban Area

Authors: Pitak Keawbunsong, Sathaporn Promwong

Abstract:

This paper presents an investigation on the efficiency of the optimized Davison path loss model in order to look for a suitable path loss model to design and planning DTTV propagation for small and medium urban areas in southern Thailand. Hadyai City in Songkla Province is chosen as the case study to collect the analytical data on the electric field strength. The optimization is conducted through the least square method while the efficiency index is through the statistical value of relative error (RE). The result of the least square method is the offset and slop of the frequency to be used in the optimized process. The statistical result shows that RE of the old Davidson model is at the least when being compared with the optimized Davison and the Hata models. Thus, the old Davison path loss model is the most accurate that further becomes the most optimized for the plan on the propagation network design.

Keywords: DTTV propagation, path loss model, Davidson model, least square method

Procedia PDF Downloads 337
18091 Research on Online Consumption of College Students in China with Stimulate-Organism-Reaction Driven Model

Authors: Wei Lu

Abstract:

With the development of information technology in China, network consumption is becoming more and more popular. As a special group, college students have a high degree of education and distinct opinions and personalities. In the future, the key groups of network consumption have gradually become the focus groups of network consumption. Studying college students’ online consumption behavior has important theoretical significance and practical value. Based on the Stimulus-Organism-Response (SOR) driving model and the structural equation model, this paper establishes the influencing factors model of College students’ online consumption behavior, evaluates and amends the model by using SPSS and AMOS software, analyses and determines the positive factors of marketing college students’ consumption, and provides an effective basis for guiding and promoting college student consumption.

Keywords: college students, online consumption, stimulate-organism-reaction driving model, structural equation model

Procedia PDF Downloads 151
18090 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis

Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu

Abstract:

Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.

Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding

Procedia PDF Downloads 166
18089 Relation between Electrical Properties and Application of Chitosan Nanocomposites

Authors: Evgen Prokhorov, Gabriel Luna-Barcenas

Abstract:

The polysaccharide chitosan (CS) is an attractive biopolymer for the stabilization of several nanoparticles in acidic aqueous media. This is due in part to the presence of abundant primary NH2 and OH groups which may lead to steric or chemical stabilization. Applications of most CS nanocomposites are based upon the interaction of high surface area nanoparticles (NPs) with different substance. Therefore, agglomeration of NPs leads to decreasing effective surface area such that it may decrease the efficiency of nanocomposites. The aim of this work is to measure nanocomposite’s electrical conductivity phenomena that will allow one to formulate optimal concentrations of conductivity NPs in CS-based nanocomposites. Additionally, by comparing the efficiency of such nanocomposites, one can guide applications in the biomedical (antibacterial properties and tissue regeneration) and sensor fields (detection of copper and nitrate ions in aqueous solutions). It was shown that the best antibacterial (CS-AgNPs, CS-AgNPs-carbon nanotubes) and would healing properties (CS-AuNPs) are observed in nanocomposites with concentrations of NPs near the percolation threshold. In this regard, the best detection limit in potentiometric and impedimetric sensors for detection of copper ions (using CS-AuNPs membrane) and nitrate ions (using CS-clay membrane) in aqueous solutions have been observed for membranes with concentrations of NPs near percolation threshold. It is well known that at the percolation concentration of NPs an abrupt increasing of conductivity is observed due to the presence of physical contacts between NPs; above this concentration, agglomeration of NPs takes place such that a decrease in the effective surface and performance of nanocomposite appear. The obtained relationship between electrical percolation threshold and performance of polymer nanocomposites with conductivity NPs is important for the design and optimization of polymer-based nanocomposites for different applications.

Keywords: chitosan, conductivity nanoparticles, percolation threshold, polymer nanocomposites

Procedia PDF Downloads 211
18088 Detection of Curvilinear Structure via Recursive Anisotropic Diffusion

Authors: Sardorbek Numonov, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Dongeun Choi, Byung-Woo Hong

Abstract:

The detection of curvilinear structures often plays an important role in the analysis of images. In particular, it is considered as a crucial step for the diagnosis of chronic respiratory diseases to localize the fissures in chest CT imagery where the lung is divided into five lobes by the fissures that are characterized by linear features in appearance. However, the characteristic linear features for the fissures are often shown to be subtle due to the high intensity variability, pathological deformation or image noise involved in the imaging procedure, which leads to the uncertainty in the quantification of anatomical or functional properties of the lung. Thus, it is desired to enhance the linear features present in the chest CT images so that the distinctiveness in the delineation of the lobe is improved. We propose a recursive diffusion process that prefers coherent features based on the analysis of structure tensor in an anisotropic manner. The local image features associated with certain scales and directions can be characterized by the eigenanalysis of the structure tensor that is often regularized via isotropic diffusion filters. However, the isotropic diffusion filters involved in the computation of the structure tensor generally blur geometrically significant structure of the features leading to the degradation of the characteristic power in the feature space. Thus, it is required to take into consideration of local structure of the feature in scale and direction when computing the structure tensor. We apply an anisotropic diffusion in consideration of scale and direction of the features in the computation of the structure tensor that subsequently provides the geometrical structure of the features by its eigenanalysis that determines the shape of the anisotropic diffusion kernel. The recursive application of the anisotropic diffusion with the kernel the shape of which is derived from the structure tensor leading to the anisotropic scale-space where the geometrical features are preserved via the eigenanalysis of the structure tensor computed from the diffused image. The recursive interaction between the anisotropic diffusion based on the geometry-driven kernels and the computation of the structure tensor that determines the shape of the diffusion kernels yields a scale-space where geometrical properties of the image structure are effectively characterized. We apply our recursive anisotropic diffusion algorithm to the detection of curvilinear structure in the chest CT imagery where the fissures present curvilinear features and define the boundary of lobes. It is shown that our algorithm yields precise detection of the fissures while overcoming the subtlety in defining the characteristic linear features. The quantitative evaluation demonstrates the robustness and effectiveness of the proposed algorithm for the detection of fissures in the chest CT in terms of the false positive and the true positive measures. The receiver operating characteristic curves indicate the potential of our algorithm as a segmentation tool in the clinical environment. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).

Keywords: anisotropic diffusion, chest CT imagery, chronic respiratory disease, curvilinear structure, fissure detection, structure tensor

Procedia PDF Downloads 231
18087 Detection of Extrusion Blow Molding Defects by Airflow Analysis

Authors: Eva Savy, Anthony Ruiz

Abstract:

In extrusion blow molding, there is great variability in product quality due to the sensitivity of the machine settings. These variations lead to unnecessary rejects and loss of time. Yet production control is a major challenge for companies in this sector to remain competitive within their market. Current quality control methods only apply to finished products (vision control, leak test...). It has been shown that material melt temperature, blowing pressure, and ambient temperature have a significant impact on the variability of product quality. Since blowing is a key step in the process, we have studied this parameter in this paper. The objective is to determine if airflow analysis allows the identification of quality problems before the full completion of the manufacturing process. We conducted tests to determine if it was possible to identify a leakage defect and an obstructed defect, two common defects on products. The results showed that it was possible to identify a leakage defect by airflow analysis.

Keywords: extrusion blow molding, signal, sensor, defects, detection

Procedia PDF Downloads 149
18086 Effect of Weathering on the Mineralogy and Geochemistry of Sediments of the Hyper Saline Urmia Salt Lake, Iran

Authors: Samad Alipour, Khadije Mosavi Onlaghi

Abstract:

Urmia Salt Lake (USL) is a hypersaline lake in the northwest of Iran. It contains halite as main dissolved and precipitated mineral and the major mineral mixed with lake bed sediments. Other detrital minerals such as calcite, aragonite, dolomite, quartz, feldspars, augite are forming lake sediments. This study examined the impact of weathering of this sediments collected from 1.5 meters depth and augite placers. The study indicated that weathering of tephritic and adakite rocks of the Islamic Island at the immediate boundary of the lake play a main control of lake bed sediments and has produced a large volume of augite placer along the lake bank. Weathering increases from south to toward north with increasing distance from Islamic Island. Geochemistry of lake sediments demonstrated the enrichment of MgO, CaO, Sr with an elevated anomaly of Eu, possibly due to surface absorbance of Mn and Fe associated Sr elevation originating from adakite volcanic rocks in the vicinity of the lake basin. The study shows the local geology is the major factor in origin of lake sediments than chemical and biochemical produced mineral during diagenetic processes.

Keywords: Urmia Lake, weathering, mineralogy, augite, Iran

Procedia PDF Downloads 228
18085 Development of an Image-Based Biomechanical Model for Assessment of Hip Fracture Risk

Authors: Masoud Nasiri Sarvi, Yunhua Luo

Abstract:

Low-trauma hip fracture, usually caused by fall from standing height, has become a main source of morbidity and mortality for the elderly. Factors affecting hip fracture include sex, race, age, body weight, height, body mass distribution, etc., and thus, hip fracture risk in fall differs widely from subject to subject. It is therefore necessary to develop a subject-specific biomechanical model to predict hip fracture risk. The objective of this study is to develop a two-level, image-based, subject-specific biomechanical model consisting of a whole-body dynamics model and a proximal-femur finite element (FE) model for more accurately assessing the risk of hip fracture in lateral falls. Required information for constructing the model is extracted from a whole-body and a hip DXA (Dual Energy X-ray Absorptiometry) image of the subject. The proposed model considers all parameters subject-specifically, which will provide a fast, accurate, and non-expensive method for predicting hip fracture risk.

Keywords: bone mineral density, hip fracture risk, impact force, sideways falls

Procedia PDF Downloads 534
18084 Development of Folding Based Aptasensor for Ochratoxin a Using Different Pulse Voltammetry

Authors: Rupesh K. Mishra, Gaëlle Catanante, Akhtar Hayat, Jean-Louis Marty

Abstract:

Ochratoxins (OTA) are secondary metabolites present in a wide variety of food stuff. They are dangerous by-products mainly produced by several species of storage fungi including the Aspergillus and Penicillium genera. OTA is known to have nephrotoxic, immunotoxic, teratogenic and carcinogenic effects. Thus, needs a special attention for a highly sensitive and selective detection system that can quantify these organic toxins in various matrices such as cocoa beans. This work presents a folding based aptasensors by employing an aptamer conjugated redox probe (methylene blue) specifically designed for OTA. The aptamers were covalently attached to the screen printed carbon electrodes using diazonium grafting. Upon sensing the OTA, it binds with the immobilized aptamer on the electrode surface, which induces the conformational changes of the aptamer, consequently increased in the signal. This conformational change of the aptamer before and after biosensing of target OTA could produce the distinguishable electrochemical signal. The obtained limit of detection was 0.01 ng/ml for OTA samples with recovery of up to 88% in contaminated cocoa samples.

Keywords: ochratoxin A, cocoa, DNA aptamer, labelled probe

Procedia PDF Downloads 283
18083 Physical Education Teacher's Interpretation toward Teaching Games for Understanding Model

Authors: Soni Nopembri

Abstract:

The objective of this research is to evaluate the implementation of teaching games for Understanding model by conducting action to physical education teacher who have got long teaching experience. The research applied Participatory Action Research. The subjects of this research were 19 physical education teachers who had got training of Teaching Games for Understanding. Data collection was conducted intensively through a questionnaire, in-depth interview, Focus Group Discussion (FGD), observation, and documentation. The collected data was analysis zed qualitatively and quantitatively. The result showed that physical education teachers had got an appropriate interpretation on TGfU model. Some indicators that were the focus of this research indicated this points; they are: (1) physical education teachers had good understanding toward TGfU model, (2) PE teachers’ competence in applying TGfU model on Physical Education at school were adequate, though some improvement were needed, (3) the influence factors in the implementation of TGfU model, in sequence, were teacher, facilities, environment, and students factors, (4) PE teachers’ perspective toward TGfU model were positively good, although some teachers were less optimistic toward the development of TGfU model in the future.

Keywords: TGfU, physical education teacher, teaching games, FGD

Procedia PDF Downloads 544
18082 Geomechanical Numerical Modeling of Well Wall in Drilling with Finite Difference Method

Authors: Marzieh Zarei

Abstract:

Well instability is one of the most fundamental challenges faced by the oil and gas industry. Well wall stability analysis is a gap to be filled in the oil industry. The collection of static data such as well logging leads to the construction of a geomechanical numerical model, which will help in assessing the probable risks in future drilling. In this paper, geomechanical model was designed, and mechanical properties of the rock was determined at all points of the model. It was found the safe mud window was determined and the minimum and maximum mud pressures were determined in the ranges of 70-60 MPa and 110-100 MPa, respectively.

Keywords: geomechanics, numerical model, well stability, in-situ stress, underbalanced drilling

Procedia PDF Downloads 127
18081 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment

Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto

Abstract:

Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.

Keywords: carbon stock, forest inventory, LiDAR, tree count

Procedia PDF Downloads 387