Search results for: forecasting accuracy
2515 Unified Structured Process for Health Analytics
Authors: Supunmali Ahangama, Danny Chiang Choon Poo
Abstract:
Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.Keywords: agile methodology, health analytics, unified process model, UML
Procedia PDF Downloads 5062514 A Systematic Review Investigating the Use of EEG Measures in Neuromarketing
Authors: A. M. Byrne, E. Bonfiglio, C. Rigby, N. Edelstyn
Abstract:
Introduction: Neuromarketing employs numerous methodologies when investigating products and advertisement effectiveness. Electroencephalography (EEG), a non-invasive measure of electrical activity from the brain, is commonly used in neuromarketing. EEG data can be considered using time-frequency (TF) analysis, where changes in the frequency of brainwaves are calculated to infer participant’s mental states, or event-related potential (ERP) analysis, where changes in amplitude are observed in direct response to a stimulus. This presentation discusses the findings of a systematic review of EEG measures in neuromarketing. A systematic review summarises evidence on a research question, using explicit measures to identify, select, and critically appraise relevant research papers. Thissystematic review identifies which EEG measures are the most robust predictor of customer preference and purchase intention. Methods: Search terms identified174 papers that used EEG in combination with marketing-related stimuli. Publications were excluded if they were written in a language other than English or were not published as journal articles (e.g., book chapters). The review investigated which TF effect (e.g., theta-band power) and ERP component (e.g., N400) most consistently reflected preference and purchase intention. Machine-learning prediction was also investigated, along with the use of EEG combined with physiological measures such as eye-tracking. Results: Frontal alpha asymmetry was the most reliable TF signal, where an increase in activity over the left side of the frontal lobe indexed a positive response to marketing stimuli, while an increase in activity over the right side indexed a negative response. The late positive potential, a positive amplitude increase around 600 ms after stimulus presentation, was the most reliable ERP component, reflecting the conscious emotional evaluation of marketing stimuli. However, each measure showed mixed results when related to preference and purchase behaviour. Predictive accuracy was greatly improved through machine-learning algorithms such as deep neural networks, especially when combined with eye-tracking or facial expression analyses. Discussion: This systematic review provides a novel catalogue of the most effective use of each EEG measure commonly used in neuromarketing. Exciting findings to emerge are the identification of the frontal alpha asymmetry and late positive potential as markers of preferential responses to marketing stimuli. Predictive accuracy using machine-learning algorithms achieved predictive accuracies as high as 97%, and future research should therefore focus on machine-learning prediction when using EEG measures in neuromarketing.Keywords: EEG, ERP, neuromarketing, machine-learning, systematic review, time-frequency
Procedia PDF Downloads 1122513 Prediction of Coronary Artery Stenosis Severity Based on Machine Learning Algorithms
Authors: Yu-Jia Jian, Emily Chia-Yu Su, Hui-Ling Hsu, Jian-Jhih Chen
Abstract:
Coronary artery is the major supplier of myocardial blood flow. When fat and cholesterol are deposit in the coronary arterial wall, narrowing and stenosis of the artery occurs, which may lead to myocardial ischemia and eventually infarction. According to the World Health Organization (WHO), estimated 740 million people have died of coronary heart disease in 2015. According to Statistics from Ministry of Health and Welfare in Taiwan, heart disease (except for hypertensive diseases) ranked the second among the top 10 causes of death from 2013 to 2016, and it still shows a growing trend. According to American Heart Association (AHA), the risk factors for coronary heart disease including: age (> 65 years), sex (men to women with 2:1 ratio), obesity, diabetes, hypertension, hyperlipidemia, smoking, family history, lack of exercise and more. We have collected a dataset of 421 patients from a hospital located in northern Taiwan who received coronary computed tomography (CT) angiography. There were 300 males (71.26%) and 121 females (28.74%), with age ranging from 24 to 92 years, and a mean age of 56.3 years. Prior to coronary CT angiography, basic data of the patients, including age, gender, obesity index (BMI), diastolic blood pressure, systolic blood pressure, diabetes, hypertension, hyperlipidemia, smoking, family history of coronary heart disease and exercise habits, were collected and used as input variables. The output variable of the prediction module is the degree of coronary artery stenosis. The output variable of the prediction module is the narrow constriction of the coronary artery. In this study, the dataset was randomly divided into 80% as training set and 20% as test set. Four machine learning algorithms, including logistic regression, stepwise regression, neural network and decision tree, were incorporated to generate prediction results. We used area under curve (AUC) / accuracy (Acc.) to compare the four models, the best model is neural network, followed by stepwise logistic regression, decision tree, and logistic regression, with 0.68 / 79 %, 0.68 / 74%, 0.65 / 78%, and 0.65 / 74%, respectively. Sensitivity of neural network was 27.3%, specificity was 90.8%, stepwise Logistic regression sensitivity was 18.2%, specificity was 92.3%, decision tree sensitivity was 13.6%, specificity was 100%, logistic regression sensitivity was 27.3%, specificity 89.2%. From the result of this study, we hope to improve the accuracy by improving the module parameters or other methods in the future and we hope to solve the problem of low sensitivity by adjusting the imbalanced proportion of positive and negative data.Keywords: decision support, computed tomography, coronary artery, machine learning
Procedia PDF Downloads 2292512 A Novel PSO Based Decision Tree Classification
Authors: Ali Farzan
Abstract:
Classification of data objects or patterns is a major part in most of Decision making systems. One of the popular and commonly used classification methods is Decision Tree (DT). It is a hierarchical decision making system by which a binary tree is constructed and starting from root, at each node some of the classes is rejected until reaching the leaf nods. Each leaf node is a representative of one specific class. Finding the splitting criteria in each node for constructing or training the tree is a major problem. Particle Swarm Optimization (PSO) has been adopted as a metaheuristic searching method for finding the best splitting criteria. Result of evaluating the proposed method over benchmark datasets indicates the higher accuracy of the new PSO based decision tree.Keywords: decision tree, particle swarm optimization, splitting criteria, metaheuristic
Procedia PDF Downloads 4062511 Selection of Variogram Model for Environmental Variables
Authors: Sheikh Samsuzzhan Alam
Abstract:
The present study investigates the selection of variogram model in analyzing spatial variations of environmental variables with the trend. Sometimes, the autofitted theoretical variogram does not really capture the true nature of the empirical semivariogram. So proper exploration and analysis are needed to select the best variogram model. For this study, an open source data collected from California Soil Resource Lab1 is used to explain the problems when fitting a theoretical variogram. Five most commonly used variogram models: Linear, Gaussian, Exponential, Matern, and Spherical were fitted to the experimental semivariogram. Ordinary kriging methods were considered to evaluate the accuracy of the selected variograms through cross-validation. This study is beneficial for selecting an appropriate theoretical variogram model for environmental variables.Keywords: anisotropy, cross-validation, environmental variables, kriging, variogram models
Procedia PDF Downloads 3342510 Feedforward Neural Network with Backpropagation for Epilepsy Seizure Detection
Authors: Natalia Espinosa, Arthur Amorim, Rudolf Huebner
Abstract:
Epilepsy is a chronic neural disease and around 50 million people in the world suffer from this disease, however, in many cases, the individual acquires resistance to the medication, which is known as drug-resistant epilepsy, where a detection system is necessary. This paper showed the development of an automatic system for seizure detection based on artificial neural networks (ANN), which are common techniques of machine learning. Discrete Wavelet Transform (DWT) is used for decomposing electroencephalogram (EEG) signal into main brain waves, with these frequency bands is extracted features for training a feedforward neural network with backpropagation, finally made a pattern classification, seizure or non-seizure. Obtaining 95% accuracy in epileptic EEG and 100% in normal EEG.Keywords: Artificial Neural Network (ANN), Discrete Wavelet Transform (DWT), Epilepsy Detection , Seizure.
Procedia PDF Downloads 2232509 Transient Heat Conduction in Nonuniform Hollow Cylinders with Time Dependent Boundary Condition at One Surface
Authors: Sen Yung Lee, Chih Cheng Huang, Te Wen Tu
Abstract:
A solution methodology without using integral transformation is proposed to develop analytical solutions for transient heat conduction in nonuniform hollow cylinders with time-dependent boundary condition at the outer surface. It is shown that if the thermal conductivity and the specific heat of the medium are in arbitrary polynomial function forms, the closed solutions of the system can be developed. The influence of physical properties on the temperature distribution of the system is studied. A numerical example is given to illustrate the efficiency and the accuracy of the solution methodology.Keywords: analytical solution, nonuniform hollow cylinder, time-dependent boundary condition, transient heat conduction
Procedia PDF Downloads 5052508 A Novel Spectral Index for Automatic Shadow Detection in Urban Mapping Based on WorldView-2 Satellite Imagery
Authors: Kaveh Shahi, Helmi Z. M. Shafri, Ebrahim Taherzadeh
Abstract:
In remote sensing, shadow causes problems in many applications such as change detection and classification. It is caused by objects which are elevated, thus can directly affect the accuracy of information. For these reasons, it is very important to detect shadows particularly in urban high spatial resolution imagery which created a significant problem. This paper focuses on automatic shadow detection based on a new spectral index for multispectral imagery known as Shadow Detection Index (SDI). The new spectral index was tested on different areas of World-View 2 images and the results demonstrated that the new spectral index has a massive potential to extract shadows effectively and automatically.Keywords: spectral index, shadow detection, remote sensing images, World-View 2
Procedia PDF Downloads 5382507 Evaluating of Turkish Earthquake Code (2007) for FRP Wrapped Circular Concrete Cylinders
Authors: Guler S., Guzel E., Gulen M.
Abstract:
Fiber Reinforced Polymer (FRP) materials are commonly used in construction sector to enhance the strength and ductility capacities of structural elements. The equations on confined compressive strength of FRP wrapped concrete cylinders is described in the 7th chapter of the Turkish Earthquake Code (TEC-07) that enter into force in 2007. This study aims to evaluate the applicability of TEC-07 on confined compressive strengths of circular FRP wrapped concrete cylinders. To this end, a large number of data on circular FRP wrapped concrete cylinders are collected from the literature. It is clearly seen that the predictions of TEC-07 on circular FRP wrapped the FRP wrapped columns is not same accuracy for different ranges of concrete strengths.Keywords: Fiber Reinforced Polymer (FRP), concrete cylinders, Turkish Earthquake Code, earthquake
Procedia PDF Downloads 5182506 Simplified Linearized Layering Method for Stress Intensity Factor Determination
Authors: Jeries J. Abou-Hanna, Bradley Storm
Abstract:
This paper looks to reduce the complexity of determining stress intensity factors while maintaining high levels of accuracy by the use of a linearized layering approach. Many techniques for stress intensity factor determination exist, but they can be limited by conservative results, requiring too many user parameters, or by being too computationally intensive. Multiple notch geometries with various crack lengths were investigated in this study to better understand the effectiveness of the proposed method. By linearizing the average stresses in radial layers around the crack tip, stress intensity factors were found to have error ranging from -10.03% to 8.94% when compared to analytically exact solutions. This approach proved to be a robust and efficient method of accurately determining stress intensity factors.Keywords: fracture mechanics, finite element method, stress intensity factor, stress linearization
Procedia PDF Downloads 1432505 Biologically Inspired Small Infrared Target Detection Using Local Contrast Mechanisms
Authors: Tian Xia, Yuan Yan Tang
Abstract:
In order to obtain higher small target detection accuracy, this paper presents an effective algorithm inspired by the local contrast mechanism. The proposed method can enhance target signal and suppress background clutter simultaneously. In the first stage, a enhanced image is obtained using the proposed Weighted Laplacian of Gaussian. In the second stage, an adaptive threshold is adopted to segment the target. Experimental results on two changeling image sequences show that the proposed method can detect the bright and dark targets simultaneously, and is not sensitive to sea-sky line of the infrared image. So it is fit for IR small infrared target detection.Keywords: small target detection, local contrast, human vision system, Laplacian of Gaussian
Procedia PDF Downloads 4692504 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 1982503 Dynamic Modeling of Orthotropic Cracked Materials by X-FEM
Authors: S. Houcine Habib, B. Elkhalil Hachi, Mohamed Guesmi, Mohamed Haboussi
Abstract:
In this paper, dynamic fracture behaviors of cracked orthotropic structure are modeled using extended finite element method (X-FEM). In this approach, the finite element method model is first created and then enriched by special orthotropic crack tip enrichments and Heaviside functions in the framework of partition of unity. The mixed mode stress intensity factor (SIF) is computed using the interaction integral technique based on J-integral in order to predict cracking behavior of the structure. The developments of these procedures are programmed and introduced in a self-software platform code. To assess the accuracy of the developed code, results obtained by the proposed method are compared with those of literature.Keywords: X-FEM, composites, stress intensity factor, crack, dynamic orthotropic behavior
Procedia PDF Downloads 5702502 Impact of Meteorological Factors on Influenza Activity in Pakistan; A Tale of Two Cities
Authors: Nadia Nisar
Abstract:
Background: In the temperate regions Influenza activities occur sporadically all year round with peaks coinciding during cold months. Meteorological and environmental conditions play significant role in the transmission of influenza globally. In this study, we assessed the relationship between meteorological parameters and influenza activity in two geographical areas of Pakistan. Methods: Influenza data were collected from Islamabad (north) and Multan (south) regions of national influenza surveillance system during 2010-2015. Meteorological database was obtained from National Climatic Data Center (Pakistan). Logistic regression model with a stepwise approach was used to explore the relationship between meteorological parameters with influenza peaks. In statistical model, we used the weekly proportion of laboratory-confirmed influenza positive samples to represent Influenza activity with metrological parameters as the covariates (temperature, humidity and precipitation). We also evaluate the link between environmental conditions associated with seasonal influenza epidemics: 'cold-dry' and 'humid-rainy'. Results: We found that temperature and humidity was positively associated with influenza in north and south both locations (OR = 0.927 (0.88-0.97)) & (OR = 0.1.078 (1.027-1.132)) and (OR = 1.023 (1.008-1.037)) & (OR = 0.978 (0.964-0.992)) respectively, whilst precipitation was negatively associated with influenza (OR = 1.054 (1.039-1.070)) & (OR = 0.949 (0.935-0.963)). In both regions, temperature and humidity had the highest contribution to the model as compared to the precipitation. We revealed that the p-value for all of climate parameters is <0.05 by Independent-sample t-test. These results demonstrate that there were significant relationships between climate factors and influenza infection with correlation coefficients: 0.52-0.90. The total contribution of these three climatic variables accounted for 89.04%. The reported number of influenza cases increased sharply during the cold-dry season (i.e., winter) when humidity and temperature are at minimal levels. Conclusion: Our findings showed that measures of temperature, humidity and cold-dry season (winter) can be used as indicators to forecast influenza infections. Therefore integrating meteorological parameters for influenza forecasting in the surveillance system may benefit the public health efforts in reducing the burden of seasonal influenza. More studies are necessary to understand the role of these parameters in the viral transmission and host susceptibility process.Keywords: influenza, climate, metrological, environmental
Procedia PDF Downloads 2002501 Tongue Image Retrieval Based Using Machine Learning
Authors: Ahmad FAROOQ, Xinfeng Zhang, Fahad Sabah, Raheem Sarwar
Abstract:
In Traditional Chinese Medicine, tongue diagnosis is a vital inspection tool (TCM). In this study, we explore the potential of machine learning in tongue diagnosis. It begins with the cataloguing of the various classifications and characteristics of the human tongue. We infer 24 kinds of tongues from the material and coating of the tongue, and we identify 21 attributes of the tongue. The next step is to apply machine learning methods to the tongue dataset. We use the Weka machine learning platform to conduct the experiment for performance analysis. The 457 instances of the tongue dataset are used to test the performance of five different machine learning methods, including SVM, Random Forests, Decision Trees, and Naive Bayes. Based on accuracy and Area under the ROC Curve, the Support Vector Machine algorithm was shown to be the most effective for tongue diagnosis (AUC).Keywords: medical imaging, image retrieval, machine learning, tongue
Procedia PDF Downloads 812500 Warning about the Risk of Blood Flow Stagnation after Transcatheter Aortic Valve Implantation
Authors: Aymen Laadhari, Gábor Székely
Abstract:
In this work, the hemodynamics in the sinuses of Valsalva after Transcatheter Aortic Valve Implantation is numerically examined. We focus on the physical results in the two-dimensional case. We use a finite element methodology based on a Lagrange multiplier technique that enables to couple the dynamics of blood flow and the leaflets’ movement. A massively parallel implementation of a monolithic and fully implicit solver allows more accuracy and significant computational savings. The elastic properties of the aortic valve are disregarded, and the numerical computations are performed under physiologically correct pressure loads. Computational results depict that blood flow may be subject to stagnation in the lower domain of the sinuses of Valsalva after Transcatheter Aortic Valve Implantation.Keywords: hemodynamics, simulations, stagnation, valve
Procedia PDF Downloads 2912499 Integration of EEG and Motion Tracking Sensors for Objective Measure of Attention-Deficit Hyperactivity Disorder in Pre-Schoolers
Authors: Neha Bhattacharyya, Soumendra Singh, Amrita Banerjee, Ria Ghosh, Oindrila Sinha, Nairit Das, Rajkumar Gayen, Somya Subhra Pal, Sahely Ganguly, Tanmoy Dasgupta, Tanusree Dasgupta, Pulak Mondal, Aniruddha Adhikari, Sharmila Sarkar, Debasish Bhattacharyya, Asim Kumar Mallick, Om Prakash Singh, Samir Kumar Pal
Abstract:
Background: We aim to develop an integrated device comprised of single-probe EEG and CCD-based motion sensors for a more objective measure of Attention-deficit Hyperactivity Disorder (ADHD). While the integrated device (MAHD) relies on the EEG signal (spectral density of beta wave) for the assessment of attention during a given structured task (painting three segments of a circle using three different colors, namely red, green and blue), the CCD sensor depicts movement pattern of the subjects engaged in a continuous performance task (CPT). A statistical analysis of the attention and movement patterns was performed, and the accuracy of the completed tasks was analysed using indigenously developed software. The device with the embedded software, called MAHD, is intended to improve certainty with criterion E (i.e. whether symptoms are better explained by another condition). Methods: We have used the EEG signal from a single-channel dry sensor placed on the frontal lobe of the head of the subjects (3-5 years old pre-schoolers). During the painting of three segments of a circle using three distinct colors (red, green, and blue), absolute power for delta and beta EEG waves from the subjects are found to be correlated with relaxation and attention/cognitive load conditions. While the relaxation condition of the subject hints at hyperactivity, a more direct CCD-based motion sensor is used to track the physical movement of the subject engaged in a continuous performance task (CPT) i.e., separation of the various colored balls from one table to another. We have used our indigenously developed software for the statistical analysis to derive a scale for the objective assessment of ADHD. We have also compared our scale with clinical ADHD evaluation. Results: In a limited clinical trial with preliminary statistical analysis, we have found a significant correlation between the objective assessment of the ADHD subjects with that of the clinician’s conventional evaluation. Conclusion: MAHD, the integrated device, is supposed to be an auxiliary tool to improve the accuracy of ADHD diagnosis by supporting greater criterion E certainty.Keywords: ADHD, CPT, EEG signal, motion sensor, psychometric test
Procedia PDF Downloads 992498 Size-Reduction Strategies for Iris Codes
Authors: Jutta Hämmerle-Uhl, Georg Penn, Gerhard Pötzelsberger, Andreas Uhl
Abstract:
Iris codes contain bits with different entropy. This work investigates different strategies to reduce the size of iris code templates with the aim of reducing storage requirements and computational demand in the matching process. Besides simple sub-sampling schemes, also a binary multi-resolution representation as used in the JBIG hierarchical coding mode is assessed. We find that iris code template size can be reduced significantly while maintaining recognition accuracy. Besides, we propose a two stage identification approach, using small-sized iris code templates in a pre-selection satge, and full resolution templates for final identification, which shows promising recognition behaviour.Keywords: iris recognition, compact iris code, fast matching, best bits, pre-selection identification, two-stage identification
Procedia PDF Downloads 4402497 Comparison of Wet and Microwave Digestion Methods for the Al, Cu, Fe, Mn, Ni, Pb and Zn Determination in Some Honey Samples by ICPOES in Turkey
Authors: Huseyin Altundag, Emel Bina, Esra Altıntıg
Abstract:
The aim of this study is determining amount of Al, Cu, Fe, Mn, Ni, Pb and Zn in the samples of honey which are gathered from Sakarya and Istanbul regions. In this study the evaluation of the trace elements in honeys samples are gathered from Sakarya and Istanbul, Turkey. The sample preparation phase is performed via wet decomposition method and microwave digestion system. The accuracy of the method was corrected by the standard reference material, Tea Leaves (INCY-TL-1) and NIST SRM 1515 Apple leaves. The comparison between gathered data and literature values has made and possible resources of the contamination to the samples of honey have handled. The obtained results will be presented in ICCIS 2015: XIII International Conference on Chemical Industry and Science.Keywords: Wet decomposition, Microwave digestion, Trace element, Honey, ICP-OES
Procedia PDF Downloads 4622496 Kannudi- A Reference Editor for Kannada (Based on OPOK! and OHOK! Principles, and Domain Knowledge)
Authors: Vishweshwar V. Dixit
Abstract:
Kannudi is a reference editor introducing a method of input for Kannada, called OHOK!, that is, Ottu Hāku Ottu Koḍu!. This is especially suited for pressure-sensitive input devices, though the current online implementation uses the regular mechanical keyboard. OHOK! has three possible modes, namely, sva-ottu (self-conjunct), kandante (as you see), and andante (as you say). It may be noted that kandante mode does not follow the phonetic order. However, this model may work well for those who are inclined to visualize as they type rather than vocalize the sounds. Kannudi also demonstrates how domain knowledge can be effectively used to potentially increase speed, accuracy, and user-friendliness. For example, selection of a default vowel, automatic shunyification, and arkification. Also implemented are four types of Deletes that are necessary for phono-syllabic languages like Kannada.Keywords: kannada, conjunct, reference editor, pressure input
Procedia PDF Downloads 932495 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit
Authors: Davit Mirzoyan, Ararat Khachatryan
Abstract:
A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.Keywords: detection, monitoring, process corner, process variation
Procedia PDF Downloads 5252494 A Comprehensive Finite Element Model for Incremental Launching of Bridges: Optimizing Construction and Design
Authors: Mohammad Bagher Anvari, Arman Shojaei
Abstract:
Incremental launching, a widely adopted bridge erection technique, offers numerous advantages for bridge designers. However, accurately simulating and modeling the dynamic behavior of the bridge during each step of the launching process proves to be tedious and time-consuming. The perpetual variation of internal forces within the deck during construction stages adds complexity, exacerbated further by considerations of other load cases, such as support settlements and temperature effects. As a result, there is an urgent need for a reliable, simple, economical, and fast algorithmic solution to model bridge construction stages effectively. This paper presents a novel Finite Element (FE) model that focuses on studying the static behavior of bridges during the launching process. Additionally, a simple method is introduced to normalize all quantities in the problem. The new FE model overcomes the limitations of previous models, enabling the simulation of all stages of launching, which conventional models fail to achieve due to underlying assumptions. By leveraging the results obtained from the new FE model, this study proposes solutions to improve the accuracy of conventional models, particularly for the initial stages of bridge construction that have been neglected in previous research. The research highlights the critical role played by the first span of the bridge during the initial stages, a factor often overlooked in existing studies. Furthermore, a new and simplified model termed the "semi-infinite beam" model, is developed to address this oversight. By utilizing this model alongside a simple optimization approach, optimal values for launching nose specifications are derived. The practical applications of this study extend to optimizing the nose-deck system of incrementally launched bridges, providing valuable insights for practical usage. In conclusion, this paper introduces a comprehensive Finite Element model for studying the static behavior of bridges during incremental launching. The proposed model addresses limitations found in previous approaches and offers practical solutions to enhance accuracy. The study emphasizes the importance of considering the initial stages and introduces the "semi-infinite beam" model. Through the developed model and optimization approach, optimal specifications for launching nose configurations are determined. This research holds significant practical implications and contributes to the optimization of incrementally launched bridges, benefiting both the construction industry and bridge designers.Keywords: incremental launching, bridge construction, finite element model, optimization
Procedia PDF Downloads 1032493 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 5042492 Getting to Know the Enemy: Utilization of Phone Record Analysis Simulations to Uncover a Target’s Personal Life Attributes
Authors: David S. Byrne
Abstract:
The purpose of this paper is to understand how phone record analysis can enable identification of subjects in communication with a target of a terrorist plot. This study also sought to understand the advantages of the implementation of simulations to develop the skills of future intelligence analysts to enhance national security. Through the examination of phone reports which in essence consist of the call traffic of incoming and outgoing numbers (and not by listening to calls or reading the content of text messages), patterns can be uncovered that point toward members of a criminal group and activities planned. Through temporal and frequency analysis, conclusions were drawn to offer insights into the identity of participants and the potential scheme being undertaken. The challenge lies in the accurate identification of the users of the phones in contact with the target. Often investigators rely on proprietary databases and open sources to accomplish this task, however it is difficult to ascertain the accuracy of the information found. Thus, this paper poses two research questions: how effective are freely available web sources of information at determining the actual identification of callers? Secondly, does the identity of the callers enable an understanding of the lifestyle and habits of the target? The methodology for this research consisted of the analysis of the call detail records of the author’s personal phone activity spanning the period of a year combined with a hypothetical theory that the owner of said phone was a leader of terrorist cell. The goal was to reveal the identity of his accomplices and understand how his personal attributes can further paint a picture of the target’s intentions. The results of the study were interesting, nearly 80% of the calls were identified with over a 75% accuracy rating via datamining of open sources. The suspected terrorist’s inner circle was recognized including relatives and potential collaborators as well as financial institutions [money laundering], restaurants [meetings], a sporting goods store [purchase of supplies], and airline and hotels [travel itinerary]. The outcome of this research showed the benefits of cellphone analysis without more intrusive and time-consuming methodologies though it may be instrumental for potential surveillance, interviews, and developing probable cause for wiretaps. Furthermore, this research highlights the importance of building upon the skills of future intelligence analysts through phone record analysis via simulations; that hands-on learning in this case study emphasizes the development of the competencies necessary to improve investigations overall.Keywords: hands-on learning, intelligence analysis, intelligence education, phone record analysis, simulations
Procedia PDF Downloads 142491 A Firefly Based Optimization Technique for Optimal Planning of Voltage Controlled Distributed Generators
Authors: M. M. Othman, Walid El-Khattam, Y. G. Hegazy, A. Y. Abdelaziz
Abstract:
This paper presents a method for finding the optimal location and capacity of dispatchable DGs connected to the distribution feeders for optimal planning for a specified power loss without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37-nodes feeder. The results that are validated by comparing it with results obtained from other competing methods show the effectiveness, accuracy and speed of the proposed method.Keywords: distributed generators, firefly technique, optimization, power loss
Procedia PDF Downloads 5332490 Wind Speed Prediction Using Passive Aggregation Artificial Intelligence Model
Authors: Tarek Aboueldahab, Amin Mohamed Nassar
Abstract:
Wind energy is a fluctuating energy source unlike conventional power plants, thus, it is necessary to accurately predict short term wind speed to integrate wind energy in the electricity supply structure. To do so, we present a hybrid artificial intelligence model of short term wind speed prediction based on passive aggregation of the particle swarm optimization and neural networks. As a result, improvement of the prediction accuracy is obviously obtained compared to the standard artificial intelligence method.Keywords: artificial intelligence, neural networks, particle swarm optimization, passive aggregation, wind speed prediction
Procedia PDF Downloads 4502489 Open Jet Testing for Buoyant and Hybrid Buoyant Aerial Vehicles
Authors: A. U. Haque, W. Asrar, A. A. Omar, E. Sulaeman, J. S Mohamed Ali
Abstract:
Open jet testing is a valuable testing technique which provides the desired results with reasonable accuracy. It has been used in past for the airships and now has recently been applied for the hybrid ones, having more non-buoyant force coming from the wings, empennage and the fuselage. In the present review work, an effort has been done to review the challenges involved in open jet testing. In order to shed light on the application of this technique, the experimental results of two different configurations are presented. Although, the aerodynamic results of such vehicles are unique to its own design; however, it will provide a starting point for planning any future testing. Few important testing areas which need more attention are also highlighted. Most of the hybrid buoyant aerial vehicles are unconventional in shape and there experimental data is generated, which is unique to its own design.Keywords: open jet testing, aerodynamics, hybrid buoyant aerial vehicles, airships
Procedia PDF Downloads 5732488 Myanmar Consonants Recognition System Based on Lip Movements Using Active Contour Model
Authors: T. Thein, S. Kalyar Myo
Abstract:
Human uses visual information for understanding the speech contents in noisy conditions or in situations where the audio signal is not available. The primary advantage of visual information is that it is not affected by the acoustic noise and cross talk among speakers. Using visual information from the lip movements can improve the accuracy and robustness of automatic speech recognition. However, a major challenge with most automatic lip reading system is to find a robust and efficient method for extracting the linguistically relevant speech information from a lip image sequence. This is a difficult task due to variation caused by different speakers, illumination, camera setting and the inherent low luminance and chrominance contrast between lip and non-lip region. Several researchers have been developing methods to overcome these problems; the one is lip reading. Moreover, it is well known that visual information about speech through lip reading is very useful for human speech recognition system. Lip reading is the technique of a comprehensive understanding of underlying speech by processing on the movement of lips. Therefore, lip reading system is one of the different supportive technologies for hearing impaired or elderly people, and it is an active research area. The need for lip reading system is ever increasing for every language. This research aims to develop a visual teaching method system for the hearing impaired persons in Myanmar, how to pronounce words precisely by identifying the features of lip movement. The proposed research will work a lip reading system for Myanmar Consonants, one syllable consonants (င (Nga)၊ ည (Nya)၊ မ (Ma)၊ လ (La)၊ ၀ (Wa)၊ သ (Tha)၊ ဟ (Ha)၊ အ (Ah) ) and two syllable consonants ( က(Ka Gyi)၊ ခ (Kha Gway)၊ ဂ (Ga Nge)၊ ဃ (Ga Gyi)၊ စ (Sa Lone)၊ ဆ (Sa Lain)၊ ဇ (Za Gwe) ၊ ဒ (Da Dway)၊ ဏ (Na Gyi)၊ န (Na Nge)၊ ပ (Pa Saug)၊ ဘ (Ba Gone)၊ ရ (Ya Gaug)၊ ဠ (La Gyi) ). In the proposed system, there are three subsystems, the first one is the lip localization system, which localizes the lips in the digital inputs. The next one is the feature extraction system, which extracts features of lip movement suitable for visual speech recognition. And the final one is the classification system. In the proposed research, Two Dimensional Discrete Cosine Transform (2D-DCT) and Linear Discriminant Analysis (LDA) with Active Contour Model (ACM) will be used for lip movement features extraction. Support Vector Machine (SVM) classifier is used for finding class parameter and class number in training set and testing set. Then, experiments will be carried out for the recognition accuracy of Myanmar consonants using the only visual information on lip movements which are useful for visual speech of Myanmar languages. The result will show the effectiveness of the lip movement recognition for Myanmar Consonants. This system will help the hearing impaired persons to use as the language learning application. This system can also be useful for normal hearing persons in noisy environments or conditions where they can find out what was said by other people without hearing voice.Keywords: feature extraction, lip reading, lip localization, Active Contour Model (ACM), Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), Two Dimensional Discrete Cosine Transform (2D-DCT)
Procedia PDF Downloads 2862487 A Variable Structural Control for a Flexible Lamina
Authors: Xuezhang Hou
Abstract:
A control problem of a flexible Lamina formulated by partial differential equations with viscoelastic boundary conditions is studied in this paper. The problem is written in standard form of linear infinite dimensional system in an appropriate energy Hilbert space. The semigroup approach of linear operators is adopted in investigating wellposedness of the closed loop system. A variable structural control for the system is proposed, and meanwhile an equivalent control method is applied to the thin plate system. A significant result on control theory that the thin plate can be approximated by ideal sliding mode in any accuracy in terms of semigroup approach is obtained.Keywords: partial differential equations, flexible lamina, variable structural control, semigroup of linear operators
Procedia PDF Downloads 852486 An Ergonomic Handle Design for Instruments in Laparoscopic Surgery
Authors: Ramon Sancibrian, Carlos Redondo-Figuero, Maria C. Gutierrez-Diez, Esther G. Sarabia, Maria A. Benito-Gonzalez, Jose C. Manuel-Palazuelos
Abstract:
In this paper, the design and evaluation of a handle for laparoscopic surgery is presented. The design of the handle is based on ergonomic principles and tries to avoid awkward postures for surgeons. The handle combines the so-called power-grip and accurate-grip in order to provide strength and accuracy in the performance of surgery. The handle is tested using both objective and subjective approaches. The objective approach uses motion capture techniques to obtain the angles of forearm, arm, wrist and hand. The muscular effort is obtained with electromyography electrodes. On the other hand, a subjective survey has been carried out using questionnaires. Results confirm that the handle is preferred by the majority of the surgeons.Keywords: laparoscopic surgery, ergonomics, mechanical design, biomechanics
Procedia PDF Downloads 502