Search results for: Turbulence models
1900 Multidimensional Sports Spectators Segmentation and Social Media Marketing
Authors: B. Schmid, C. Kexel, E. Djafarova
Abstract:
Understanding consumers is elementary for practitioners in marketing. Consumers of sports events, the sports spectators, are a particularly complex consumer crowd. In order to identify and define their profiles different segmentation approaches can be found in literature, one of them being multidimensional segmentation. Multidimensional segmentation models correspond to the broad range of attitudes, behaviours, motivations and beliefs of sports spectators, other than earlier models. Moreover, in sports there are some well-researched disciplines (e.g. football or North American sports) where consumer profiles and marketing strategies are elaborate and others where no research at all can be found. For example, there is almost no research on athletics spectators. This paper explores the current state of research on sports spectators segmentation. An in-depth literature review provides the framework for a spectators segmentation in athletics. On this basis, additional potential consumer groups and implications for social media marketing will be explored. The findings are the basis for further research.Keywords: Multidimensional segmentation, social media, sports marketing, sports spectators segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26131899 An Interactive Web-based Simulation Tool for Surgical Thread
Authors: A. Ruimi, S. Goyal, B. M. Nour
Abstract:
Interactive web-based computer simulations are needed by the medical community to replicate the experience of surgical procedures as closely and realistically as possible without the need to practice on corpses, animals and/or plastic models. In this paper, we offer a review on current state of the research on simulations of surgical threads, identify future needs and present our proposed plans to meet them. Our goal is to create a physics-based simulator, which will predict the behavior of surgical thread when subjected to conditions commonly encountered during surgery. To that end, we will i) develop three dimensional finite element models based on the Cosserat theory of elasticity ii) test and feedback results with the medical community and iii) develop a web-based user interface to run/command our simulator and visualize the results. The impacts of our research are that i) it will contribute to the development of a new generation of training for medical school students and ii) the simulator will be useful to expert surgeons in developing new, better and less risky procedures.Keywords: Cosserat rod-theory, FEM simulations, Modeling, Surgical thread.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16541898 Comparison of Neural Network and Logistic Regression Methods to Predict Xerostomia after Radiotherapy
Authors: Hui-Min Ting, Tsair-Fwu Lee, Ming-Yuan Cho, Pei-Ju Chao, Chun-Ming Chang, Long-Chang Chen, Fu-Min Fang
Abstract:
To evaluate the ability to predict xerostomia after radiotherapy, we constructed and compared neural network and logistic regression models. In this study, 61 patients who completed a questionnaire about their quality of life (QoL) before and after a full course of radiation therapy were included. Based on this questionnaire, some statistical data about the condition of the patients’ salivary glands were obtained, and these subjects were included as the inputs of the neural network and logistic regression models in order to predict the probability of xerostomia. Seven variables were then selected from the statistical data according to Cramer’s V and point-biserial correlation values and were trained by each model to obtain the respective outputs which were 0.88 and 0.89 for AUC, 9.20 and 7.65 for SSE, and 13.7% and 19.0% for MAPE, respectively. These parameters demonstrate that both neural network and logistic regression methods are effective for predicting conditions of parotid glands.
Keywords: NPC, ANN, logistic regression, xerostomia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16371897 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: Spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27041896 The Use of Thermal Infrared Wavelengths to Determine the Volcanic Soils
Authors: Levent Basayigit, Mert Dedeoglu, Fadime Ozogul
Abstract:
In this study, an application was carried out to determine the Volcanic Soils by using remote sensing. The study area was located on the Golcuk formation in Isparta-Turkey. The thermal bands of Landsat 7 image were used for processing. The implementation of the climate model that was based on the water index was used in ERDAS Imagine software together with pixel based image classification. Soil Moisture Index (SMI) was modeled by using the surface temperature (Ts) which was obtained from thermal bands and vegetation index (NDVI) derived from Landsat 7. Surface moisture values were grouped and classified by using scoring system. Thematic layers were compared together with the field studies. Consequently, different moisture levels for volcanic soils were indicator for determination and separation. Those thermal wavelengths are preferable bands for separation of volcanic soils using moisture and temperature models.
Keywords: Landsat 7, soil moisture index, temperature models, volcanic soils.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11071895 Adsorption of Cadmium onto Activated and Non-Activated Date Pits
Authors: Munther I. Kandah, Fahmi A. Abu Al-Rub, Lucy Bawarish, Mira Bawarish, Hiba Al-Tamimi, Reem Khalil, Raja'a Sa, ada
Abstract:
In this project cadmium ions were adsorbed from aqueous solutions onto either date pits; a cheap agricultural and nontoxic material, or chemically activated carbon prepared from date pits using phosphoric acid. A series of experiments were conducted in a batch adsorption technique to assess the feasibility of using the prepared adsorbents. The effects of the process variables such as initial cadmium ions concentration, contact time, solution pH and adsorbent dose on the adsorption capacity of both adsorbents were studied. The experimental data were tested using different isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin- Radushkevich. The results showed that although the equilibrium data could be described by all models used, Langmuir model gave slightly better results when using activated carbon while Freundlich model, gave better results with date pits.Keywords: Adsorption, Cadmium, Chemical Activation, DatePits.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18131894 CFD Simulations to Validate Two and Three Phase Up-flow in Bubble Columns
Authors: Shyam Kumar, Nannuri Srinivasulu, Ashok Khanna
Abstract:
Bubble columns have a variety of applications in absorption, bio-reactions, catalytic slurry reactions, and coal liquefaction; because they are simple to operate, provide good heat and mass transfer, having less operational cost. The use of Computational Fluid Dynamics (CFD) for bubble column becomes important, since it can describe the fluid hydrodynamics on both local and global scale. Euler- Euler two-phase fluid model has been used to simulate two-phase (air and water) transient up-flow in bubble column (15cm diameter) using FLUENT6.3. These simulations and experiments were operated over a range of superficial gas velocities in the bubbly flow and churn turbulent regime (1 to16 cm/s) at ambient conditions. Liquid velocity was varied from 0 to 16cm/s. The turbulence in the liquid phase is described using the standard k-ε model. The interactions between the two phases are described through drag coefficient formulations (Schiller Neumann). The objectives are to validate CFD simulations with experimental data, and to obtain grid-independent numerical solutions. Quantitatively good agreements are obtained between experimental data for hold-up and simulation values. Axial liquid velocity profiles and gas holdup profiles were also obtained for the simulation.Keywords: Bubble column, Computational fluid dynamics, Gas holdup profile, k-ε model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27191893 A Context-Aware Supplier Selection Model
Authors: Mohammadreza Razzazi, Maryam Bayat
Abstract:
Selection of the best possible set of suppliers has a significant impact on the overall profitability and success of any business. For this reason, it is usually necessary to optimize all business processes and to make use of cost-effective alternatives for additional savings. This paper proposes a new efficient context-aware supplier selection model that takes into account possible changes of the environment while significantly reducing selection costs. The proposed model is based on data clustering techniques while inspiring certain principles of online algorithms for an optimally selection of suppliers. Unlike common selection models which re-run the selection algorithm from the scratch-line for any decision-making sub-period on the whole environment, our model considers the changes only and superimposes it to the previously defined best set of suppliers to obtain a new best set of suppliers. Therefore, any recomputation of unchanged elements of the environment is avoided and selection costs are consequently reduced significantly. A numerical evaluation confirms applicability of this model and proves that it is a more optimal solution compared with common static selection models in this field.Keywords: Supplier Selection, Context-Awareness, OnlineAlgorithms, Data Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18191892 Modeling and Control of Direct Driven PMSG for Ultra Large Wind Turbines
Authors: Ahmed M. Hemeida, Wael A. Farag, Osama A. Mahgoub
Abstract:
This paper focuses on developing an integrated reliable and sophisticated model for ultra large wind turbines And to study the performance and analysis of vector control on large wind turbines. With the advance of power electronics technology, direct driven multi-pole radial flux PMSG (Permanent Magnet Synchronous Generator) has proven to be a good choice for wind turbines manufacturers. To study the wind energy conversion systems, it is important to develop a wind turbine simulator that is able to produce realistic and validated conditions that occur in real ultra MW wind turbines. Three different packages are used to simulate this model, namely, Turbsim, FAST and Simulink. Turbsim is a Full field wind simulator developed by National Renewable Energy Laboratory (NREL). The wind turbine mechanical parts are modeled by FAST (Fatigue, Aerodynamics, Structures and Turbulence) code which is also developed by NREL. Simulink is used to model the PMSG, full scale back to back IGBT converters, and the grid.Keywords: FAST, Permanent Magnet Synchronous Generator(PMSG), TurbSim, Vector Control and Pitch Control
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 56101891 A Framework for Product Development Process including HW and SW Components
Authors: Namchul Do, Gyeongseok Chae
Abstract:
This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.Keywords: HW and SW Development Integration, ProductDevelopment with Software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26021890 Application of Feed-Forward Neural Networks Autoregressive Models with Genetic Algorithm in Gross Domestic Product Prediction
Authors: E. Giovanis
Abstract:
In this paper we present a Feed-Foward Neural Networks Autoregressive (FFNN-AR) model with genetic algorithms training optimization in order to predict the gross domestic product growth of six countries. Specifically we propose a kind of weighted regression, which can be used for econometric purposes, where the initial inputs are multiplied by the neural networks final optimum weights from input-hidden layer of the training process. The forecasts are compared with those of the ordinary autoregressive model and we conclude that the proposed regression-s forecasting results outperform significant those of autoregressive model. Moreover this technique can be used in Autoregressive-Moving Average models, with and without exogenous inputs, as also the training process with genetics algorithms optimization can be replaced by the error back-propagation algorithm.Keywords: Autoregressive model, Feed-Forward neuralnetworks, Genetic Algorithms, Gross Domestic Product
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16721889 Optimizing usage of ICTs and Outsourcing Strategic in Business Models and Customer Satisfaction
Authors: Saeed Rahmani Bagha, Mohammad Mirzahosseinian, Sonatkhatoon Kashanimotlagh
Abstract:
Nowadays, under developed countries for progress in science and technology and decreasing the technologic gap with developed countries, increasing the capacities and technology transfer from developed countries. To remain competitive, industry is continually searching for new methods to evolve their products. Business model is one of the latest buzzwords in the Internet and electronic business world. To be successful, organizations must look into the needs and wants of their customers. This research attempts to identify a specific feature of the company with a strong competitive advantage by analyzing the cause of Customer satisfaction. Due to the rapid development of knowledge and information technology, business environments have become much more complicated. Information technology can help a firm aiming to gain a competitive advantage. This study explores the role and effect of Information Communication Technology in Business Models and Customer satisfaction on firms and also relationships between ICTs and Outsourcing strategic.Keywords: Information Communication Technology, Outsourcing, Customer Satisfaction, Business Plan
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16961888 An Empirical Model of Correlated Traffics in LTE-Advanced System through an Innovative Simulation Tool
Authors: Ghassan A. Abed, Mahamod Ismail, Samir I. Badrawi, Bayan M. Sabbar
Abstract:
Long Term Evolution Advanced (LTE-Advanced) LTE-Advanced is not new as a radio access technology, but it is an evolution of LTE to enhance the performance. This generation is the continuation of 3GPP-LTE (3GPP: 3rd Generation Partnership Project) and it is targeted for advanced development of the requirements of LTE in terms of throughput and coverage. The performance evaluation process of any network should be based on many models and simulations to investigate the network layers and functions and monitor the employment of the new technologies especially when this network includes large-bandwidth and low-latency links such as LTE and LTE-Advanced networks. Therefore, it’s necessary to enhance the proposed models of high-speed and high-congested link networks to make these links and traffics fulfill the needs of the huge data which transferred over the congested links. This article offered an innovative model of the most correlated links of LTE-Advanced system using the Network Simulator 2 (NS-2) with investigation of the link parameters.
Keywords: 3GPP, LTE, LTE-Advanced, NS-2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24271887 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Authors: Belkacem Laimouche
Abstract:
With the field of Artificial Intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.
Keywords: Artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, inter-laboratory comparison, data analysis, data reliability, bias impact assessment, bias measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441886 Modeling Football Penalty Shootouts: How Improving Individual Performance Affects Team Performance and the Fairness of the ABAB Sequence
Authors: Pablo Enrique Sartor Del Giudice
Abstract:
Penalty shootouts often decide the outcome of important soccer matches. Although usually referred to as ”lotteries”, there is evidence that some national teams and clubs consistently perform better than others. The outcomes are therefore not explained just by mere luck, and therefore there are ways to improve the average performance of players, naturally at the expense of some sort of effort. In this article we study the payoff of player performance improvements in terms of the performance of the team as a whole. To do so we develop an analytical model with static individual performances, as well as Monte Carlo models that take into account the known influence of partial score and round number on individual performances. We find that within a range of usual values, the team performance improves above 70% faster than individual performances do. Using these models, we also estimate that the new ABBA penalty shootout ordering under test reduces almost all the known bias in favor of the first-shooting team under the current ABAB system.Keywords: Football, penalty shootouts, Montecarlo simulation, ABBA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8561885 WEMax: Virtual Manned Assembly Line Generation
Authors: Won Kyung Ham, Kang Hoon Cho, Yongho Chung, Sang C. Park
Abstract:
Presented in this paper is a framework of a software ‘WEMax’. The WEMax is invented for analysis and simulation for manned assembly lines to sustain and improve performance of manufacturing systems. In a manufacturing system, performance, such as productivity, is a key of competitiveness for output products. However, the manned assembly lines are difficult to forecast performance, because human labors are not expectable factors by computer simulation models or mathematical models. Existing approaches to performance forecasting of the manned assembly lines are limited to matters of the human itself, such as ergonomic and workload design, and non-human-factor-relevant simulation. Consequently, an approach for the forecasting and improvement of manned assembly line performance is needed to research. As a solution of the current problem, this study proposes a framework that is for generation and simulation of virtual manned assembly lines, and the framework has been implemented as a software.
Keywords: Performance Forecasting, Simulation, Virtual Manned Assembly Line.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18981884 A Method to Improve Test Process in Federal Enterprise Architecture Framework Using ISTQB Framework
Authors: Hamideh Mahdavifar, Ramin Nassiri, Alireza Bagheri
Abstract:
Enterprise Architecture (EA) is a framework for description, coordination and alignment of all activities across the organization in order to achieve strategic goals using ICT enablers. A number of EA-compatible frameworks have been developed. We, in this paper, mainly focus on Federal Enterprise Architecture Framework (FEAF) since its reference models are plentiful. Among these models we are interested here in its business reference model (BRM). The test process is one important subject of an EA project which is to somewhat overlooked. This lack of attention may cause drawbacks or even failure of an enterprise architecture project. To address this issue we intend to use International Software Testing Qualification Board (ISTQB) framework and standard test suites to present a method to improve EA testing process. The main challenge is how to communicate between the concepts of EA and ISTQB. In this paper, we propose a method for integrating these concepts.
Keywords: Business Reference Model (BRM), Federal Enterprise Architecture (FEA), ISTQB, Test Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19681883 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis
Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen
Abstract:
The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluates the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.
Keywords: lexical semantics, feature representation, semantic decision, convolutional neural network, electronic medical record
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5941882 Piezoelectric Transducer Modeling: with System Identification (SI) Method
Authors: Nora Taghavi, Ali Sadr
Abstract:
System identification is the process of creating models of dynamic process from input- output signals. The aim of system identification can be identified as “ to find a model with adjustable parameters and then to adjust them so that the predicted output matches the measured output". This paper presents a method of modeling and simulating with system identification to achieve the maximum fitness for transformation function. First by using optimized KLM equivalent circuit for PVDF piezoelectric transducer and assuming different inputs including: sinuside, step and sum of sinusides, get the outputs, then by using system identification toolbox in MATLAB, we estimate the transformation function from inputs and outputs resulted in last program. Then compare the fitness of transformation function resulted from using ARX,OE(Output- Error) and BJ(Box-Jenkins) models in system identification toolbox and primary transformation function form KLM equivalent circuit.Keywords: PVDF modeling, ARX, BJ(Box-Jenkins), OE(Output-Error), System Identification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27471881 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.
Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4501880 Assessment of the Influence of External Earth Terrain at Construction of the Physicmathematical Models or Finding the Dynamics of Pollutants' Distribution in Urban Atmosphere
Authors: Stanislav Aryeh V. Fradkin, Sharif E.Guseynov
Abstract:
There is a complex situation on the transport environment in the cities of the world. For the analysis and prevention of environmental problems an accurate calculation hazardous substances concentrations at each point of the investigated area is required. In the turbulent atmosphere of the city the wellknown methods of mathematical statistics for these tasks cannot be applied with a satisfactory level of accuracy. Therefore, to solve this class of problems apparatus of mathematical physics is more appropriate. In such models, because of the difficulty as a rule the influence of uneven land surface on streams of air masses in the turbulent atmosphere of the city are not taken into account. In this paper the influence of the surface roughness, which can be quite large, is mathematically shown. The analysis of this problem under certain conditions identified the possibility of areas appearing in the atmosphere with pressure tending to infinity, i.e. so-called "wall effect".
Keywords: Air pollution, concentration of harmful substances, physical-mathematical model, urban area.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13431879 Annotations of Gene Pathways Images in Biomedical Publications Using Siamese Network
Authors: Micheal Olaolu Arowolo, Muhammad Azam, Fei He, Mihail Popescu, Dong Xu
Abstract:
As the quantity of biological articles rises, so does the number of biological route figures. Each route figure shows gene names and relationships. Manually annotating pathway diagrams is time-consuming. Advanced image understanding models could speed up curation, but they must be more precise. There is rich information in biological pathway figures. The first step to performing image understanding of these figures is to recognize gene names automatically. Classical optical character recognition methods have been employed for gene name recognition, but they are not optimized for literature mining data. This study devised a method to recognize an image bounding box of gene name as a photo using deep Siamese neural network models to outperform the existing methods using ResNet, DenseNet and Inception architectures, the results obtained about 84% accuracy.
Keywords: Biological pathway, gene identification, object detection, Siamese network, ResNet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2481878 Gravitational Frequency Shifts for Photons and Particles
Authors: Jing-Gang Xie
Abstract:
The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.
Keywords: General relativity theory, particles, photons, quantum gravity model, gravitational frequency shift.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22291877 Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile
Authors: Enrico Rukzio, George N. Prezerakos, Giovanni Cortese, Eleftherios Koutsoloukas, Sofia Kapellaki
Abstract:
The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Keywords: 3GPP, context, context-awareness, context model, information model, user model, XML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87741876 3D CAD Models and its Feature Similarity
Authors: Elmi Abu Bakar, Tetsuo Miyake, Zhong Zhang, Takashi Imamura
Abstract:
Knowing the geometrical object pose of products in manufacturing line before robot manipulation is required and less time consuming for overall shape measurement. In order to perform it, the information of shape representation and matching of objects is become required. Objects are compared with its descriptor that conceptually subtracted from each other to form scalar metric. When the metric value is smaller, the object is considered closed to each other. Rotating the object from static pose in some direction introduce the change of value in scalar metric value of boundary information after feature extraction of related object. In this paper, a proposal method for indexing technique for retrieval of 3D geometrical models based on similarity between boundaries shapes in order to measure 3D CAD object pose using object shape feature matching for Computer Aided Testing (CAT) system in production line is proposed. In experimental results shows the effectiveness of proposed method.
Keywords: CAD, rendering, feature extraction, feature classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19791875 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.
Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8351874 Scientific Workflow Interoperability Evaluation
Authors: Ahmed Alqaoud
Abstract:
There is wide range of scientific workflow systems today, each one designed to resolve problems at a specific level. In large collaborative projects, it is often necessary to recognize the heterogeneous workflow systems already in use by various partners and any potential collaboration between these systems requires workflow interoperability. Publish/Subscribe Scientific Workflow Interoperability Framework (PS-SWIF) approach was proposed to achieve workflow interoperability among workflow systems. This paper evaluates the PS-SWIF approach and its system to achieve workflow interoperability using Web Services with asynchronous notification messages represented by WS-Eventing standard. This experiment covers different types of communication models provided by Workflow Management Coalition (WfMC). These models are: Chained processes, Nested synchronous sub-processes, Event synchronous sub-processes, and Nested sub-processes (Polling/Deferred Synchronous). Also, this experiment shows the flexibility and simplicity of the PS-SWIF approach when applied to a variety of workflow systems (Triana, Taverna, Kepler) in local and remote environments.Keywords: Publish/subscribe, scientific workflow, web services, workflow interoperability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18221873 Monetary Evaluation of Dispatching Decisions in Consideration of Mode Choice Models
Authors: Marcel Schneider, Nils Nießen
Abstract:
Microscopic simulation tool kits allow for consideration of the two processes of railway operations and the previous timetable production. Block occupation conflicts on both process levels are often solved by using defined train priorities. These conflict resolutions (dispatching decisions) generate reactionary delays to the involved trains. The sum of reactionary delays is commonly used to evaluate the quality of railway operations, which describes the timetable robustness. It is either compared to an acceptable train performance or the delays are appraised economically by linear monetary functions. It is impossible to adequately evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for the evaluation of dispatching decisions. The approach uses mode choice models and considers the behaviour of the end-customers. These models evaluate the reactionary delays in more detail and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operations with the macroscopic choice mode model. At first, it will be implemented for railway operations process but it can also be used for timetable production. The evaluation considers the possibility for the customer to interchange to other transport modes. The new approach starts to look at rail and road, but it can also be extended to air travel. The result of mode choice models is the modal split. The reactions by the end-customers have an impact on the revenue of the train operating companies. Different purposes of travel have different payment reserves and tolerances towards late running. Aside from changes to revenues, longer journey times can also generate additional costs. The costs are either time- or track-specific and arise from required changes to rolling stock or train crew cycles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of delays. The contribution margin is calculated for different possible solutions to the same conflict. The conflict resolution is optimised until the monetary loss becomes minimal. The iterative process therefore determines an optimum conflict resolution by monitoring the change to the contribution margin. Furthermore, a monetary value of each dispatching decision can also be derived.Keywords: Choice of mode, monetary evaluation, railway operations, reactionary delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14821872 Validity Domains of Beams Behavioural Models: Efficiency and Reduction with Artificial Neural Networks
Authors: Keny Ordaz-Hernandez, Xavier Fischer, Fouad Bennis
Abstract:
In a particular case of behavioural model reduction by ANNs, a validity domain shortening has been found. In mechanics, as in other domains, the notion of validity domain allows the engineer to choose a valid model for a particular analysis or simulation. In the study of mechanical behaviour for a cantilever beam (using linear and non-linear models), Multi-Layer Perceptron (MLP) Backpropagation (BP) networks have been applied as model reduction technique. This reduced model is constructed to be more efficient than the non-reduced model. Within a less extended domain, the ANN reduced model estimates correctly the non-linear response, with a lower computational cost. It has been found that the neural network model is not able to approximate the linear behaviour while it does approximate the non-linear behaviour very well. The details of the case are provided with an example of the cantilever beam behaviour modelling.
Keywords: artificial neural network, validity domain, cantileverbeam, non-linear behaviour, model reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14281871 Assessment of Landslide Volume for Alishan Highway Based On Database of Rainfall-Induced Slope Failure
Authors: Yun-Yao Chi, Ya-Fen Lee
Abstract:
In this paper, a study of slope failures along the Alishan Highway is carried out. An innovative empirical model is developed based on 15-year records of rainfall-induced slope failures. The statistical models are intended for assessing the volume of landslide for slope failure along the Alishan Highway in the future. The rainfall data considered in the proposed models include the effective cumulative rainfall and the critical rainfall intensity. The effective cumulative rainfall is defined at the point when the curve of cumulative rainfall goes from steep to flat. Then, the rainfall thresholds of landslide are established for assessing the volume of landslide and issuing warning and/or closure for the Alishan Highway during a future extreme rainfall. Slope failures during Typhoon Saola in 2012 demonstrate that the new empirical model is effective and applicable to other cases with similar rainfall conditions.
Keywords: Slope failure, landslide, volume, model, rainfall thresholds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772