Search results for: Hidden Markov Models (HMM)
1921 Adsorption of Cadmium onto Activated and Non-Activated Date Pits
Authors: Munther I. Kandah, Fahmi A. Abu Al-Rub, Lucy Bawarish, Mira Bawarish, Hiba Al-Tamimi, Reem Khalil, Raja'a Sa, ada
Abstract:
In this project cadmium ions were adsorbed from aqueous solutions onto either date pits; a cheap agricultural and nontoxic material, or chemically activated carbon prepared from date pits using phosphoric acid. A series of experiments were conducted in a batch adsorption technique to assess the feasibility of using the prepared adsorbents. The effects of the process variables such as initial cadmium ions concentration, contact time, solution pH and adsorbent dose on the adsorption capacity of both adsorbents were studied. The experimental data were tested using different isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin- Radushkevich. The results showed that although the equilibrium data could be described by all models used, Langmuir model gave slightly better results when using activated carbon while Freundlich model, gave better results with date pits.Keywords: Adsorption, Cadmium, Chemical Activation, DatePits.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18131920 A Context-Aware Supplier Selection Model
Authors: Mohammadreza Razzazi, Maryam Bayat
Abstract:
Selection of the best possible set of suppliers has a significant impact on the overall profitability and success of any business. For this reason, it is usually necessary to optimize all business processes and to make use of cost-effective alternatives for additional savings. This paper proposes a new efficient context-aware supplier selection model that takes into account possible changes of the environment while significantly reducing selection costs. The proposed model is based on data clustering techniques while inspiring certain principles of online algorithms for an optimally selection of suppliers. Unlike common selection models which re-run the selection algorithm from the scratch-line for any decision-making sub-period on the whole environment, our model considers the changes only and superimposes it to the previously defined best set of suppliers to obtain a new best set of suppliers. Therefore, any recomputation of unchanged elements of the environment is avoided and selection costs are consequently reduced significantly. A numerical evaluation confirms applicability of this model and proves that it is a more optimal solution compared with common static selection models in this field.Keywords: Supplier Selection, Context-Awareness, OnlineAlgorithms, Data Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18181919 A Framework for Product Development Process including HW and SW Components
Authors: Namchul Do, Gyeongseok Chae
Abstract:
This paper proposes a framework for product development including hardware and software components. It provides separation of hardware dependent software, modifications of current product development process, and integration of software modules with existing product configuration models and assembly product structures. In order to decide the dependent software, the framework considers product configuration modules and engineering changes of associated software and hardware components. In order to support efficient integration of the two different hardware and software development, a modified product development process is proposed. The process integrates the dependent software development into product development through the interchanges of specific product information. By using existing product data models in Product Data Management (PDM), the framework represents software as modules for product configurations and software parts for product structure. The framework is applied to development of a robot system in order to show its effectiveness.Keywords: HW and SW Development Integration, ProductDevelopment with Software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26021918 Optimizing usage of ICTs and Outsourcing Strategic in Business Models and Customer Satisfaction
Authors: Saeed Rahmani Bagha, Mohammad Mirzahosseinian, Sonatkhatoon Kashanimotlagh
Abstract:
Nowadays, under developed countries for progress in science and technology and decreasing the technologic gap with developed countries, increasing the capacities and technology transfer from developed countries. To remain competitive, industry is continually searching for new methods to evolve their products. Business model is one of the latest buzzwords in the Internet and electronic business world. To be successful, organizations must look into the needs and wants of their customers. This research attempts to identify a specific feature of the company with a strong competitive advantage by analyzing the cause of Customer satisfaction. Due to the rapid development of knowledge and information technology, business environments have become much more complicated. Information technology can help a firm aiming to gain a competitive advantage. This study explores the role and effect of Information Communication Technology in Business Models and Customer satisfaction on firms and also relationships between ICTs and Outsourcing strategic.Keywords: Information Communication Technology, Outsourcing, Customer Satisfaction, Business Plan
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16951917 An Empirical Model of Correlated Traffics in LTE-Advanced System through an Innovative Simulation Tool
Authors: Ghassan A. Abed, Mahamod Ismail, Samir I. Badrawi, Bayan M. Sabbar
Abstract:
Long Term Evolution Advanced (LTE-Advanced) LTE-Advanced is not new as a radio access technology, but it is an evolution of LTE to enhance the performance. This generation is the continuation of 3GPP-LTE (3GPP: 3rd Generation Partnership Project) and it is targeted for advanced development of the requirements of LTE in terms of throughput and coverage. The performance evaluation process of any network should be based on many models and simulations to investigate the network layers and functions and monitor the employment of the new technologies especially when this network includes large-bandwidth and low-latency links such as LTE and LTE-Advanced networks. Therefore, it’s necessary to enhance the proposed models of high-speed and high-congested link networks to make these links and traffics fulfill the needs of the huge data which transferred over the congested links. This article offered an innovative model of the most correlated links of LTE-Advanced system using the Network Simulator 2 (NS-2) with investigation of the link parameters.
Keywords: 3GPP, LTE, LTE-Advanced, NS-2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24271916 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Authors: Belkacem Laimouche
Abstract:
With the field of Artificial Intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.
Keywords: Artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, inter-laboratory comparison, data analysis, data reliability, bias impact assessment, bias measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431915 Modeling Football Penalty Shootouts: How Improving Individual Performance Affects Team Performance and the Fairness of the ABAB Sequence
Authors: Pablo Enrique Sartor Del Giudice
Abstract:
Penalty shootouts often decide the outcome of important soccer matches. Although usually referred to as ”lotteries”, there is evidence that some national teams and clubs consistently perform better than others. The outcomes are therefore not explained just by mere luck, and therefore there are ways to improve the average performance of players, naturally at the expense of some sort of effort. In this article we study the payoff of player performance improvements in terms of the performance of the team as a whole. To do so we develop an analytical model with static individual performances, as well as Monte Carlo models that take into account the known influence of partial score and round number on individual performances. We find that within a range of usual values, the team performance improves above 70% faster than individual performances do. Using these models, we also estimate that the new ABBA penalty shootout ordering under test reduces almost all the known bias in favor of the first-shooting team under the current ABAB system.Keywords: Football, penalty shootouts, Montecarlo simulation, ABBA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8561914 WEMax: Virtual Manned Assembly Line Generation
Authors: Won Kyung Ham, Kang Hoon Cho, Yongho Chung, Sang C. Park
Abstract:
Presented in this paper is a framework of a software ‘WEMax’. The WEMax is invented for analysis and simulation for manned assembly lines to sustain and improve performance of manufacturing systems. In a manufacturing system, performance, such as productivity, is a key of competitiveness for output products. However, the manned assembly lines are difficult to forecast performance, because human labors are not expectable factors by computer simulation models or mathematical models. Existing approaches to performance forecasting of the manned assembly lines are limited to matters of the human itself, such as ergonomic and workload design, and non-human-factor-relevant simulation. Consequently, an approach for the forecasting and improvement of manned assembly line performance is needed to research. As a solution of the current problem, this study proposes a framework that is for generation and simulation of virtual manned assembly lines, and the framework has been implemented as a software.
Keywords: Performance Forecasting, Simulation, Virtual Manned Assembly Line.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18981913 Massively-Parallel Bit-Serial Neural Networks for Fast Epilepsy Diagnosis: A Feasibility Study
Authors: Si Mon Kueh, Tom J. Kazmierski
Abstract:
There are about 1% of the world population suffering from the hidden disability known as epilepsy and major developing countries are not fully equipped to counter this problem. In order to reduce the inconvenience and danger of epilepsy, different methods have been researched by using a artificial neural network (ANN) classification to distinguish epileptic waveforms from normal brain waveforms. This paper outlines the aim of achieving massive ANN parallelization through a dedicated hardware using bit-serial processing. The design of this bit-serial Neural Processing Element (NPE) is presented which implements the functionality of a complete neuron using variable accuracy. The proposed design has been tested taking into consideration non-idealities of a hardware ANN. The NPE consists of a bit-serial multiplier which uses only 16 logic elements on an Altera Cyclone IV FPGA and a bit-serial ALU as well as a look-up table. Arrays of NPEs can be driven by a single controller which executes the neural processing algorithm. In conclusion, the proposed compact NPE design allows the construction of complex hardware ANNs that can be implemented in a portable equipment that suits the needs of a single epileptic patient in his or her daily activities to predict the occurrences of impending tonic conic seizures.Keywords: Artificial Neural Networks, bit-serial neural processor, FPGA, Neural Processing Element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15731912 A Method to Improve Test Process in Federal Enterprise Architecture Framework Using ISTQB Framework
Authors: Hamideh Mahdavifar, Ramin Nassiri, Alireza Bagheri
Abstract:
Enterprise Architecture (EA) is a framework for description, coordination and alignment of all activities across the organization in order to achieve strategic goals using ICT enablers. A number of EA-compatible frameworks have been developed. We, in this paper, mainly focus on Federal Enterprise Architecture Framework (FEAF) since its reference models are plentiful. Among these models we are interested here in its business reference model (BRM). The test process is one important subject of an EA project which is to somewhat overlooked. This lack of attention may cause drawbacks or even failure of an enterprise architecture project. To address this issue we intend to use International Software Testing Qualification Board (ISTQB) framework and standard test suites to present a method to improve EA testing process. The main challenge is how to communicate between the concepts of EA and ISTQB. In this paper, we propose a method for integrating these concepts.
Keywords: Business Reference Model (BRM), Federal Enterprise Architecture (FEA), ISTQB, Test Techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19661911 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis
Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen
Abstract:
The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluates the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.
Keywords: lexical semantics, feature representation, semantic decision, convolutional neural network, electronic medical record
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5941910 Piezoelectric Transducer Modeling: with System Identification (SI) Method
Authors: Nora Taghavi, Ali Sadr
Abstract:
System identification is the process of creating models of dynamic process from input- output signals. The aim of system identification can be identified as “ to find a model with adjustable parameters and then to adjust them so that the predicted output matches the measured output". This paper presents a method of modeling and simulating with system identification to achieve the maximum fitness for transformation function. First by using optimized KLM equivalent circuit for PVDF piezoelectric transducer and assuming different inputs including: sinuside, step and sum of sinusides, get the outputs, then by using system identification toolbox in MATLAB, we estimate the transformation function from inputs and outputs resulted in last program. Then compare the fitness of transformation function resulted from using ARX,OE(Output- Error) and BJ(Box-Jenkins) models in system identification toolbox and primary transformation function form KLM equivalent circuit.Keywords: PVDF modeling, ARX, BJ(Box-Jenkins), OE(Output-Error), System Identification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27451909 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.
Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4491908 Assessment of the Influence of External Earth Terrain at Construction of the Physicmathematical Models or Finding the Dynamics of Pollutants' Distribution in Urban Atmosphere
Authors: Stanislav Aryeh V. Fradkin, Sharif E.Guseynov
Abstract:
There is a complex situation on the transport environment in the cities of the world. For the analysis and prevention of environmental problems an accurate calculation hazardous substances concentrations at each point of the investigated area is required. In the turbulent atmosphere of the city the wellknown methods of mathematical statistics for these tasks cannot be applied with a satisfactory level of accuracy. Therefore, to solve this class of problems apparatus of mathematical physics is more appropriate. In such models, because of the difficulty as a rule the influence of uneven land surface on streams of air masses in the turbulent atmosphere of the city are not taken into account. In this paper the influence of the surface roughness, which can be quite large, is mathematically shown. The analysis of this problem under certain conditions identified the possibility of areas appearing in the atmosphere with pressure tending to infinity, i.e. so-called "wall effect".
Keywords: Air pollution, concentration of harmful substances, physical-mathematical model, urban area.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13431907 Annotations of Gene Pathways Images in Biomedical Publications Using Siamese Network
Authors: Micheal Olaolu Arowolo, Muhammad Azam, Fei He, Mihail Popescu, Dong Xu
Abstract:
As the quantity of biological articles rises, so does the number of biological route figures. Each route figure shows gene names and relationships. Manually annotating pathway diagrams is time-consuming. Advanced image understanding models could speed up curation, but they must be more precise. There is rich information in biological pathway figures. The first step to performing image understanding of these figures is to recognize gene names automatically. Classical optical character recognition methods have been employed for gene name recognition, but they are not optimized for literature mining data. This study devised a method to recognize an image bounding box of gene name as a photo using deep Siamese neural network models to outperform the existing methods using ResNet, DenseNet and Inception architectures, the results obtained about 84% accuracy.
Keywords: Biological pathway, gene identification, object detection, Siamese network, ResNet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2471906 Gravitational Frequency Shifts for Photons and Particles
Authors: Jing-Gang Xie
Abstract:
The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.
Keywords: General relativity theory, particles, photons, quantum gravity model, gravitational frequency shift.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22281905 Spread Spectrum Image Watermarking for Secured Multimedia Data Communication
Authors: Tirtha S. Das, Ayan K. Sau, Subir K. Sarkar
Abstract:
Digital watermarking is a way to provide the facility of secure multimedia data communication besides its copyright protection approach. The Spread Spectrum modulation principle is widely used in digital watermarking to satisfy the robustness of multimedia signals against various signal-processing operations. Several SS watermarking algorithms have been proposed for multimedia signals but very few works have discussed on the issues responsible for secure data communication and its robustness improvement. The current paper has critically analyzed few such factors namely properties of spreading codes, proper signal decomposition suitable for data embedding, security provided by the key, successive bit cancellation method applied at decoder which have greater impact on the detection reliability, secure communication of significant signal under camouflage of insignificant signals etc. Based on the analysis, robust SS watermarking scheme for secure data communication is proposed in wavelet domain and improvement in secure communication and robustness performance is reported through experimental results. The reported result also shows improvement in visual and statistical invisibility of the hidden data.
Keywords: Spread spectrum modulation, spreading code, signaldecomposition, security, successive bit cancellation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27811904 Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile
Authors: Enrico Rukzio, George N. Prezerakos, Giovanni Cortese, Eleftherios Koutsoloukas, Sofia Kapellaki
Abstract:
The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.
Keywords: 3GPP, context, context-awareness, context model, information model, user model, XML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87741903 3D CAD Models and its Feature Similarity
Authors: Elmi Abu Bakar, Tetsuo Miyake, Zhong Zhang, Takashi Imamura
Abstract:
Knowing the geometrical object pose of products in manufacturing line before robot manipulation is required and less time consuming for overall shape measurement. In order to perform it, the information of shape representation and matching of objects is become required. Objects are compared with its descriptor that conceptually subtracted from each other to form scalar metric. When the metric value is smaller, the object is considered closed to each other. Rotating the object from static pose in some direction introduce the change of value in scalar metric value of boundary information after feature extraction of related object. In this paper, a proposal method for indexing technique for retrieval of 3D geometrical models based on similarity between boundaries shapes in order to measure 3D CAD object pose using object shape feature matching for Computer Aided Testing (CAT) system in production line is proposed. In experimental results shows the effectiveness of proposed method.
Keywords: CAD, rendering, feature extraction, feature classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19791902 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.
Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8331901 Scientific Workflow Interoperability Evaluation
Authors: Ahmed Alqaoud
Abstract:
There is wide range of scientific workflow systems today, each one designed to resolve problems at a specific level. In large collaborative projects, it is often necessary to recognize the heterogeneous workflow systems already in use by various partners and any potential collaboration between these systems requires workflow interoperability. Publish/Subscribe Scientific Workflow Interoperability Framework (PS-SWIF) approach was proposed to achieve workflow interoperability among workflow systems. This paper evaluates the PS-SWIF approach and its system to achieve workflow interoperability using Web Services with asynchronous notification messages represented by WS-Eventing standard. This experiment covers different types of communication models provided by Workflow Management Coalition (WfMC). These models are: Chained processes, Nested synchronous sub-processes, Event synchronous sub-processes, and Nested sub-processes (Polling/Deferred Synchronous). Also, this experiment shows the flexibility and simplicity of the PS-SWIF approach when applied to a variety of workflow systems (Triana, Taverna, Kepler) in local and remote environments.Keywords: Publish/subscribe, scientific workflow, web services, workflow interoperability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18221900 Monetary Evaluation of Dispatching Decisions in Consideration of Mode Choice Models
Authors: Marcel Schneider, Nils Nießen
Abstract:
Microscopic simulation tool kits allow for consideration of the two processes of railway operations and the previous timetable production. Block occupation conflicts on both process levels are often solved by using defined train priorities. These conflict resolutions (dispatching decisions) generate reactionary delays to the involved trains. The sum of reactionary delays is commonly used to evaluate the quality of railway operations, which describes the timetable robustness. It is either compared to an acceptable train performance or the delays are appraised economically by linear monetary functions. It is impossible to adequately evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for the evaluation of dispatching decisions. The approach uses mode choice models and considers the behaviour of the end-customers. These models evaluate the reactionary delays in more detail and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operations with the macroscopic choice mode model. At first, it will be implemented for railway operations process but it can also be used for timetable production. The evaluation considers the possibility for the customer to interchange to other transport modes. The new approach starts to look at rail and road, but it can also be extended to air travel. The result of mode choice models is the modal split. The reactions by the end-customers have an impact on the revenue of the train operating companies. Different purposes of travel have different payment reserves and tolerances towards late running. Aside from changes to revenues, longer journey times can also generate additional costs. The costs are either time- or track-specific and arise from required changes to rolling stock or train crew cycles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of delays. The contribution margin is calculated for different possible solutions to the same conflict. The conflict resolution is optimised until the monetary loss becomes minimal. The iterative process therefore determines an optimum conflict resolution by monitoring the change to the contribution margin. Furthermore, a monetary value of each dispatching decision can also be derived.Keywords: Choice of mode, monetary evaluation, railway operations, reactionary delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14811899 Validity Domains of Beams Behavioural Models: Efficiency and Reduction with Artificial Neural Networks
Authors: Keny Ordaz-Hernandez, Xavier Fischer, Fouad Bennis
Abstract:
In a particular case of behavioural model reduction by ANNs, a validity domain shortening has been found. In mechanics, as in other domains, the notion of validity domain allows the engineer to choose a valid model for a particular analysis or simulation. In the study of mechanical behaviour for a cantilever beam (using linear and non-linear models), Multi-Layer Perceptron (MLP) Backpropagation (BP) networks have been applied as model reduction technique. This reduced model is constructed to be more efficient than the non-reduced model. Within a less extended domain, the ANN reduced model estimates correctly the non-linear response, with a lower computational cost. It has been found that the neural network model is not able to approximate the linear behaviour while it does approximate the non-linear behaviour very well. The details of the case are provided with an example of the cantilever beam behaviour modelling.
Keywords: artificial neural network, validity domain, cantileverbeam, non-linear behaviour, model reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14281898 Assessment of Landslide Volume for Alishan Highway Based On Database of Rainfall-Induced Slope Failure
Authors: Yun-Yao Chi, Ya-Fen Lee
Abstract:
In this paper, a study of slope failures along the Alishan Highway is carried out. An innovative empirical model is developed based on 15-year records of rainfall-induced slope failures. The statistical models are intended for assessing the volume of landslide for slope failure along the Alishan Highway in the future. The rainfall data considered in the proposed models include the effective cumulative rainfall and the critical rainfall intensity. The effective cumulative rainfall is defined at the point when the curve of cumulative rainfall goes from steep to flat. Then, the rainfall thresholds of landslide are established for assessing the volume of landslide and issuing warning and/or closure for the Alishan Highway during a future extreme rainfall. Slope failures during Typhoon Saola in 2012 demonstrate that the new empirical model is effective and applicable to other cases with similar rainfall conditions.
Keywords: Slope failure, landslide, volume, model, rainfall thresholds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17721897 Bayesian Meta-Analysis to Account for Heterogeneity in Studies Relating Life Events to Disease
Authors: Elizabeth Stojanovski
Abstract:
Associations between life events and various forms of cancers have been identified. The purpose of a recent random-effects meta-analysis was to identify studies that examined the association between adverse events associated with changes to financial status including decreased income and breast cancer risk. The same association was studied in four separate studies which displayed traits that were not consistent between studies such as the study design, location, and time frame. It was of interest to pool information from various studies to help identify characteristics that differentiated study results. Two random-effects Bayesian meta-analysis models are proposed to combine the reported estimates of the described studies. The proposed models allow major sources of variation to be taken into account, including study level characteristics, between study variance and within study variance, and illustrate the ease with which uncertainty can be incorporated using a hierarchical Bayesian modelling approach.
Keywords: Random-effects, meta-analysis, Bayesian, variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6591896 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting
Authors: Analise Borg, Paul Micallef
Abstract:
Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organise the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that nonparametric analysis offer potential results as the ones mentioned in the literature.
Keywords: Audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22851895 A Survey of Business Component Identification Methods and Related Techniques
Authors: Zhongjie Wang, Xiaofei Xu, Dechen Zhan
Abstract:
With deep development of software reuse, componentrelated technologies have been widely applied in the development of large-scale complex applications. Component identification (CI) is one of the primary research problems in software reuse, by analyzing domain business models to get a set of business components with high reuse value and good reuse performance to support effective reuse. Based on the concept and classification of CI, its technical stack is briefly discussed from four views, i.e., form of input business models, identification goals, identification strategies, and identification process. Then various CI methods presented in literatures are classified into four types, i.e., domain analysis based methods, cohesion-coupling based clustering methods, CRUD matrix based methods, and other methods, with the comparisons between these methods for their advantages and disadvantages. Additionally, some insufficiencies of study on CI are discussed, and the causes are explained subsequently. Finally, it is concluded with some significantly promising tendency about research on this problem.Keywords: Business component, component granularity, component identification, reuse performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19741894 A Cuckoo Search with Differential Evolution for Clustering Microarray Gene Expression Data
Authors: M. Pandi, K. Premalatha
Abstract:
A DNA microarray technology is a collection of microscopic DNA spots attached to a solid surface. Scientists use DNA microarrays to measure the expression levels of large numbers of genes simultaneously or to genotype multiple regions of a genome. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. It is handled by clustering which reveals the natural structures and identifying the interesting patterns in the underlying data. In this paper, gene based clustering in gene expression data is proposed using Cuckoo Search with Differential Evolution (CS-DE). The experiment results are analyzed with gene expression benchmark datasets. The results show that CS-DE outperforms CS in benchmark datasets. To find the validation of the clustering results, this work is tested with one internal and one external cluster validation indexes.
Keywords: DNA, Microarray, genomics, Cuckoo Search, Differential Evolution, Gene expression data, Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14831893 Promoting Biofuels in India: Assessing Land Use Shifts Using Econometric Acreage Response Models
Authors: Y. Bhatt, N. Ghosh, N. Tiwari
Abstract:
Acreage response function are modeled taking account of expected harvest prices, weather related variables and other non-price variables allowing for partial adjustment possibility. At the outset, based on the literature on price expectation formation, we explored suitable formulations for estimating the farmer’s expected prices. Assuming that farmers form expectations rationally, the prices of food and biofuel crops are modeled using time-series methods for possible ARCH/GARCH effects to account for volatility. The prices projected on the basis of the models are then inserted to proxy for the expected prices in the acreage response functions. Food crop acreages in different growing states are found sensitive to their prices relative to those of one or more of the biofuel crops considered. The required percentage improvement in food crop yields is worked to offset the acreage loss.
Keywords: Acreage response function, biofuel, food security, sustainable development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14151892 The Effects of System Change on Buildings Equipped with Structural Systems with the Sandwich Composite Wall with J-Hook Connectors and Reinforced Concrete Shear Walls
Authors: Majid Saaly, Shahriar Tavousi Tafreshi, Mehdi Nazari Afshar
Abstract:
The sandwich composite walls (SCSSC) have more ductility and energy dissipation than conventional reinforced concrete shear walls. SCSSCs have acceptable compressive, shear, in-plane bending, and out-of-plane bending capacities. The use of sandwich-composite walls with J-hook connectors has a significant effect on energy dissipation and reduction of dynamic responses of mid-rise and high-rise structural models. In this paper, incremental dynamic analyses for 10- and 15-story steel structures were performed under seven far-faults by OpenSees. The demand values of 10- and 15-story models are reduced by up to 32% and 45%, respectively, while the structural system change from shear walls (SW) to SCSSC.
Keywords: Sandwich composite wall, SCSSC, fling step, fragility curve, IDA, inter story drift ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 285