Search results for: Process Models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7381

Search results for: Process Models.

6961 The Impact of Process Parameters on the Output Characteristics of an LDMOS Device

Authors: M. A. Malakoutian, V. Fathipour, M. Fathipour, A. Mojab, M. M. Allame, M. Moradinasab

Abstract:

In this paper, we have examined the effect of process parameter variation on the electrical characteristics of an LDMOS device. The rate of change in the electrical parameters such as cut off frequency, breakdown voltage and drain saturation current as a function of the process parameters is investigated

Keywords: LDMOS, Process Parameters, characteristics, parameter variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1925
6960 Operating System Based Virtualization Models in Cloud Computing

Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi

Abstract:

Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.

Keywords: Virtualization, OS based virtualization, container and hypervisor based virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
6959 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: Integral differential equations, American options, jump–diffusion model, rational approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 510
6958 Fixed Points of Contractive-Like Operators by a Faster Iterative Process

Authors: Safeer Hussain Khan

Abstract:

In this paper, we prove a strong convergence result using a recently introduced iterative process with contractive-like operators. This improves andgeneralizes corresponding results in the literature in two ways: iterativeprocess is faster, operators are more general. At the end, we indicatethat the results can also be proved with the iterative process witherror terms.

Keywords: Contractive-like operator, iterative process, fixed point, strong convergence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681
6957 Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Authors: Engin Yesil, Leon Urbas

Abstract:

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

Keywords: Big Bang-Big Crunch optimization, Dynamic Systems, Fuzzy Cognitive Maps, Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
6956 Analysis of Textual Data Based On Multiple 2-Class Classification Models

Authors: Shigeaki Sakurai, Ryohei Orihara

Abstract:

This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.

Keywords: Text mining, Multiple viewpoints, Differential analysis, Questionnaire data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
6955 Process Oriented Architecture for Emergency Scenarios in the Czech Republic

Authors: Tomáš Ludík, Josef Navrátil, Alena Langerová

Abstract:

Tackling emergency situations is performed based on emergency scenarios. These scenarios do not have a uniform form in the Czech Republic. They are unstructured and developed primarily in the text form. This does not allow solving emergency situations efficiently. For this reason, the paper aims at defining a Process Oriented Architecture to support and thus to improve tackling emergency situations in the Czech Republic. The innovative Process Oriented Architecture is based on the Workflow Reference Model while taking into account the options of Business Process Management Suites for the implementation of process oriented emergency scenarios. To verify the proposed architecture the Proof of Concept has been used which covers the reception of an emergency event at the district emergency operations centre. Within the particular implementation of the proposed architecture the Bonita Open Solution has been used. The architecture created in this way is suitable not only for emergency management, but also for educational purposes.

Keywords: Business Process Management Suite, Czech Republic, Emergency Scenarios, Process Execution, Process Oriented Architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
6954 Automated Process Quality Monitoring with Prediction of Fault Condition Using Measurement Data

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events is important to improve safety and reliability of machine operations and reduce losses caused by failures. Improper set-ups or aligning of parts often leads to severe problems in many machines. The construction of prediction models for predicting faulty conditions is quite essential in making decisions on when to perform machine maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of machine measurement data. The calibration model is used to predict two faulty conditions from historical reference data. This approach utilizes genetic algorithms (GA) based variable selection, and we evaluate the predictive performance of several prediction methods using real data. The results shows that the calibration model based on supervised probabilistic principal component analysis (SPPCA) yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: Prediction, operation monitoring, on-line data, nonlinear statistical methods, empirical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
6953 Comparison of different Channel Modeling Techniques used in the BPLC Systems

Authors: Justinian Anatory, Nelson Theethayi

Abstract:

The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.

Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
6952 Web-Based Cognitive Writing Instruction (WeCWI): A Theoretical-and-Pedagogical e-Framework for Language Development

Authors: Boon Yih Mah

Abstract:

Web-based Cognitive Writing Instruction (WeCWI)’s contribution towards language development can be divided into linguistic and non-linguistic perspectives. In linguistic perspective, WeCWI focuses on the literacy and language discoveries, while the cognitive and psychological discoveries are the hubs in non-linguistic perspective. In linguistic perspective, WeCWI draws attention to free reading and enterprises, which are supported by the language acquisition theories. Besides, the adoption of process genre approach as a hybrid guided writing approach fosters literacy development. Literacy and language developments are interconnected in the communication process; hence, WeCWI encourages meaningful discussion based on the interactionist theory that involves input, negotiation, output, and interactional feedback. Rooted in the elearning interaction-based model, WeCWI promotes online discussion via synchronous and asynchronous communications, which allows interactions happened among the learners, instructor, and digital content. In non-linguistic perspective, WeCWI highlights on the contribution of reading, discussion, and writing towards cognitive development. Based on the inquiry models, learners’ critical thinking is fostered during information exploration process through interaction and questioning. Lastly, to lower writing anxiety, WeCWI develops the instructional tool with supportive features to facilitate the writing process. To bring a positive user experience to the learner, WeCWI aims to create the instructional tool with different interface designs based on two different types of perceptual learning style.

Keywords: WeCWI, literacy discovery, language discovery, cognitive discovery, psychological discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3202
6951 Analysis of Pressure Drop in a Concentrated Solar Collector with Direct Steam Production

Authors: Sara Sallam, Mohamed Taqi, Naoual Belouaggadia

Abstract:

Solar thermal power plants using parabolic trough collectors (PTC) are currently a powerful technology for generating electricity. Most of these solar power plants use thermal oils as heat transfer fluid. The latter is heated in the solar field and transfers the heat absorbed in an oil-water heat exchanger for the production of steam driving the turbines of the power plant. Currently, we are seeking to develop PTCs with direct steam generation (DSG). This process consists of circulating water under pressure in the receiver tube to generate steam directly into the solar loop. This makes it possible to reduce the investment and maintenance costs of the PTCs (the oil-water exchangers are removed) and to avoid the environmental risks associated with the use of thermal oils. The pressure drops in these systems are an important parameter to ensure their proper operation. The determination of these losses is complex because of the presence of the two phases, and most often we limit ourselves to describing them by models using empirical correlations. A comparison of these models with experimental data was performed. Our calculations focused on the evolution of the pressure of the liquid-vapor mixture along the receiver tube of a PTC-DSG for pressure values and inlet flow rates ranging respectively from 3 to 10 MPa, and from 0.4 to 0.6 kg/s. The comparison of the numerical results with experience allows us to demonstrate the validity of some models according to the pressures and the flow rates of entry in the PTC-DSG receiver tube. The analysis of these two parameters’ effects on the evolution of the pressure along the receiving tub, shows that the increase of the inlet pressure and the decrease of the flow rate lead to minimal pressure losses.

Keywords: Direct steam generation, parabolic trough collectors, pressure drop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 737
6950 Vision Based Hand Gesture Recognition Using Generative and Discriminative Stochastic Models

Authors: Mahmoud Elmezain, Samar El-shinawy

Abstract:

Many approaches to pattern recognition are founded on probability theory, and can be broadly characterized as either generative or discriminative according to whether or not the distribution of the image features. Generative and discriminative models have very different characteristics, as well as complementary strengths and weaknesses. In this paper, we study these models to recognize the patterns of alphabet characters (A-Z) and numbers (0-9). To handle isolated pattern, generative model as Hidden Markov Model (HMM) and discriminative models like Conditional Random Field (CRF), Hidden Conditional Random Field (HCRF) and Latent-Dynamic Conditional Random Field (LDCRF) with different number of window size are applied on extracted pattern features. The gesture recognition rate is improved initially as the window size increase, but degrades as window size increase further. Experimental results show that the LDCRF is the best in terms of results than CRF, HCRF and HMM at window size equal 4. Additionally, our results show that; an overall recognition rates are 91.52%, 95.28%, 96.94% and 98.05% for CRF, HCRF, HMM and LDCRF respectively.

Keywords: Statistical Pattern Recognition, Generative Model, Discriminative Model, Human Computer Interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2884
6949 Dynamic Load Modeling for KHUZESTAN Power System Voltage Stability Studies

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Based on the component approach, three kinds of dynamic load models, including a single –motor model, a two-motor model and composite load model have been developed for the stability studies of Khuzestan power system. The study results are presented in this paper. Voltage instability is a dynamic phenomenon and therefore requires dynamic representation of the power system components. Industrial loads contain a large fraction of induction machines. Several models of different complexity are available for the description investigations. This study evaluates the dynamic performances of several dynamic load models in combination with the dynamics of a load changing transformer. Case study is steel industrial substation in Khuzestan power systems.

Keywords: Dynamic load, modeling, Voltage Stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1820
6948 A New Vector Quantization Front-End Process for Discrete HMM Speech Recognition System

Authors: M. Debyeche, J.P Haton, A. Houacine

Abstract:

The paper presents a complete discrete statistical framework, based on a novel vector quantization (VQ) front-end process. This new VQ approach performs an optimal distribution of VQ codebook components on HMM states. This technique that we named the distributed vector quantization (DVQ) of hidden Markov models, succeeds in unifying acoustic micro-structure and phonetic macro-structure, when the estimation of HMM parameters is performed. The DVQ technique is implemented through two variants. The first variant uses the K-means algorithm (K-means- DVQ) to optimize the VQ, while the second variant exploits the benefits of the classification behavior of neural networks (NN-DVQ) for the same purpose. The proposed variants are compared with the HMM-based baseline system by experiments of specific Arabic consonants recognition. The results show that the distributed vector quantization technique increase the performance of the discrete HMM system.

Keywords: Hidden Markov Model, Vector Quantization, Neural Network, Speech Recognition, Arabic Language

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2024
6947 Stochastic Learning Algorithms for Modeling Human Category Learning

Authors: Toshihiko Matsuka, James E. Corter

Abstract:

Most neural network (NN) models of human category learning use a gradient-based learning method, which assumes that locally-optimal changes are made to model parameters on each learning trial. This method tends to under predict variability in individual-level cognitive processes. In addition many recent models of human category learning have been criticized for not being able to replicate rapid changes in categorization accuracy and attention processes observed in empirical studies. In this paper we introduce stochastic learning algorithms for NN models of human category learning and show that use of the algorithms can result in (a) rapid changes in accuracy and attention allocation, and (b) different learning trajectories and more realistic variability at the individual-level.

Keywords: category learning, cognitive modeling, radial basis function, stochastic optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
6946 Personal Information Classification Based on Deep Learning in Automatic Form Filling System

Authors: Shunzuo Wu, Xudong Luo, Yuanxiu Liao

Abstract:

Recently, the rapid development of deep learning makes artificial intelligence (AI) penetrate into many fields, replacing manual work there. In particular, AI systems also become a research focus in the field of automatic office. To meet real needs in automatic officiating, in this paper we develop an automatic form filling system. Specifically, it uses two classical neural network models and several word embedding models to classify various relevant information elicited from the Internet. When training the neural network models, we use less noisy and balanced data for training. We conduct a series of experiments to test my systems and the results show that our system can achieve better classification results.

Keywords: Personal information, deep learning, auto fill, NLP, document analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797
6945 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: Collision, finite element method, Hertz’s Theory, impact models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 730
6944 A Simplified and Effective Algorithm Used to Mine Similar Processes: An Illustrated Example

Authors: Min-Hsun Kuo, Yun-Shiow Chen

Abstract:

The running logs of a process hold valuable information about its executed activity behavior and generated activity logic structure. Theses informative logs can be extracted, analyzed and utilized to improve the efficiencies of the process's execution and conduction. One of the techniques used to accomplish the process improvement is called as process mining. To mine similar processes is such an improvement mission in process mining. Rather than directly mining similar processes using a single comparing coefficient or a complicate fitness function, this paper presents a simplified heuristic process mining algorithm with two similarity comparisons that are able to relatively conform the activity logic sequences (traces) of mining processes with those of a normalized (regularized) one. The relative process conformance is to find which of the mining processes match the required activity sequences and relationships, further for necessary and sufficient applications of the mined processes to process improvements. One similarity presented is defined by the relationships in terms of the number of similar activity sequences existing in different processes; another similarity expresses the degree of the similar (identical) activity sequences among the conforming processes. Since these two similarities are with respect to certain typical behavior (activity sequences) occurred in an entire process, the common problems, such as the inappropriateness of an absolute comparison and the incapability of an intrinsic information elicitation, which are often appeared in other process conforming techniques, can be solved by the relative process comparison presented in this paper. To demonstrate the potentiality of the proposed algorithm, a numerical example is illustrated.

Keywords: process mining, process similarity, artificial intelligence, process conformance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417
6943 Roles and Responsibilities to Success of IT Project in an Organization

Authors: Vahhab Attar Olyaee, Fouad Attar Olyaee

Abstract:

Many IT projects come to failure because of having technical approach, focusing on the final product and lack of proper attention to strategic alignment. Project management models quite often have technical management view [4], [8], [13], [14]. These models focus greatly on the finalization of the project product and the delivery of the product to the customer. However, many project problems are due to lack of attention to the needs and capabilities of the organizations or disregarding how to deploy and use the product in the organization. In this regard, in the current research we are trying to present a solution with the purpose of raising the value of the project in an organization. This way, the project outputs will be properly deployed in the organization. Therefore, a comprehensive model is presented which takes into account the whole processes from initial step of project definition to the deployment of the final outputs in the organization and then the definition of all roles and responsibilities to put the model into practice. Taking into account the opinions of experts and project managers, to prove the performance of the model, the project problems were recognized and based on the model, categorized and analyzed. And at the end it is made clear that ignoring the proper definition of the project and not having a proper understanding of the expected value on the one hand and not supervising the emerged value in the process of production and installment are among the most important factors that bring a project to failure.

Keywords: IT Governance, Project Model, Roles and Responsibilities of Project

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1566
6942 Analytical and Experimental Study on the Effect of Air-Core Coil Parameters on Magnetic Force Used in a Linear Optical Scanner

Authors: Loke Kean Koay, Horizon Gitano-Briggs, Mani Maran Ratnam

Abstract:

Today air-core coils (ACC) are a viable alternative to ferrite-core coils in a range of applications due to their low induction effect. An analytical study was carried out and the results were used as a guide to understand the relationship between the magnet-coil distance and the resulting attractive magnetic force. Four different ACC models were fabricated for experimental study. The variation in the models included the dimensions, the number of coil turns and the current supply to the coil. Comparison between the analytical and experimental results for all the models shows an average discrepancy of less than 10%. An optimized ACC design was selected for the scanner which can provide maximum magnetic force.

Keywords: Air-Core Coils, Electromagnetic, Linear Optical Scanner

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347
6941 Conceptual Method for Flexible Business Process Modeling

Authors: Adla Bentellis, Zizette Boufaïda

Abstract:

Nowadays, the pace of business change is such that, increasingly, new functionality has to be realized and reliably installed in a matter of days, or even hours. Consequently, more and more business processes are prone to a continuous change. The objective of the research in progress is to use the MAP model, in a conceptual modeling method for flexible and adaptive business process. This method can be used to capture the flexibility dimensions of a business process; it takes inspiration from modularity concept in the object oriented paradigm to establish a hierarchical construction of the BP modeling. Its intent is to provide a flexible modeling that allows companies to quickly adapt their business processes.

Keywords: Business Process, Business process modeling, flexibility, MAP Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
6940 Analysis of Hard Turning Process of AISI D3-Thermal Aspects

Authors: B. Varaprasad, C. Srinivasa Rao

Abstract:

In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of  hard turning by using commercial software DEFORM 3D has been compared to experimental results of  stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.

Keywords: Hard-turning, computer-aided engineering, computational machining, finite element method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
6939 Recognition of Gene Names from Gene Pathway Figures Using Siamese Network

Authors: Muhammad Azam, Micheal Olaolu Arowolo, Fei He, Mihail Popescu, Dong Xu

Abstract:

The number of biological papers is growing quickly, which means that the number of biological pathway figures in those papers is also increasing quickly. Each pathway figure shows extensive biological information, like the names of genes and how the genes are related. However, manually annotating pathway figures takes a lot of time and work. Even though using advanced image understanding models could speed up the process of curation, these models still need to be made more accurate. To improve gene name recognition from pathway figures, we applied a Siamese network to map image segments to a library of pictures containing known genes in a similar way to person recognition from photos in many photo applications. We used a triple loss function and a triplet spatial pyramid pooling network by combining the triplet convolution neural network and the spatial pyramid pooling (TSPP-Net). We compared VGG19 and VGG16 as the Siamese network model. VGG16 achieved better performance with an accuracy of 93%, which is much higher than Optical Character Recognition (OCR) results.

Keywords: Biological pathway, image understanding, gene name recognition, object detection, Siamese network, Visual Geometry Group.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 585
6938 Artificial Neural Network Prediction for Coke Strength after Reaction and Data Analysis

Authors: Sulata Maharana, B Biswas, Adity Ganguly, Ashok Kumar

Abstract:

In this paper, the requirement for Coke quality prediction, its role in Blast furnaces, and the model output is explained. By applying method of Artificial Neural Networking (ANN) using back propagation (BP) algorithm, prediction model has been developed to predict CSR. Important blast furnace functions such as permeability, heat exchanging, melting, and reducing capacity are mostly connected to coke quality. Coke quality is further dependent upon coal characterization and coke making process parameters. The ANN model developed is a useful tool for process experts to adjust the control parameters in case of coke quality deviations. The model also makes it possible to predict CSR for new coal blends which are yet to be used in Coke Plant. Input data to the model was structured into 3 modules, for tenure of past 2 years and the incremental models thus developed assists in identifying the group causing the deviation of CSR.

Keywords: Artificial Neural Networks, backpropagation, CokeStrength after Reaction, Multilayer Perceptron.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2564
6937 Grid Coordination with Marketmaker Agents

Authors: Xin Bai, Kresimir Sivoncik, Damla Turgut, Ladislau Bölöni

Abstract:

Market based models are frequently used in the resource allocation on the computational grid. However, as the size of the grid grows, it becomes difficult for the customer to negotiate directly with all the providers. Middle agents are introduced to mediate between the providers and customers and facilitate the resource allocation process. The most frequently deployed middle agents are the matchmakers and the brokers. The matchmaking agent finds possible candidate providers who can satisfy the requirements of the consumers, after which the customer directly negotiates with the candidates. The broker agents are mediating the negotiation with the providers in real time. In this paper we present a new type of middle agent, the marketmaker. Its operation is based on two parallel operations - through the investment process the marketmaker is acquiring resources and resource reservations in large quantities, while through the resale process it sells them to the customers. The operation of the marketmaker is based on the fact that through its global view of the grid it can perform a more efficient resource allocation than the one possible in one-to-one negotiations between the customers and providers. We present the operation and algorithms governing the operation of the marketmaker agent, contrasting it with the matchmaker and broker agents. Through a series of simulations in the task oriented domain we compare the operation of the three agents types. We find that the use of marketmaker agent leads to a better performance in the allocation of large tasks and a significant reduction of the messaging overhead.

Keywords: grid computing, autonomous agents, market-basedgrid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
6936 Microscopic Emission and Fuel Consumption Modeling for Light-duty Vehicles Using Portable Emission Measurement System Data

Authors: Wei Lei, Hui Chen, Lin Lu

Abstract:

Microscopic emission and fuel consumption models have been widely recognized as an effective method to quantify real traffic emission and energy consumption when they are applied with microscopic traffic simulation models. This paper presents a framework for developing the Microscopic Emission (HC, CO, NOx, and CO2) and Fuel consumption (MEF) models for light-duty vehicles. The variable of composite acceleration is introduced into the MEF model with the purpose of capturing the effects of historical accelerations interacting with current speed on emission and fuel consumption. The MEF model is calibrated by multivariate least-squares method for two types of light-duty vehicle using on-board data collected in Beijing, China by a Portable Emission Measurement System (PEMS). The instantaneous validation results shows the MEF model performs better with lower Mean Absolute Percentage Error (MAPE) compared to other two models. Moreover, the aggregate validation results tells the MEF model produces reasonable estimations compared to actual measurements with prediction errors within 12%, 10%, 19%, and 9% for HC, CO, NOx emissions and fuel consumption, respectively.

Keywords: Emission, Fuel consumption, Light-duty vehicle, Microscopic, Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1957
6935 Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model

Authors: Yepeng Cheng, Yasuhiko Morimoto

Abstract:

Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.

Keywords: Customer value, Huff's Gravity Model, POS, retailer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 566
6934 The Reconstruction New Agegraphic and Gauss- Bonnet Dark Energy Models with a Special Power Law Expasion

Authors: V. Fayaz , F. Felegary

Abstract:

Here, in this work we study correspondence the energy density New agegraphic and the energy density Gauss- Bonnet models in flat universe. We reconstruct Λ  and Λ ω for them with 0 ( ) 0 h a t = a t .

Keywords: dark energy, new age graphic, gauss- bonnet, late time universe

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
6933 Modelling of Soil Structure Interaction of Integral Abutment Bridges

Authors: Thevaneyan K. David, John P. Forth

Abstract:

Integral Abutment Bridges (IAB) are defined as simple or multiple span bridges in which the bridge deck is cast monolithically with the abutment walls. This kind of bridges are becoming very popular due to different aspects such as good response under seismic loading, low initial costs, elimination of bearings, and less maintenance. However the main issue related to the analysis of this type of structures is dealing with soil-structure interaction of the abutment walls and the supporting piles. Various soil constitutive models have been used in studies of soil-structure interaction in this kind of structures by researchers. This paper is an effort to review the implementation of various finite elements model which explicitly incorporates the nonlinear soil and linear structural response considering various soil constitutive models and finite element mesh.

Keywords: Constitutive Models, FEM, Integral AbutmentBridges, Soil-structure Interactions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4681
6932 Unified Structured Process for Health Analytics

Authors: Supunmali Ahangama, Danny Chiang Choon Poo

Abstract:

Health analytics (HA) is used in healthcare systems for effective decision making, management and planning of healthcare and related activities. However, user resistances, unique position of medical data content and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. Success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose a HA process model with features from rational unified process (RUP) model and agile methodology.

Keywords: Agile methodology, health analytics, unified process model, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2299