Search results for: stochastic process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5665

Search results for: stochastic process

5425 Application of Feed Forward Neural Networks in Modeling and Control of a Fed-Batch Crystallization Process

Authors: Petia Georgieva, Sebastião Feyo de Azevedo

Abstract:

This paper is focused on issues of nonlinear dynamic process modeling and model-based predictive control of a fed-batch sugar crystallization process applying the concept of artificial neural networks as computational tools. The control objective is to force the operation into following optimal supersaturation trajectory. It is achieved by manipulating the feed flow rate of sugar liquor/syrup, considered as the control input. A feed forward neural network (FFNN) model of the process is first built as part of the controller structure to predict the process response over a specified (prediction) horizon. The predictions are supplied to an optimization procedure to determine the values of the control action over a specified (control) horizon that minimizes a predefined performance index. The control task is rather challenging due to the strong nonlinearity of the process dynamics and variations in the crystallization kinetics. However, the simulation results demonstrated smooth behavior of the control actions and satisfactory reference tracking.

Keywords: Feed forward neural network, process modelling, model predictive control, crystallization process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875
5424 A New Framework to Model a Secure E-Commerce System

Authors: A. Youseef, F. Liu

Abstract:

The existing information system (IS) developments methods are not met the requirements to resolve the security related IS problems and they fail to provide a successful integration of security and systems engineering during all development process stages. Hence, the security should be considered during the whole software development process and identified with the requirements specification. This paper aims to propose an integrated security and IS engineering approach in all software development process stages by using i* language. This proposed framework categorizes into three separate parts: modelling business environment part, modelling information technology system part and modelling IS security part. The results show that considering security IS goals in the whole system development process can have a positive influence on system implementation and better meet business expectations.

Keywords: Business Process Modelling (BPM), Information System Security, Software Development Process, Requirement Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
5423 Carrying Out the Steps of Decision Making Process in Concrete Organization

Authors: Eva Štěpánková

Abstract:

The decision-making process is theoretically clearly defined. Generally, it includes the problem identification and analysis, data gathering, goals and criteria setting, alternatives development and optimal alternative choice and its implementation. In practice however, various modifications of the theoretical decision-making process can occur. The managers can consider some of the phases to be too complicated or unfeasible and thus they do not carry them out and conversely some of the steps can be overestimated. The aim of the paper is to reveal and characterize the perception of the individual phases of decision-making process by the managers. The research is concerned with managers in the military environment – commanders. Quantitative survey is focused cross-sectionally in the individual levels of management of the Ministry of Defence of the Czech Republic. On the total number of 135 respondents the analysis focuses on which of the decision-making process phases are problematic or not carried out in practice and which are again perceived to be the easiest. Then it is examined the reasons of the findings.

Keywords: Decision making, decision making process, decision problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962
5422 Parametric Studies of Ethylene Dichloride Purification Process

Authors: Sh. Arzani, H. Kazemi Esfeh, Y. Galeh Zadeh, V. Akbari

Abstract:

Ethylene dichloride is a colorless liquid with a smell like chloroform. EDC is classified in the simple hydrocarbon group which is obtained from chlorinating ethylene gas. Its chemical formula is C2H2Cl2 which is used as the main mediator in VCM production. Therefore, the purification process of EDC is important in the petrochemical process. In this study, the purification unit of EDC was simulated, and then validation was performed. Finally, the impact of process parameter was studied for the degree of EDC purity. The results showed that by increasing the feed flow, the reflux impure combinations increase and result in an EDC purity decrease.

Keywords: Ethylene dichloride, purification, EDC, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2019
5421 Effect of Impurities in the Chlorination Process of TiO2

Authors: Seok Hong Min, Tae Kwon Ha

Abstract:

With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.

Keywords: Rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
5420 Electricity Load Modeling: An Application to Italian Market

Authors: Giovanni Masala, Stefania Marica

Abstract:

Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.

Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
5419 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the workpiece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: Dexel, process stability, material removal, milling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261
5418 New Product Development Process on High-Tech Innovation Life Cycle

Authors: Gonçalo G. Aleixo, Alexandra B. Tenera

Abstract:

This work will provide a new perspective of exploring innovation thematic. It will reveal that radical and incremental innovations are complementary during the innovation life cycle and accomplished through distinct ways of developing new products. Each new product development process will be constructed according to the nature of each innovation and the state of the product development. This paper proposes the inclusion of the organizational function areas that influence new product's development on the new product development process.

Keywords: Cross-functional, Incremental Innovation, New Product development Process, Radical Innovation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3844
5417 Phenomenological and Theoretical Analysis of Relativistic Temperature Transformation and Relativistic Entropy

Authors: Marko Popovic

Abstract:

There are three possible effects of Special Theory of Relativity (STR) on a thermodynamic system. Planck and Einstein looked upon this process as isobaric; on the other hand Ott saw it as an adiabatic process. However plenty of logical reasons show that the process is isotherm. Our phenomenological consideration demonstrates that the temperature is invariant with Lorenz transformation. In that case process is isotherm, so volume and pressure are Lorentz covariant. If the process is isotherm the Boyles law is Lorentz invariant. Also equilibrium constant and Gibbs energy, activation energy, enthalpy entropy and extent of the reaction became Lorentz invariant.

Keywords: STR, relativistic temperature transformation, Boyle'slaw, equilibrium constant, Gibbs energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2292
5416 The Use of Process-Oriented Methods of Calculation to Determine the Costs of Logistics Processes

Authors: Tomas Cechura, Michal Simon

Abstract:

The aim of this paper is to create a proposal for determining the costs of logistics processes by using process-oriented calculation methods. The traditional approach is that logistics costs are part of manufacturing overhead which is usually calculated as a percentage surcharge. Therefore in the traditional approach it is not obvious where and in which activities costs were incurred. So it is impossible to trace logistics costs to products. Our point of view is trying to fix or at least improve this issue. Another benefit of applying the process approach is identification of logistics processes which are otherwise part of manufacturing overhead. In the first part this paper describes the development of process-oriented methods over time. The next part shows the possibility of implementing the process-oriented method called Prozesskostenrechnung to logistics processes. The conclusion summarizes advantages and disadvantages of using this method in logistics.

Keywords: Cost, logistics, calculation, process-oriented method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
5415 Mathematical Models of Flow Shop and Job Shop Scheduling Problems

Authors: Miloš Šeda

Abstract:

In this paper, mathematical models for permutation flow shop scheduling and job shop scheduling problems are proposed. The first problem is based on a mixed integer programming model. As the problem is NP-complete, this model can only be used for smaller instances where an optimal solution can be computed. For large instances, another model is proposed which is suitable for solving the problem by stochastic heuristic methods. For the job shop scheduling problem, a mathematical model and its main representation schemes are presented.

Keywords: Flow shop, job shop, mixed integer model, representation scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4678
5414 Prediction of Solidification Behavior of Al Alloy in a Cube Mold Cavity

Authors: N. P. Yadav, Deepti Verma

Abstract:

This paper focuses on the mathematical modeling for solidification of Al alloy in a cube mold cavity to study the solidification behavior of casting process. The parametric investigation of solidification process inside the cavity was performed by using computational solidification/melting model coupled with Volume of fluid (VOF) model. The implicit filling algorithm is used in this study to understand the overall process from the filling stage to solidification in a model metal casting process. The model is validated with past studied at same conditions. The solidification process is analyzed by including the effect of pouring velocity as well as natural convection from the wall and geometry of the cavity. These studies show the possibility of various defects during solidification process.

Keywords: Buoyancy driven flow, natural convection driven flow, residual flow, secondary flow, volume of fluid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2316
5413 Transmission Lines Loading Enhancement Using ADPSO Approach

Authors: M. Mahdavi, H. Monsef, A. Bagheri

Abstract:

Discrete particle swarm optimization (DPSO) is a powerful stochastic evolutionary algorithm that is used to solve the large-scale, discrete and nonlinear optimization problems. However, it has been observed that standard DPSO algorithm has premature convergence when solving a complex optimization problem like transmission expansion planning (TEP). To resolve this problem an advanced discrete particle swarm optimization (ADPSO) is proposed in this paper. The simulation result shows that optimization of lines loading in transmission expansion planning with ADPSO is better than DPSO from precision view point.

Keywords: ADPSO, TEP problem, Lines loading optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
5412 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology

Authors: Anjian Chen, Joseph C. Chen

Abstract:

This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.

Keywords: Additive manufacturing, fused deposition modeling, surface roughness, Six-Sigma, Taguchi method, 3D printing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1391
5411 Line Balancing in the Hard Disk Drive Process Using Simulation Techniques

Authors: Teerapun Saeheaw, Nivit Charoenchai, Wichai Chattinnawat

Abstract:

Simulation model is an easy way to build up models to represent real life scenarios, to identify bottlenecks and to enhance system performance. Using a valid simulation model may give several advantages in creating better manufacturing design in order to improve the system performances. This paper presents result of implementing a simulation model to design hard disk drive manufacturing process by applying line balancing to improve both productivity and quality of hard disk drive process. The line balance efficiency showed 86% decrease in work in process, output was increased by an average of 80%, average time in the system was decreased 86% and waiting time was decreased 90%.

Keywords: line balancing, arena, hard disk drive process, simulation, work in process (WIP)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2164
5410 Integration Process of Industrial Design and Engineering Design

Authors: Kazuhide Sugiyama, Hiroshi Osada

Abstract:

Lately management strategy that put Industrial Design (ID) in its core is recognized more important, as technology and price alone cannot differentiate a product. The needs to shorten the time to develop a product also shorten the development period of ID, and it necessitates the ID process management. This research analyzes the status of integration process of ID and Engineering Design (ED) of office equipment that requires the collaboration of ID and ED to clarify the issues for the efficiency of the development and to propose solutions.

Keywords: Industrial Design (ID), Engineering Design (ED), Integration process, Office equipment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
5409 Using Perspective Schemata to Model the ETL Process

Authors: Valeria M. Pequeno, Joao Carlos G. M. Pires

Abstract:

Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.

Keywords: conceptual data model, correspondence assertions, data warehouse, data integration, ETL process, object relational database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
5408 Assessing Community Participation in Decision-Making Process under Co-Management: A Case Study on Hail Haor, Bangladesh

Authors: R. Ferdous

Abstract:

Power, responsibility sharing, and democratic decision-making are the central ethos to co-management. It is assumed that involving local community in the decision-making process can create a sense of ownership and responsibility of that community and motivate the community towards collective action. But this paper demonstrated that the process to involve local community is not simple and straightforward as it is influenced by structural aspects, power relations among the actors, and social embedded institutions. These factors shape the process in that way who will participate, how they will participate and how the local community maneuvers their agency in the decision-making process. To grasp the complexities that materialize in the process of participation and to understand the inclusionary and exclusionary nature of participation, this paper examines the subjective understanding of different stakeholders concerning participation and furthermore observes the enabling or constraining factors that affect the community to exercise their agency.

Keywords: Participation, social embeddedness, power, structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
5407 Quality Based Approach for Efficient Biologics Manufacturing

Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama

Abstract:

To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.

Keywords: Antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
5406 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, A. Syed-Khaja, J. Franke

Abstract:

The importance of energy efficiency within the production processes increases steadily. For a comprehensive assessment of energy efficiency within the production process, unfortunately no tools exist or have been developed yet. Therefore the Institute for Factory Automation and Production Systems at the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency namely EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state-of-the-art as well as the developed approaches.

Keywords: Energy efficiency, energy efficiency value, energetic process efficiency, production.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2280
5405 Comparative Analysis of Transient-Fault Tolerant Schemes for Network on Chips

Authors: Muhammad Ali, Awais Adnan

Abstract:

Network on a chip (NoC) has been proposed as a viable solution to counter the inefficiency of buses in the current VLSI on-chip interconnects. However, as the silicon chip accommodates more transistors, the probability of transient faults is increasing, making fault tolerance a key concern in scaling chips. In packet based communication on a chip, transient failures can corrupt the data packet and hence, undermine the accuracy of data communication. In this paper, we present a comparative analysis of transient fault tolerant techniques including end-to-end, node-by-node, and stochastic communication based on flooding principle.

Keywords: NoC, fault-tolerance, transient faults.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
5404 Using Combination of Optimized Recurrent Neural Network with Design of Experiments and Regression for Control Chart Forecasting

Authors: R. Behmanesh, I. Rahimi

Abstract:

recurrent neural network (RNN) is an efficient tool for modeling production control process as well as modeling services. In this paper one RNN was combined with regression model and were employed in order to be checked whether the obtained data by the model in comparison with actual data, are valid for variable process control chart. Therefore, one maintenance process in workshop of Esfahan Oil Refining Co. (EORC) was taken for illustration of models. First, the regression was made for predicting the response time of process based upon determined factors, and then the error between actual and predicted response time as output and also the same factors as input were used in RNN. Finally, according to predicted data from combined model, it is scrutinized for test values in statistical process control whether forecasting efficiency is acceptable. Meanwhile, in training process of RNN, design of experiments was set so as to optimize the RNN.

Keywords: RNN, DOE, regression, control chart.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
5403 Experimental Investigation on Freeze-Concentration Process Desalting for Highly Saline Brines

Authors: H. Al-Jabli

Abstract:

Using the freeze-melting process for the disposing of high saline brines was the aim of the paper by confirming the performance estimation of the treatment system. A laboratory bench scale freezing technique test unit was designed, constructed, and tested at Doha Research Plant (DRP) in Kuwait. The principal unit operations that have been considered for the laboratory study are: ice crystallization, separation, washing, and melting. The applied process is characterized as “the secondary-refrigerant indirect freezing”, which is utilizing normal freezing concept. The high saline brine was used as definite feed water, i.e. average TDS of 250,000 ppm. Kuwait desalination plants were carried out in the experimental study to measure the performance of the proposed treatment system. Experimental analysis shows that the freeze-melting process is capable of dropping the TDS of the feed water from 249,482 ppm to 56,880 ppm of the freeze-melting process in the two-phase’s course, whereas overall recovery results of the salt passage and salt rejection are 31.11%, 19.05%, and 80.95%, correspondingly. Therefore, the freeze-melting process is encouraging for the proposed application, as it shows on the results, which approves the process capability of reducing a major amount of the dissolved salts of the high saline brine with reasonable sensible recovery. This process might be reasonable with other brine disposal processes.

Keywords: High saline brine, freeze-melting process, ice crystallization, brine disposal process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1059
5402 Overriding Moral Intuitions – Does It Make Us Immoral? Dual-Process Theory of Higher Cognition Account for Moral Reasoning

Authors: Michał Białek, Simon J. Handley

Abstract:

Moral decisions are considered as an intuitive process, while conscious reasoning is mostly used only to justify those intuitions. This problem is described in few different dual-process theories of mind, that are being developed e.g. by Frederick and Kahneman, Stanovich and Evans. Those theories recently evolved into tri-process theories with a proposed process that makes ultimate decision or allows to paraformal processing with focal bias.. Presented experiment compares the decision patterns to the implications of those models. In presented study participants (n=179) considered different aspects of trolley dilemma or its footbridge version and decided after that. Results show that in the control group 70% of people decided to use the lever to change tracks for the running trolley, and 20% chose to push the fat man down the tracks. In contrast, after experimental manipulation almost no one decided to act. Also the decision time difference between dilemmas disappeared after experimental manipulation. The result supports the idea of three co-working processes: intuitive (TASS), paraformal (reflective mind) and algorithmic process.

Keywords: Moral reasoning, moral decision, reflection, trolley problem, dual-process theory of reasoning, tri-process theory of cognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2029
5401 Delay-Distribution-Dependent Stability Criteria for BAM Neural Networks with Time-Varying Delays

Authors: J.H. Park, S. Lakshmanan, H.Y. Jung, S.M. Lee

Abstract:

This paper is concerned with the delay-distributiondependent stability criteria for bidirectional associative memory (BAM) neural networks with time-varying delays. Based on the Lyapunov-Krasovskii functional and stochastic analysis approach, a delay-probability-distribution-dependent sufficient condition is derived to achieve the globally asymptotically mean square stable of the considered BAM neural networks. The criteria are formulated in terms of a set of linear matrix inequalities (LMIs), which can be checked efficiently by use of some standard numerical packages. Finally, a numerical example and its simulation is given to demonstrate the usefulness and effectiveness of the proposed results.

Keywords: BAM neural networks, Probabilistic time-varying delays, Stability criteria.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1418
5400 Rapid Data Acquisition System for Complex Algorithm Testing in Plastic Molding Industry

Authors: A. Tellaeche, R. Arana

Abstract:

Injection molding is a very complicated process to monitor and control. With its high complexity and many process parameters, the optimization of these systems is a very challenging problem. To meet the requirements and costs demanded by the market, there has been an intense development and research with the aim to maintain the process under control. This paper outlines the latest advances in necessary algorithms for plastic injection process and monitoring, and also a flexible data acquisition system that allows rapid implementation of complex algorithms to assess their correct performance and can be integrated in the quality control process. This is the main topic of this paper. Finally, to demonstrate the performance achieved by this combination, a real case of use is presented.

Keywords: Plastic injection, machine learning, rapid complex algorithm prototyping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2123
5399 Parameters Extraction for Pseudomorphic HEMTs Using Genetic Algorithms

Authors: Mazhar B. Tayel, Amr H. Yassin

Abstract:

A proposed small-signal model parameters for a pseudomorphic high electron mobility transistor (PHEMT) is presented. Both extrinsic and intrinsic circuit elements of a smallsignal model are determined using genetic algorithm (GA) as a stochastic global search and optimization tool. The parameters extraction of the small-signal model is performed on 200-μm gate width AlGaAs/InGaAs PHEMT. The equivalent circuit elements for a proposed 18 elements model are determined directly from the measured S- parameters. The GA is used to extract the parameters of the proposed small-signal model from 0.5 up to 18 GHz.

Keywords: PHEMT, Genetic Algorithms, small signal modeling, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262
5398 Losses Analysis in TEP Considering Uncertainity in Demand by DPSO

Authors: S. Jalilzadeh, A. Kimiyaghalam, A. Ashouri

Abstract:

This paper presents a mathematical model and a methodology to analyze the losses in transmission expansion planning (TEP) under uncertainty in demand. The methodology is based on discrete particle swarm optimization (DPSO). DPSO is a useful and powerful stochastic evolutionary algorithm to solve the large-scale, discrete and nonlinear optimization problems like TEP. The effectiveness of the proposed idea is tested on an actual transmission network of the Azerbaijan regional electric company, Iran. The simulation results show that considering the losses even for transmission expansion planning of a network with low load growth is caused that operational costs decreases considerably and the network satisfies the requirement of delivering electric power more reliable to load centers.

Keywords: DPSO, TEP, Uncertainty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476
5397 An Agent-Based Approach to Immune Modelling: Priming Individual Response

Authors: Dimitri Perrin, Heather J. Ruskin, Martin Crane

Abstract:

This study focuses on examining why the range of experience with respect to HIV infection is so diverse, especially in regard to the latency period. An agent-based approach in modelling the infection is used to extract high-level behaviour which cannot be obtained analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties, contributing to the individual disease experience, and is included in a network which mimics the chain of lymph nodes. The model also accounts for stochastic events such as viral mutations. The size and complexity of the model require major computational effort and parallelisation methods are used.

Keywords: HIV, Immune modelling, Agent-based system, individual response.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1273
5396 The Development of the Quality Management Processes for the Building and Environment of the Basic Education Schools

Authors: Suppara Charoenpoom

Abstract:

The objectives of this research was to design and develop a quality management of the school buildings and environment. A quantitative and qualitative mixed research methodology was used. The population sample included 14 directors of primary schools. Two research tools were used. The first research tool included an in-depth interview and questionnaire. The second research tool included the Quality Business Process and Quality Work Procedure, and a Key Performance Indicator of each activity. The statistics included mean and standard deviation. The findings for the development of a quality management process of buildings and environment administration of the basic schools consisted of one quality business process (QBP) and seven quality work processes (QWP). The result from the experts’ evaluation revealed that the process and implementation of quality management of the school buildings and environment has passed the inspection process with consensus. This implies that the process of quality management of the school buildings and environment is suitable for implementation. Moreover, the level of agreement in the feasibility of the implementation of this plan had the mean in the range of 0.64-1.00 which suggests the design of the new plan is acceptable.

Keywords: Process, Building, Environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634