Search results for: first order plusdead time process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13809

Search results for: first order plusdead time process

13779 Lagrange and Multilevel Wavelet-Galerkin with Polynomial Time Basis for Heat Equation

Authors: Watcharakorn Thongchuay, Puntip Toghaw, Montri Maleewong

Abstract:

The Wavelet-Galerkin finite element method for solving the one-dimensional heat equation is presented in this work. Two types of basis functions which are the Lagrange and multi-level wavelet bases are employed to derive the full form of matrix system. We consider both linear and quadratic bases in the Galerkin method. Time derivative is approximated by polynomial time basis that provides easily extend the order of approximation in time space. Our numerical results show that the rate of convergences for the linear Lagrange and the linear wavelet bases are the same and in order 2 while the rate of convergences for the quadratic Lagrange and the quadratic wavelet bases are approximately in order 4. It also reveals that the wavelet basis provides an easy treatment to improve numerical resolutions that can be done by increasing just its desired levels in the multilevel construction process.

Keywords: Galerkin finite element method, Heat equation , Lagrange basis function, Wavelet basis function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1682
13778 A Review in Advanced Digital Signal Processing Systems

Authors: Roza Dastres, Mohsen Soori

Abstract:

Digital Signal Processing (DSP) is the use of digital processing systems by computers in order to perform a variety of signal processing operations. It is the mathematical manipulation of a digital signal's numerical values in order to increase quality as well as effects of signals. DSP can include linear or nonlinear operators in order to process and analyze the input signals. The nonlinear DSP processing is closely related to nonlinear system detection and can be implemented in time, frequency and space-time domains. Applications of the DSP can be presented as control systems, digital image processing, biomedical engineering, speech recognition systems, industrial engineering, health care systems, radar signal processing and telecommunication systems. In this study, advanced methods and different applications of DSP are reviewed in order to move forward the interesting research filed.

Keywords: Digital signal processing, advanced telecommunication, nonlinear signal processing, speech recognition systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 945
13777 Analysis of Acoustic Emission Signal for the Detection of Defective Manufactures in Press Process

Authors: Dong Hun Kim, Won Kyu Lee, Sok Won Kim

Abstract:

Small cracks or chips of a product appear very frequently in the course of continuous production of an automatic press process system. These phenomena become the cause of not only defective product but also damage of a press mold. In order to solve this problem AE system was introduced. AE system was expected to be very effective to real time detection of the defective product and to prevention of the damage of the press molds. In this study, for pick and analysis of AE signals generated from the press process, AE sensors/pre-amplifier/analysis and processing board were used as frequently found in the other similar cases. For analysis and processing the AE signals picked in real time from the good or bad products, specialized software called cdm8 was used. As a result of this work it was conformed that intensity and shape of the various AE signals differ depending on the weight and thickness of metal sheet and process type.

Keywords: press, acoustic emission, signal processing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
13776 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, Modifier, Malicious node, Self-Computing, Distributed-Computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1338
13775 Expert System for Sintering Process Control based on the Information about solid-fuel Flow Composition

Authors: Yendiyarov Sergei, Zobnin Boris, Petrushenko Sergei

Abstract:

Usually, the solid-fuel flow of an iron ore sinter plant consists of different types of the solid-fuels, which differ from each other. Information about the composition of the solid-fuel flow usually comes every 8-24 hours. It can be clearly seen that this information cannot be used to control the sintering process in real time. Due to this, we propose an expert system which uses indirect measurements from the process in order to obtain the composition of the solid-fuel flow by solving an optimization task. Then this information can be used to control the sintering process. The proposed technique can be successfully used to improve sinter quality and reduce the amount of solid-fuel used by the process.

Keywords: sintering process, particle swarm optimization, optimal control, expert system, solid-fuel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
13774 Object-Centric Process Mining Using Process Cubes

Authors: Anahita Farhang Ghahfarokhi, Alessandro Berti, Wil M.P. van der Aalst

Abstract:

Process mining provides ways to analyze business processes. Common process mining techniques consider the process as a whole. However, in real-life business processes, different behaviors exist that make the overall process too complex to interpret. Process comparison is a branch of process mining that isolates different behaviors of the process from each other by using process cubes. Process cubes organize event data using different dimensions. Each cell contains a set of events that can be used as an input to apply process mining techniques. Existing work on process cubes assume single case notions. However, in real processes, several case notions (e.g., order, item, package, etc.) are intertwined. Object-centric process mining is a new branch of process mining addressing multiple case notions in a process. To make a bridge between object-centric process mining and process comparison, we propose a process cube framework, which supports process cube operations such as slice and dice on object-centric event logs. To facilitate the comparison, the framework is integrated with several object-centric process discovery approaches.

Keywords: Process mining, multidimensional process mining, multi-perspective business processes, OLAP, process cubes, process discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1046
13773 Evaluation of a PSO Approach for Optimum Design of a First-Order Controllers for TCP/AQM Systems

Authors: Sana Testouri, Karim Saadaoui, Mohamed Benrejeb

Abstract:

This paper presents a Particle Swarm Optimization (PSO) method for determining the optimal parameters of a first-order controller for TCP/AQM system. The model TCP/AQM is described by a second-order system with time delay. First, the analytical approach, based on the D-decomposition method and Lemma of Kharitonov, is used to determine the stabilizing regions of a firstorder controller. Second, the optimal parameters of the controller are obtained by the PSO algorithm. Finally, the proposed method is implemented in the Network Simulator NS-2 and compared with the PI controller.

Keywords: AQM, first-order controller, time delay, stability, PSO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
13772 Reduction of Linear Time-Invariant Systems Using Routh-Approximation and PSO

Authors: S. Panda, S. K. Tomar, R. Prasad, C. Ardil

Abstract:

Order reduction of linear-time invariant systems employing two methods; one using the advantages of Routh approximation and other by an evolutionary technique is presented in this paper. In Routh approximation method the denominator of the reduced order model is obtained using Routh approximation while the numerator of the reduced order model is determined using the indirect approach of retaining the time moments and/or Markov parameters of original system. By this method the reduced order model guarantees stability if the original high order model is stable. In the second method Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical examples.

Keywords: Model Order Reduction, Markov Parameters, Routh Approximation, Particle Swarm Optimization, Integral Squared Error, Steady State Stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3241
13771 Faults Forecasting System

Authors: Hanaa E.Sayed, Hossam A. Gabbar, Shigeji Miyazaki

Abstract:

This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.

Keywords: Bayesian Techniques, Faults Detection, Forecasting techniques, Multivariate Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1507
13770 Time Comparative Simulator for Distributed Process Scheduling Algorithms

Authors: Nazleeni Samiha Haron, Anang Hudaya Muhamad Amin, Mohd Hilmi Hasan, Izzatdin Abdul Aziz, Wirdhayu Mohd Wahid

Abstract:

In any distributed systems, process scheduling plays a vital role in determining the efficiency of the system. Process scheduling algorithms are used to ensure that the components of the system would be able to maximize its utilization and able to complete all the processes assigned in a specified period of time. This paper focuses on the development of comparative simulator for distributed process scheduling algorithms. The objectives of the works that have been carried out include the development of the comparative simulator, as well as to implement a comparative study between three distributed process scheduling algorithms; senderinitiated, receiver-initiated and hybrid sender-receiver-initiated algorithms. The comparative study was done based on the Average Waiting Time (AWT) and Average Turnaround Time (ATT) of the processes involved. The simulation results show that the performance of the algorithms depends on the number of nodes in the system.

Keywords: Distributed Systems, Load Sharing, Process Scheduling, AWT and ATT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
13769 Inventory Control for a Joint Replenishment Problem with Stochastic Demand

Authors: Bassem Roushdy, Nahed Sobhy, Abdelrhim Abdelhamid, Ahmed Mahmoud

Abstract:

Most papers model Joint Replenishment Problem (JRP) as a (kT,S) where kT is a multiple value for a common review period T,and S is a predefined order up to level. In general the (T,S) policy is characterized by a long out of control period which requires a large amount of safety stock compared to the (R,Q) policy. In this paper a probabilistic model is built where an item, call it item(i), with the shortest order time between interval (T)is modeled under (R,Q) policy and its inventory is continuously reviewed, while the rest of items (j) are periodically reviewed at a definite time corresponding to item

Keywords: Inventory management, Joint replenishment, policy evaluation, stochastic process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2999
13768 Forecasting the Volatility of Geophysical Time Series with Stochastic Volatility Models

Authors: Maria C. Mariani, Md Al Masum Bhuiyan, Osei K. Tweneboah, Hector G. Huizar

Abstract:

This work is devoted to the study of modeling geophysical time series. A stochastic technique with time-varying parameters is used to forecast the volatility of data arising in geophysics. In this study, the volatility is defined as a logarithmic first-order autoregressive process. We observe that the inclusion of log-volatility into the time-varying parameter estimation significantly improves forecasting which is facilitated via maximum likelihood estimation. This allows us to conclude that the estimation algorithm for the corresponding one-step-ahead suggested volatility (with ±2 standard prediction errors) is very feasible since it possesses good convergence properties.

Keywords: Augmented Dickey Fuller Test, geophysical time series, maximum likelihood estimation, stochastic volatility model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 825
13767 Dynamic Load Balancing in PVM Using Intelligent Application

Authors: Kashif Bilal, Tassawar Iqbal, Asad Ali Safi, Nadeem Daudpota

Abstract:

This paper deals with dynamic load balancing using PVM. In distributed environment Load Balancing and Heterogeneity are very critical issues and needed to drill down in order to achieve the optimal results and efficiency. Various techniques are being used in order to distribute the load dynamically among different nodes and to deal with heterogeneity. These techniques are using different approaches where Process Migration is basic concept with different optimal flavors. But Process Migration is not an easy job, it impose lot of burden and processing effort in order to track each process in nodes. We will propose a dynamic load balancing technique in which application will intelligently balance the load among different nodes, resulting in efficient use of system and have no overheads of process migration. It would also provide a simple solution to problem of load balancing in heterogeneous environment.

Keywords: PVM, load balancing, task allocation, intelligent application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
13766 Statistical Analysis of First Order Plus Dead-time System using Operational Matrix

Authors: Pham Luu Trung Duong, Moonyong Lee

Abstract:

To increase precision and reliability of automatic control systems, we have to take into account of random factors affecting the control system. Thus, operational matrix technique is used for statistical analysis of first order plus time delay system with uniform random parameter. Examples with deterministic and stochastic disturbance are considered to demonstrate the validity of the method. Comparison with Monte Carlo method is made to show the computational effectiveness of the method.

Keywords: First order plus dead-time, Operational matrix, Statistical analysis, Walsh function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318
13765 Methods for Material and Process Monitoring by Characterization of (Second and Third Order) Elastic Properties with Lamb Waves

Authors: R. Meier, M. Pander

Abstract:

In accordance with the industry 4.0 concept, manufacturing process steps as well as the materials themselves are going to be more and more digitalized within the next years. The “digital twin” representing the simulated and measured dataset of the (semi-finished) product can be used to control and optimize the individual processing steps and help to reduce costs and expenditure of time in product development, manufacturing, and recycling. In the present work, two material characterization methods based on Lamb waves were evaluated and compared. For demonstration purpose, both methods were shown at a standard industrial product - copper ribbons, often used in photovoltaic modules as well as in high-current microelectronic devices. By numerical approximation of the Rayleigh-Lamb dispersion model on measured phase velocities second order elastic constants (Young’s modulus, Poisson’s ratio) were determined. Furthermore, the effective third order elastic constants were evaluated by applying elastic, “non-destructive”, mechanical stress on the samples. In this way, small microstructural variations due to mechanical preconditioning could be detected for the first time. Both methods were compared with respect to precision and inline application capabilities. Microstructure of the samples was systematically varied by mechanical loading and annealing. Changes in the elastic ultrasound transport properties were correlated with results from microstructural analysis and mechanical testing. In summary, monitoring the elastic material properties of plate-like structures using Lamb waves is valuable for inline and non-destructive material characterization and manufacturing process control. Second order elastic constants analysis is robust over wide environmental and sample conditions, whereas the effective third order elastic constants highly increase the sensitivity with respect to small microstructural changes. Both Lamb wave based characterization methods are fitting perfectly into the industry 4.0 concept.

Keywords: Lamb waves, industry 4.0, process control, elasticity, acoustoelasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1043
13764 The Evaluation of Production Line Performance by Using ARENA – A Case Study

Authors: Muhammad Marsudi, Hani Shafeek

Abstract:

The purpose of this paper is to simulate the production process of a metal stamping industry and to evaluate the utilization of the production line by using ARENA simulation software. The process time and the standard time for each process of the production line is obtained from data given by the company management. Other data are collected through direct observation of the line. There are three work stations performing ten different types of processes in order to produce a single product type. Arena simulation model is then developed based on the collected data. Verification and validation are done to the Arena model, and finally the result of Arena simulation can be analyzed. It is found that utilization at each workstation will increase if batch size is increased although throughput rate remains/is kept constant. This study is very useful for the company because the company needs to improve the efficiency and utilization of its production lines.

Keywords: Arena software, case study, production line, utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5319
13763 Numerical Studies of Galerkin-type Time-discretizations Applied to Transient Convection-diffusion-reaction Equations

Authors: Naveed Ahmed, Gunar Matthies

Abstract:

We deal with the numerical solution of time-dependent convection-diffusion-reaction equations. We combine the local projection stabilization method for the space discretization with two different time discretization schemes: the continuous Galerkin-Petrov (cGP) method and the discontinuous Galerkin (dG) method of polynomial of degree k. We establish the optimal error estimates and present numerical results which shows that the cGP(k) and dG(k)- methods are accurate of order k +1, respectively, in the whole time interval. Moreover, the cGP(k)-method is superconvergent of order 2k and dG(k)-method is of order 2k +1 at the discrete time points. Furthermore, the dependence of the results on the choice of the stabilization parameter are discussed and compared.

Keywords: Convection-diffusion-reaction equations, stabilized finite elements, discontinuous Galerkin, continuous Galerkin-Petrov.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690
13762 Genetic Algorithm and Padé-Moment Matching for Model Order Reduction

Authors: Shilpi Lavania, Deepak Nagaria

Abstract:

A mixed method for model order reduction is presented in this paper. The denominator polynomial is derived by matching both Markov parameters and time moments, whereas numerator polynomial derivation and error minimization is done using Genetic Algorithm. The efficiency of the proposed method can be investigated in terms of closeness of the response of reduced order model with respect to that of higher order original model and a comparison of the integral square error as well.

Keywords: Model Order Reduction (MOR), control theory, Markov parameters, time moments, genetic algorithm, Single Input Single Output (SISO).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3473
13761 Redesigning Business Processes: A Method Based on Simulation and Process Mining Techniques

Authors: Zahra Mohammadnazari, Fateme Rostambeygi, Fatemeh Dehrouyeh, Hwang Ki-Soon, Amir Aghsami

Abstract:

Corporations have always prioritized efforts to examine and improve processes. Various metrics, such as the cost and time required to implement the process and can be specified in this regard. Process improvement can be defined as an improvement of these indicators. This is accomplished by looking at prospective adjustments to the current executive process model or the resources allotted to it. Research has been conducted in this paper to the improve the procurement process and aims to explore assessment prospects in the project using a combination of process mining and simulation (benefiting from Play-In and Play-Out methodologies). To run the simulation, we will need to complete the control flow diagram, institution settings, resource settings, and activity settings. The process of mining event logs yields the process control flow. However, both the entry of institutions and the distribution of resources must be modeled. The rate of admission of institutions and the distribution of time for the implementation of activities will be determined in the next step.

Keywords: Business reengineering, Petri net, process-based simulation, process mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 415
13760 Order Optimization of a Telecommunication Distribution Center through Service Lead Time

Authors: Tamás Hartványi, Ferenc Tóth

Abstract:

European telecommunication distribution center performance is measured by service lead time and quality. Operation model is CTO (customized to order) namely, a high mix customization of telecommunication network equipment and parts. CTO operation contains material receiving, warehousing, network and server assembly to order and configure based on customer specifications. Variety of the product and orders does not support mass production structure. One of the success factors to satisfy customer is to have a proper aggregated planning method for the operation in order to have optimized human resources and highly efficient asset utilization. Research will investigate several methods and find proper way to have an order book simulation where practical optimization problem may contain thousands of variables and the simulation running times of developed algorithms were taken into account with high importance. There are two operation research models that were developed, customer demand is given in orders, no change over time, customer demands are given for product types, and changeover time is constant.

Keywords: CTO, aggregated planning, demand simulation, changeover time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 747
13759 Gaze Patterns of Skilled and Unskilled Sight Readers Focusing on the Cognitive Processes Involved in Reading Key and Time Signatures

Authors: J. F. Viljoen, Catherine Foxcroft

Abstract:

Expert sight readers rely on their ability to recognize patterns in scores, their inner hearing and prediction skills in order to perform complex sight reading exercises. They also have the ability to observe deviations from expected patterns in musical scores. This increases the “Eye-hand span” (reading ahead of the point of playing) in order to process the elements in the score. The study aims to investigate the gaze patterns of expert and non-expert sight readers focusing on key and time signatures. 20 musicians were tasked with playing 12 sight reading examples composed for one hand and five examples composed for two hands to be performed on a piano keyboard. These examples were composed in different keys and time signatures and included accidentals and changes of time signature to test this theory. Results showed that the experts fixate more and for longer on key and time signatures as well as deviations in examples for two hands than the non-expert group. The inverse was true for the examples for one hand, where expert sight readers showed fewer and shorter fixations on key and time signatures as well as deviations. This seems to suggest that experts focus more on the key and time signatures as well as deviations in complex scores to facilitate sight reading. The examples written for one appeared to be too easy for the expert sight readers, compromising gaze patterns.

Keywords: Cognition, eye tracking, musical notation, sight reading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 513
13758 Methods for Business Process Simulation Based on Petri Nets

Authors: K. Shoylekova, K. Grigorova

Abstract:

The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated. Two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.

Keywords: Business process repository, Petri nets, Simulation, Woflan, Yasper.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2004
13757 Identifying Significant Factors of Brick Laying Process through Design of Experiment and Computer Simulation: A Case Study

Authors: M. H. Zarei, A. Nikakhtar, A. H. Roudsari, N. Madadi, K. Y. Wong

Abstract:

Improving performance measures in the construction processes has been a major concern for managers and decision makers in the industry. They seek for ways to recognize the key factors which have the largest effect on the process. Identifying such factors can guide them to focus on the right parts of the process in order to gain the best possible result. In the present study design of experiment (DOE) has been applied to a computer simulation model of brick laying process to determine significant factors while productivity has been chosen as the response of the experiment. To this end, four controllable factors and their interaction have been experimented and the best factor level has been calculated for each one. The results indicate that three factors, namely, labor of brick, labor of mortar and inter arrival time of mortar along with interaction of labor of brick and labor of mortar are significant.

Keywords: Brick laying process, computer simulation, design of experiment, significant factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2054
13756 Production of Biodiesel from Roasted Chicken Fat and Methanol: Free Catalyst

Authors: Jorge Ramírez-Ortiz, Merced Martínez Rosales, Horacio Flores Zúñiga

Abstract:

Transesterification reactions free of catalyst between roasted chicken fat with methanol were carried out in a batch reactor in order to produce biodiesel to temperatures from 120°C to 140°C. Parameters related to the transesterification reactions, including temperature, time and the molar ratio of chicken fat to methanol also investigated. The maximum yield of the reaction was of 98% under conditions of 140°C, 4 h of reaction time and a molar ratio of chicken fat to methanol of 1:31. The biodiesel thus obtained exhibited a viscosity of 6.3 mm2/s and a density of 895.9 kg/m3. The results showed this process can be right choice to produce biodiesel since this process does not use any catalyst. Therefore, the steps of neutralization and washing are avoided, indispensables in the case of the alkaline catalysis.

Keywords: Biodiesel, non-catalyst, roasted chicken fat, transesterification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3110
13755 Introducing Fast Robot Roller Hemming Process in Automotive Industry

Authors: Babak Saboori, Behzad Saboori, Johan S. Carlson, Rikard Söderberg

Abstract:

As product life cycle becomes less and less every day, having flexible manufacturing processes for any companies seems more demanding. In the assembling of closures, i.e. opening parts in car body, hemming process is the one which needs more attention. This paper focused on the robot roller hemming process and how to reduce its cycle time by introducing a fast roller hemming process. A robot roller hemming process of a tailgate of Saab 93 SportCombi model is investigated as a case study in this paper. By applying task separation, robot coordination, and robot cell configuration principles in the roller hemming process, three alternatives are proposed, developed, and remarkable reduction in cycle times achieved [1].

Keywords: Cell configuration, cycle time, robot coordination, roller hemming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4032
13754 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase

Authors: A. Lauvray, F. Poulhaon, P. Michaud, P. Joyot, E. Duc

Abstract:

Additive Friction Stir Manufacturing, or AFSM, is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. There is still a lack in understanding of the physical phenomena taking place during the process. This research aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system due to pure friction. An analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable, due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes through a numerical modeling followed by an experimental validation to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.

Keywords: numerical model, additive manufacturing, frictional heat generation, process

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 454
13753 Application of Rapid Prototyping to Create Additive Prototype Using Computer System

Authors: Meftah O. Bashir, Fatma A. Karkory

Abstract:

Rapid prototyping is a new group of manufacturing processes, which allows fabrication of physical of any complexity using a layer by layer deposition technique directly from a computer system. The rapid prototyping process greatly reduces the time and cost necessary to bring a new product to market. The prototypes made by these systems are used in a range of industrial application including design evaluation, verification, testing, and as patterns for casting processes. These processes employ a variety of materials and mechanisms to build up the layers to build the part. The present work was to build a FDM prototyping machine that could control the X-Y motion and material deposition, to generate two-dimensional and three-dimensional complex shapes. This study focused on the deposition of wax material. This work was to find out the properties of the wax materials used in this work in order to enable better control of the FDM process. This study will look at the integration of a computer controlled electro-mechanical system with the traditional FDM additive prototyping process. The characteristics of the wax were also analysed in order to optimise the model production process. These included wax phase change temperature, wax viscosity and wax droplet shape during processing.

Keywords: Rapid prototyping, wax, manufacturing processes, additive prototyping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635
13752 Manufacturing Process and Cost Estimation through Process Detection by Applying Image Processing Technique

Authors: Chalakorn Chitsaart, Suchada Rianmora, Noppawat Vongpiyasatit

Abstract:

In order to reduce the transportation time and cost for direct interface between customer and manufacturer, the image processing technique has been introduced in this research where designing part and defining manufacturing process can be performed quickly. A3D virtual model is directly generated from a series of multi-view images of an object, and it can be modified, analyzed, and improved the structure, or function for the further implementations, such as computer-aided manufacturing (CAM). To estimate and quote the production cost, the user-friendly platform has been developed in this research where the appropriate manufacturing parameters and process detections have been identified and planned by CAM simulation.

Keywords: Image processing technique, Feature detections, Surface registrations, Capturing multi-view images, Production costs, and Manufacturing processes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
13751 Improving Patients Discharge Process in Hospitals by using Six Sigma Approach

Authors: Mahmoud A. El-Banna

Abstract:

The need to increase the efficiency of health care systems is becoming an obligation, and one of area of improvement is the discharge process. The objective of this work is to minimize the patients discharge time (for insured patients) to be less than 50 minutes by using six sigma approach, this improvement will also: lead to an increase in customer satisfaction, increase the number of admissions and turnover on the rooms, increase hospital profitability.Three different departments were considered in this study: Female, Male, and Paediatrics. Six Sigma approach coupled with simulation has been applied to reduce the patients discharge time for pediatrics, female, and male departments at hospital. Upon applying these recommendations at hospital: 60%, 80%, and 22% of insured female, male, and pediatrics patients respectively will have discharge time less than the upper specification time i.e. 50 min.

Keywords: Discharge Time, Healthcare, Hospitals, Patients, Process Improvement, Six Sigma, Simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4698
13750 Integrating Process Planning and Scheduling for Prismatic Parts Regard to Due Date

Authors: M. Haddadzade, M. R. Razfar, M. Farahnakian

Abstract:

Integration of process planning and scheduling functions is necessary to achieve superior overall system performance. This paper proposes a methodology for integration of process planning and scheduling for prismatic component that can be implemented in a company with existing departments. The developed model considers technological constraints whereas available time for machining in shop floor is the limiting factor to produce multiple process plan (MPP). It takes advantage of MPP while guarantied the fulfillment of the due dates via using overtime. This study has been proposed to determinate machining parameters, tools, machine and amount of over time within the minimum cost objective while overtime is considered for this. At last the illustration shows that the system performance is improved by as measured by cost and compatible with due date.

Keywords: Due date, Integration, Multiple process plan, Process planning, Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601