Search results for: first order plusdead time process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13820

Search results for: first order plusdead time process

13490 High Performance in Parallel Data Integration: An Empirical Evaluation of the Ratio Between Processing Time and Number of Physical Nodes

Authors: Caspar von Seckendorff, Eldar Sultanow

Abstract:

Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.

Keywords: Process delay, speedup, efficiency, parallel computing, data integration, E-Commerce, Amazon Elastic Compute Cloud (EC2), Hadoop, Nutch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
13489 Dye Removal from Aqueous Solution by Regenerated Spent Bleaching Earth

Authors: Ahmed I. Shehab, Sabah M. Abdel Basir, M. A. Abdel Khalek, M. H. Soliman, G. Elgemeie

Abstract:

Spent bleaching earth (SBE) recycling and utilization as an adsorbent to eliminate dyes from aqueous solution was studied. Organic solvents and subsequent thermal treatment were carried out to recover and reactivate the SBE. The effect of pH, temperature, dye’s initial concentration, and contact time on the dye removal using recycled spent bleaching earth (RSBE) was investigated. Recycled SBE showed better removal affinity of cationic than anionic dyes. The maximum removal was achieved at pH 2 and 8 for anionic and cationic dyes, respectively. Kinetic data matched with the pseudo second-order model. The adsorption phenomenon governing this process was identified by the Langmuir and Freundlich isotherms for anionic dye while Freundlich model represented the sorption process for cationic dye. The changes of Gibbs free energy (ΔG°), enthalpy (ΔH°), and entropy (ΔS°) were computed and compared through thermodynamic study for both dyes.

Keywords: Spent bleaching earth, Regeneration, Dye removal, Thermodynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 888
13488 Computational Study on Cardiac-Coronary Interaction in Terms of Coronary Flow-Pressure Waveforms in Presence of Drugs: Comparison Between Simulated and In Vivo Data

Authors: C. De Lazzari, E. Del Prete, I. Genuini, F. Fedele

Abstract:

Cardiovascular human simulator can be a useful tool in understanding complex physiopathological process in cardiocirculatory system. It can also be a useful tool in order to investigate the effects of different drugs on hemodynamic parameters. The aim of this work is to test the potentiality of our cardiovascular numerical simulator CARDIOSIM© in reproducing flow/pressure coronary waveforms in presence of two different drugs: Amlodipine (AMLO) and Adenosine (ADO). In particular a time-varying intramyocardial compression, assumed to be proportional to the left ventricular pressure, was related to the venous coronary compliances in order to study its effects on the coronary blood flow and the flow/pressure loop. Considering that coronary circulation dynamics is strongly interrelated with the mechanics of the left ventricular contraction, relaxation, and filling, the numerical model allowed to analyze the effects induced by the left ventricular pressure on the coronary flow.

Keywords: Cardiovascular system, Coronary blood flow, Hemodynamic, Numerical simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
13487 A Design of Fractional-Order PI Controller with Error Compensation

Authors: Mazidah Tajjudin, Norhashim Mohd Arshad, Ramli Adnan

Abstract:

Fractional-order controller was proven to perform better than the integer-order controller. However, the absence of a pole at origin produced marginal error in fractional-order control system. This study demonstrated the enhancement of the fractionalorder PI over the integer-order PI in a steam temperature control. The fractional-order controller was cascaded with an error compensator comprised of a very small zero and a pole at origin to produce a zero steady-state error for the closed-loop system. Some modification on the error compensator was suggested for different order fractional integrator that can improve the overall phase margin.

Keywords: Fractional-order PI, Ziegler-Nichols tuning, Oustaloup's Recursive Approximation, steam temperature control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2250
13486 Power System Contingency Analysis Using Multiagent Systems

Authors: Anant Oonsivilai, Kenedy A. Greyson

Abstract:

The demand of the energy management systems (EMS) set forth by modern power systems requires fast energy management systems. Contingency analysis is among the functions in EMS which is time consuming. In order to handle this limitation, this paper introduces agent based technology in the contingency analysis. The main function of agents is to speed up the performance. Negotiations process in decision making is explained and the issue set forth is the minimization of the operating costs. The IEEE 14 bus system and its line outage have been used in the research and simulation results are presented.

Keywords: Agents, model, negotiation, optimal dispatch, powersystems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
13485 Process Parameter Optimization in Resistance Spot Welding of Dissimilar Thickness Materials

Authors: Pradeep M., N. S. Mahesh, Raja Hussain

Abstract:

Resistance spot welding (RSW) has been used widely to join sheet metals. It has been a challenge to get required weld quality in spot welding of dissimilar thickness materials. Weld parameters are not generally available in standards for thickness beyond 4mm. This paper presents the welding process design and parameter optimization of RSW used in joining of low carbon steel sheet of thickness 0.8 mm and metal strips of cross section 10 x 5mm for electrical motor applications. Taguchi quality design was adopted for weld current and time optimization using L9 orthogonal array. Optimum process parameters (current- 3.5kA and time- 10 cycles) were obtained from the Taguchi analysis and shear test results. Confirmation experiment result revealed that the weld quality was within acceptable interval. Further, numerical simulation of RSW process was carried out with selected weld parameters to quantify the temperature at faying surface and check for formation of appropriate nugget. The nugget geometry measured after peel test and predicted from numerical validation method were similar and in accordance with the standards.

Keywords: Resistance spot welding, dissimilar thickness, weld parameters, Taguchi method, numerical modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5144
13484 Decision Tree-based Feature Ranking using Manhattan Hierarchical Cluster Criterion

Authors: Yasmin Mohd Yacob, Harsa A. Mat Sakim, Nor Ashidi Mat Isa

Abstract:

Feature selection study is gaining importance due to its contribution to save classification cost in terms of time and computation load. In search of essential features, one of the methods to search the features is via the decision tree. Decision tree act as an intermediate feature space inducer in order to choose essential features. In decision tree-based feature selection, some studies used decision tree as a feature ranker with a direct threshold measure, while others remain the decision tree but utilized pruning condition that act as a threshold mechanism to choose features. This paper proposed threshold measure using Manhattan Hierarchical Cluster distance to be utilized in feature ranking in order to choose relevant features as part of the feature selection process. The result is promising, and this method can be improved in the future by including test cases of a higher number of attributes.

Keywords: Feature ranking, decision tree, hierarchical cluster, Manhattan distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
13483 Using High Performance Computing for Online Flood Monitoring and Prediction

Authors: Stepan Kuchar, Martin Golasowski, Radim Vavrik, Michal Podhoranyi, Boris Sir, Jan Martinovic

Abstract:

The main goal of this article is to describe the online flood monitoring and prediction system Floreon+ primarily developed for the Moravian-Silesian region in the Czech Republic and the basic process it uses for running automatic rainfall-runoff and hydrodynamic simulations along with their calibration and uncertainty modeling. It takes a long time to execute such process sequentially, which is not acceptable in the online scenario, so the use of a high performance computing environment is proposed for all parts of the process to shorten their duration. Finally, a case study on the Ostravice River catchment is presented that shows actual durations and their gain from the parallel implementation.

Keywords: Flood prediction process, High performance computing, Online flood prediction system, Parallelization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2290
13482 An Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs)

Authors: Rosziati Ibrahim, Siow Yen Yen

Abstract:

System development life cycle (SDLC) is a process uses during the development of any system. SDLC consists of four main phases: analysis, design, implement and testing. During analysis phase, context diagram and data flow diagrams are used to produce the process model of a system. A consistency of the context diagram to lower-level data flow diagrams is very important in smoothing up developing process of a system. However, manual consistency check from context diagram to lower-level data flow diagrams by using a checklist is time-consuming process. At the same time, the limitation of human ability to validate the errors is one of the factors that influence the correctness and balancing of the diagrams. This paper presents a tool that automates the consistency check between Data Flow Diagrams (DFDs) based on the rules of DFDs. The tool serves two purposes: as an editor to draw the diagrams and as a checker to check the correctness of the diagrams drawn. The consistency check from context diagram to lower-level data flow diagrams is embedded inside the tool to overcome the manual checking problem.

Keywords: Data Flow Diagram, Context Diagram, ConsistencyCheck, Syntax and Semantic Rules

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3405
13481 The Simulation and Experimental Investigation to Study the Strain Distribution Pattern during the Closed Die Forging Process

Authors: D. B. Gohil

Abstract:

Closed die forging is a very complex process, and measurement of actual forces for real material is difficult and time consuming. Hence, the modelling technique has taken the advantage of carrying out the experimentation with the proper model material which needs lesser forces and relatively low temperature. The results of experiments on the model material then may be correlated with the actual material by using the theory of similarity. There are several methods available to resolve the complexity involved in the closed die forging process. Finite Element Method (FEM) and Finite Difference Method (FDM) are relatively difficult as compared to the slab method. The slab method is very popular and very widely used by the people working on shop floor because it is relatively easy to apply and reasonably accurate for most of the common forging load requirement computations.

Keywords: Experimentation, forging, process modeling, strain distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1311
13480 A Hybrid Scheme for on-Line Diagnostic Decision Making Using Optimal Data Representation and Filtering Technique

Authors: Hyun-Woo Cho

Abstract:

The early diagnostic decision making in industrial processes is absolutely necessary to produce high quality final products. It helps to provide early warning for a special event in a process, and finding its assignable cause can be obtained. This work presents a hybrid diagnostic schmes for batch processes. Nonlinear representation of raw process data is combined with classification tree techniques. The nonlinear kernel-based dimension reduction is executed for nonlinear classification decision boundaries for fault classes. In order to enhance diagnosis performance for batch processes, filtering of the data is performed to get rid of the irrelevant information of the process data. For the diagnosis performance of several representation, filtering, and future observation estimation methods, four diagnostic schemes are evaluated. In this work, the performance of the presented diagnosis schemes is demonstrated using batch process data.

Keywords: Diagnostics, batch process, nonlinear representation, data filtering, multivariate statistical approach

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281
13479 Using the Monte Carlo Simulation to Predict the Assembly Yield

Authors: C. Chahin, M. C. Hsu, Y. H. Lin, C. Y. Huang

Abstract:

Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.

Keywords: Monte Carlo simulation, placement yield, PCBcharacterization, electronics assembly

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2137
13478 Spatial Analysis of Park and Ride Users’ Dynamic Accessibility to Train Station: A Case Study in Perth

Authors: Ting (Grace) Lin, Jianhong (Cecilia) Xia, Todd Robinson

Abstract:

Accessibility analysis, examining people’s ability to access facilities and destinations, is a fundamental assessment for transport planning, policy making, and social exclusion research. Dynamic accessibility which measures accessibility in real-time traffic environment has been an advanced accessibility indicator in transport research. It is also a useful indicator to help travelers to understand travel time daily variability, assists traffic engineers to monitor traffic congestions, and finally develop effective strategies in order to mitigate traffic congestions. This research involved real-time traffic information by collecting travel time data with 15-minute interval via the TomTom® API. A framework for measuring dynamic accessibility was then developed based on the gravity theory and accessibility dichotomy theory through space and time interpolation. Finally, the dynamic accessibility can be derived at any given time and location under dynamic accessibility spatial analysis framework.

Keywords: Dynamic accessibility, space-time continuum, transport research, TomTom® API.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1043
13477 Real-time Laser Monitoring based on Pipe Detective Operation

Authors: Mongkorn Klingajay, Tawatchai Jitson

Abstract:

The pipe inspection operation is the difficult detective performance. Almost applications are mainly relies on a manual recognition of defective areas that have carried out detection by an engineer. Therefore, an automation process task becomes a necessary in order to avoid the cost incurred in such a manual process. An automated monitoring method to obtain a complete picture of the sewer condition is proposed in this work. The focus of the research is the automated identification and classification of discontinuities in the internal surface of the pipe. The methodology consists of several processing stages including image segmentation into the potential defect regions and geometrical characteristic features. Automatic recognition and classification of pipe defects are carried out by means of using an artificial neural network technique (ANN) based on Radial Basic Function (RBF). Experiments in a realistic environment have been conducted and results are presented.

Keywords: Artificial neural network, Radial basic function, Curve fitting, CCTV, Image segmentation, Data acquisition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
13476 Cryogenic Freezing Process Optimization Based On Desirability Function on the Path of Steepest Ascent

Authors: R. Uporn, P. Luangpaiboon

Abstract:

This paper presents a comparative study of statistical methods for the multi-response surface optimization of a cryogenic freezing process. Taguchi design and analysis and steepest ascent methods based on the desirability function were conducted to ascertain the influential factors of a cryogenic freezing process and their optimal levels. The more preferable levels of the set point, exhaust fan speed, retention time and flow direction are set at -90oC, 20 Hz, 18 minutes and Counter Current, respectively. The overall desirability level is 0.7044.

Keywords: Cryogenic Freezing Process, Taguchi Design and Analysis, Response Surface Method, Steepest Ascent Method and Desirability Function Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
13475 Removal of Malachite Green from Aqueous Solution using Hydrilla verticillata -Optimization, Equilibrium and Kinetic Studies

Authors: R. Rajeshkannan, M. Rajasimman, N. Rajamohan

Abstract:

In this study, the sorption of Malachite green (MG) on Hydrilla verticillata biomass, a submerged aquatic plant, was investigated in a batch system. The effects of operating parameters such as temperature, adsorbent dosage, contact time, adsorbent size, and agitation speed on the sorption of Malachite green were analyzed using response surface methodology (RSM). The proposed quadratic model for central composite design (CCD) fitted very well to the experimental data that it could be used to navigate the design space according to ANOVA results. The optimum sorption conditions were determined as temperature - 43.5oC, adsorbent dosage - 0.26g, contact time - 200min, adsorbent size - 0.205mm (65mesh), and agitation speed - 230rpm. The Langmuir and Freundlich isotherm models were applied to the equilibrium data. The maximum monolayer coverage capacity of Hydrilla verticillata biomass for MG was found to be 91.97 mg/g at an initial pH 8.0 indicating that the optimum sorption initial pH. The external and intra particle diffusion models were also applied to sorption data of Hydrilla verticillata biomass with MG, and it was found that both the external diffusion as well as intra particle diffusion contributes to the actual sorption process. The pseudo-second order kinetic model described the MG sorption process with a good fitting.

Keywords: Response surface methodology, Hydrilla verticillata, malachite green, adsorption, central composite design

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
13474 Knowledge Sharing based on Semantic Nets and Mereology to Avoid Risks in Manufacturing

Authors: Ulrich Berger, Yuliya Lebedynska, Veronica Vargas

Abstract:

The right information at the right time influences the enterprise and technical success. Sharing knowledge among members of a big organization may be a complex activity. And as long as the knowledge is not shared, can not be exploited by the organization. There are some mechanisms which can originate knowledge sharing. It is intended, in this paper, to trigger these mechanisms by using semantic nets. Moreover, the intersection and overlapping of terms and sub-terms, as well as their relationships will be described through the mereology science for the whole knowledge sharing system. It is proposed a knowledge system to supply to operators with the right information about a specific process and possible risks, e.g. at the assembly process, at the right time in an automated manufacturing environment, such as at the automotive industry.

Keywords: Automated manufacturing, knowledge sharing, mereology, risk management, semantic net.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
13473 Simulation of Sample Paths of Non Gaussian Stationary Random Fields

Authors: Fabrice Poirion, Benedicte Puig

Abstract:

Mathematical justifications are given for a simulation technique of multivariate nonGaussian random processes and fields based on Rosenblatt-s transformation of Gaussian processes. Different types of convergences are given for the approaching sequence. Moreover an original numerical method is proposed in order to solve the functional equation yielding the underlying Gaussian process autocorrelation function.

Keywords: Simulation, nonGaussian, random field, multivariate, stochastic process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787
13472 NewPerceptual Organization within Temporal Displacement

Authors: Michele Sinico

Abstract:

The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.

Keywords: Time perception, perceptual present, temporal displacement, gestalt laws of perceptual organization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 768
13471 Computational Simulations on Stability of Model Predictive Control for Linear Discrete-time Stochastic Systems

Authors: Tomoaki Hashimoto

Abstract:

Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial time and a moving terminal time. This paper examines the stability of model predictive control for linear discrete-time systems with additive stochastic disturbances. A sufficient condition for the stability of the closed-loop system with model predictive control is derived by means of a linear matrix inequality. The objective of this paper is to show the results of computational simulations in order to verify the effectiveness of the obtained stability condition.

Keywords: Computational simulations, optimal control, predictive control, stochastic systems, discrete-time systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
13470 A New Fast Skin Color Detection Technique

Authors: Tarek M. Mahmoud

Abstract:

Skin color can provide a useful and robust cue for human-related image analysis, such as face detection, pornographic image filtering, hand detection and tracking, people retrieval in databases and Internet, etc. The major problem of such kinds of skin color detection algorithms is that it is time consuming and hence cannot be applied to a real time system. To overcome this problem, we introduce a new fast technique for skin detection which can be applied in a real time system. In this technique, instead of testing each image pixel to label it as skin or non-skin (as in classic techniques), we skip a set of pixels. The reason of the skipping process is the high probability that neighbors of the skin color pixels are also skin pixels, especially in adult images and vise versa. The proposed method can rapidly detect skin and non-skin color pixels, which in turn dramatically reduce the CPU time required for the protection process. Since many fast detection techniques are based on image resizing, we apply our proposed pixel skipping technique with image resizing to obtain better results. The performance evaluation of the proposed skipping and hybrid techniques in terms of the measured CPU time is presented. Experimental results demonstrate that the proposed methods achieve better result than the relevant classic method.

Keywords: Adult images filtering, image resizing, skin color detection, YcbCr color space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3932
13469 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: Multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, Importance sampling, approximate posterior distribution, Marginal likelihood evidence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571
13468 Evaluation of Produced Water Treatment Using Advanced Oxidation Processes and Sodium Ferrate(VI)

Authors: Erica T. R. Mendonça, Caroline M. B. de Araujo, Filho, Osvaldo Chiavone, Sobrinho, Maurício A. da Motta

Abstract:

Oil and gas exploration is an essential activity for modern society, although the supply of its global demand has caused enough damage to the environment, mainly due to produced water generation, which is an effluent associated with the oil and gas produced during oil extraction. It is the aim of this study to evaluate the treatment of produced water, in order to reduce its oils and greases content (OG), by using flotation as a pre-treatment, combined with oxidation for the remaining organic load degradation. Thus, there has been tested Advanced Oxidation Process (AOP) using both Fenton and photo-Fenton reactions, as well as a chemical oxidation treatment using sodium ferrate(VI), Na2[FeO4], as a strong oxidant. All the studies were carried out using real samples of produced water from petroleum industry. The oxidation process using ferrate(VI) ion was studied based on factorial experimental designs. The factorial design was used in order to study how the variables pH, temperature and concentration of Na2[FeO4] influences the O&G levels. For the treatment using ferrate(VI) ion, the results showed that the best operating point is obtained when the temperature is 28 °C, pH 3, and a 2000 mg.L-1 solution of Na2[FeO4] is used. This experiment has achieved a final O&G level of 4.7 mg.L-1, which means 94% percentage removal efficiency of oils and greases. Comparing Fenton and photo-Fenton processes, it was observed that the Fenton reaction did not provide good reduction of O&G (around 20% only). On the other hand, a degradation of approximately 80.5% of oil and grease was obtained after a period of seven hours of treatment using photo-Fenton process, which indicates that the best process combination has occurred between the flotation and the photo-Fenton reaction using solar radiation, with an overall removal efficiency of O&G of approximately 89%.

Keywords: Advanced oxidation process, ferrate(VI) ion, oils and greases removal, produced water treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
13467 Identification of Author and Reviewer from Single and Double Blind Paper

Authors: Jatinderkumar R. Saini, Nikita R. Sonthalia, Khushbu A. Dodiya

Abstract:

Research leads to the development of science and technology and hence it leads to the betterment of humankind also. Journals and Conferences provide a platform to receive large number of research papers for publications and presentations before the expert and peer-level scientific community. In order to assure quality of such papers, they are also sent to reviewers for their comments. In order to maintain good ethical standards, the research papers are sent to reviewers in such a way authors and reviewers do not know each other’s identity. This technique is called Double-blind Review Process. It is called Single-blind Review Process, if identity of any one party, generally authors’, is disclosed to the other. This paper presents the techniques by which identity of author as well as reviewer could be found even through Double-blind Review process. It is proposed that the characteristics and techniques presented here will help journals and conferences in assuring intentional or un-intentional disclosure of identity revealing information by the either party. 

Keywords: Author, Conference, Double Blind Paper, Journal, Reviewer, Single Blind Paper.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2403
13466 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-by-Wire ECU Development

Authors: A. Ukaew, C. Chauypen

Abstract:

Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual driveby- wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.

Keywords: Drive-by-wire ECU, in-the-loop testing, modelbased design, real-time embedded system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2138
13465 Defect Management Life Cycle Process for Software Quality Improvement

Authors: Aedah A. Rahman, Nurdatillah Hasim

Abstract:

Software quality issues require special attention especially in view of the demands of quality software product to meet customer satisfaction. Software development projects in most organisations need proper defect management process in order to produce high quality software product and reduce the number of defects. The research question of this study is how to produce high quality software and reducing the number of defects. Therefore, the objective of this paper is to provide a framework for managing software defects by following defined life cycle processes. The methodology starts by reviewing defects, defect models, best practices, and standards. A framework for defect management life cycle is proposed. The major contribution of this study is to define a defect management roadmap in software development. The adoption of an effective defect management process helps to achieve the ultimate goal of producing high quality software products and contributes towards continuous software process improvement.

Keywords: Defects, defect management, life cycle process, software quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2539
13464 A Goal-Oriented Social Business Process Management Framework

Authors: Mohammad Ehson Rangiha, Bill Karakostas

Abstract:

Social Business Process Management (SBPM) promises to overcome limitations of traditional BPM by allowing flexible process design and enactment through the involvement of users from a social community. This paper proposes a meta-model and architecture for socially driven business process management systems. It discusses the main facets of the architecture such as goalbased role assignment that combines social recommendations with user profile, and process recommendation, through a real example of a charity organization.

Keywords: Business Process Management, Goal-Based Modelling, Process Recommendation Social Collaboration, Social BPM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2530
13463 Simulation Modeling and Analysis of In-Plant Logistics at a Cement Manufacturing Plant in India

Authors: Sachin Kamble, Shradha Gawankar

Abstract:

This paper presents the findings of successful implementation of Business Process Reengineering (BPR) of cement dispatch activities in a cement manufacturing plant located in India. Simulation model was developed for the purpose of identifying and analyzing the areas for improvement. The company was facing a problem of low throughput rate and subsequent forced stoppages of the plant leading to a high production loss of 15000MT per month. It was found from the study that the present systems and procedures related to the in-plant logistics plant required significant changes. The major recommendations included process improvement at the entry gate, reducing the cycle time at the security gate and installation of an additional weigh bridge. This paper demonstrates how BPR can be implemented for improving the in-plant logistics process. Various recommendations helped the plant to increase its throughput by 14%.

Keywords: Business process reengineering, simulation modeling, in-plant logistics, distribution process, cement industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2246
13462 A Practical Approach for Testing the Process Quality

Authors: Mou-Yuan Liao, Chien-Wei Wu, Chien-Hua Lin

Abstract:

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

Keywords: Process capability analysis, quality control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1387
13461 The Application of an Experimental Design for the Defect Reduction of Electrodeposition Painting on Stainless Steel Washers

Authors: Chansiri Singhtaun, Nattaporn Prasartthong

Abstract:

The purpose of this research is to reduce the amount of incomplete coating of stainless steel washers in the electrodeposition painting process by using an experimental design technique. The surface preparation was found to be a major cause of painted surface quality. The influence of pretreating and painting process parameters, which are cleaning time, chemical concentration and shape of hanger were studied. A 23 factorial design with two replications was performed. The analysis of variance for the designed experiment showed the great influence of cleaning time and shape of hanger. From this study, optimized cleaning time was determined and a newly designed electrical conductive hanger was proved to be superior to the original one. The experimental verification results showed that the amount of incomplete coating defects decreased from 4% to 1.02% and operation cost decreased by 10.5%.

Keywords: Defect reduction, design of experiments, electrodeposition painting, stainless steel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2229