Search results for: decision making process
5783 Multiple Model and Neural based Adaptive Multi-loop PID Controller for a CSTR Process
Authors: R.Vinodha S. Abraham Lincoln, J. Prakash
Abstract:
Multi-loop (De-centralized) Proportional-Integral- Derivative (PID) controllers have been used extensively in process industries due to their simple structure for control of multivariable processes. The objective of this work is to design multiple-model adaptive multi-loop PID strategy (Multiple Model Adaptive-PID) and neural network based multi-loop PID strategy (Neural Net Adaptive-PID) for the control of multivariable system. The first method combines the output of multiple linear PID controllers, each describing process dynamics at a specific level of operation. The global output is an interpolation of the individual multi-loop PID controller outputs weighted based on the current value of the measured process variable. In the second method, neural network is used to calculate the PID controller parameters based on the scheduling variable that corresponds to major shift in the process dynamics. The proposed control schemes are simple in structure with less computational complexity. The effectiveness of the proposed control schemes have been demonstrated on the CSTR process, which exhibits dynamic non-linearity.Keywords: Multiple-model Adaptive PID controller, Multivariableprocess, CSTR process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20145782 The Mechanistic and Oxidative Study of Methomyl and Parathion Degradation by Fenton Process
Authors: Chihhao Fan, Ming-Chu Liao
Abstract:
The purpose of this study is to investigate the chemical degradation of the organophosphorus pesticide of parathion and carbamate insecticide of methomyl in the aqueous phase through Fenton process. With the employment of batch Fenton process, the degradation of the two selected pesticides at different pH, initial concentration, humic acid concentration, and Fenton reagent dosages was explored. The Fenton process was found effective to degrade parathion and methomyl. The optimal dosage of Fenton reagents (i.e., molar concentration ratio of H2O2 to Fe2+) at pH 7 for parathion degradation was equal to 3, which resulted in 50% removal of parathion. Similarly, the optimal dosage for methomyl degradation was 1, resulting in 80% removal of methomyl. This study also found that the presence of humic substances has enhanced pesticide degradation by Fenton process significantly. The mass spectroscopy results showed that the hydroxyl free radical may attack the single bonds with least energy of investigated pesticides to form smaller molecules which is more easily to degrade either through physio-chemical or bilolgical processes.Keywords: Fenton Process, humic acid, methomyl, parathion, pesticides
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16305781 A Bathtub Curve from Nonparametric Model
Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos
Abstract:
This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.
Keywords: Bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22355780 Drop Impact on a Vibrated, Heated Surface: Towards a Potential New Way of Elaborating Nuclear Fuel from Gel Microspheres
Authors: Méryl Brothier, Dominique Moulinier, Christophe Bertaux
Abstract:
The gel-supported precipitation (GSP) process can be used to make spherical particles (spherules) of nuclear fuel, particularly for very high temperature reactors (VHTR) and even for implementing the process called SPHEREPAC. In these different cases, the main characteristics are the sphericity of the particles to be manufactured and the control over their grain size. Nonetheless, depending on the specifications defined for these spherical particles, the GSP process has intrinsic limits, particularly when fabricating very small particles. This paper describes the use of secondary fragmentation (water, water/PVA and uranyl nitrate) on solid surfaces under varying temperature and vibration conditions to assess the relevance of using this new technique to manufacture very small spherical particles by means of a modified GSP process. The fragmentation mechanisms are monitored and analysed, before the trends for its subsequent optimised application are described.Keywords: Microsphere elaboration, nuclear fuel, droplet impact , gel-supported precipitation process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15915779 Quality-Driven Business Process Refactoring
Authors: María Fernández-Ropero, Ricardo Pérez-Castillo, Ismael Caballero, Mario Piattini
Abstract:
Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.Keywords: business process model, modifiability, refactoring, understandability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15285778 A New Group Key Management Protocol for Wireless Ad-Hoc Networks
Authors: Rony H. Rahman, Lutfar Rahman
Abstract:
Ad hoc networks are characterized by multi-hop wireless connectivity and frequently changing network topology. Forming security association among a group of nodes in ad-hoc networks is more challenging than in conventional networks due to the lack of central authority, i.e. fixed infrastructure. With that view in mind, group key management plays an important building block of any secure group communication. The main contribution of this paper is a low complexity key management scheme that is suitable for fully self-organized ad-hoc networks. The protocol is also password authenticated, making it resilient against active attacks. Unlike other existing key agreement protocols, ours make no assumption about the structure of the underlying wireless network, making it suitable for “truly ad-hoc" networks. Finally, we will analyze our protocol to show the computation and communication burden on individual nodes for key establishment.Keywords: Ad-hoc Networks, Group Key Management, Key Management Protocols, Password Authentication
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17625777 On the Differentiation of Strategic Spatial Planning Making Mechanisms in New Era: Between Melbourne and Tianjin
Abstract:
Strategic spatial planning, which is taken as an effective and competitive way for the governors of the city to improve the development and management level of a city, has been blooming in recent years all over the world. In the context of globalization and informatization, strategic spatial planning must transfer its focus on three different levels: global, regional and urban. Internal and external changes in environmental conditions lead to new advances in strategic planning both theoretically and practically. However, such advances or changes respond differently to cities on account of different dynamic mechanisms. This article aims at two cities of Tianjin in China and Melbourne in Australia, through a comparative study on strategic planning, to explore the differentiation of mechanisms in urban planning making. By comparison and exploration, the purpose of this article is to exhibit two different planning worlds between western and Chinese in a new way nowadays.
Keywords: Differentiation, Tianjin China, Melbourne Australia, strategic planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22615776 On Speeding Up Support Vector Machines: Proximity Graphs Versus Random Sampling for Pre-Selection Condensation
Authors: Xiaohua Liu, Juan F. Beltran, Nishant Mohanchandra, Godfried T. Toussaint
Abstract:
Support vector machines (SVMs) are considered to be the best machine learning algorithms for minimizing the predictive probability of misclassification. However, their drawback is that for large data sets the computation of the optimal decision boundary is a time consuming function of the size of the training set. Hence several methods have been proposed to speed up the SVM algorithm. Here three methods used to speed up the computation of the SVM classifiers are compared experimentally using a musical genre classification problem. The simplest method pre-selects a random sample of the data before the application of the SVM algorithm. Two additional methods use proximity graphs to pre-select data that are near the decision boundary. One uses k-Nearest Neighbor graphs and the other Relative Neighborhood Graphs to accomplish the task.Keywords: Machine learning, data mining, support vector machines, proximity graphs, relative-neighborhood graphs, k-nearestneighbor graphs, random sampling, training data condensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19195775 A Generic Approach to Reuse Unified Modeling Language Components Following an Agile Process
Authors: Rim Bouhaouel, Naoufel Kraïem, Zuhoor Al Khanjari
Abstract:
Unified Modeling Language (UML) is considered as one of the widespread modeling language standardized by the Object Management Group (OMG). Therefore, the model driving engineering (MDE) community attempts to provide reuse of UML diagrams, and do not construct it from scratch. The UML model appears according to a specific software development process. The existing method generation models focused on the different techniques of transformation without considering the development process. Our work aims to construct an UML component from fragments of UML diagram basing on an agile method. We define UML fragment as a portion of a UML diagram, which express a business target. To guide the generation of fragments of UML models using an agile process, we need a flexible approach, which adapts to the agile changes and covers all its activities. We use the software product line (SPL) to derive a fragment of process agile method. This paper explains our approach, named RECUP, to generate UML fragments following an agile process, and overviews the different aspects. In this paper, we present the approach and we define the different phases and artifacts.Keywords: UML, component, fragment, agile, SPL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9175774 A New Approach for Predicting and Optimizing Weld Bead Geometry in GMAW
Authors: Farhad Kolahan, Mehdi Heidari
Abstract:
Gas Metal Arc Welding (GMAW) processes is an important joining process widely used in metal fabrication industries. This paper addresses modeling and optimization of this technique using a set of experimental data and regression analysis. The set of experimental data has been used to assess the influence of GMAW process parameters in weld bead geometry. The process variables considered here include voltage (V); wire feed rate (F); torch Angle (A); welding speed (S) and nozzle-to-plate distance (D). The process output characteristics include weld bead height, width and penetration. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. In the next stage, the proposed model is embedded into a Simulated Annealing (SA) algorithm to optimize the GMAW process parameters. The objective is to determine a suitable set of process parameters that can produce desired bead geometry, considering the ranges of the process parameters. Computational results prove the effectiveness of the proposed model and optimization procedure.Keywords: Weld Bead Geometry, GMAW welding, Processparameters Optimization, Modeling, SA algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21875773 Simulation of the Extensional Flow Mixing of Molten Aluminium and Fly Ash Nanoparticles
Authors: O. Ualibek, C. Spitas, V. Inglezakis, G. Itskos
Abstract:
This study presents simulations of an aluminium melt containing an initially non-dispersed fly ash nanoparticle phase. Mixing is affected predominantly by means of forced extensional flow via either straight or slanted orifices. The sensitivity to various process parameters is determined. The simulated process is used for the production of cast fly ash-aluminium nanocomposites. The possibilities for rod and plate stock grading in the context of a continuous casting process implementation are discussed.Keywords: Metal matrix composites, fly ash nanoparticles, aluminium 2024, agglomeration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10025772 Contractor Selection in Saudi Arabia
Authors: M. A. Bajaber, M. A. Taha
Abstract:
Contractor selection in Saudi Arabia is very important due to the large construction boom and the contractor role to get over construction risks. The need for investigating contractor selection is due to the following reasons; large number of defaulted or failed projects (18%), large number of disputes attributed to contractor during the project execution stage (almost twofold), the extension of the General Agreement on Tariffs and Trade (GATT) into construction industry, and finally the few number of researches. The selection strategy is not perfect and considered as the reason behind irresponsible contractors. As a response, this research was conducted to review the contractor selection strategies as an integral part of a long advanced research to develop a good selection model. Many techniques can be used to form a selection strategy; multi criteria for optimizing decision, prequalification to discover contractor-s responsibility, bidding process for competition, third party guarantee to enhance the selection, and fuzzy techniques for ambiguities and incomplete information.
Keywords: Bidding, Construction industry, Contractor selection, Saudi Arabia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31415771 Additional Considerations on a Sequential Life Testing Approach using a Weibull Model
Authors: D. I. De Souza, D. R. Fonseca, R. Rocha
Abstract:
In this paper we will develop further the sequential life test approach presented in a previous article by [1] using an underlying two parameter Weibull sampling distribution. The minimum life will be considered equal to zero. We will again provide rules for making one of the three possible decisions as each observation becomes available; that is: accept the null hypothesis H0; reject the null hypothesis H0; or obtain additional information by making another observation. The product being analyzed is a new type of a low alloy-high strength steel product. To estimate the shape and the scale parameters of the underlying Weibull model we will use a maximum likelihood approach for censored failure data. A new example will further develop the proposed sequential life testing approach.Keywords: Sequential Life Testing, Underlying Weibull Model, Maximum Likelihood Approach, Hypothesis Testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13865770 A Probabilistic Optimization Approach for a Gas Processing Plant under Uncertain Feed Conditions and Product Requirements
Authors: G. Mesfin, M. Shuhaimi
Abstract:
This paper proposes a new optimization techniques for the optimization a gas processing plant uncertain feed and product flows. The problem is first formulated using a continuous linear deterministic approach. Subsequently, the single and joint chance constraint models for steady state process with timedependent uncertainties have been developed. The solution approach is based on converting the probabilistic problems into their equivalent deterministic form and solved at different confidence levels Case study for a real plant operation has been used to effectively implement the proposed model. The optimization results indicate that prior decision has to be made for in-operating plant under uncertain feed and product flows by satisfying all the constraints at 95% confidence level for single chance constrained and 85% confidence level for joint chance constrained optimizations cases.Keywords: Butane, Feed composition, LPG, Productspecification, Propane.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13985769 Worth of Sick Building Syndrome and Enhance the Quality of Life in Green Building
Authors: Kamyar Kabirifar, Majid Azarniush, Behbood Maashkar
Abstract:
A proper house is a suitable residential area which provides comfort, proper accessibility, security, stability and permanence of structure, enough lighting, proper initial infrastructures and ventilation for its inhabitants and the most important of all, it should be proportional to the family’s financial power .
Saving energy and making optimal usage of it and also taking advantage of stable energies are the bases of green buildings. Making green building will help the health of a person living in it and in its surrounding. It will support the people and provoke their satisfaction. Not only it will bring about the raise of level of the quality of life for building inhabitants, but it will cause the promotion of quality level of life of the people living in the surrounding area and in general the society.
Keywords: Quality of Life, Green Building, environment pollution, Sick Building.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18315768 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering
Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel
Abstract:
Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.Keywords: Classification, data mining, spam filtering, naive Bayes, decision tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15005767 A Hybrid Approach to Fault Detection and Diagnosis in a Diesel Fuel Hydrotreatment Process
Authors: Salvatore L., Pires B., Campos M. C. M., De Souza Jr M. B.
Abstract:
It is estimated that the total cost of abnormal conditions to US process industries is around $20 billion dollars in annual losses. The hydrotreatment (HDT) of diesel fuel in petroleum refineries is a conversion process that leads to high profitable economical returns. However, this is a difficult process to control because it is operated continuously, with high hydrogen pressures and it is also subject to disturbances in feed properties and catalyst performance. So, the automatic detection of fault and diagnosis plays an important role in this context. In this work, a hybrid approach based on neural networks together with a pos-processing classification algorithm is used to detect faults in a simulated HDT unit. Nine classes (8 faults and the normal operation) were correctly classified using the proposed approach in a maximum time of 5 minutes, based on on-line data process measurements.Keywords: Fault detection, hydrotreatment, hybrid systems, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16495766 The System Identification and PID Lead-lag Control for Two Poles Unstable SOPDT Process by Improved Relay Method
Authors: V. K. Singh, P. K. Padhy
Abstract:
This paper describes identification of the two poles unstable SOPDT process, especially with large time delay. A new modified relay feedback identification method for two poles unstable SOPDT process is proposed. Furthermore, for the two poles unstable SOPDT process, an additional Derivative controller is incorporated parallel with relay to relax the constraint on the ratio of delay to the unstable time constant, so that the exact model parameters of unstable processes can be identified. To cope with measurement noise in practice, a low pass filter is suggested to get denoised output signal toimprove the exactness of model parameter of unstable process. PID Lead-lag tuning formulas are derived for two poles unstable (SOPDT) processes based on IMC principle. Simulation example illustrates the effectiveness and the simplicity of the proposed identification and control method.Keywords: IMC structure, PID Lead-lag controller, Relayfeedback, SOPDT
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20975765 A Framework for Identifying the Critical Factors Affecting the Decision to Adopt and Use Inter-Organizational Information Systems
Authors: K. Bouchbout, Z. Alimazighi
Abstract:
The importance of inter-organizational system (IOS) has been increasingly recognized by organizations. However, IOS adoption has proved to be difficult and, at this stage, why this is so is not fully uncovered. In practice, benefits have often remained concentrated, primarily accruing to the dominant party, resulting in low rates of adoption and usage, and often culminating in the failure of the IOS. The main research question is why organizations initiate or join IOS and what factors influence their adoption and use levels. This paper reviews the literature on IOS adoption and proposes a theoretical framework in order to identify the critical factors to capture a complete picture of IOS adoption. With our proposed critical factors, we are able to investigate their relative contributions to IOS adoption decisions. We obtain findings that suggested that there are five groups of factors that significantly affect the adoption and use decision of IOS in the Supply Chain Management (SCM) context: 1) interorganizational context, 2) organizational context, 3) technological context, 4) perceived costs, and 5) perceived benefits.Keywords: Business-to-Business relationships, buyer-supplier relationships, Critical factors, Interorganizational Information Systems, IOS adoption and use.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20495764 Automation of Heat Exchanger using Neural Network
Authors: Sudhir Agashe, Ashok Ghatol, Sujata Agashe
Abstract:
In this paper the development of a heat exchanger as a pilot plant for educational purpose is discussed and the use of neural network for controlling the process is being presented. The aim of the study is to highlight the need of a specific Pseudo Random Binary Sequence (PRBS) to excite a process under control. As the neural network is a data driven technique, the method for data generation plays an important role. In light of this a careful experimentation procedure for data generation was crucial task. Heat exchange is a complex process, which has a capacity and a time lag as process elements. The proposed system is a typical pipe-in- pipe type heat exchanger. The complexity of the system demands careful selection, proper installation and commissioning. The temperature, flow, and pressure sensors play a vital role in the control performance. The final control element used is a pneumatically operated control valve. While carrying out the experimentation on heat exchanger a welldrafted procedure is followed giving utmost attention towards safety of the system. The results obtained are encouraging and revealing the fact that if the process details are known completely as far as process parameters are concerned and utilities are well stabilized then feedback systems are suitable, whereas neural network control paradigm is useful for the processes with nonlinearity and less knowledge about process. The implementation of NN control reinforces the concepts of process control and NN control paradigm. The result also underlined the importance of excitation signal typically for that process. Data acquisition, processing, and presentation in a typical format are the most important parameters while validating the results.Keywords: Process identification, neural network, heat exchanger.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15735763 Understanding and Predicting Foam in Anaerobic Digester
Authors: I. R. Kanu, T. J. Aspray, A. J. Adeloye
Abstract:
As a result of the ambiguity and complexity surrounding anaerobic digester foaming, efforts have been made by various researchers to understand the process of anaerobic digester foaming so as to proffer a solution that can be universally applied rather than site specific. All attempts ranging from experimental analysis to comparative review of other process has not fully explained the conditions and process of foaming in anaerobic digester. Studying the current available knowledge on foam formation and relating it to anaerobic digester process and operating condition, this piece of work presents a succinct and enhanced understanding of foaming in anaerobic digesters as well as introducing a simple method to identify the onset of anaerobic digester foaming based on analysis of historical data from a field scale system.
Keywords: Anaerobic digester, foam, biogas, surfactants, wastewater sludge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29705762 Post Pandemic Mobility Analysis through Indexing and Sharding in MongoDB: Performance Optimization and Insights
Authors: Karan Vishavjit, Aakash Lakra, Shafaq Khan
Abstract:
The COVID-19 pandemic has pushed healthcare professionals to use big data analytics as a vital tool for tracking and evaluating the effects of contagious viruses. To effectively analyse huge datasets, efficient NoSQL databases are needed. The analysis of post-COVID-19 health and well-being outcomes and the evaluation of the effectiveness of government efforts during the pandemic is made possible by this research’s integration of several datasets, which cuts down on query processing time and creates predictive visual artifacts. We recommend applying sharding and indexing technologies to improve query effectiveness and scalability as the dataset expands. Effective data retrieval and analysis are made possible by spreading the datasets into a sharded database and doing indexing on individual shards. Analysis of connections between governmental activities, poverty levels, and post-pandemic wellbeing is the key goal. We want to evaluate the effectiveness of governmental initiatives to improve health and lower poverty levels. We will do this by utilising advanced data analysis and visualisations. The findings provide relevant data that support the advancement of UN sustainable objectives, future pandemic preparation, and evidence-based decision-making. This study shows how Big Data and NoSQL databases may be used to address problems with global health.
Keywords: COVID-19, big data, data analysis, indexing, NoSQL, sharding, scalability, poverty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 695761 Selection of Solid Waste Landfill Site Using Geographical Information System (GIS)
Abstract:
Rapid population growth, urbanization and industrialization are known as the most important factors of environment problems. Elimination and management of solid wastes are also within the most important environment problems. One of the main problems in solid waste management is the selection of the best site for elimination of solid wastes. Lately, Geographical Information System (GIS) has been used for easing selection of landfill area. GIS has the ability of imitating necessary economic, environmental and political limitations. They play an important role for the site selection of landfill area as a decision support tool. In this study; map layers will be studied for minimum effect of environmental, social and cultural factors and maximum effect for engineering/economic factors for site selection of landfill areas and using GIS for a decision support mechanism in solid waste landfill areas site selection will be presented in Aksaray/Turkey city, Güzelyurt district practice.Keywords: GIS, landfill, solid waste, spatial analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31805760 Detergent Removal from Rinsing Water by Peroxi Electrocoagulation Process
Authors: A. Benhadji, M. Taleb Ahmed
Abstract:
Among the various methods of treatment, advanced oxidation processes (AOP) are the most promising ones. In this study, Peroxi Electrocoagulation Process (PEP) was investigated for the treatment of detergent wastewater. The process was compared with electrooxidation treatment. The results showed that chemical oxygen demand (COD) was high 7584 mgO2.L-1, while the biochemical oxygen demand was low (250 mgO2.L-1). This wastewater was hardly biodegradable. Electrochemical process was carried out for the removal of detergent using a glass reactor with a volume of 1 L and fitted with three electrodes. A direct current (DC) supply was used. Samples were taken at various current density (0.0227 A/cm2 to 0.0378 A/cm2) and reaction time (1-2-3-4 and 5 hour). Finally, the COD was determined. The results indicated that COD removal efficiency of PEP was observed to increase with current intensity and reached to 77% after 5 h. The highest removal efficiency was observed after 5 h of treatment.
Keywords: Advanced oxidation processes, chemical oxygen demand, COD, detergent, peroxi electrocoagulation process, PEP, wastewater
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9175759 Topological Properties of an Exponential Random Geometric Graph Process
Authors: Yilun Shang
Abstract:
In this paper we consider a one-dimensional random geometric graph process with the inter-nodal gaps evolving according to an exponential AR(1) process. The transition probability matrix and stationary distribution are derived for the Markov chains concerning connectivity and the number of components. We analyze the algorithm for hitting time regarding disconnectivity. In addition to dynamical properties, we also study topological properties for static snapshots. We obtain the degree distributions as well as asymptotic precise bounds and strong law of large numbers for connectivity threshold distance and the largest nearest neighbor distance amongst others. Both exact results and limit theorems are provided in this paper.Keywords: random geometric graph, autoregressive process, degree, connectivity, Markovian, wireless network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14585758 Modeling And Analysis of Simple Open Cycle Gas Turbine Using Graph Networks
Authors: Naresh Yadav, I.A. Khan, Sandeep Grover
Abstract:
This paper presents a unified approach based graph theory and system theory postulates for the modeling and analysis of Simple open cycle Gas turbine system. In the present paper, the simple open cycle gas turbine system has been modeled up to its subsystem level and system variables have been identified to develop the process subgraphs. The theorems and algorithms of the graph theory have been used to represent behavioural properties of the system like rate of heat and work transfers rates, pressure drops and temperature drops in the involved processes of the system. The processes have been represented as edges of the process subgraphs and their limits as the vertices of the process subgraphs. The system across variables and through variables has been used to develop terminal equations of the process subgraphs of the system. The set of equations developed for vertices and edges of network graph are used to solve the system for its process variables.Keywords: Simple open cycle gas turbine, Graph theoretic approach, process subgraphs, gas turbines system modeling, systemtheory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26425757 Using the Monte Carlo Simulation to Predict the Assembly Yield
Authors: C. Chahin, M. C. Hsu, Y. H. Lin, C. Y. Huang
Abstract:
Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Keywords: Monte Carlo simulation, placement yield, PCBcharacterization, electronics assembly
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21665756 A Data Mining Model for Detecting Financial and Operational Risk Indicators of SMEs
Authors: Ali Serhan Koyuncugil, Nermin Ozgulbas
Abstract:
In this paper, a data mining model to SMEs for detecting financial and operational risk indicators by data mining is presenting. The identification of the risk factors by clarifying the relationship between the variables defines the discovery of knowledge from the financial and operational variables. Automatic and estimation oriented information discovery process coincides the definition of data mining. During the formation of model; an easy to understand, easy to interpret and easy to apply utilitarian model that is far from the requirement of theoretical background is targeted by the discovery of the implicit relationships between the data and the identification of effect level of every factor. In addition, this paper is based on a project which was funded by The Scientific and Technological Research Council of Turkey (TUBITAK).
Keywords: Risk Management, Financial Risk, Operational Risk, Financial Early Warning System, Data Mining, CHAID Decision Tree Algorithm, SMEs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31245755 Change Management in Business Process Modeling Based on Object Oriented Petri Net
Authors: Bassam Atieh Rajabi, Sai Peck Lee
Abstract:
Business Process Modeling (BPM) is the first and most important step in business process management lifecycle. Graph based formalism and rule based formalism are the two most predominant formalisms on which process modeling languages are developed. BPM technology continues to face challenges in coping with dynamic business environments where requirements and goals are constantly changing at the execution time. Graph based formalisms incur problems to react to dynamic changes in Business Process (BP) at the runtime instances. In this research, an adaptive and flexible framework based on the integration between Object Oriented diagramming technique and Petri Net modeling language is proposed in order to support change management techniques for BPM and increase the representation capability for Object Oriented modeling for the dynamic changes in the runtime instances. The proposed framework is applied in a higher education environment to achieve flexible, updatable and dynamic BP.Keywords: Business Process Modeling, Change Management, Graph Based Modeling, Rule Based Modeling, Object Oriented PetriNet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20385754 Frame and Burst Acquisition in TDMA Satellite Communication Networks with Transponder Hopping
Authors: Vitalice K. Oduol, C. Ardil
Abstract:
The paper presents frame and burst acquisition in a satellite communication network based on time division multiple access (TDMA) in which the transmissions may be carried on different transponders. A unique word pattern is used for the acquisition process. The search for the frame is aided by soft-decision of QPSK modulated signals in an additive white Gaussian channel. Results show that when the false alarm rate is low the probability of detection is also low, and the acquisition time is long. Conversely when the false alarm rate is high, the probability of detection is also high and the acquisition time is short. Thus the system operators can trade high false alarm rates for high detection probabilities and shorter acquisition times.
Keywords: burst acquisition, burst time plan, frame acquisition, satellite access, satellite TDMA, unique word detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9157