Search results for: measure valued process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6339

Search results for: measure valued process

6129 Modeling and Optimization of Process Parameters in PMEDM by Genetic Algorithm

Authors: Farhad Kolahan, Mohammad Bironro

Abstract:

This paper addresses modeling and optimization of process parameters in powder mixed electrical discharge machining (PMEDM). The process output characteristics include metal removal rate (MRR) and electrode wear rate (EWR). Grain size of Aluminum powder (S), concentration of the powder (C), discharge current (I) pulse on time (T) are chosen as control variables to study the process performance. The experimental results are used to develop the regression models based on second order polynomial equations for the different process characteristics. Then, a genetic algorithm (GA) has been employed to determine optimal process parameters for any desired output values of machining characteristics.

Keywords: Regression modeling, PMEDM, GeneticAlgorithm, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1448
6128 Fixed Points of Contractive-Like Operators by a Faster Iterative Process

Authors: Safeer Hussain Khan

Abstract:

In this paper, we prove a strong convergence result using a recently introduced iterative process with contractive-like operators. This improves andgeneralizes corresponding results in the literature in two ways: iterativeprocess is faster, operators are more general. At the end, we indicatethat the results can also be proved with the iterative process witherror terms.

Keywords: Contractive-like operator, iterative process, fixed point, strong convergence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1670
6127 Process Oriented Architecture for Emergency Scenarios in the Czech Republic

Authors: Tomáš Ludík, Josef Navrátil, Alena Langerová

Abstract:

Tackling emergency situations is performed based on emergency scenarios. These scenarios do not have a uniform form in the Czech Republic. They are unstructured and developed primarily in the text form. This does not allow solving emergency situations efficiently. For this reason, the paper aims at defining a Process Oriented Architecture to support and thus to improve tackling emergency situations in the Czech Republic. The innovative Process Oriented Architecture is based on the Workflow Reference Model while taking into account the options of Business Process Management Suites for the implementation of process oriented emergency scenarios. To verify the proposed architecture the Proof of Concept has been used which covers the reception of an emergency event at the district emergency operations centre. Within the particular implementation of the proposed architecture the Bonita Open Solution has been used. The architecture created in this way is suitable not only for emergency management, but also for educational purposes.

Keywords: Business Process Management Suite, Czech Republic, Emergency Scenarios, Process Execution, Process Oriented Architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
6126 An Empirical Analysis of the Influence of Application Experience on Working Methods of Process Modelers

Authors: A. Nielen, S. Mütze-Niewöhner, C. M. Schlick

Abstract:

In view of growing competition in the service sector, services are as much in need of modeling, analysis and improvement as business or working processes. Graphical process models are important means to capture process-related know-how for an effective management of the service process. In this contribution, a human performance analysis of process model development paying special attention to model development time and the working method was conducted. It was found that modelers with higher application experience need significantly less time for mental activities than modelers with lower application experience, spend more time on labeling graphical elements, and achieved higher process model quality in terms of activity label quality.

Keywords: Model quality, predetermined motion time system, process modeling, working method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1288
6125 Assessing Complexity of Neuronal Multiunit Activity by Information Theoretic Measure

Authors: Young-Seok Choi

Abstract:

This paper provides a quantitative measure of the time-varying multiunit neuronal spiking activity using an entropy based approach. To verify the status embedded in the neuronal activity of a population of neurons, the discrete wavelet transform (DWT) is used to isolate the inherent spiking activity of MUA. Due to the de-correlating property of DWT, the spiking activity would be preserved while reducing the non-spiking component. By evaluating the entropy of the wavelet coefficients of the de-noised MUA, a multiresolution Shannon entropy (MRSE) of the MUA signal is developed. The proposed entropy was tested in the analysis of both simulated noisy MUA and actual MUA recorded from cortex in rodent model. Simulation and experimental results demonstrate that the dynamics of a population can be quantified by using the proposed entropy.

Keywords: Discrete wavelet transform, Entropy, Multiresolution, Multiunit activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
6124 Improved Feature Processing for Iris Biometric Authentication System

Authors: Somnath Dey, Debasis Samanta

Abstract:

Iris-based biometric authentication is gaining importance in recent times. Iris biometric processing however, is a complex process and computationally very expensive. In the overall processing of iris biometric in an iris-based biometric authentication system, feature processing is an important task. In feature processing, we extract iris features, which are ultimately used in matching. Since there is a large number of iris features and computational time increases as the number of features increases, it is therefore a challenge to develop an iris processing system with as few as possible number of features and at the same time without compromising the correctness. In this paper, we address this issue and present an approach to feature extraction and feature matching process. We apply Daubechies D4 wavelet with 4 levels to extract features from iris images. These features are encoded with 2 bits by quantizing into 4 quantization levels. With our proposed approach it is possible to represent an iris template with only 304 bits, whereas existing approaches require as many as 1024 bits. In addition, we assign different weights to different iris region to compare two iris templates which significantly increases the accuracy. Further, we match the iris template based on a weighted similarity measure. Experimental results on several iris databases substantiate the efficacy of our approach.

Keywords: Iris recognition, biometric, feature processing, patternrecognition, pattern matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2099
6123 Simultaneous Optimization of Machining Parameters and Tool Geometry Specifications in Turning Operation of AISI1045 Steel

Authors: Farhad Kolahan, Mohsen Manoochehri, Abbas Hosseini

Abstract:

Machining is an important manufacturing process used to produce a wide variety of metallic parts. Among various machining processes, turning is one of the most important one which is employed to shape cylindrical parts. In turning, the quality of finished product is measured in terms of surface roughness. In turn, surface quality is determined by machining parameters and tool geometry specifications. The main objective of this study is to simultaneously model and optimize machining parameters and tool geometry in order to improve the surface roughness for AISI1045 steel. Several levels of machining parameters and tool geometry specifications are considered as input parameters. The surface roughness is selected as process output measure of performance. A Taguchi approach is employed to gather experimental data. Then, based on signal-to-noise (S/N) ratio, the best sets of cutting parameters and tool geometry specifications have been determined. Using these parameters values, the surface roughness of AISI1045 steel parts may be minimized. Experimental results are provided to illustrate the effectiveness of the proposed approach.

Keywords: Taguchi method, turning parameters, tool geometry specifications, S/N ratio, statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2289
6122 Extending the Quantum Entropy to Multidimensional Signal Processing

Authors: Youssef Khmou, Said Safi, Miloud Frikel

Abstract:

This paper treats different aspects of entropy measure in classical information theory and statistical quantum mechanics, it presents the possibility of extending the definition of Von Neumann entropy to image and array processing. In the first part, we generalize the quantum entropy using singular values of arbitrary rectangular matrices to measure the randomness and the quality of denoising operation, this new definition of entropy can be implemented to compare the performance analysis of filtering methods. In the second part, we apply the concept of pure state in quantum formalism to generalize the maximum entropy method for narrowband and farfield source localization problem. Several computer simulation results are illustrated to demonstrate the effectiveness of the proposed techniques.

Keywords: Von Neumann entropy, Filtering, array, DoA, Maximum Entropy Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2456
6121 Efficient Program Slicing Algorithms for Measuring Functional Cohesion and Parallelism

Authors: Jehad Al Dallal

Abstract:

Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. In this paper, algorithms are introduced to compute all backward and forward static slices of a computer program by traversing the program representation graph once. The program representation graph used in this paper is called Program Dependence Graph (PDG). We have conducted an experimental comparison study using 25 software modules to show the effectiveness of the introduced algorithm for computing all backward static slices over single-point slicing approaches in computing the parallelism and functional cohesion of program modules. The effectiveness of the algorithm is measured in terms of time execution and number of traversed PDG edges. The comparison study results indicate that using the introduced algorithm considerably saves the slicing time and effort required to measure module parallelism and functional cohesion.

Keywords: Backward slicing, cohesion measure, forward slicing, parallelism measure, program dependence graph, program slicing, static slicing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1410
6120 A Simplified and Effective Algorithm Used to Mine Similar Processes: An Illustrated Example

Authors: Min-Hsun Kuo, Yun-Shiow Chen

Abstract:

The running logs of a process hold valuable information about its executed activity behavior and generated activity logic structure. Theses informative logs can be extracted, analyzed and utilized to improve the efficiencies of the process's execution and conduction. One of the techniques used to accomplish the process improvement is called as process mining. To mine similar processes is such an improvement mission in process mining. Rather than directly mining similar processes using a single comparing coefficient or a complicate fitness function, this paper presents a simplified heuristic process mining algorithm with two similarity comparisons that are able to relatively conform the activity logic sequences (traces) of mining processes with those of a normalized (regularized) one. The relative process conformance is to find which of the mining processes match the required activity sequences and relationships, further for necessary and sufficient applications of the mined processes to process improvements. One similarity presented is defined by the relationships in terms of the number of similar activity sequences existing in different processes; another similarity expresses the degree of the similar (identical) activity sequences among the conforming processes. Since these two similarities are with respect to certain typical behavior (activity sequences) occurred in an entire process, the common problems, such as the inappropriateness of an absolute comparison and the incapability of an intrinsic information elicitation, which are often appeared in other process conforming techniques, can be solved by the relative process comparison presented in this paper. To demonstrate the potentiality of the proposed algorithm, a numerical example is illustrated.

Keywords: process mining, process similarity, artificial intelligence, process conformance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
6119 Efficiency Evaluation of E-Commerce Websites

Authors: A. K. Abd El-Aleem, W. F. Abd El-wahed, N. A. Ismail, F. A. Torkey

Abstract:

This study suggests a model of a new set of evaluation criteria that will be used to measure the efficiency of real-world E-commerce websites. Evaluation criteria include design, usability and performance for websites, the Data Envelopment Analysis (DEA) technique has been used to measure the websites efficiency. An efficient Web site is defined as a site that generates the most outputs, using the smallest amount of inputs. Inputs refer to measurements representing the amount of effort required to build, maintain and perform the site. Output is amount of traffic the site generates. These outputs are measured as the average number of daily hits and the average number of daily unique visitors.

Keywords: Data Envelopment Analysis, E-commerce, Efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4048
6118 Conceptual Method for Flexible Business Process Modeling

Authors: Adla Bentellis, Zizette Boufaïda

Abstract:

Nowadays, the pace of business change is such that, increasingly, new functionality has to be realized and reliably installed in a matter of days, or even hours. Consequently, more and more business processes are prone to a continuous change. The objective of the research in progress is to use the MAP model, in a conceptual modeling method for flexible and adaptive business process. This method can be used to capture the flexibility dimensions of a business process; it takes inspiration from modularity concept in the object oriented paradigm to establish a hierarchical construction of the BP modeling. Its intent is to provide a flexible modeling that allows companies to quickly adapt their business processes.

Keywords: Business Process, Business process modeling, flexibility, MAP Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
6117 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: Business process modelling, system models, role activity diagrams, sequence diagrams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
6116 Actionable Rules: Issues and New Directions

Authors: Harleen Kaur

Abstract:

Knowledge Discovery in Databases (KDD) is the process of extracting previously unknown, hidden and interesting patterns from a huge amount of data stored in databases. Data mining is a stage of the KDD process that aims at selecting and applying a particular data mining algorithm to extract an interesting and useful knowledge. It is highly expected that data mining methods will find interesting patterns according to some measures, from databases. It is of vital importance to define good measures of interestingness that would allow the system to discover only the useful patterns. Measures of interestingness are divided into objective and subjective measures. Objective measures are those that depend only on the structure of a pattern and which can be quantified by using statistical methods. While, subjective measures depend only on the subjectivity and understandability of the user who examine the patterns. These subjective measures are further divided into actionable, unexpected and novel. The key issues that faces data mining community is how to make actions on the basis of discovered knowledge. For a pattern to be actionable, the user subjectivity is captured by providing his/her background knowledge about domain. Here, we consider the actionability of the discovered knowledge as a measure of interestingness and raise important issues which need to be addressed to discover actionable knowledge.

Keywords: Data Mining Community, Knowledge Discovery inDatabases (KDD), Interestingness, Subjective Measures, Actionability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902
6115 An Interval-Based Multi-Attribute Decision Making Approach for Electric Utility Resource Planning

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

This paper presents an interval-based multi-attribute decision making (MADM) approach in support of the decision process with imprecise information. The proposed decision methodology is based on the model of linear additive utility function but extends the problem formulation with the measure of composite utility variance. A sample study concerning with the evaluation of electric generation expansion strategies is provided showing how the imprecise data may affect the choice toward the best solution and how a set of alternatives, acceptable to the decision maker (DM), may be identified with certain confidence.

Keywords: Decision Making, Power Generation, ElectricUtilities, Resource Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
6114 Identification of Disease Causing DNA Motifs in Human DNA Using Clustering Approach

Authors: G. Tamilpavai, C. Vishnuppriya

Abstract:

Studying DNA (deoxyribonucleic acid) sequence is useful in biological processes and it is applied in the fields such as diagnostic and forensic research. DNA is the hereditary information in human and almost all other organisms. It is passed to their generations. Earlier stage detection of defective DNA sequence may lead to many developments in the field of Bioinformatics. Nowadays various tedious techniques are used to identify defective DNA. The proposed work is to analyze and identify the cancer-causing DNA motif in a given sequence. Initially the human DNA sequence is separated as k-mers using k-mer separation rule. The separated k-mers are clustered using Self Organizing Map (SOM). Using Levenshtein distance measure, cancer associated DNA motif is identified from the k-mer clusters. Experimental results of this work indicate the presence or absence of cancer causing DNA motif. If the cancer associated DNA motif is found in DNA, it is declared as the cancer disease causing DNA sequence. Otherwise the input human DNA is declared as normal sequence. Finally, elapsed time is calculated for finding the presence of cancer causing DNA motif using clustering formation. It is compared with normal process of finding cancer causing DNA motif. Locating cancer associated motif is easier in cluster formation process than the other one. The proposed work will be an initiative aid for finding genetic disease related research.

Keywords: Bioinformatics, cancer motif, DNA, k-mers, Levenshtein distance, SOM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1300
6113 Unified Structured Process for Health Analytics

Authors: Supunmali Ahangama, Danny Chiang Choon Poo

Abstract:

Health analytics (HA) is used in healthcare systems for effective decision making, management and planning of healthcare and related activities. However, user resistances, unique position of medical data content and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. Success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose a HA process model with features from rational unified process (RUP) model and agile methodology.

Keywords: Agile methodology, health analytics, unified process model, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2290
6112 Optimizing Machine Vision System Setup Accuracy by Six-Sigma DMAIC Approach

Authors: Joseph C. Chen

Abstract:

Machine vision system provides automatic inspection to reduce manufacturing costs considerably. However, only a few principles have been found to optimize machine vision system and help it function more accurately in industrial practice. Mostly, there were complicated and impractical design techniques to improve the accuracy of machine vision system. This paper discusses implementing the Six Sigma Define, Measure, Analyze, Improve, and Control (DMAIC) approach to optimize the setup parameters of machine vision system when it is used as a direct measurement technique. This research follows a case study showing how Six Sigma DMAIC methodology has been put into use.

Keywords: DMAIC, machine vision system, process capability, Taguchi parameter design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1190
6111 Measuring of Urban Sustainability in Town Planners Practice

Authors: J. Zagorskas, I. Veteikyte

Abstract:

Physical urban form is recognized to be the media for human transactions. It directly influences the travel demand of people in a specific urban area and the amount of energy used for transportation. Distorted, sprawling form often creates sustainability problems in urban areas. It is declared in EU strategic planning documents that compact urban form and mixed land use pattern must be given the main focus to achieve better sustainability in urban areas, but the methods to measure and compare these characteristics are still not clear. This paper presents the simple methods to measure the spatial characteristics of urban form by analyzing the location and distribution of objects in an urban environment. The extended CA (cellular automata) model is used to simulate urban development scenarios.

Keywords: Cellular automata (CA), Mixed used planning, Spatial analysis, Urban compactness, Geographic information systems (GIS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2638
6110 Introducing Fast Robot Roller Hemming Process in Automotive Industry

Authors: Babak Saboori, Behzad Saboori, Johan S. Carlson, Rikard Söderberg

Abstract:

As product life cycle becomes less and less every day, having flexible manufacturing processes for any companies seems more demanding. In the assembling of closures, i.e. opening parts in car body, hemming process is the one which needs more attention. This paper focused on the robot roller hemming process and how to reduce its cycle time by introducing a fast roller hemming process. A robot roller hemming process of a tailgate of Saab 93 SportCombi model is investigated as a case study in this paper. By applying task separation, robot coordination, and robot cell configuration principles in the roller hemming process, three alternatives are proposed, developed, and remarkable reduction in cycle times achieved [1].

Keywords: Cell configuration, cycle time, robot coordination, roller hemming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4032
6109 Using Copulas to Measure Association between Air Pollution and Respiratory Diseases

Authors: Snezhana P. Kostova, Krassi V. Rumchev, Todor Vlaev, Silviya B. Popova

Abstract:

Air pollution is still considered as one of the major environmental and health issues. There is enough research evidence to show a strong relationship between exposure to air contaminants and respiratory illnesses among children and adults. In this paper we used the Copula approach to study a potential relationship between selected air pollutants (PM10 and NO2) and hospital admissions for respiratory diseases. Kendall-s tau and Spearman-s rho rank correlation coefficients are calculated and used in Copula method. This paper demonstrates that copulas can be used to provide additional information as a measure of an association when compared to the standard correlation coefficients. The results find a significant correlation between the selected air pollutants and hospital admissions for most of the selected respiratory illnesses.

Keywords: Air pollution, Copula, Respiratory Health.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779
6108 Nullity of t-Tupple Graphs

Authors: Khidir R. Sharaf, Didar A. Ali

Abstract:

The nullity η(G) of a graph is the occurrence of zero as an eigenvalue in its spectra. A zero-sum weighting of a graph G is real valued function, say f from vertices of G to the set of real numbers, provided that for each vertex of G the summation of the weights f(w) over all neighborhood w of v is zero for each v in G.A high zero-sum weighting of G is one that uses maximum number of non-zero independent variables. If G is graph with an end vertex, and if H is an induced subgraph of G obtained by deleting this vertex together with the vertex adjacent to it, then, η(G)= η(H). In this paper, a high zero-sum weighting technique and the endvertex procedure are applied to evaluate the nullity of t-tupple and generalized t-tupple graphs are derived  and determined for some special types of graphs,

 Also, we introduce and prove some important results about the t-tupple coalescence, Cartesian and Kronecker products of nut graphs.

Keywords: Graph theory, Graph spectra, Nullity of graphs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
6107 Unsupervised Texture Segmentation via Applying Geodesic Active Regions to Gaborian Feature Space

Authors: Yuan He, Yupin Luo, Dongcheng Hu

Abstract:

In this paper, we propose a novel variational method for unsupervised texture segmentation. We use a Gabor filter bank to extract texture features. Some of the filtered channels form a multidimensional Gaborian feature space. To avoid deforming contours directly in a vector-valued space we use a Gaussian mixture model to describe the statistical distribution of this space and get the boundary and region probabilities. Then a framework of geodesic active regions is applied based on them. In the end, experimental results are presented, and show that this method can obtain satisfied boundaries between different texture regions.

Keywords: Texture segmentation, Gabor filter, snakes, Geodesicactive regions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717
6106 Predicting Extrusion Process Parameters Using Neural Networks

Authors: Sachin Man Bajimaya, SangChul Park, Gi-Nam Wang

Abstract:

The objective of this paper is to estimate realistic principal extrusion process parameters by means of artificial neural network. Conventionally, finite element analysis is used to derive process parameters. However, the finite element analysis of the extrusion model does not consider the manufacturing process constraints in its modeling. Therefore, the process parameters obtained through such an analysis remains highly theoretical. Alternatively, process development in industrial extrusion is to a great extent based on trial and error and often involves full-size experiments, which are both expensive and time-consuming. The artificial neural network-based estimation of the extrusion process parameters prior to plant execution helps to make the actual extrusion operation more efficient because more realistic parameters may be obtained. And so, it bridges the gap between simulation and real manufacturing execution system. In this work, a suitable neural network is designed which is trained using an appropriate learning algorithm. The network so trained is used to predict the manufacturing process parameters.

Keywords: Artificial Neural Network (ANN), Indirect Extrusion, Finite Element Analysis, MES.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2325
6105 Treatment of Cutting Oily-Wastewater by Sono Fenton Process: Experimental Approach and Combined Process

Authors: P. Painmanakul, T. Chintateerachai, S. Lertlapwasin, N. Rojvilavan, T. Chalermsinsuwan, N. Chawaloesphonsiya, O. Larpparisudthi

Abstract:

Conventional coagulation, advance oxidation process (AOPs), and the combined process were evaluated and compared for its suitability to treat the stabilized cutting-oil wastewater. The 90% efficiency was obtained from the coagulation at Al2(SO4)3 dosage of 150 mg/L and pH 7. On the other hands, efficiencies of AOPs for 30 minutes oxidation time were 10% for acoustic oxidation, 12% for acoustic oxidation with hydrogen peroxide, 76% for Fenton, and 92% sono-Fenton processes. The highest efficiency for effective oil removal of AOPs required large amount of chemical. Therefore, AOPs were studied as a post-treatment after conventional separation process. The efficiency was considerable as the effluent COD can pass the standard required for industrial wastewater discharge with less chemical and energy consumption.

 

Keywords: Cutting oily-wastewater, Advance oxidation process, Sono-Fenton, Combined process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3223
6104 Reasoning With Non-Binary Logics

Authors: Sylvia Encheva

Abstract:

Students in high education are presented with new terms and concepts in nearly every lecture they attend. Many of them prefer Web-based self-tests for evaluation of their concepts understanding since they can use those tests independently of tutors- working hours and thus avoid the necessity of being in a particular place at a particular time. There is a large number of multiple-choice tests in almost every subject designed to contribute to higher level learning or discover misconceptions. Every single test provides immediate feedback to a student about the outcome of that test. In some cases a supporting system displays an overall score in case a test is taken several times by a student. What we still find missing is how to secure delivering of personalized feedback to a user while taking into consideration the user-s progress. The present work is motivated to throw some light on that question.

Keywords: Clustering, rough sets, many valued logic, predictions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
6103 Variables for Measuring the Impact of the Social Enterprises in the Field of Community Development

Authors: A. Irudaya Veni Mary, M. Victor Louis Anthuvan, P. Christie, A. Indira

Abstract:

In India, social enterprises are working to create social value in various fields including education; health; women and child development; environment protection and community development. Although social enterprises have brought about tremendous changes in the lives of beneficiaries, the importance of their works is not understood thoroughly. One of the ways to prove themselves is to measure the impact, which in recent times has received much attention. This paper focuses on the study of social value created by the social enterprises in the field of community development. It also aims to put forth a research tool for measuring the social value created by the social enterprises in the field of community development. A close-ended interview schedule was prepared to measure the social value creation and it was administered among 60 beneficiaries of two social enterprises who work in the field of community development. The study results show that the social enterprises have brought four types of impact in the life of their beneficiaries; economic impact, social impact, political impact and cultural impact. This study is limited to the social enterprises those who work towards community development. This empirical finding will enable the reader to understand various types of social value created by the social enterprises working in the field of community development. This study will also serve as guide for social enterprises in community development activities to measure their impact and thereby improve their operation towards the betterment of the society. This paper is derived from an empirical research carried out to describe the different types of social value created by the social enterprises in India.

Keywords: Social enterprise, social entrepreneurs, social impact, social value, tool for social impact measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1858
6102 A Multi-Attribute Utility Model for Performance Evaluation of Sustainable Banking

Authors: Sonia Rebai, Mohamed Naceur Azaiez, Dhafer Saidane

Abstract:

In this study, we develop a performance evaluation model based on a multi-attribute utility approach aiming at reaching the sustainable banking (SB) status. This model is built accounting for various banks’ stakeholders in a win-win paradigm. In addition, it offers the opportunity for adopting a global measure of performance as an indication of a bank’s sustainability degree. This measure is referred to as banking sustainability performance index (BSPI). This index may constitute a basis for ranking banks. Moreover, it may constitute a bridge between the assessment types of financial and extra-financial rating agencies. A real application is performed on three French banks.

Keywords: Multi-attribute utility theory, Performance, Sustainable banking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2164
6101 Process Simulation of Ethyl tert-Butyl Ether (ETBE) Production from Naphtha Cracking Wastes

Authors: Pakorn Traiprasertpong, Apichit Svang-Ariyaskul

Abstract:

The production of ethyl tert-butyl ether (ETBE) was simulated through Aspen Plus. The objective of this work was to use the simulation results to be an alternative platform for ETBE production from naphtha cracking wastes for the industry to develop. ETBE is produced from isobutylene which is one of the wastes in naphtha cracking process. The content of isobutylene in the waste is less than 30% weight. The main part of this work was to propose a process to save the environment and to increase the product value by converting a great majority of the wastes into ETBE. Various processes were considered to determine the optimal production of ETBE. The proposed process increased ETBE production yield by 100% from conventional process with the purity of 96% weight. The results showed a great promise for developing this proposed process in an industrial scale.

Keywords: ETBE, process simulation, naphtha cracking, Aspen Plus

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5378
6100 Integrating Process Planning and Scheduling for Prismatic Parts Regard to Due Date

Authors: M. Haddadzade, M. R. Razfar, M. Farahnakian

Abstract:

Integration of process planning and scheduling functions is necessary to achieve superior overall system performance. This paper proposes a methodology for integration of process planning and scheduling for prismatic component that can be implemented in a company with existing departments. The developed model considers technological constraints whereas available time for machining in shop floor is the limiting factor to produce multiple process plan (MPP). It takes advantage of MPP while guarantied the fulfillment of the due dates via using overtime. This study has been proposed to determinate machining parameters, tools, machine and amount of over time within the minimum cost objective while overtime is considered for this. At last the illustration shows that the system performance is improved by as measured by cost and compatible with due date.

Keywords: Due date, Integration, Multiple process plan, Process planning, Scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601