Search results for: process developed data warehouse.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14005

Search results for: process developed data warehouse.

13585 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting and tracking of physical items in combination with Internet of Things (IoT) developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm decision-making process does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyze on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue and environmental impact. Evolutionary Computing (EC) can be very effective in finding the optimal combination of sets of some objects and finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and EC in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management and its uptake has become a continuing trend.

Keywords: Big data, evolutionary computing, cloud, precision technologies

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 745
13584 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

Authors: J. Grira, Y. Bédard, S. Roche

Abstract:

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
13583 Big Data: Big Challenges to Privacy and Data Protection

Authors: Abu Bakar Munir, Siti Hajar Mohd Yasin, Firdaus Muhammad-Sukki

Abstract:

This paper seeks to analyse the benefits of big data and more importantly the challenges it pose to the subject of privacy and data protection. First, the nature of big data will be briefly deliberated before presenting the potential of big data in the present days. Afterwards, the issue of privacy and data protection is highlighted before discussing the challenges of implementing this issue in big data. In conclusion, the paper will put forward the debate on the adequacy of the existing legal framework in protecting personal data in the era of big data.

Keywords: Big data, data protection, information, privacy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3915
13582 An Event Based Approach to Extract the Run Time Execution Path of BPEL Process for Monitoring QoS in the Cloud

Authors: Rima Grati, Khouloud Boukadi, Hanene Ben-Abdallah

Abstract:

Due to the dynamic nature of the Cloud, continuous monitoring of QoS requirements is necessary to manage the Cloud computing environment. The process of QoS monitoring and SLA violation detection consists of: collecting low and high level information pertinent to the service, analyzing the collected information, and taking corrective actions when SLA violations are detected. In this paper, we detail the architecture and the implementation of the first step of this process. More specifically, we propose an event-based approach to obtain run time information of services developed as BPEL processes. By catching particular events (i.e., the low level information), our approach recognizes the run-time execution path of a monitored service and uses the BPEL execution patterns to compute QoS of the composite service (i.e., the high level information).

Keywords: Monitoring of Web service composition, Cloud environment, Run-time extraction of execution path of BPEL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
13581 Problems and Possible Solutions with the Development of a Computer Model of Quantum Theory

Authors: Hans H. Diel

Abstract:

A computer model of Quantum Theory (QT) has been developed by the author. Major goal of the computer model was support and demonstration of an as large as possible scope of QT. This includes simulations for the major QT (Gedanken-) experiments such as, for example, the famous double-slit experiment. Besides the anticipated difficulties with (1) transforming exacting mathematics into a computer program, two further types of problems showed up, namely (2) areas where QT provides a complete mathematical formalism, but when it comes to concrete applications the equations are not solvable at all, or only with extremely high effort; (3) QT rules which are formulated in natural language and which do not seem to be translatable to precise mathematical expressions, nor to a computer program. The paper lists problems in all three categories and describes also the possible solutions or circumventions developed for the computer model.

Keywords: Computability, Foundation of Quantum Mechanics, Measurement Process, Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
13580 An Algebra for Protein Structure Data

Authors: Yanchao Wang, Rajshekhar Sunderraman

Abstract:

This paper presents an algebraic approach to optimize queries in domain-specific database management system for protein structure data. The approach involves the introduction of several protein structure specific algebraic operators to query the complex data stored in an object-oriented database system. The Protein Algebra provides an extensible set of high-level Genomic Data Types and Protein Data Types along with a comprehensive collection of appropriate genomic and protein functions. The paper also presents a query translator that converts high-level query specifications in algebra into low-level query specifications in Protein-QL, a query language designed to query protein structure data. The query transformation process uses a Protein Ontology that serves the purpose of a dictionary.

Keywords: Domain-Specific Data Management, Protein Algebra, Protein Ontology, Protein Structure Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
13579 Preparation of Computer Model of the Aircraft for Numerical Aeroelasticity Tests – Flutter

Authors: M. Rychlik, R. Roszak, M. Morzynski, M. Nowak, H. Hausa, K. Kotecki

Abstract:

Article presents the geometry and structure reconstruction procedure of the aircraft model for flatter research (based on the I22-IRYDA aircraft). For reconstruction the Reverse Engineering techniques and advanced surface modeling CAD tools are used. Authors discuss all stages of data acquisition process, computation and analysis of measured data. For acquisition the three dimensional structured light scanner was used. In the further sections, details of reconstruction process are present. Geometry reconstruction procedure transform measured input data (points cloud) into the three dimensional parametric computer model (NURBS solid model) which is compatible with CAD systems. Parallel to the geometry of the aircraft, the internal structure (structural model) are extracted and modeled. In last chapter the evaluation of obtained models are discussed.

Keywords: computer modeling, numerical simulation, Reverse Engineering, structural model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752
13578 Cascade Kalman Filter Configuration for Low Cost IMU/GPS Integration in Car Navigation Like Robot

Authors: Othman Maklouf, Abdurazag Ghila, Ahmed Abdulla

Abstract:

This paper introduces a low cost INS/GPS algorithm for land vehicle navigation application. The data fusion process is done with an extended Kalman filter in cascade configuration mode. In order to perform numerical simulations, MATLAB software has been developed. Loosely coupled configuration is considered. The results obtained in this work demonstrate that a low-cost INS/GPS navigation system is partially capable of meeting the performance requirements for land vehicle navigation. The relative effectiveness of the kalman filter implementation in integrated GPS/INS navigation algorithm is highlighted. The paper also provides experimental results; field test using a car is carried out.

Keywords: GPS, INS, IMU, Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3843
13577 Software Engineering Interoperable Environment for University Process Workflow and Document Management

Authors: Bekim Fetaji, Majlinda Fetaji, Mirlinda Ebibi

Abstract:

The objective of the research was focused on the design, development and evaluation of a sustainable web based network system to be used as an interoperable environment for University process workflow and document management. In this manner the most of the process workflows in Universities can be entirely realized electronically and promote integrated University. Definition of the most used University process workflows enabled creating electronic workflows and their execution on standard workflow execution engines. Definition or reengineering of workflows provided increased work efficiency and helped in having standardized process through different faculties. The concept and the process definition as well as the solution applied as Case study are evaluated and findings are reported.

Keywords: design process workflows, workflow and documentmanagement, Business Process, software engineering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1320
13576 SATA: A Web Based Scheduling Support System

Authors: Rajeswari Raju, Saiful Nizam Warris, Hazlifah Mohd Rusli

Abstract:

Developing a university course schedule is difficult. This is due to the limitations in the resources available. The process is made even harder with different faculties or departments having different ways of stating their schedule requirements. The person in charge of taking the schedule requirements and turning them into a proper course schedule is not only burden with the task of allocating the appropriate classes and time to lecturers and students, they also need to understand the schedule requirements. Therefore a scheduling support system named SATA is developed to assist ICRESS in the course scheduling process. SATA has been put to use for several semesters and the results have been encouraging. It won a bronze medal in the 2008 Invention, Innovation and Design competition (IID-08) and has been submitted to be patented in October 2008

Keywords: Course Scheduling, Scheduling Tool Aid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
13575 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: Data grids, fault tolerance, chandy-lamport, clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 943
13574 Capability Prediction of Machining Processes Based on Uncertainty Analysis

Authors: Hamed Afrasiab, Saeed Khodaygan

Abstract:

Prediction of machining process capability in the design stage plays a key role to reach the precision design and manufacturing of mechanical products. Inaccuracies in machining process lead to errors in position and orientation of machined features on the part, and strongly affect the process capability in the final quality of the product. In this paper, an efficient systematic approach is given to investigate the machining errors to predict the manufacturing errors of the parts and capability prediction of corresponding machining processes. A mathematical formulation of fixture locators modeling is presented to establish the relationship between the part errors and the related sources. Based on this method, the final machining errors of the part can be accurately estimated by relating them to the combined dimensional and geometric tolerances of the workpiece – fixture system. This method is developed for uncertainty analysis based on the Worst Case and statistical approaches. The application of the presented method is illustrated through presenting an example and the computational results are compared with the Monte Carlo simulation results.

Keywords: Process capability, machining error, dimensional and geometrical tolerances, uncertainty analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1227
13573 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System

Authors: Vuk M. Popovic, Dunja D. Popovic

Abstract:

Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.

Keywords: Laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1122
13572 Influence of Artificial Roughness on Heat Transfer in the Rotating Flow

Authors: T. Magrakvelidze, N. Bantsadze, N. Lekveishvili, Kh. Lomidze

Abstract:

The results of an experimental study of the process of convective and boiling heat transfer in the vessel with stirrer for smooth and rough ring-shaped pipes are presented. It is established that creation of two-dimensional artificial roughness on the heated surface causes the essential (~100%) intensification of convective heat transfer. In case of boiling the influence of roughness appears on the initial stage of boiling and in case of fully developed nucleate boiling there was no intensification of heat transfer. The similitude equation for calculating convective heat transfer coefficient, which generalizes well experimental data both for the smooth and the rough surfaces is proposed.

Keywords: boiling, heat transfer, roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1861
13571 Oscillation Effect of the Multi-stage Learning for the Layered Neural Networks and Its Analysis

Authors: Isao Taguchi, Yasuo Sugai

Abstract:

This paper proposes an efficient learning method for the layered neural networks based on the selection of training data and input characteristics of an output layer unit. Comparing to recent neural networks; pulse neural networks, quantum neuro computation, etc, the multilayer network is widely used due to its simple structure. When learning objects are complicated, the problems, such as unsuccessful learning or a significant time required in learning, remain unsolved. Focusing on the input data during the learning stage, we undertook an experiment to identify the data that makes large errors and interferes with the learning process. Our method devides the learning process into several stages. In general, input characteristics to an output layer unit show oscillation during learning process for complicated problems. The multi-stage learning method proposes by the authors for the function approximation problems of classifying learning data in a phased manner, focusing on their learnabilities prior to learning in the multi layered neural network, and demonstrates validity of the multi-stage learning method. Specifically, this paper verifies by computer experiments that both of learning accuracy and learning time are improved of the BP method as a learning rule of the multi-stage learning method. In learning, oscillatory phenomena of a learning curve serve an important role in learning performance. The authors also discuss the occurrence mechanisms of oscillatory phenomena in learning. Furthermore, the authors discuss the reasons that errors of some data remain large value even after learning, observing behaviors during learning.

Keywords: data selection, function approximation problem, multistage leaning, neural network, voluntary oscillation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
13570 Role of Association Rule Mining in Numerical Data Analysis

Authors: Sudhir Jagtap, Kodge B. G., Shinde G. N., Devshette P. M

Abstract:

Numerical analysis naturally finds applications in all fields of engineering and the physical sciences, but in the 21st century, the life sciences and even the arts have adopted elements of scientific computations. The numerical data analysis became key process in research and development of all the fields [6]. In this paper we have made an attempt to analyze the specified numerical patterns with reference to the association rule mining techniques with minimum confidence and minimum support mining criteria. The extracted rules and analyzed results are graphically demonstrated. Association rules are a simple but very useful form of data mining that describe the probabilistic co-occurrence of certain events within a database [7]. They were originally designed to analyze market-basket data, in which the likelihood of items being purchased together within the same transactions are analyzed.

Keywords: Numerical data analysis, Data Mining, Association Rule Mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2856
13569 Principles of Municipal Sewage Sludge Bioconversion into Biomineral Fertilizer

Authors: K. V. Kalinichenko, G. N. Nikovskaya

Abstract:

The efficiency of heavy metals removal from sewage  sludge in bioleaching processes with heterotrophic, chemoautotrophic  (sulphur-oxidizing) sludge cenoses and chemical leaching (in  distilled water, weakly acidic or alkaline medium) was compared.  The efficacy of heavy metals removal from sewage sludge varies  from 83 % (Zn) up to 14 % (Cr) and follows the order: Zn > Mn > Cu  > Ni > Co > Pb > Cr. The advantages of metals bioleaching process  at heterotrophic metabolism were shown. A new process for  bioconversation of sewage sludge into fertilizer at middle  temperatures after partial heavy metals removal was developed. This  process is based on enhancing vital ability of heterotrophic  microorganisms by adding easily metabolized nutrients and synthesis  of metabolites by growing sludge cenoses. These metabolites possess  the properties of heavy metals extractants and flocculants which  provide the enhancement of sludge flocks sedimentation. The process  results in biomineral fertilizer of prolonged action with immobilized  sludge bioelements. The fertilizer satisfies the EU limits for the  sewage sludge of agricultural utilization. High efficiency of the  biomineral fertilizer obtained has been demonstrated in vegetation  experiments.

 

Keywords: Fertilizer, heavy metals, leaching, sewage sludge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2579
13568 Features for Measuring Credibility on Facebook Information

Authors: Kanda Runapongsa Saikaew, Chaluemwut Noyunsan

Abstract:

Nowadays social media information, such as news, links, images, or VDOs, is shared extensively. However, the effectiveness of disseminating information through social media lacks in quality: less fact checking, more biases, and several rumors. Many researchers have investigated about credibility on Twitter, but there is no the research report about credibility information on Facebook. This paper proposes features for measuring credibility on Facebook information. We developed the system for credibility on Facebook. First, we have developed FB credibility evaluator for measuring credibility of each post by manual human’s labelling. We then collected the training data for creating a model using Support Vector Machine (SVM). Secondly, we developed a chrome extension of FB credibility for Facebook users to evaluate the credibility of each post. Based on the usage analysis of our FB credibility chrome extension, about 81% of users’ responses agree with suggested credibility automatically computed by the proposed system.

Keywords: Facebook, social media, credibility measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3661
13567 Hybrid of Hunting Search and Modified Simplex Methods for Grease Position Parameter Design Optimisation

Authors: P. Luangpaiboon, S. Boonhao

Abstract:

This study proposes a multi-response surface optimization problem (MRSOP) for determining the proper choices of a process parameter design (PPD) decision problem in a noisy environment of a grease position process in an electronic industry. The proposed models attempts to maximize dual process responses on the mean of parts between failure on left and right processes. The conventional modified simplex method and its hybridization of the stochastic operator from the hunting search algorithm are applied to determine the proper levels of controllable design parameters affecting the quality performances. A numerical example demonstrates the feasibility of applying the proposed model to the PPD problem via two iterative methods. Its advantages are also discussed. Numerical results demonstrate that the hybridization is superior to the use of the conventional method. In this study, the mean of parts between failure on left and right lines improve by 39.51%, approximately. All experimental data presented in this research have been normalized to disguise actual performance measures as raw data are considered to be confidential.

Keywords: Grease Position Process, Multi-response Surfaces, Modified Simplex Method, Hunting Search Method, Desirability Function Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
13566 Robust Digital Cinema Watermarking

Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi

Abstract:

With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.

Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip, traceable watermark, Hash Function, CRC-32.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
13565 Hybrid Approach for Memory Analysis in Windows System

Authors: Khairul Akram Zainol Ariffin, Ahmad Kamil Mahmood, Jafreezal Jaafar, Solahuddin Shamsuddin

Abstract:

Random Access Memory (RAM) is an important device in computer system. It can represent the snapshot on how the computer has been used by the user. With the growth of its importance, the computer memory has been an issue that has been discussed in digital forensics. A number of tools have been developed to retrieve the information from the memory. However, most of the tools have their limitation in the ability of retrieving the important information from the computer memory. Hence, this paper is aimed to discuss the limitation and the setback for two main techniques such as process signature search and process enumeration. Then, a new hybrid approach will be presented to minimize the setback in both individual techniques. This new approach combines both techniques with the purpose to retrieve the information from the process block and other objects in the computer memory. Nevertheless, the basic theory in address translation for x86 platforms will be demonstrated in this paper.

Keywords: Algorithms, Digital Forensics, Memory Analysis, Signature Search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
13564 Comparative Study of Transformed and Concealed Data in Experimental Designs and Analyses

Authors: K. Chinda, P. Luangpaiboon

Abstract:

This paper presents the comparative study of coded data methods for finding the benefit of concealing the natural data which is the mercantile secret. Influential parameters of the number of replicates (rep), treatment effects (τ) and standard deviation (σ) against the efficiency of each transformation method are investigated. The experimental data are generated via computer simulations under the specified condition of the process with the completely randomized design (CRD). Three ways of data transformation consist of Box-Cox, arcsine and logit methods. The difference values of F statistic between coded data and natural data (Fc-Fn) and hypothesis testing results were determined. The experimental results indicate that the Box-Cox results are significantly different from natural data in cases of smaller levels of replicates and seem to be improper when the parameter of minus lambda has been assigned. On the other hand, arcsine and logit transformations are more robust and obviously, provide more precise numerical results. In addition, the alternate ways to select the lambda in the power transformation are also offered to achieve much more appropriate outcomes.

Keywords: Experimental Designs, Box-Cox, Arcsine, Logit Transformations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
13563 Big Bang – Big Crunch Learning Method for Fuzzy Cognitive Maps

Authors: Engin Yesil, Leon Urbas

Abstract:

Modeling of complex dynamic systems, which are very complicated to establish mathematical models, requires new and modern methodologies that will exploit the existing expert knowledge, human experience and historical data. Fuzzy cognitive maps are very suitable, simple, and powerful tools for simulation and analysis of these kinds of dynamic systems. However, human experts are subjective and can handle only relatively simple fuzzy cognitive maps; therefore, there is a need of developing new approaches for an automated generation of fuzzy cognitive maps using historical data. In this study, a new learning algorithm, which is called Big Bang-Big Crunch, is proposed for the first time in literature for an automated generation of fuzzy cognitive maps from data. Two real-world examples; namely a process control system and radiation therapy process, and one synthetic model are used to emphasize the effectiveness and usefulness of the proposed methodology.

Keywords: Big Bang-Big Crunch optimization, Dynamic Systems, Fuzzy Cognitive Maps, Learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
13562 Control-flow Complexity Measurement of Processes and Weyuker's Properties

Authors: Jorge Cardoso

Abstract:

Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.

Keywords: Business process measurement, workflow, complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2687
13561 Post ERP Feral System and use of ‘Feral System as Coping Mechanism

Authors: Tajul Urus, S., Molla, A., Teoh, S.Y.

Abstract:

A number of studies highlighted problems related to ERP systems, yet, most of these studies focus on the problems during the project and implementation stages but not during the postimplementation use process. Problems encountered in the process of using ERP would hinder the effective exploitation and the extended and continued use of ERP systems and their value to organisations. This paper investigates the different types of problems users (operational, supervisory and managerial) faced in using ERP and how 'feral system' is used as the coping mechanism. The paper adopts a qualitative method and uses data collected from two cases and 26 interviews, to inductively develop a casual network model of ERP usage problem and its coping mechanism. This model classified post ERP usage problems as data quality, system quality, interface and infrastructure. The model is also categorised the different coping mechanism through use of 'feral system' inclusive of feral information system, feral data and feral use of technology.

Keywords: Case Studies, Coping Mechanism, Post Implementation ERP system, Usage Problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
13560 Multiphase Flow Regime Detection Algorithm for Gas-Liquid Interface Using Ultrasonic Pulse-Echo Technique

Authors: Serkan Solmaz, Jean-Baptiste Gouriet, Nicolas Van de Wyer, Christophe Schram

Abstract:

Efficiency of the cooling process for cryogenic propellant boiling in engine cooling channels on space applications is relentlessly affected by the phase change occurs during the boiling. The effectiveness of the cooling process strongly pertains to the type of the boiling regime such as nucleate and film. Geometric constraints like a non-transparent cooling channel unable to use any of visualization methods. The ultrasonic (US) technique as a non-destructive method (NDT) has therefore been applied almost in every engineering field for different purposes. Basically, the discontinuities emerge between mediums like boundaries among different phases. The sound wave emitted by the US transducer is both transmitted and reflected through a gas-liquid interface which makes able to detect different phases. Due to the thermal and structural concerns, it is impractical to sustain a direct contact between the US transducer and working fluid. Hence the transducer should be located outside of the cooling channel which results in additional interfaces and creates ambiguities on the applicability of the present method. In this work, an exploratory research is prompted so as to determine detection ability and applicability of the US technique on the cryogenic boiling process for a cooling cycle where the US transducer is taken place outside of the channel. Boiling of the cryogenics is a complex phenomenon which mainly brings several hindrances for experimental protocol because of thermal properties. Thus substitute materials are purposefully selected based on such parameters to simplify experiments. Aside from that, nucleate and film boiling regimes emerging during the boiling process are simply simulated using non-deformable stainless steel balls, air-bubble injection apparatuses and air clearances instead of conducting a real-time boiling process. A versatile detection algorithm is perennially developed concerning exploratory studies afterward. According to the algorithm developed, the phases can be distinguished 99% as no-phase, air-bubble, and air-film presences. The results show the detection ability and applicability of the US technique for an exploratory purpose.

Keywords: Ultrasound, ultrasonic, multiphase flow, boiling, cryogenics, detection algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995
13559 Reduce, Reuse and Recycle: Grand Challenges in Construction Recovery Process

Authors: Abioye A. Oyenuga, Rao Bhamidimarri

Abstract:

Hurling a successful Construction and Demolition Waste (C&DW) recycling operation around the globe is a challenge today, predominantly because secondary materials markets are yet to be integrated. Reducing, Reusing and recycling of (C&DW) have been employed over the years, and various techniques have been investigated. However, the economic and environmental viability of its application seems limited. This paper discusses the costs and benefits in using secondary materials and focus on investigating reuse and recycling process for five major types of construction materials: concrete, metal, wood, cardboard/paper and plasterboard. Data obtained from demolition specialists and contractors are considered and evaluated. The research paper found that construction material recovery process fully incorporate a 3R’s principle contributing to saving energy and natural resources. This scrutiny leads to the empathy of grand challenges in construction material recovery process. Recommendations to deepen material recovery process are also discussed.

Keywords: Construction & Demolition Waste (C&DW), 3R concept, Recycling, Reuse, Life-Cycle Assessment (LCA), Waste Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5095
13558 Optimal Performance of Plastic Extrusion Process Using Fuzzy Goal Programming

Authors: Abbas Al-Refaie

Abstract:

This study optimized the performance of plastic extrusion process of drip irrigation pipes using fuzzy goal programming. Two main responses were of main interest; roll thickness and hardness. Four main process factors were studied. The L18 array was then used for experimental design. The individual-moving range control charts were used to assess the stability of the process, while the process capability index was used to assess process performance. Confirmation experiments were conducted at the obtained combination of optimal factor setting by fuzzy goal programming. The results revealed that process capability was improved significantly from -1.129 to 0.8148 for roll thickness and from 0.0965 to 0.714 and hardness. Such improvement results in considerable savings in production and quality costs.

Keywords: Fuzzy goal programming, extrusion process, process capability, irrigation plastic pipes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893
13557 A Monte Carlo Method to Data Stream Analysis

Authors: Kittisak Kerdprasop, Nittaya Kerdprasop, Pairote Sattayatham

Abstract:

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

Keywords: Data Stream, Monte Carlo, Sampling, DensityEstimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1413
13556 Benefits from a SMED Application in a Punching Machine

Authors: Eric Costa, Sara Bragança, Rui Sousa, Anabela Alves

Abstract:

This paper presents an application of the Single-Minute Exchange of Die (SMED) methodology to a turret punching machine in an elevators company, in Portugal. The work was developed during five months, in the ambit of a master thesis in Industrial Engineering and Management. The Lean Production tool SMED was applied to reduce setup times in order to improve the production flexibility of the machine. The main results obtained were a reduction of 64% in setup time (from 15.1 to 5.4min), 50% in work-in-process amount (from 12.8 to 6.4 days) and 99% in the distance traveled by the operator during the internal period (from 136.7 to 1.7m). These improvements correspond to gains of about €7,315.38 per year.

Keywords: Lean production, setup process, SMED.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4079