Search results for: workflow applications
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6556

Search results for: workflow applications

6556 Towards a Resources Provisioning for Dynamic Workflows in the Cloud

Authors: Fairouz Fakhfakh, Hatem Hadj Kacem, Ahmed Hadj Kacem

Abstract:

Cloud computing offers a new model of service provisioning for workflow applications, thanks to its elasticity and its paying model. However, it presents various challenges that need to be addressed in order to be efficiently utilized. The resources provisioning problem for workflow applications has been widely studied. Nevertheless, the existing works did not consider the change in workflow instances while they are being executed. This functionality has become a major requirement to deal with unusual situations and evolution. This paper presents a first step towards the resources provisioning for a dynamic workflow. In fact, we propose a provisioning algorithm which minimizes the overall workflow execution cost, while meeting a deadline constraint. Then, we extend it to support the dynamic adding of tasks. Experimental results show that our proposed heuristic demonstrates a significant reduction in resources cost by using a consolidation process.

Keywords: cloud computing, resources provisioning, dynamic workflow, workflow applications

Procedia PDF Downloads 293
6555 Preserving Privacy in Workflow Delegation Models

Authors: Noha Nagy, Hoda Mokhtar, Mohamed El Sherkawi

Abstract:

The popularity of workflow delegation models and the increasing number of workflow provenance-aware systems motivate the need for finding more strict delegation models. Such models combine different approaches for enhanced security and respecting workflow privacy. Although modern enterprises seek conformance to workflow constraints to ensure correctness of their work, these constraints pose a threat to security, because these constraints can be good seeds for attacking privacy even in secure models. This paper introduces a comprehensive Workflow Delegation Model (WFDM) that utilizes provenance and workflow constraints to prevent malicious delegate from attacking workflow privacy as well as extending the delegation functionalities. In addition, we argue the need for exploiting workflow constraints to improve workflow security models.

Keywords: workflow delegation models, secure workflow, workflow privacy, workflow provenance

Procedia PDF Downloads 330
6554 Publish/Subscribe Scientific Workflow Interoperability Framework (PS-SWIF) Architecture and Design

Authors: Ahmed Alqaoud

Abstract:

This paper describes Publish/Subscribe Scientific Workflow Interoperability Framework (PS-SWIF) architecture and its components that collectively provide interoperability between heterogeneous scientific workflow systems. Requirements to achieve interoperability are identified. This paper also provides a detailed investigation and design of models and solutions for system requirements, and considers how workflow interoperability models provided by Workflow Management Coalition (WfMC) can be achieved using the PS-SWIF system.

Keywords: publish/subscribe, scientific workflow, web services, workflow interoperability

Procedia PDF Downloads 306
6553 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud

Authors: Shuen-Tai Wang, Yu-Ching Lin

Abstract:

With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.

Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine

Procedia PDF Downloads 159
6552 A Holistic Workflow Modeling Method for Business Process Redesign

Authors: Heejung Lee

Abstract:

In a highly competitive environment, it becomes more important to shorten the whole business process while delivering or even enhancing the business value to the customers and suppliers. Although the workflow management systems receive much attention for its capacity to practically support the business process enactment, the effective workflow modeling method remain still challenging and the high degree of process complexity makes it more difficult to gain the short lead time. This paper presents a workflow structuring method in a holistic way that can reduce the process complexity using activity-needs and formal concept analysis, which eventually enhances the key performance such as quality, delivery, and cost in business process.

Keywords: workflow management, re-engineering, formal concept analysis, business process

Procedia PDF Downloads 407
6551 Using the SMT Solver to Minimize the Latency and to Optimize the Number of Cores in an NoC-DSP Architectures

Authors: Imen Amari, Kaouther Gasmi, Asma Rebaya, Salem Hasnaoui

Abstract:

The problem of scheduling and mapping data flow applications on multi-core architectures is notoriously difficult. This difficulty is related to the rapid evaluation of Telecommunication and multimedia systems accompanied by a rapid increase of user requirements in terms of latency, execution time, consumption, energy, etc. Having an optimal scheduling on multi-cores DSP (Digital signal Processors) platforms is a challenging task. In this context, we present a novel technic and algorithm in order to find a valid schedule that optimizes the key performance metrics particularly the Latency. Our contribution is based on Satisfiability Modulo Theories (SMT) solving technologies which is strongly driven by the industrial applications and needs. This paper, describe a scheduling module integrated in our proposed Workflow which is advised to be a successful approach for programming the applications based on NoC-DSP platforms. This workflow transform automatically a Simulink model to a synchronous dataflow (SDF) model. The automatic transformation followed by SMT solver scheduling aim to minimize the final latency and other software/hardware metrics in terms of an optimal schedule. Also, finding the optimal numbers of cores to be used. In fact, our proposed workflow taking as entry point a Simulink file (.mdl or .slx) derived from embedded Matlab functions. We use an approach which is based on the synchronous and hierarchical behavior of both Simulink and SDF. Whence, results of running the scheduler which exist in the Workflow mentioned above using our proposed SMT solver algorithm refinements produce the best possible scheduling in terms of latency and numbers of cores.

Keywords: multi-cores DSP, scheduling, SMT solver, workflow

Procedia PDF Downloads 285
6550 Execution Time Optimization of Workflow Network with Activity Lead-Time

Authors: Xiaoping Qiu, Binci You, Yue Hu

Abstract:

The executive time of the workflow network has an important effect on the efficiency of the business process. In this paper, the activity executive time is divided into the service time and the waiting time, then the lead time can be extracted from the waiting time. The executive time formulas of the three basic structures in the workflow network are deduced based on the activity lead time. Taken the process of e-commerce logistics as an example, insert appropriate lead time for key activities by using Petri net, and the executive time optimization model is built to minimize the waiting time with the time-cost constraints. Then the solution program-using VC++6.0 is compiled to get the optimal solution, which reduces the waiting time of key activities in the workflow, and verifies the role of lead time in the timeliness of e-commerce logistics.

Keywords: electronic business, execution time, lead time, optimization model, petri net, time workflow network

Procedia PDF Downloads 174
6549 Cloud Support for Scientific Workflow Execution: Prototyping Solutions for Remote Sensing Applications

Authors: Sofiane Bendoukha, Daniel Moldt, Hayat Bendoukha

Abstract:

Workflow concepts are essential for the development of remote sensing applications. They can help users to manage and process satellite data and execute scientific experiments on distributed resources. The objective of this paper is to introduce an approach for the specification and the execution of complex scientific workflows in Cloud-like environments. The approach strives to support scientists during the modeling, the deployment and the monitoring of their workflows. This work takes advantage from Petri nets and more pointedly the so-called reference nets formalism, which provides a robust modeling/implementation technique. RENEWGRASS is a tool that we implemented and integrated into the Petri nets editor and simulator RENEW. It provides an easy way to support not experienced scientists during the specification of their workflows. It allows both modeling and enactment of image processing workflows from the remote sensing domain. Our case study is related to the implementation of vegetation indecies. We have implemented the Normalized Differences Vegetation Index (NDVI) workflow. Additionally, we explore the integration possibilities of the Cloud technology as a supplementary layer for the deployment of the current implementation. For this purpose, we discuss migration patterns of data and applications and propose an architecture.

Keywords: cloud computing, scientific workflows, petri nets, RENEWGRASS

Procedia PDF Downloads 446
6548 Developing a Web-Based Workflow Management System in Cloud Computing Platforms

Authors: Wang Shuen-Tai, Lin Yu-Ching, Chang Hsi-Ya

Abstract:

Cloud computing is the innovative and leading information technology model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort. In this paper, we aim at the development of workflow management system for cloud computing platforms based on our previous research on the dynamic allocation of the cloud computing resources and its workflow process. We took advantage of the HTML 5 technology and developed web-based workflow interface. In order to enable the combination of many tasks running on the cloud platform in sequence, we designed a mechanism and developed an execution engine for workflow management on clouds. We also established a prediction model which was integrated with job queuing system to estimate the waiting time and cost of the individual tasks on different computing nodes, therefore helping users achieve maximum performance at lowest payment. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for cloud computing platform. This development also helps boost user productivity by promoting a flexible workflow interface that lets users design and control their tasks' flow from anywhere.

Keywords: web-based, workflow, HTML5, Cloud Computing, Queuing System

Procedia PDF Downloads 308
6547 Using Knowledge Management and Visualisation Concepts to Improve Patients and Hospitals Staff Workflow

Authors: A. A. AlRasheed, A. Atkins, R. Campion

Abstract:

This paper focuses on using knowledge management and visualisation concepts to improve the patients and hospitals employee’s workflow. Hospitals workflow is a complex and complicated process and poor patient flow can put both patients and a hospital’s reputation at risk, and can threaten the facility’s financial sustainability. Healthcare leaders are under increased pressure to reduce costs while maintaining or increasing patient care standards. In this paper, a framework is proposed to help improving patient experience, staff satisfaction, and operational efficiency across hospitals by using knowledge management based visualisation concepts. This framework is using real-time visibility to track and monitor location and status of patients, staff, rooms, and medical equipment.

Keywords: knowledge management, improvements, visualisation, workflow

Procedia PDF Downloads 267
6546 Resource-Constrained Heterogeneous Workflow Scheduling Algorithms in Heterogeneous Computing Clusters

Authors: Lei Wang, Jiahao Zhou

Abstract:

The development of heterogeneous computing clusters provides a strong computility guarantee for large-scale workflows (e.g., scientific computing, artificial intelligence (AI), etc.). However, the tasks within large-scale workflows have also gradually become heterogeneous due to different demands on computing resources, which leads to the addition of a task resource-restricted constraint to the workflow scheduling problem on heterogeneous computing platforms. In this paper, we propose a heterogeneous constrained minimum makespan scheduling algorithm based on the idea of greedy strategy, which provides an efficient solution to the heterogeneous workflow scheduling problem in a heterogeneous platform. In this paper, we test the effectiveness of our proposed scheduling algorithm by randomly generating heterogeneous workflows with heterogeneous computing platform, and the experiments show that our method improves 15.2% over the state-of-the-art methods.

Keywords: heterogeneous computing, workflow scheduling, constrained resources, minimal makespan

Procedia PDF Downloads 32
6545 An Efficient Hardware/Software Workflow for Multi-Cores Simulink Applications

Authors: Asma Rebaya, Kaouther Gasmi, Imen Amari, Salem Hasnaoui

Abstract:

Over these last years, applications such as telecommunications, signal processing, digital communication with advanced features (Multi-antenna, equalization..) witness a rapid evaluation accompanied with an increase of user exigencies in terms of latency, the power of computation… To satisfy these requirements, the use of hardware/software systems is a common solution; where hardware is composed of multi-cores and software is represented by models of computation, synchronous data flow (SDF) graph for instance. Otherwise, the most of the embedded system designers utilize Simulink for modeling. The issue is how to simplify the c code generation, for a multi-cores platform, of an application modeled by Simulink. To overcome this problem, we propose a workflow allowing an automatic transformation from the Simulink model to the SDF graph and providing an efficient schedule permitting to optimize the number of cores and to minimize latency. This workflow goes from a Simulink application and a hardware architecture described by IP.XACT language. Based on the synchronous and hierarchical behavior of both models, the Simulink block diagram is automatically transformed into an SDF graph. Once this process is successfully achieved, the scheduler calculates the optimal cores’ number needful by minimizing the maximum density of the whole application. Then, a core is chosen to execute a specific graph task in a specific order and, subsequently, a compatible C code is generated. In order to perform this proposal, we extend Preesm, a rapid prototyping tool, to take the Simulink model as entry input and to support the optimal schedule. Afterward, we compared our results to this tool results, using a simple illustrative application. The comparison shows that our results strictly dominate the Preesm results in terms of number of cores and latency. In fact, if Preesm needs m processors and latency L, our workflow need processors and latency L'< L.

Keywords: hardware/software system, latency, modeling, multi-cores platform, scheduler, SDF graph, Simulink model, workflow

Procedia PDF Downloads 268
6544 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 219
6543 Behavior Consistency Analysis for Workflow Nets Based on Branching Processes

Authors: Wang Mimi, Jiang Changjun, Liu Guanjun, Fang Xianwen

Abstract:

Loop structure often appears in the business process modeling, analyzing the consistency of corresponding workflow net models containing loop structure is a problem, the existing behavior consistency methods cannot analyze effectively the process models with the loop structure. In the paper, by analyzing five kinds of behavior relations of transitions, a three-dimensional figure and two-dimensional behavior relation matrix are proposed. Based on this, analysis method of behavior consistency of business process based on Petri net branching processes is proposed. Finally, an example is given out, which shows the method is effective.

Keywords: workflow net, behavior consistency measures, loop, branching process

Procedia PDF Downloads 387
6542 Employing a Knime-based and Open-source Tools to Identify AMI and VER Metabolites from UPLC-MS Data

Authors: Nouf Alourfi

Abstract:

This study examines the metabolism of amitriptyline (AMI) and verapamil (VER) using a KNIME-based method. KNIME improved workflow is an open-source data-analytics platform that integrates a number of open-source metabolomics tools such as CFMID and MetFrag to provide standard data visualisations, predict candidate metabolites, assess them against experimental data, and produce reports on identified metabolites. The use of this workflow is demonstrated by employing three types of liver microsomes (human, rat, and Guinea pig) to study the in vitro metabolism of the two drugs (AMI and VER). This workflow is used to create and treat UPLC-MS (Orbitrap) data. The formulas and structures of these drugs' metabolites can be assigned automatically. The key metabolic routes for amitriptyline are hydroxylation, N-dealkylation, N-oxidation, and conjugation, while N-demethylation, O-demethylation and N-dealkylation, and conjugation are the primary metabolic routes for verapamil. The identified metabolites are compatible to the published, clarifying the solidity of the workflow technique and the usage of computational tools like KNIME in supporting the integration and interoperability of emerging novel software packages in the metabolomics area.

Keywords: KNIME, CFMID, MetFrag, Data Analysis, Metabolomics

Procedia PDF Downloads 118
6541 Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital

Authors: Mohamed Elshal, Hazim El-Mounayri, Omar El-Mounayri

Abstract:

Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.

Keywords: systems modeling, ED operation, workflow modeling, systems analysis

Procedia PDF Downloads 180
6540 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 141
6539 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C

Authors: Keaghan Brown

Abstract:

The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.

Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase

Procedia PDF Downloads 75
6538 Bridging the Gap between Different Interfaces for Business Process Modeling

Authors: Katalina Grigorova, Kaloyan Mironov

Abstract:

The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever-increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.

Keywords: business process modeling, business process modeling standards, workflow patterns, converting models

Procedia PDF Downloads 583
6537 Spatially Distributed Rainfall Prediction Based on Automated Kriging for Landslide Early Warning Systems

Authors: Ekrem Canli, Thomas Glade

Abstract:

The precise prediction of rainfall in space and time is a key element to most landslide early warning systems. Unfortunately, the spatial variability of rainfall in many early warning applications is often disregarded. A common simplification is to use uniformly distributed rainfall to characterize aerial rainfall intensity. With spatially differentiated rainfall information, real-time comparison with rainfall thresholds or the implementation in process-based approaches might form the basis for improved landslide warnings. This study suggests an automated workflow from the hourly, web-based collection of rain gauge data to the generation of spatially differentiated rainfall predictions based on kriging. Because the application of kriging is usually a labor intensive task, a simplified and consequently automated variogram modeling procedure was applied to up-to-date rainfall data. The entire workflow was carried out purely with open source technology. Validation results, albeit promising, pointed out the challenges that are involved in pure distance based, automated geostatistical interpolation techniques for ever-changing environmental phenomena over short temporal and spatial extent.

Keywords: kriging, landslide early warning system, spatial rainfall prediction, variogram modelling, web scraping

Procedia PDF Downloads 280
6536 Analyzing Medical Workflows Using Market Basket Analysis

Authors: Mohit Kumar, Mayur Betharia

Abstract:

Healthcare domain, with the emergence of Electronic Medical Record (EMR), collects a lot of data which have been attracting Data Mining expert’s interest. In the past, doctors have relied on their intuition while making critical clinical decisions. This paper presents the means to analyze the Medical workflows to get business insights out of huge dumped medical databases. Market Basket Analysis (MBA) which is a special data mining technique, has been widely used in marketing and e-commerce field to discover the association between products bought together by customers. It helps businesses in increasing their sales by analyzing the purchasing behavior of customers and pitching the right customer with the right product. This paper is an attempt to demonstrate Market Basket Analysis applications in healthcare. In particular, it discusses the Market Basket Analysis Algorithm ‘Apriori’ applications within healthcare in major areas such as analyzing the workflow of diagnostic procedures, Up-selling and Cross-selling of Healthcare Systems, designing healthcare systems more user-friendly. In the paper, we have demonstrated the MBA applications using Angiography Systems, but can be extrapolated to other modalities as well.

Keywords: data mining, market basket analysis, healthcare applications, knowledge discovery in healthcare databases, customer relationship management, healthcare systems

Procedia PDF Downloads 172
6535 A Survey of Crowdsourcing Technology

Authors: Qianjia Cheng, Hongquan Jiang

Abstract:

Crowdsourcing solves the problems that computers can't handle by integrating computers and the Internet. Its extensive knowledge sources, high efficiency and high quality, make crowdsourcing attract wide attention in industry and academia in recent years. The development of online crowdsourcing platforms such as Clickworker and Amazon Mechanical Turk(Mturk) tend to mature gradually. This paper sorts out the concept of crowdsourcing, sorts out the workflow of competitive crowdsourcing, summarizes the related technologies of crowdsourcing based on the workflow, quality control, cost control and delay control, introduces the typical crowdsourcing platform. Finally, we highlight some open problems of the current crowdsourcing and present some future research direction in this area.

Keywords: application, crowdsourcing, crowdsourcing platform, system architecture

Procedia PDF Downloads 69
6534 A Survey of Crowdsourcing Technology and Application

Authors: Qianjia Cheng, Hongquan Jiang

Abstract:

Crowdsourcing solves the problems that computers can't handle by integrating computers and the Internet. Its extensive knowledge sources, high efficiency, and high quality have made crowdsourcing attract wide attention in industry and academia in recent years. The development of online crowdsourcing platforms such as Clickworker, Amazon Mechanical Turk(Mturk) tends to mature gradually. This paper sorts out the concept of crowdsourcing, sorts out the workflow of competitive crowdsourcing, summarizes the related technologies of crowdsourcing based on workflow, quality control, cost control, and delay control, and introduces the typical crowdsourcing platform. Finally, we highlight some open problems of the current crowdsourcing and present some future research directions in this area.

Keywords: application, crowdsourcing, crowdsourcing platform, system architecture

Procedia PDF Downloads 88
6533 Process Modeling and Problem Solving: Connecting Two Worlds by BPMN

Authors: Gionata Carmignani, Mario G. C. A. Cimino, Franco Failli

Abstract:

Business Processes (BPs) are the key instrument to understand how companies operate at an organizational level, taking an as-is view of the workflow, and how to address their issues by identifying a to-be model. In last year’s, the BP Model and Notation (BPMN) has become a de-facto standard for modeling processes. However, this standard does not incorporate explicitly the Problem-Solving (PS) knowledge in the Process Modeling (PM) results. Thus, such knowledge cannot be shared or reused. To narrow this gap is today a challenging research area. In this paper we present a framework able to capture the PS knowledge and to improve a workflow. This framework extends the BPMN specification by incorporating new general-purpose elements. A pilot scenario is also presented and discussed.

Keywords: business process management, BPMN, problem solving, process mapping

Procedia PDF Downloads 412
6532 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 516
6531 To Handle Data-Driven Software Development Projects Effectively

Authors: Shahnewaz Khan

Abstract:

Machine learning (ML) techniques are often used in projects for creating data-driven applications. These tasks typically demand additional research and analysis. The proper technique and strategy must be chosen to ensure the success of data-driven projects. Otherwise, even exerting a lot of effort, the necessary development might not always be possible. In this post, an effort to examine the workflow of data-driven software development projects and its implementation process in order to describe how to manage a project successfully. Which will assist in minimizing the added workload.

Keywords: data, data-driven projects, data science, NLP, software project

Procedia PDF Downloads 81
6530 Getting Out of the Box: Tangible Music Production in the Age of Virtual Technological Abundance

Authors: Tim Nikolsky

Abstract:

This paper seeks to explore the different ways in which music producers choose to embrace various levels of technology based on musical values, objectives, affordability, access and workflow benefits. Current digital audio production workflow is questioned. Engineers and music producers of today are increasingly divorced from the tangibility of music production. Making music no longer requires you to reach over and turn a knob. Ideas of authenticity in music production are being redefined. Calculations from the mathematical algorithm with the pretty pictures are increasingly being chosen over hardware containing transformers and tubes. Are mouse clicks and movements equivalent or inferior to the master brush strokes we are seeking to conjure? We are making audio production decisions visually by constantly looking at a screen rather than listening. Have we compromised our music objectives and values by removing the ‘hands-on’ nature of music making? DAW interfaces are making our musical decisions for us not necessarily in our best interests. Technological innovation has presented opportunities as well as challenges for education. What do music production students actually need to learn in a formalised education environment, and to what extent do they need to know it? In this brave new world of omnipresent music creation tools, do we still need tangibility in music production? Interviews with prominent Australian music producers that work in a variety of fields will be featured in this paper, and will provide insight in answering these questions and move towards developing an understanding how tangibility can be rediscovered in the next generation of music production.

Keywords: analogue, digital, digital audio workstation, music production, plugins, tangibility, technology, workflow

Procedia PDF Downloads 271
6529 General Architecture for Automation of Machine Learning Practices

Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain

Abstract:

Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.

Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler

Procedia PDF Downloads 56
6528 Geospatial Data Complexity in Electronic Airport Layout Plan

Authors: Shyam Parhi

Abstract:

Airports GIS program collects Airports data, validate and verify it, and stores it in specific database. Airports GIS allows authorized users to submit changes to airport data. The verified data is used to develop several engineering applications. One of these applications is electronic Airport Layout Plan (eALP) whose primary aim is to move from paper to digital form of ALP. The first phase of development of eALP was completed recently and it was tested for a few pilot program airports across different regions. We conducted gap analysis and noticed that a lot of development work is needed to fine tune at least six mandatory sheets of eALP. It is important to note that significant amount of programming is needed to move from out-of-box ArcGIS to a much customized ArcGIS which will be discussed. The ArcGIS viewer capability to display essential features like runway or taxiway or the perpendicular distance between them will be discussed. An enterprise level workflow which incorporates coordination process among different lines of business will be highlighted.

Keywords: geospatial data, geology, geographic information systems, aviation

Procedia PDF Downloads 415
6527 Multi-objective Rationality Optimisation for Robotic-fabrication-oriented Free-form Timber Structure Morphology Design

Authors: Yiping Meng, Yiming Sun

Abstract:

The traditional construction industry is unable to meet the requirements for novel fabrication and construction. Automated construction and digital design have emerged as industry development trends that compensate for this shortcoming under the backdrop of Industrial Revolution 4.0. Benefitting from more flexible working space and more various end-effector tools compared to CNC methods, robot fabrication and construction techniques have been used in irregular architectural design. However, there is a lack of a systematic and comprehensive design and optimisation workflow considering geometric form, material, and fabrication methods. This paper aims to propose a design optimisation workflow for improving the rationality of a free-form timber structure fabricated by the robotic arm. Firstly, the free-form surface is described by NURBS, while its structure is calculated using the finite element analysis method. Then, by considering the characteristics and limiting factors of robotic timber fabrication, strain energy and robustness are set as optimisation objectives to optimise structural morphology by gradient descent method. As a result, an optimised structure with axial force as the main force and uniform stress distribution is generated after the structure morphology optimisation process. With the decreased strain energy and the improved robustness, the generated structure's bearing capacity and mechanical properties have been enhanced. The results prove the feasibility and effectiveness of the proposed optimisation workflow for free-form timber structure morphology design.

Keywords: robotic fabrication, free-form timber structure, Multi-objective optimisation, Structural morphology, rational design

Procedia PDF Downloads 192