Search results for: software process engineering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20511

Search results for: software process engineering

19551 Optimal Bayesian Chart for Controlling Expected Number of Defects in Production Processes

Authors: V. Makis, L. Jafari

Abstract:

In this paper, we develop an optimal Bayesian chart to control the expected number of defects per inspection unit in production processes with long production runs. We formulate this control problem in the optimal stopping framework. The objective is to determine the optimal stopping rule minimizing the long-run expected average cost per unit time considering partial information obtained from the process sampling at regular epochs. We prove the optimality of the control limit policy, i.e., the process is stopped and the search for assignable causes is initiated when the posterior probability that the process is out of control exceeds a control limit. An algorithm in the semi-Markov decision process framework is developed to calculate the optimal control limit and the corresponding average cost. Numerical examples are presented to illustrate the developed optimal control chart and to compare it with the traditional u-chart.

Keywords: Bayesian u-chart, economic design, optimal stopping, semi-Markov decision process, statistical process control

Procedia PDF Downloads 553
19550 Hyperchaos-Based Video Encryption for Device-To-Device Communications

Authors: Samir Benzegane, Said Sadoudi, Mustapha Djeddou

Abstract:

In this paper, we present a software development of video streaming encryption for Device-to-Device (D2D) communications by using Hyperchaos-based Random Number Generator (HRNG) implemented in C#. The software implements and uses the proposed HRNG to generate key stream for encrypting and decrypting real-time video data. The used HRNG consists of Hyperchaos Lorenz system which produces four signal outputs taken as encryption keys. The generated keys are characterized by high quality randomness which is confirmed by passing standard NIST statistical tests. Security analysis of the proposed encryption scheme confirms its robustness against different attacks.

Keywords: hyperchaos Lorenz system, hyperchaos-based random number generator, D2D communications, C#

Procedia PDF Downloads 347
19549 Wait-Optimized Scheduler Algorithm for Efficient Process Scheduling in Computer Systems

Authors: Md Habibur Rahman, Jaeho Kim

Abstract:

Efficient process scheduling is a crucial factor in ensuring optimal system performance and resource utilization in computer systems. While various algorithms have been proposed over the years, there are still limitations to their effectiveness. This paper introduces a new Wait-Optimized Scheduler (WOS) algorithm that aims to minimize process waiting time by dividing them into two layers and considering both process time and waiting time. The WOS algorithm is non-preemptive and prioritizes processes with the shortest WOS. In the first layer, each process runs for a predetermined duration, and any unfinished process is subsequently moved to the second layer, resulting in a decrease in response time. Whenever the first layer is free or the number of processes in the second layer is twice that of the first layer, the algorithm sorts all the processes in the second layer based on their remaining time minus waiting time and sends one process to the first layer to run. This ensures that all processes eventually run, optimizing waiting time. To evaluate the performance of the WOS algorithm, we conducted experiments comparing its performance with traditional scheduling algorithms such as First-Come-First-Serve (FCFS) and Shortest-Job-First (SJF). The results showed that the WOS algorithm outperformed the traditional algorithms in reducing the waiting time of processes, particularly in scenarios with a large number of short tasks with long wait times. Our study highlights the effectiveness of the WOS algorithm in improving process scheduling efficiency in computer systems. By reducing process waiting time, the WOS algorithm can improve system performance and resource utilization. The findings of this study provide valuable insights for researchers and practitioners in developing and implementing efficient process scheduling algorithms.

Keywords: process scheduling, wait-optimized scheduler, response time, non-preemptive, waiting time, traditional scheduling algorithms, first-come-first-serve, shortest-job-first, system performance, resource utilization

Procedia PDF Downloads 73
19548 The Impact of the Business Process Reengineering on the Practices of the Human Resources Management in the Franco Tunisian Company-Network

Authors: Nesrine Bougarech, Habib Affes

Abstract:

This research lays the emphasis on the business process reengineering (BPR) which consists in radically altering the organizational processes through the optimal use of information technology (IT) to attain major enhancements in terms of quality, performance and productivity. A survey of the business process reengineering (BPR) was carried out in three French groups and their subsidiaries in Tunisia. The data collected were qualitatively analyzed in an attempt to test the main indicators of the success of a business process reengineering project (BPR) and to compare the importance of these indicators in the context of France versus Tunisia. The study corroborates that the respect of the inherent principles of the business process reengineering (BPR) and the diversity of the human resources involved in the project can lead to better productivity, higher quality of the goods or services and lower cost. Additionally, our results mirror the extent to which the respect of the principles and the diversity of resources are more important in the French companies than in their Tunisian subsidiaries.

Keywords: business process reengineering (BPR), human resources management (HRM), information technology (IT), management

Procedia PDF Downloads 388
19547 Locating the Best Place for Earthquake Refugee Camps by OpenSource Software: A Case Study for Tehran, Iran

Authors: Reyhaneh Saeedi

Abstract:

Iran is one of the regions which are most prone for earthquakes annually having a large number of financial and mortality and financial losses. Every year around the world, a large number of people lose their home and life due to natural disasters such as earthquakes. It is necessary to provide and specify some suitable places for settling the homeless people before the occurrence of the earthquake, one of the most important factors in crisis planning and management. Some of the natural disasters can be Modeling and shown by Geospatial Information System (GIS). By using GIS, it would be possible to manage the spatial data and reach several goals by making use of the analyses existing in it. GIS has a determining role in disaster management because it can determine the best places for temporary resettling after such a disaster. In this research QuantumGIS software is used that It is an OpenSource software so that easy to access codes and It is also free. In this system, AHP method is used as decision model and to locate the best places for temporary resettling, is done based on the related organizations criteria with their weights and buffers. Also in this research are made the buffer layers of criteria and change them to the raster layers. Later on, the raster layers are multiplied on desired weights then, the results are added together. Eventually, there are suitable places for resettling of victims by desired criteria by different colors with their optimum rate in QuantumGIS platform.

Keywords: disaster management, temporary resettlement, earthquake, QuantumGIS

Procedia PDF Downloads 381
19546 Seismic Performance Evaluation of Existing Building Using Structural Information Modeling

Authors: Byungmin Cho, Dongchul Lee, Taejin Kim, Minhee Lee

Abstract:

The procedure for the seismic retrofit of existing buildings includes the seismic evaluation. In the evaluation step, it is assessed whether the buildings have satisfactory performance against seismic load. Based on the results of that, the buildings are upgraded. To evaluate seismic performance of the buildings, it usually goes through the model transformation from elastic analysis to inelastic analysis. However, when the data is not delivered through the interwork, engineers should manually input the data. In this process, since it leads to inaccuracy and loss of information, the results of the analysis become less accurate. Therefore, in this study, the process for the seismic evaluation of existing buildings using structural information modeling is suggested. This structural information modeling makes the work economic and accurate. To this end, it is determined which part of the process could be computerized through the investigation of the process for the seismic evaluation based on ASCE 41. The structural information modeling process is developed to apply to the seismic evaluation using Perform 3D program usually used for the nonlinear response history analysis. To validate this process, the seismic performance of an existing building is investigated.

Keywords: existing building, nonlinear analysis, seismic performance, structural information modeling

Procedia PDF Downloads 364
19545 The Use of Software and Internet Search Engines to Develop the Encoding and Decoding Skills of a Dyslexic Learner: A Case Study

Authors: Rabih Joseph Nabhan

Abstract:

This case study explores the impact of two major computer software programs Learn to Speak English and Learn English Spelling and Pronunciation, and some Internet search engines such as Google on mending the decoding and spelling deficiency of Simon X, a dyslexic student. The improvement in decoding and spelling may result in better reading comprehension and composition writing. Some computer programs and Internet materials can help regain the missing awareness and consequently restore his self-confidence and self-esteem. In addition, this study provides a systematic plan comprising a set of activities (four computer programs and Internet materials) which address the problem from the lowest to the highest levels of phoneme and phonological awareness. Four methods of data collection (accounts, observations, published tests, and interviews) create the triangulation to validly and reliably collect data before the plan, during the plan, and after the plan. The data collected are analyzed quantitatively and qualitatively. Sometimes the analysis is either quantitative or qualitative, and some other times a combination of both. Tables and figures are utilized to provide a clear and uncomplicated illustration of some data. The improvement in the decoding, spelling, reading comprehension, and composition writing skills that occurred is proved through the use of authentic materials performed by the student under study. Such materials are a comparison between two sample passages written by the learner before and after the plan, a genuine computer chat conversation, and the scores of the academic year that followed the execution of the plan. Based on these results, the researcher recommends further studies on other Lebanese dyslexic learners using the computer to mend their language problem in order to design and make a most reliable software program that can address this disability more efficiently and successfully.

Keywords: analysis, awareness, dyslexic, software

Procedia PDF Downloads 203
19544 Understanding Tacit Knowledge and DIKW

Authors: Bahadir Aydin

Abstract:

Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.

Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW

Procedia PDF Downloads 500
19543 Understanding Team Member Autonomy and Team Collaboration: A Qualitative Study

Authors: Ayşen Bakioğlu, Gökçen Seyra Çakır

Abstract:

This study aims to explore how research assistants who work in project teams experience team member autonomy and how they reconcile team member autonomy with team collaboration. The study utilizes snowball sampling. 20 research assistants who work the faculties of education in Marmara University and Yıldız Technical University have been interviewed. The analysis of data involves a content analysis MAXQDAPlus 11 which is a qualitative data analysis software is used as the data analysis tool. According to the findings of this study, emerging themes include team norm formation, team coordination management, the role of individual tasks in team collaboration, leadership distribution. According to the findings, interviewees experience team norm formation process in terms of processes, which pertain to task fulfillment, and processes, which pertain to the regulation of team dynamics. Team norm formation process instills a sense of responsibility amongst individual team members. Apart from that, the interviewees’ responses indicate that the realization of the obligation to work in a team contributes to the team norm formation process. The participants indicate that individual expectations are taken into consideration during the coordination of the team. The supervisor of the project team also has a crucial role in maintaining team collaboration. Coordination problems arise when an individual team member does not relate his/her academic field with the research topic of the project team. The findings indicate that the leadership distribution in the project teams involves two leadership processes: leadership distribution which is based on the processes that focus on individual team members and leadership distribution which is based on the processes that focus on team interaction. Apart from that, individual tasks serve as a facilitator of collaboration amongst team members. Interviewees also indicate that individual tasks also facilitate the expression of individuality.

Keywords: project teams in higher education, research assistant teams, team collaboration, team member autonomy

Procedia PDF Downloads 345
19542 The Analysis of Different Classes of Weighted Fuzzy Petri Nets and Their Features

Authors: Yurii Bloshko, Oksana Olar

Abstract:

This paper presents the analysis of 6 different classes of Petri nets: fuzzy Petri nets (FPN), generalized fuzzy Petri nets (GFPN), parameterized fuzzy Petri nets (PFPN), T2GFPN, flexible generalized fuzzy Petri nets (FGFPN), binary Petri nets (BPN). These classes were simulated in the special software PNeS® for the analysis of its pros and cons on the example of models which are dedicated to the decision-making process of passenger transport logistics. The paper includes the analysis of two approaches: when input values are filled with the experts’ knowledge; when fuzzy expectations represented by output values are added to the point. These approaches fulfill the possibilities of triples of functions which are replaced with different combinations of t-/s-norms.

Keywords: fuzzy petri net, intelligent computational techniques, knowledge representation, triangular norms

Procedia PDF Downloads 126
19541 The Extended Skew Gaussian Process for Regression

Authors: M. T. Alodat

Abstract:

In this paper, we propose a generalization to the Gaussian process regression(GPR) model called the extended skew Gaussian process for regression(ESGPr) model. The ESGPR model works better than the GPR model when the errors are skewed. We derive the predictive distribution for the ESGPR model at a new input. Also we apply the ESGPR model to FOREX data and we find that it fits the Forex data better than the GPR model.

Keywords: extended skew normal distribution, Gaussian process for regression, predictive distribution, ESGPr model

Procedia PDF Downloads 531
19540 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 196
19539 The Comparative Study of Binary Artifact Repository Managers

Authors: Evgeny Chugunnyy, Alena Gerasimova, Kirill Chernyavskiy, Alexander Krasnov

Abstract:

One of the primary component of Continuous deployment (CD) is a binary artifact repository — the place where artifacts are stored with metadata in a structured way. The binary artifact repository manager (BARM) is a software, which implements this repository logic and exposes a public application programming interface (API) for managing these artifacts. Almost every programming language ecosystem has its own artifact repository kind. During creating Artipie — BARM constructor and server, we analyzed and implemented a lot of different artifact repositories. In this paper we present criterias for comparing artifact repositories, and analyze the most popular repositories using these metrics. We also describe some of the notable features of different repositories. This paper aimed to help people who are creating, maintaining or optimizing software repository and CI tools.

Keywords: artifact, repository, continuous deployment, build automation, artifacts management

Procedia PDF Downloads 123
19538 A Tool to Represent People Approach to the Use of Pharmaceuticals and Related Criticality and Needs: A Territory Experience

Authors: Barbara Pittau, Piergiorgio Palla, Antonio Mastino

Abstract:

Communication is fundamental to health education. The proper use of medicinal products is a crucial aspect of the health of citizens that affects both safety and health care spending. Therefore, encouraging/promoting communication, concerning the importance of proper use of pharmaceuticals, has substantial implications in terms of individual health, health care, and health care system sustainability. In view of these considerations, in the context of two projects, one of which is still in progress, a relational database-backed web application named COLLABORAFARMACISOLA has been designed and developed as a tool to analyze and visualize how people approach the use of medicinal products, with the aim of improving and enhancing communication efficacy. The software application is being used to collect information (anonymously and voluntarily) from the citizens of Sardinia, an Italian region, regarding their knowledge, experiences, and opinions towards pharmaceuticals. This study that was conducted to date on thousand of interviewed people, has focused on different aspects such as: the treatment interruption and the "self-prescription” without medical consultation, the attention paid to reading the leaflets, the awareness of the economic value of the pharmaceuticals, the importance of avoiding the waste of medicinal products and the attitudes towards the use of generics. To this purpose, our software application provides a set of ad hoc parsing routines, to store information into the structure of a relational database and to process and visualize it through a set of interactive tools aimed to emphasize the findings and the insights obtained. The results of our preliminary analysis show the efficacy of the awareness plan and, at the same time, the criticality and the needs of the territory under examination. The ultimate goal of our study is to provide a contribution to the community by improving communication that can result in a benefit for public health in a context strictly connected to the reality of the territory.

Keywords: communication, pharmaceuticals, public health, relational database, tool, web application

Procedia PDF Downloads 120
19537 Simulation of the Reactive Rotational Molding Using Smoothed Particle Hydrodynamics

Authors: A. Hamidi, S. Khelladi, L. Illoul, A. Tcharkhtchi

Abstract:

Reactive rotational molding (RRM) is a process to manufacture hollow plastic parts with reactive material has several advantages compared to conventional roto molding of thermoplastic powders: process cycle time is shorter; raw material is less expensive because polymerization occurs during processing and high-performance polymers may be used such as thermosets, thermoplastics or blends. However, several phenomena occur during this process which makes the optimization of the process quite complex. In this study, we have used a mixture of isocyanate and polyol as a reactive system. The chemical transformation of this system to polyurethane has been studied by thermal analysis and rheology tests. Thanks to these results of the curing process and rheological measurements, the kinetic and rheokinetik of polyurethane was identified. Smoothed Particle Hydrodynamics, a Lagrangian meshless method, was chosen to simulate reactive fluid flow in 2 and 3D configurations of the polyurethane during the process taking into account the chemical, and chemiorehological results obtained experimentally in this study.

Keywords: reactive rotational molding, simulation, smoothed particle hydrodynamics, surface tension, rheology, free surface flows, viscoelastic, interpolation

Procedia PDF Downloads 272
19536 Efficiency of Membrane Distillation to Produce Fresh Water

Authors: Sabri Mrayed, David Maccioni, Greg Leslie

Abstract:

Seawater desalination has been accepted as one of the most effective solutions to the growing problem of a diminishing clean drinking water supply. Currently, two desalination technologies dominate the market – the thermally driven multi-stage flash distillation (MSF) and the membrane based reverse osmosis (RO). However, in recent years membrane distillation (MD) has emerged as a potential alternative to the established means of desalination. This research project intended to determine the viability of MD as an alternative process to MSF and RO for seawater desalination. Specifically the project involves conducting a thermodynamic analysis of the process based on the second law of thermodynamics to determine the efficiency of the MD. Data was obtained from experiments carried out on a laboratory rig. In order to determine exergy values required for the exergy analysis, two separate models were built in Engineering Equation Solver – the ’Minimum Separation Work Model’ and the ‘Stream Exergy Model’. The efficiency of MD process was found to be 17.3 %, and the energy consumption was determined to be 4.5 kWh to produce one cubic meter of fresh water. The results indicate MD has potential as a technique for seawater desalination compared to RO and MSF. However, it was shown that this was only the case if an alternate energy source such as green or waste energy was available to provide the thermal energy input to the process. If the process was required to power itself, it was shown to be highly inefficient and in no way thermodynamically viable as a commercial desalination process.

Keywords: desalination, exergy, membrane distillation, second law efficiency

Procedia PDF Downloads 341
19535 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 474
19534 Buoyancy Effects in Pressure Retarded Osmosis with Extremely High Draw Solution Concentration

Authors: Ivonne Tshuma, Ralf Cord-Ruwisch, Wendell Ela

Abstract:

Water crisis is a world-wide problem because of population growth and climate change. Hence, desalination is a solution to water scarcity, which threatens the world. Reverse osmosis (RO) is the most used technique for desalination; unfortunately, this process, usually requires high-pressure requirement hence requires a lot of energy about 3 – 5.5 KWhr/m³ of electrical energy. The pressure requirements of RO can be alleviated by the use of PRO (pressure retarded osmosis) to drive the RO process. This paper proposes a process of utilizing the energy directly from PRO to drive an RO process. The paper mostly analyses the PRO process parameters such as cross-flow velocity, density, and buoyancy and how these have an effect on PRO hence ultimately the RO process. The experimental study of the PRO with various feed solution concentrations and cross-flow velocities at fixed applied pressure with different orientations of the PRO cell was performed. The study revealed that without cross-flow velocity, buoyancy effects were observed but not with cross-flow velocity.

Keywords: cross-flow velocity, pressure retarded osmosis, density, buoyancy

Procedia PDF Downloads 127
19533 Automatic Algorithm for Processing and Analysis of Images from the Comet Assay

Authors: Yeimy L. Quintana, Juan G. Zuluaga, Sandra S. Arango

Abstract:

The comet assay is a method based on electrophoresis that is used to measure DNA damage in cells and has shown important results in the identification of substances with a potential risk to the human population as innumerable physical, chemical and biological agents. With this technique is possible to obtain images like a comet, in which the tail of these refers to damaged fragments of the DNA. One of the main problems is that the image has unequal luminosity caused by the fluorescence microscope and requires different processing to condition it as well as to know how many optimal comets there are per sample and finally to perform the measurements and determine the percentage of DNA damage. In this paper, we propose the design and implementation of software using Image Processing Toolbox-MATLAB that allows the automation of image processing. The software chooses the optimum comets and measuring the necessary parameters to detect the damage.

Keywords: artificial vision, comet assay, DNA damage, image processing

Procedia PDF Downloads 283
19532 A Model for Analyzing the Startup Dynamics of a Belt Transmission Driven by a DC Motor

Authors: Giovanni Incerti

Abstract:

In this paper the dynamic behavior of a synchronous belt drive during start-up is analyzed and discussed. Besides considering the belt elasticity, the mathematical model here proposed also takes into consideration the electrical behaviour of the DC motor. The solution of the motion equations is obtained by means of the modal analysis in state space, which allows to obtain the decoupling of all equations of the mathematical model without introducing the hypothesis of proportional damping. The mathematical model of the transmission and the solution algorithms have been implemented within a computing software that allows the user to simulate the dynamics of the system and to evaluate the effects due to the elasticity of the belt branches and to the electromagnetic behavior of the DC motor. In order to show the details of the calculation procedure, the paper presents a case study developed with the aid of the abovementioned software.

Keywords: belt drive, vibrations, startup, DC motor

Procedia PDF Downloads 554
19531 A Regression Model for Predicting Sugar Crystal Size in a Fed-Batch Vacuum Evaporative Crystallizer

Authors: Sunday B. Alabi, Edikan P. Felix, Aniediong M. Umo

Abstract:

Crystal size distribution is of great importance in the sugar factories. It determines the market value of granulated sugar and also influences the cost of production of sugar crystals. Typically, sugar is produced using fed-batch vacuum evaporative crystallizer. The crystallization quality is examined by crystal size distribution at the end of the process which is quantified by two parameters: the average crystal size of the distribution in the mean aperture (MA) and the width of the distribution of the coefficient of variation (CV). Lack of real-time measurement of the sugar crystal size hinders its feedback control and eventual optimisation of the crystallization process. An attractive alternative is to use a soft sensor (model-based method) for online estimation of the sugar crystal size. Unfortunately, the available models for sugar crystallization process are not suitable as they do not contain variables that can be measured easily online. The main contribution of this paper is the development of a regression model for estimating the sugar crystal size as a function of input variables which are easy to measure online. This has the potential to provide real-time estimates of crystal size for its effective feedback control. Using 7 input variables namely: initial crystal size (Lo), temperature (T), vacuum pressure (P), feed flowrate (Ff), steam flowrate (Fs), initial super-saturation (S0) and crystallization time (t), preliminary studies were carried out using Minitab 14 statistical software. Based on the existing sugar crystallizer models, and the typical ranges of these 7 input variables, 128 datasets were obtained from a 2-level factorial experimental design. These datasets were used to obtain a simple but online-implementable 6-input crystal size model. It seems the initial crystal size (Lₒ) does not play a significant role. The goodness of the resulting regression model was evaluated. The coefficient of determination, R² was obtained as 0.994, and the maximum absolute relative error (MARE) was obtained as 4.6%. The high R² (~1.0) and the reasonably low MARE values are an indication that the model is able to predict sugar crystal size accurately as a function of the 6 easy-to-measure online variables. Thus, the model can be used as a soft sensor to provide real-time estimates of sugar crystal size during sugar crystallization process in a fed-batch vacuum evaporative crystallizer.

Keywords: crystal size, regression model, soft sensor, sugar, vacuum evaporative crystallizer

Procedia PDF Downloads 192
19530 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime

Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.

Keywords: data fusion, round types speed hump, speed hump detection, surface filter

Procedia PDF Downloads 495
19529 Optimizing the Passenger Throughput at an Airport Security Checkpoint

Authors: Kun Li, Yuzheng Liu, Xiuqi Fan

Abstract:

High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.

Keywords: queue theory, security check, stochatic process, Monte Carlo simulation

Procedia PDF Downloads 185
19528 A Multi-Agent System for Accelerating the Delivery Process of Clinical Diagnostic Laboratory Results Using GSM Technology

Authors: Ayman M. Mansour, Bilal Hawashin, Hesham Alsalem

Abstract:

Faster delivery of laboratory test results is one of the most noticeable signs of good laboratory service and is often used as a key performance indicator of laboratory performance. Despite the availability of technology, the delivery time of clinical laboratory test results continues to be a cause of customer dissatisfaction which makes patients feel frustrated and they became careless to get their laboratory test results. The Medical Clinical Laboratory test results are highly sensitive and could harm patients especially with the severe case if they deliver in wrong time. Such results affect the treatment done by physicians if arrived at correct time efforts should, therefore, be made to ensure faster delivery of lab test results by utilizing new trusted, Robust and fast system. In this paper, we proposed a distributed Multi-Agent System to enhance and faster the process of laboratory test results delivery using SMS. The developed system relies on SMS messages because of the wide availability of GSM network comparing to the other network. The software provides the capability of knowledge sharing between different units and different laboratory medical centers. The system was built using java programming. To implement the proposed system we had many possible techniques. One of these is to use the peer-to-peer (P2P) model, where all the peers are treated equally and the service is distributed among all the peers of the network. However, for the pure P2P model, it is difficult to maintain the coherence of the network, discover new peers and ensure security. Also, security is a quite important issue since each node is allowed to join the network without any control mechanism. We thus take the hybrid P2P model, a model between the Client/Server model and the pure P2P model using GSM technology through SMS messages. This model satisfies our need. A GUI has been developed to provide the laboratory staff with the simple and easy way to interact with the system. This system provides quick response rate and the decision is faster than the manual methods. This will save patients life.

Keywords: multi-agent system, delivery process, GSM technology, clinical laboratory results

Procedia PDF Downloads 234
19527 A Novel Microcontroller Based Islanding Protection of Distributed Generation Systems

Authors: Saeid Jalilzadeh, Majid Pakdel

Abstract:

The customer demand for better power quality and higher reliability has forced the power industry to use distributed generations (DGs) such as wind power and photo voltaic arrays. Islanding is a phenomenon occurs when a power grid becomes electrically isolated from the power system and the distribution system is energized by distributed generators. It is necessary to disconnect all distributed generators immediately after islanding occurrence. Therefore a DG system should have the capability to detect islanding phenomena. In this paper, a novel micro controller based relay for anti-islanding protection of a typical DG system is proposed. The simulation results using Proteus software verify the proper operation and effectiveness of the proposed protective relay.

Keywords: islanding, distributed generation (DG), protective relay, micro controller, proteus software

Procedia PDF Downloads 555
19526 A Case Study on Theme-Based Approach in Health Technology Engineering Education: Customer Oriented Software Applications

Authors: Mikael Soini, Kari Björn

Abstract:

Metropolia University of Applied Sciences (MUAS) Information and Communication Technology (ICT) Degree Programme provides full-time Bachelor-level undergraduate studies. ICT Degree Programme has seven different major options; this paper focuses on Health Technology. In Health Technology, a significant curriculum change in 2014 enabled transition from fragmented curriculum including dozens of courses to a new integrated curriculum built around three 30 ECTS themes. This paper focuses especially on the second theme called Customer Oriented Software Applications. From students’ point of view, the goal of this theme is to get familiar with existing health related ICT solutions and systems, understand business around health technology, recognize social and healthcare operating principles and services, and identify customers and users and their special needs and perspectives. This also acts as a background for health related web application development. Built web application is tested, developed and evaluated with real users utilizing versatile user centred development methods. This paper presents experiences obtained from the first implementation of Customer Oriented Software Applications theme. Student feedback was gathered with two questionnaires, one in the middle of the theme and other at the end of the theme. Questionnaires had qualitative and quantitative parts. Similar questionnaire was implemented in the first theme; this paper evaluates how the theme-based integrated curriculum has progressed in Health Technology major by comparing results between theme 1 and 2. In general, students were satisfied for the implementation, timing and synchronization of the courses, and the amount of work. However there is still room for development. Student feedback and teachers’ observations have been and will be used to develop the content and operating principles of the themes and whole curriculum.

Keywords: engineering education, integrated curriculum, learning and teaching methods, learning experience

Procedia PDF Downloads 302
19525 Automated CNC Part Programming and Process Planning for Turned Components

Authors: Radhey Sham Rajoria

Abstract:

Pressure to increase the competitiveness in the manufacturing sector and for the survival in the market has led to the development of machining centres, which enhance productivity, improve quality, shorten the lead time, and reduce the manufacturing cost. With the innovation of machining centres in the manufacturing sector the production lines have been replaced by these machining centers, having the ability to machine various processes and multiple tooling with automatic tool changer (ATC) for the same part. Also the process plans can be easily generated for complex components. Some means are required to utilize the machining center at its best. The present work is concentrated on the automated part program generation, and in turn automated process plan generation for the turned components on Denford “MIRAC” 8 stations ATC lathe machining centre. A package in C++ on DOS platform is developed which generates the complete CNC part program, process plan and process sequence for the turned components. The input to this system is in the form of a blueprint in graphical format with machining parameters and variables, and the output is the CNC part program which is stored in a .mir file, ready for execution on the machining centre.

Keywords: CNC, MIRAC, ATC, process planning

Procedia PDF Downloads 250
19524 Application of Lean Six Sigma Tools to Minimize Time and Cost in Furniture Packaging

Authors: Suleiman Obeidat, Nabeel Mandahawi

Abstract:

In this work, the packaging process for a move is improved. The customers of this move need their household stuff to be moved from their current house to the new one with minimum damage, in an organized manner, on time and with the minimum cost. Our goal was to improve the process between 10% and 20% time efficiency, 90% reduction in damaged parts and an acceptable improvement in the cost of the total move process. The expected ROI was 833%. Many improvement techniques have been used in terms of the way the boxes are prepared, their preparation cost, packing the goods, labeling them and moving them to a place for moving out. DMAIC technique is used in this work: SIPOC diagram, value stream map of “As Is” process, Root Cause Analysis, Maps of “Future State” and “Ideal State” and an Improvement Plan. A value of ROI=624% is obtained which is lower than the expected value of 833%. The work explains the techniques of improvement and the deficiencies in the old process.

Keywords: packaging, lean tools, six sigma, DMAIC methodology, SIPOC

Procedia PDF Downloads 411
19523 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials

Authors: Simone Huber, Bianca Schnalzer, Baptiste Alcalde, Sten Hanke, Lampros Mpaltadoros, Thanos G. Stavropoulos, Spiros Nikolopoulos, Ioannis Kompatsiaris, Lina Pérez- Breva, Vallivana Rodrigo-Casares, Jaime Fons-Martínez, Jeroen de Bruin

Abstract:

Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depend on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.

Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability

Procedia PDF Downloads 193
19522 EMI Radiation Prediction and Final Measurement Process Optimization by Neural Network

Authors: Hussam Elias, Ninovic Perez, Holger Hirsch

Abstract:

The completion of the EMC regulations worldwide is growing steadily as the usage of electronics in our daily lives is increasing more than ever. In this paper, we introduce a novel method to perform the final phase of Electromagnetic compatibility (EMC) measurement and to reduce the required test time according to the norm EN 55032 by using a developed tool and the conventional neural network(CNN). The neural network was trained using real EMC measurements, which were performed in the Semi Anechoic Chamber (SAC) by CETECOM GmbH in Essen, Germany. To implement our proposed method, we wrote software to perform the radiated electromagnetic interference (EMI) measurements and use the CNN to predict and determine the position of the turntable that meets the maximum radiation value.

Keywords: conventional neural network, electromagnetic compatibility measurement, mean absolute error, position error

Procedia PDF Downloads 180