Search results for: VTSA process modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16471

Search results for: VTSA process modelling

15541 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML

Procedia PDF Downloads 125
15540 A Process FMEA in Aero Fuel Pump Manufacturing and Conduct the Corrective Actions

Authors: Zohre Soleymani, Meisam Amirzadeh

Abstract:

Many products are safety critical, so proactive analysis techniques are vital for them because these techniques try to identify potential failures before the products are produced. Failure Mode and Effective Analysis (FMEA) is an effective tool in identifying probable problems of product or process and prioritizing them and planning for its elimination. The paper shows the implementation of FMEA process to identify and remove potential troubles of aero fuel pumps manufacturing process and improve the reliability of subsystems. So the different possible causes of failure and its effects along with the recommended actions are discussed. FMEA uses Risk Priority Number (RPN) to determine the risk level. RPN value is depending on Severity(S), Occurrence (O) and Detection (D) parameters, so these parameters need to be determined. After calculating the RPN for identified potential failure modes, the corrective actions are defined to reduce risk level according to assessment strategy and determined acceptable risk level. Then FMEA process is performed again and RPN revised is calculated. The represented results are applied in the format of a case study. These results show the improvement in manufacturing process and considerable reduction in aero fuel pump production risk level.

Keywords: FMEA, risk priority number, aero pump, corrective action

Procedia PDF Downloads 280
15539 Jointly Optimal Statistical Process Control and Maintenance Policy for Deteriorating Processes

Authors: Lucas Paganin, Viliam Makis

Abstract:

With the advent of globalization, the market competition has become a major issue for most companies. One of the main strategies to overcome this situation is the quality improvement of the product at a lower cost to meet customers’ expectations. In order to achieve the desired quality of products, it is important to control the process to meet the specifications, and to implement the optimal maintenance policy for the machines and the production lines. Thus, the overall objective is to reduce process variation and the production and maintenance costs. In this paper, an integrated model involving Statistical Process Control (SPC) and maintenance is developed to achieve this goal. Therefore, the main focus of this paper is to develop the jointly optimal maintenance and statistical process control policy minimizing the total long run expected average cost per unit time. In our model, the production process can go out of control due to either the deterioration of equipment or other assignable causes. The equipment is also subject to failures in any of the operating states due to deterioration and aging. Hence, the process mean is controlled by an Xbar control chart using equidistant sampling epochs. We assume that the machine inspection epochs are the times when the control chart signals an out-of-control condition, considering both true and false alarms. At these times, the production process will be stopped, and an investigation will be conducted not only to determine whether it is a true or false alarm, but also to identify the causes of the true alarm, whether it was caused by the change in the machine setting, by other assignable causes, or by both. If the system is out of control, the proper actions will be taken to bring it back to the in-control state. At these epochs, a maintenance action can be taken, which can be no action, or preventive replacement of the unit. When the equipment is in the failure state, a corrective maintenance action is performed, which can be minimal repair or replacement of the machine and the process is brought to the in-control state. SMDP framework is used to formulate and solve the joint control problem. Numerical example is developed to demonstrate the effectiveness of the control policy.

Keywords: maintenance, semi-Markov decision process, statistical process control, Xbar control chart

Procedia PDF Downloads 85
15538 Optimization Technique for the Contractor’s Portfolio in the Bidding Process

Authors: Taha Anjamrooz, Sareh Rajabi, Salwa Bheiry

Abstract:

Selection between the available projects in bidding processes for the contractor is one of the essential areas to concentrate on. It is important for the contractor to choose the right projects within its portfolio during the tendering stage based on certain criteria. It should align the bidding process with its origination strategies and goals as a screening process to have the right portfolio pool to start with. Secondly, it should set the proper framework and use a suitable technique in order to optimize its selection process for concertation purpose and higher efforts during the tender stage with goals of success and winning. In this research paper, a two steps framework proposed to increase the efficiency of the contractor’s bidding process and the winning chance of getting the new projects awarded. In this framework, initially, all the projects pass through the first stage screening process, in which the portfolio basket will be evaluated and adjusted in accordance with the organization strategies to the reduced version of the portfolio pool, which is in line with organization activities. In the second stage, the contractor uses linear programming to optimize the portfolio pool based on available resources such as manpower, light equipment, heavy equipment, financial capability, return on investment, and success rate of winning the bid. Therefore, this optimization model will assist the contractor in utilizing its internal resource to its maximum and increase its winning chance for the new project considering past experience with clients, built-relation between two parties, and complexity in the exertion of the projects. The objective of this research will be to increase the contractor's winning chance in the bidding process based on the success rate and expected return on investment.

Keywords: bidding process, internal resources, optimization, contracting portfolio management

Procedia PDF Downloads 140
15537 Sustainable Dyeing of Cotton and Polyester Blend Fabric without Reduction Clearing

Authors: Mohammad Tofayel Ahmed, Seung Kook An

Abstract:

In contemporary research world, focus is more set on sustainable products and innovative processes. The global textile industries are putting tremendous effort to achieve a balance between economic development and ecological protection concurrently. The conservation of water sources and environment have become immensely significant issue in textile dyeing production. Accordingly, an attempt has been taken in this study to develop a process to dye polyester blend cotton without reduction clearing process and any extra wash off chemical by simple modification aiming at cost reduction and sustainability. A widely used combination of 60/40 cotton/polyester (c/p) single jersey knitted fabric of 30’s, 180 g/m² was considered for study. Traditionally, pretreatment is done followed by polyester part dyeing, reduction clearing and cotton part dyeing for c/p blend dyeing. But in this study, polyester part is dyed right away followed by pretreatment process and cotton part dyeing by skipping the reduction clearing process diametrically. The dyed samples of both traditional and modified samples were scrutinized by various color fastness tests, dyeing parameters and by consumption of water, steam, power, process time and total batch cost. The modified process in this study showed no necessity of reduction clearing process for polyester blend cotton dyeing. The key issue contributing to avoid the reduction clearing after polyester part dyeing has been the multifunctional effect of NaOH and H₂O₂ while pretreatment of cotton after polyester part dyeing. The results also revealed that the modified process could reduce the consumption of water, steam, power, time and cost remarkably. The bulk trial of modified process demonstrated the well exploitability to dye polyester blend cotton substrate ensuring all fastness and dyeing properties regardless of dyes category, blend ratio, color, and shade percentage thus making the process sustainable, eco-friendly and economical. Furthermore, the proposed method could be applicable to any cellulosic blend with polyester.

Keywords: cotton, dyeing, economical, polyester

Procedia PDF Downloads 184
15536 Impact of Tablet Based Learning on Continuous Assessment (ESPRIT Smart School Framework)

Authors: Mehdi Attia, Sana Ben Fadhel, Lamjed Bettaieb

Abstract:

Mobile technology has become a part of our daily lives and assist learners (despite their level and age) in their leaning process using various apparatus and mobile devices (laptop, tablets, etc.). This paper presents a new learning framework based on tablets. This solution has been developed and tested in ESPRIT “Ecole Supérieure Privée d’Igénieurie et de Technologies”, a Tunisian school of engineering. This application is named ESSF: Esprit Smart School Framework. In this work, the main features of the proposed solution are listed, particularly its impact on the learners’ evaluation process. Learner’s assessment has always been a critical component of the learning process as it measures students’ knowledge. However, traditional evaluation methods in which the learner is evaluated once or twice each year cannot reflect his real level. This is why a continuous assessment (CA) process becomes necessary. In this context we have proved that ESSF offers many important features that enhance and facilitate the implementation of the CA process.

Keywords: continuous assessment, mobile learning, tablet based learning, smart school, ESSF

Procedia PDF Downloads 332
15535 Audit Is a Production Performance Tool

Authors: Lattari Samir

Abstract:

The performance of a production process is the result of proper operation where the management tools appear as the key to success through process management which consists of managing and implementing a quality policy, organizing and planning the manufacturing, and thus defining an efficient logic as the main areas covered by production management. To carry out this delicate mission, which requires reconciling often contradictory objectives, the auditor is called upon, who must be able to express an opinion on the effectiveness of the operation of the "production" function. To do this, the auditor must structure his mission in three phases, namely, the preparation phase to assimilate the particularities of this function, the implementation phase and the conclusion phase. The audit is a systematic and independent examination of all the stages of a manufacturing process intended to determine whether the pre-established arrangements for the combination of production factors are respected, whether their implementation is effective and whether they are relevant in relation to the goals.

Keywords: audit, performance of process, independent examination, management tools, audit of accounts

Procedia PDF Downloads 68
15534 End To End Process to Automate Batch Application

Authors: Nagmani Lnu

Abstract:

Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.

Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing

Procedia PDF Downloads 57
15533 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process

Authors: Hong-Ming Chen

Abstract:

This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.

Keywords: optimization, interest rate model, jump process, deterministic

Procedia PDF Downloads 159
15532 How Envisioning Process Is Constructed: An Exploratory Research Comparing Three International Public Televisions

Authors: Alexandre Bedard, Johane Brunet, Wendellyn Reid

Abstract:

Public Television is constantly trying to maintain and develop its audience. And to achieve those goals, it needs a strong and clear vision. Vision or envision is a multidimensional process; it is simultaneously a conduit that orients and fixes the future, an idea that comes before the strategy and a mean by which action is accomplished, from a business perspective. Also, vision is often studied from a prescriptive and instrumental manner. Based on our understanding of the literature, we were able to explain how envisioning, as a process, is a creative one; it takes place in the mind and uses wisdom and intelligence through a process of evaluation, analysis and creation. Through an aggregation of the literature, we build a model of the envisioning process, based on past experiences, perceptions and knowledge and influenced by the context, being the individual, the organization and the environment. With exploratory research in which vision was deciphered through the discourse, through a qualitative and abductive approach and a grounded theory perspective, we explored three extreme cases, with eighteen interviews with experts, leaders, politicians, actors of the industry, etc. and more than twenty hours of interviews in three different countries. We compared the strategy, the business model, and the political and legal forces. We also looked at the history of each industry from an inertial point of view. Our analysis of the data revealed that a legitimacy effect due to the audience, the innovation and the creativity of the institutions was at the cornerstone of what would influence the envisioning process. This allowed us to identify how different the process was for Canadian, French and UK public broadcasters, although we concluded that the three of them had a socially constructed vision for their future, based on stakeholder management and an emerging role for the managers: ideas brokers.

Keywords: envisioning process, international comparison, television, vision

Procedia PDF Downloads 129
15531 Biomimetic Paradigms in Architectural Conceptualization: Science, Technology, Engineering, Arts and Mathematics in Higher Education

Authors: Maryam Kalkatechi

Abstract:

The application of algorithms in architecture has been realized as geometric forms which are increasingly being used by architecture firms. The abstraction of ideas in a formulated algorithm is not possible. There is still a gap between design innovation and final built in prescribed formulas, even the most aesthetical realizations. This paper presents the application of erudite design process to conceptualize biomimetic paradigms in architecture. The process is customized to material and tectonics. The first part of the paper outlines the design process elements within four biomimetic pre-concepts. The pre-concepts are chosen from plants family. These include the pine leaf, the dandelion flower; the cactus flower and the sun flower. The choice of these are related to material qualities and natural pattern of the tectonics of these plants. It then focuses on four versions of tectonic comprehension of one of the biomimetic pre-concepts. The next part of the paper discusses the implementation of STEAM in higher education in architecture. This is shown by the relations within the design process and the manifestation of the thinking processes. The A in the SETAM, in this case, is only achieved by the design process, an engaging event as a performing arts, in which the conceptualization and development is realized in final built.

Keywords: biomimetic paradigm, erudite design process, tectonic, STEAM (Science, Technology, Engineering, Arts, Mathematic)

Procedia PDF Downloads 206
15530 Bleeding-Heart Altruists and Calculating Utilitarians: Applying Process Dissociation to Self-sacrificial Dilemmas

Authors: David Simpson, Kyle Nash

Abstract:

There is considerable evidence linking slow, deliberative reasoning (system 2) with utilitarian judgments in dilemmas involving the sacrificing of another person for the greater good (other-sacrificial dilemmas). Joshua Greene has argued, based on this kind of evidence, that system 2 drives utilitarian judgments. However, the evidence on whether system 2 is associated with utilitarian judgments in self-sacrificial dilemmas is more mixed. We employed process dissociation to measure a self-sacrificial utilitarian (SU) parameter and an other-sacrificial (OU) utilitarian parameter. It was initially predicted that contra Greene, the cognitive reflection test (CRT) would only be positively correlated with the OU parameter and not the SU parameter. However, Greene’s hypothesis was corroborated: the CRT positively correlated with both the OU parameter and the SU parameter. By contrast, the CRT did not correlate with the other two moral parameters we extracted (altruism and deontology).

Keywords: dual-process model, utilitarianism, altruism, reason, emotion, process dissociation

Procedia PDF Downloads 149
15529 Polymer Mixing in the Cavity Transfer Mixer

Authors: Giovanna Grosso, Martien A. Hulsen, Arash Sarhangi Fard, Andrew Overend, Patrick. D. Anderson

Abstract:

In many industrial applications and, in particular in polymer industry, the quality of mixing between different materials is fundamental to guarantee the desired properties of finished products. However, properly modelling and understanding polymer mixing often presents noticeable difficulties, because of the variety and complexity of the physical phenomena involved. This is the case of the Cavity Transfer Mixer (CTM), for which a clear understanding of mixing mechanisms is still missing, as well as clear guidelines for the system optimization. This device, invented and patented by Gale at Rapra Technology Limited, is an add-on to be mounted downstream of existing extruders, in order to improve distributive mixing. It consists of two concentric cylinders, the rotor and stator, both provided with staggered rows of hemispherical cavities. The inner cylinder (rotor) rotates, while the outer (stator) remains still. At the same time, the pressure load imposed upstream, pushes the fluid through the CTM. Mixing processes are driven by the flow field generated by the complex interaction between the moving geometry, the imposed pressure load and the rheology of the fluid. In such a context, the present work proposes a complete and accurate three dimensional modelling of the CTM and results of a broad range of simulations assessing the impact on mixing of several geometrical and functioning parameters. Among them, we find: the number of cavities per row, the number of rows, the size of the mixer, the rheology of the fluid and the ratio between the rotation speed and the fluid throughput. The model is composed of a flow part and a mixing part: a finite element solver computes the transient velocity field, which is used in the mapping method implementation in order to simulate the concentration field evolution. Results of simulations are summarized in guidelines for the device optimization.

Keywords: Mixing, non-Newtonian fluids, polymers, rheology.

Procedia PDF Downloads 375
15528 Using Hierarchical Modelling to Understand the Role of Plantations in the Abundance of Koalas, Phascolarctos cinereus

Authors: Kita R. Ashman, Anthony R. Rendall, Matthew R. E. Symonds, Desley A. Whisson

Abstract:

Forest cover is decreasing globally, chiefly due to the conversion of forest to agricultural landscapes. In contrast, the area under plantation forestry is increasing significantly. For wildlife occupying landscapes where native forest is the dominant land cover, plantations generally represent a lower value habitat; however, plantations established on land formerly used for pasture may benefit wildlife by providing temporary forest habitat and increasing connectivity. This study investigates the influence of landscape, site, and climatic factors on koala population density in far south-west Victoria where there has been extensive plantation establishment. We conducted koala surveys and habitat characteristic assessments at 72 sites across three habitat types: plantation, native vegetation blocks, and native vegetation strips. We employed a hierarchical modeling framework for estimating abundance and constructed candidate multinomial N-mixture models to identify factors influencing the abundance of koalas. We detected higher mean koala density in plantation sites (0.85 per ha) than in either native block (0.68 per ha) or native strip sites (0.66 per ha). We found five covariates of koala density and using these variables, we spatially modeled koala abundance and discuss factors that are key in determining large-scale distribution and density of koala populations. We provide a distribution map that can be used to identify high priority areas for population management as well as the habitat of high conservation significance for koalas. This information facilitates the linkage of ecological theory with the on-ground implementation of management actions and may guide conservation planning and resource management actions to consider overall landscape configuration as well as the spatial arrangement of plantations adjacent to the remnant forest.

Keywords: abundance modelling, arboreal mammals plantations, wildlife conservation

Procedia PDF Downloads 114
15527 Spatial Point Process Analysis of Dengue Fever in Tainan, Taiwan

Authors: Ya-Mei Chang

Abstract:

This research is intended to apply spatio-temporal point process methods to the dengue fever data in Tainan. The spatio-temporal intensity function of the dataset is assumed to be separable. The kernel estimation is a widely used approach to estimate intensity functions. The intensity function is very helpful to study the relation of the spatio-temporal point process and some covariates. The covariate effects might be nonlinear. An nonparametric smoothing estimator is used to detect the nonlinearity of the covariate effects. A fitted parametric model could describe the influence of the covariates to the dengue fever. The correlation between the data points is detected by the K-function. The result of this research could provide useful information to help the government or the stakeholders making decisions.

Keywords: dengue fever, spatial point process, kernel estimation, covariate effect

Procedia PDF Downloads 345
15526 Impact of Interface Soil Layer on Groundwater Aquifer Behaviour

Authors: Hayder H. Kareem, Shunqi Pan

Abstract:

The geological environment where the groundwater is collected represents the most important element that affects the behaviour of groundwater aquifer. As groundwater is a worldwide vital resource, it requires knowing the parameters that affect this source accurately so that the conceptualized mathematical models would be acceptable to the broadest ranges. Therefore, groundwater models have recently become an effective and efficient tool to investigate groundwater aquifer behaviours. Groundwater aquifer may contain aquitards, aquicludes, or interfaces within its geological formations. Aquitards and aquicludes have geological formations that forced the modellers to include those formations within the conceptualized groundwater models, while interfaces are commonly neglected from the conceptualization process because the modellers believe that the interface has no effect on aquifer behaviour. The current research highlights the impact of an interface existing in a real unconfined groundwater aquifer called Dibdibba, located in Al-Najaf City, Iraq where it has a river called the Euphrates River that passes through the eastern part of this city. Dibdibba groundwater aquifer consists of two types of soil layers separated by an interface soil layer. A groundwater model is built for Al-Najaf City to explore the impact of this interface. Calibration process is done using PEST 'Parameter ESTimation' approach and the best Dibdibba groundwater model is obtained. When the soil interface is conceptualized, results show that the groundwater tables are significantly affected by that interface through appearing dry areas of 56.24 km² and 6.16 km² in the upper and lower layers of the aquifer, respectively. The Euphrates River will also leak water into the groundwater aquifer of 7359 m³/day. While these results are changed when the soil interface is neglected where the dry area became 0.16 km², the Euphrates River leakage became 6334 m³/day. In addition, the conceptualized models (with and without interface) reveal different responses for the change in the recharge rates applied on the aquifer through the uncertainty analysis test. The aquifer of Dibdibba in Al-Najaf City shows a slight deficit in the amount of water supplied by the current pumping scheme and also notices that the Euphrates River suffers from stresses applied to the aquifer. Ultimately, this study shows a crucial need to represent the interface soil layer in model conceptualization to be the intended and future predicted behaviours more reliable for consideration purposes.

Keywords: Al-Najaf City, groundwater aquifer behaviour, groundwater modelling, interface soil layer, Visual MODFLOW

Procedia PDF Downloads 179
15525 Lean Manufacturing Implementation in Fused Plastic Bags Industry

Authors: Tareq Issa

Abstract:

Lean manufacturing is concerned with the implementation of several tools and methodologies that aim for the continuous elimination of wastes throughout manufacturing process flow in the production system. This research addresses the implementation of lean principles and tools in a small-medium industry focusing on 'fused' plastic bags production company in Amman, Jordan. In this production operation, the major type of waste to eliminate include material, waiting-transportation, and setup wastes. The primary goal is to identify and implement selected lean strategies to eliminate waste in the manufacturing process flow. A systematic approach was used for the implementation of lean principles and techniques, through the application of Value Stream Mapping analysis. The current state value stream map was constructed to improve the plastic bags manufacturing process through identifying opportunities to eliminate waste and its sources. Also, the future-state value stream map was developed describing improvements in the overall manufacturing process resulting from eliminating wastes. The implementation of VSM, 5S, Kanban, Kaizen, and Reduced lot size methods have provided significant benefits and results. Productivity has increased to 95.4%, delivery schedule attained at 99-100%, reduction in total inventory to 1.4 days and the setup time for the melting process was reduced to about 30 minutes.

Keywords: lean implementation, plastic bags industry, value stream map, process flow

Procedia PDF Downloads 171
15524 The Using of Smart Power Concepts in Military Targeting Process

Authors: Serdal AKYUZ

Abstract:

The smart power is the use of soft and hard power together in consideration of existing circumstances. Soft power can be defined as the capability of changing perception of any target mass by employing policies based on legality. The hard power, generally, uses military and economic instruments which are the concrete indicator of general power comprehension. More than providing a balance between soft and hard power, smart power creates a proactive combination by assessing existing resources. Military targeting process (MTP), as stated in smart power methodology, benefits from a wide scope of lethal and non-lethal weapons to reach intended end state. The Smart powers components can be used in military targeting process similar to using of lethal or non-lethal weapons. This paper investigates the current use of Smart power concept, MTP and presents a new approach to MTP from smart power concept point of view.

Keywords: future security environment, hard power, military targeting process, soft power, smart power

Procedia PDF Downloads 469
15523 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array

Authors: Muhammad M. A. S. Mahmoud

Abstract:

Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.

Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control

Procedia PDF Downloads 466
15522 A Tool for Assessing Performance and Structural Quality of Business Process

Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri

Abstract:

Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.

Keywords: performance, structural quality, perspectives, tool, classification framework, measures

Procedia PDF Downloads 152
15521 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process

Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza

Abstract:

The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.

Keywords: artificial intelligence, harmonization, laws, intelligence

Procedia PDF Downloads 152
15520 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Authors: K. Adu Michael, K. Alese Boniface

Abstract:

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.

Keywords: client/customer, problem statement, requirements engineering, software developers

Procedia PDF Downloads 400
15519 Bioclimatic Niches of Endangered Garcinia indica Species on the Western Ghats: Predicting Habitat Suitability under Current and Future Climate

Authors: Malay K. Pramanik

Abstract:

In recent years, climate change has become a major threat and has been widely documented in the geographic distribution of many plant species. However, the impacts of climate change on the distribution of ecologically vulnerable medicinal species remain largely unknown. The identification of a suitable habitat for a species under climate change scenario is a significant step towards the mitigation of biodiversity decline. The study, therefore, aims to predict the impact of current, and future climatic scenarios on the distribution of the threatened Garcinia indica across the northern Western Ghats using Maximum Entropy (MaxEnt) modelling. The future projections were made for the year 2050 and 2070 with all Representative Concentration Pathways (RCPs) scenario (2.6, 4.5, 6.0, and 8.5) using 56 species occurrence data, and 19 bioclimatic predictors from the BCC-CSM1.1 model of the Intergovernmental Panel for Climate Change’s (IPCC) 5th assessment. The bioclimatic variables were minimised to a smaller number of variables after a multicollinearity test, and their contributions were assessed using jackknife test. The AUC value of 0.956 ± 0.023 indicates that the model performs with excellent accuracy. The study identified that temperature seasonality (39.5 ± 3.1%), isothermality (19.2 ± 1.6%), and annual precipitation (12.7 ± 1.7%) would be the major influencing variables in the current and future distribution. The model predicted 10.5% (19318.7 sq. km) of the study area as moderately to very highly suitable, while 82.60% (151904 sq. km) of the study area was identified as ‘unsuitable’ or ‘very low suitable’. Our predictions of climate change impact on habitat suitability suggest that there will be a drastic reduction in the suitability by 5.29% and 5.69% under RCP 8.5 for 2050 and 2070, respectively. Finally, the results signify that the model might be an effective tool for biodiversity protection, ecosystem management, and species re-habitation planning under future climate change scenarios.

Keywords: Garcinia Indica, maximum entropy modelling, climate change, MaxEnt, Western Ghats, medicinal plants

Procedia PDF Downloads 153
15518 Probing Multiple Relaxation Process in Zr-Cu Base Alloy Using Mechanical Spectroscopy

Authors: A. P. Srivastava, D. Srivastava, D. J. Browne

Abstract:

Relaxation dynamics of Zr44Cu40Al8Ag8 bulk metallic glass (BMG) has been probed using dynamic mechanical analyzer. The BMG sample was casted in the form of a plate of dimension 55 mm x 40 mm x 3 mm using tilt casting technique. X-ray diffraction and transmission electron microscope have been used for the microstructural characterization of as-cast BMG. For the mechanical spectroscopy study, samples in the form of a bar of size 55 mm X 2 mm X 3 mm were machined from the BMG plate. The mechanical spectroscopy was performed on dynamic mechanical analyzer (DMA) by 50 mm 3-point bending method in a nitrogen atmosphere. It was observed that two glass transition process were competing in supercooled liquid region around temperature 390°C and 430°C. The supercooled liquid state was completely characterized using DMA and differential scanning calorimeter (DSC). In addition to the main α-relaxation process, presence of β relaxation process around temperature 360°C; below the glass transition temperature was also observed. The β relaxation process could be described by Arrhenius law with the activation energy of 160 kJ/mole. The volume of the flow unit associated with this relaxation process has been estimated. The results from DMA study has been used to characterize the shear transformation zone in terms of activation volume and size. High fragility parameter value of 34 and higher activation volume indicates that this alloy could show good plasticity in supercooled liquid region. The possible mechanism for the relaxation processes has been discussed.

Keywords: DMA, glass transition, metallic glass, thermoplastic forming

Procedia PDF Downloads 292
15517 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing

Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall

Abstract:

Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.

Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear

Procedia PDF Downloads 291
15516 Rounded-off Measurements and Their Implication on Control Charts

Authors: Ran Etgar

Abstract:

The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.

Keywords: inaccurate measurement, SPC, statistical process control, rounded-off, control chart

Procedia PDF Downloads 31
15515 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing

Procedia PDF Downloads 276
15514 Systemic Functional Grammar Analysis of Barack Obama's Second Term Inaugural Speech

Authors: Sadiq Aminu, Ahmed Lamido

Abstract:

This research studies Barack Obama’s second inaugural speech using Halliday’s Systemic Functional Grammar (SFG). SFG is a text grammar which describes how language is used, so that the meaning of the text can be better understood. The primary source of data in this research work is Barack Obama’s second inaugural speech which was obtained from the internet. The analysis of the speech was based on the ideational and textual metafunctions of Systemic Functional Grammar. Specifically, the researcher analyses the Process Types and Participants (ideational) and the Theme/Rheme (textual). It was found that material process (process of doing) was the most frequently used ‘Process type’ and ‘We’ which refers to the people of America was the frequently used ‘Theme’. Application of the SFG theory, therefore, gives a better meaning to Barack Obama’s speech.

Keywords: ideational, metafunction, rheme, textual, theme

Procedia PDF Downloads 151
15513 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP

Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost

Abstract:

The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.

Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)

Procedia PDF Downloads 422
15512 Design and Implementation of LabVIEW Based Relay Autotuning Controller for Level Setup

Authors: Manoj M. Sarode, Sharad P. Jadhav, Mukesh D. Patil, Pushparaj S. Suryawanshi

Abstract:

Even though the PID controller is widely used in industrial process, tuning of PID parameters are not easy. It is a time consuming and requires expert people. Another drawback of PID controller is that process dynamics might change over time. This can happen due to variation of the process load, normal wear and tear etc. To compensate for process behavior change over time, expert users are required to recalibrate the PID gains. Implementation of model based controllers usually needs a process model. Identification of process model is time consuming job and no guaranty of model accuracy. If the identified model is not accurate, performance of the controller may degrade. Model based controllers are quite expensive and the whole procedure for the implementation is sometimes tedious. To eliminate such issues Autotuning PID controller becomes vital element. Software based Relay Feedback Autotuning Controller proves to be efficient, upgradable and maintenance free controller. In Relay Feedback Autotune controller PID parameters can be achieved with a very short span of time. This paper presents the real time implementation of LabVIEW based Relay Feedback Autotuning PID controller. It is successfully developed and implemented to control level of a laboratory setup. Its performance is analyzed for different setpoints and found satisfactorily.

Keywords: autotuning, PID, liquid level control, recalibrate, labview, controller

Procedia PDF Downloads 390