Search results for: robust process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15834

Search results for: robust process

15564 Evaluation of Mito-Uncoupler Induced Hyper Metabolic and Aggressive Phenotype in Glioma Cells

Authors: Yogesh Rai, Saurabh Singh, Sanjay Pandey, Dhananjay K. Sah, B. G. Roy, B. S. Dwarakanath, Anant N. Bhatt

Abstract:

One of the most common signatures of highly malignant gliomas is their capacity to metabolize more glucose to lactic acid than normal brain tissues, even under normoxic conditions (Warburg effect), indicating that aerobic glycolysis is constitutively upregulated through stable genetic or epigenetic changes. However, oxidative phosphorylation (OxPhos) is also required to maintain the mitochondrial membrane potential for tumor cell survival. In the process of tumorigenesis, tumor cells during fastest growth rate exhibit both high glycolytic and high OxPhos. Therefore, metabolically reprogrammed cancer cells with combination of both aerobic glycolysis and altered OxPhos develop a robust metabolic phenotype, which confers a selective growth advantage. In our study, we grew the high glycolytic BMG-1 (glioma) cells with continuous exposure of mitochondrial uncoupler 2, 4, dinitro phenol (DNP) for 10 passages to obtain a phenotype of high glycolysis with enhanced altered OxPhos. We found that OxPhos modified BMG (OPMBMG) cells has similar growth rate and cell cycle distribution but high mitochondrial mass and functional enzymatic activity than parental cells. In in-vitro studies, OPMBMG cells showed enhanced invasion, proliferation and migration properties. Moreover, it also showed enhanced angiogenesis in matrigel plug assay. Xenografted tumors from OPMBMG cells showed reduced latent period, faster growth rate and nearly five folds reduction in the tumor take in nude mice compared to BMG-1 cells, suggesting that robust metabolic phenotype facilitates tumor formation and growth. OPMBMG cells which were found radio-resistant, showed enhanced radio-sensitization by 2-DG as compared to the parental BMG-1 cells. This study suggests that metabolic reprogramming in cancer cells enhances the potential of migration, invasion and proliferation. It also strengthens the cancer cells to escape the death processes, conferring resistance to therapeutic modalities. Our data also suggest that combining metabolic inhibitors like 2-DG with conventional therapeutic modalities can sensitize such metabolically aggressive cancer cells more than the therapies alone.

Keywords: 2-DG, BMG, DNP, OPM-BMG

Procedia PDF Downloads 187
15563 Investigation on Machine Tools Energy Consumptions

Authors: Shiva Abdoli, Daniel T.Semere

Abstract:

Several researches have been conducted to study consumption of energy in cutting process. Most of these researches are focusing to measure the consumption and propose consumption reduction methods. In this work, the relation between the cutting parameters and the consumption is investigated in order to establish a generalized energy consumption model that can be used for process and production planning in real production lines. Using the generalized model, the process planning will be carried out by taking into account the energy as a function of the selected process parameters. Similarly, the generalized model can be used in production planning to select the right operational parameters like batch sizes, routing, buffer size, etc. in a production line. The description and derivation of the model as well as a case study are given in this paper to illustrate the applicability and validity of the model.

Keywords: process parameters, cutting process, energy efficiency, Material Removal Rate (MRR)

Procedia PDF Downloads 468
15562 Stock Price Informativeness and Profit Warnings: Empirical Analysis

Authors: Adel Almasarwah

Abstract:

This study investigates the nature of association between profit warnings and stock price informativeness in the context of Jordan as an emerging country. The analysis is based on the response of stock price synchronicity to profit warnings percentages that have been published in Jordanian firms throughout the period spanning 2005–2016 in the Amman Stock Exchange. The standard of profit warnings indicators have related negatively to stock price synchronicity in Jordanian firms, meaning that firms with a high portion of profit warnings integrate with more firm-specific information into stock price. Robust regression was used rather than OLS as a parametric test to overcome the variances inflation factor (VIF) and heteroscedasticity issues recognised as having occurred during running the OLS regression; this enabled us to obtained stronger results that fall in line with our prediction that higher profit warning encourages firm investors to collect and process more firm-specific information than common market information.

Keywords: Profit Warnings, Jordanian Firms, Stock Price Informativeness, Synchronicity

Procedia PDF Downloads 121
15561 Information Technology Service Management System Measurement Using ISO20000-1 and ISO15504-8

Authors: Imam Asrowardi, Septafiansyah Dwi Putra, Eko Subyantoro

Abstract:

Process assessments can improve IT service management system (IT SMS) processes but the assessment method is not always transparent. This paper outlines a project to develop a solution- mediated process assessment tool to enable transparent and objective SMS process assessment. Using the international standards for SMS and process assessment, the tool is being developed following the International standard approach in collaboration and evaluate by expert judgment from committee members and ITSM practitioners.

Keywords: SMS, tools evaluation, ITIL, ISO service

Procedia PDF Downloads 449
15560 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 331
15559 The Effect of Law on Politics

Authors: Boukrida Rafiq

Abstract:

Democracy is based on the notion that all citizens have the right to participate in the managing of political affairs and that every citizens input is of equal importance. This basic assumption clearly places emphasis on public participation in maintaining a stable democracy. The level of public participation, however is highly contested with many theorists arguing that too much public participation would overwhelm and ultimately cripple democratic systems. On the other hand, others who favor high levels of participation argue that more citizen involvement leads to greater representation. Regardless of these disagreements over the utopian level of participation, there is widespread agreement amongst scholars that, at the very least, some participation is necessary to maintain democratic systems. The ways in which citizens participate vary greatly and depending on the method used, influence political decision making at varying levels. The method of political participation is a key in controlling public influence over political affairs and therefore is also an integral part of maintaining democracy, whether it be "thin" (low levels of participation) or "Robust" (high levels of participation). High levels of participation or "robust" democracy are argued by some theorists to enhance democracy through providing the opportunity for more issues to be represented during decision making. The notion of widespread participation was first advanced by classical theorists.

Keywords: assumption clearly places emphasis, ultimately cripple, influence political decision making at varying, classical theorists

Procedia PDF Downloads 429
15558 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models

Authors: Manisha Mukherjee, Diptarka Saha

Abstract:

Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.

Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function

Procedia PDF Downloads 129
15557 A Development of Holonomic Mobile Robot Using Fuzzy Multi-Layered Controller

Authors: Seungwoo Kim, Yeongcheol Cho

Abstract:

In this paper, a holonomic mobile robot is designed in omnidirectional wheels and an adaptive fuzzy controller is presented for its precise trajectories. A kind of adaptive controller based on fuzzy multi-layered algorithm is used to solve the big parametric uncertainty of motor-controlled dynamic system of 3-wheels omnidirectional mobile robot. The system parameters such as a tracking force are so time-varying due to the kinematic structure of omnidirectional wheels. The fuzzy adaptive control method is able to solve the problems of classical adaptive controller and conventional fuzzy adaptive controllers. The basic idea of new adaptive control scheme is that an adaptive controller can be constructed with parallel combination of robust controllers. This new adaptive controller uses a fuzzy multi-layered architecture which has several independent fuzzy controllers in parallel, each with different robust stability area. Out of several independent fuzzy controllers, the most suited one is selected by a system identifier which observes variations in the controlled system parameter. This paper proposes a design procedure which can be carried out mathematically and systematically from the model of a controlled system. Finally, the good performance of a holonomic mobile robot is confirmed through live tests of the tracking control task.

Keywords: fuzzy adaptive control, fuzzy multi-layered controller, holonomic mobile robot, omnidirectional wheels, robustness and stability.

Procedia PDF Downloads 322
15556 Optimizing Performance of Tablet's Direct Compression Process Using Fuzzy Goal Programming

Authors: Abbas Al-Refaie

Abstract:

This paper aims at improving the performance of the tableting process using statistical quality control and fuzzy goal programming. The tableting process was studied. Statistical control tools were used to characterize the existing process for three critical responses including the averages of a tablet’s weight, hardness, and thickness. At initial process factor settings, the estimated process capability index values for the tablet’s averages of weight, hardness, and thickness were 0.58, 3.36, and 0.88, respectively. The L9 array was utilized to provide experimentation design. Fuzzy goal programming was then employed to find the combination of optimal factor settings. Optimization results showed that the process capability index values for a tablet’s averages of weight, hardness, and thickness were improved to 1.03, 4.42, and 1.42, respectively. Such improvements resulted in significant savings in quality and production costs.

Keywords: fuzzy goal programming, control charts, process capability, tablet optimization

Procedia PDF Downloads 238
15555 Linearization and Process Standardization of Construction Design Engineering Workflows

Authors: T. R. Sreeram, S. Natarajan, C. Jena

Abstract:

Civil engineering construction is a network of tasks involving varying degree of complexity and streamlining, and standardization is the only way to establish a systemic approach to design. While there are off the shelf tools such as AutoCAD that play a role in the realization of design, the repeatable process in which these tools are deployed often is ignored. The present paper addresses this challenge through a sustainable design process and effective standardizations at all stages in the design workflow. The same is demonstrated through a case study in the context of construction, and further improvement points are highlighted.

Keywords: syste, lean, value stream, process improvement

Procedia PDF Downloads 92
15554 Geospatial Analysis for Predicting Sinkhole Susceptibility in Greene County, Missouri

Authors: Shishay Kidanu, Abdullah Alhaj

Abstract:

Sinkholes in the karst terrain of Greene County, Missouri, pose significant geohazards, imposing challenges on construction and infrastructure development, with potential threats to lives and property. To address these issues, understanding the influencing factors and modeling sinkhole susceptibility is crucial for effective mitigation through strategic changes in land use planning and practices. This study utilizes geographic information system (GIS) software to collect and process diverse data, including topographic, geologic, hydrogeologic, and anthropogenic information. Nine key sinkhole influencing factors, ranging from slope characteristics to proximity to geological structures, were carefully analyzed. The Frequency Ratio method establishes relationships between attribute classes of these factors and sinkhole events, deriving class weights to indicate their relative importance. Weighted integration of these factors is accomplished using the Analytic Hierarchy Process (AHP) and the Weighted Linear Combination (WLC) method in a GIS environment, resulting in a comprehensive sinkhole susceptibility index (SSI) model for the study area. Employing Jenk's natural break classifier method, the SSI values are categorized into five distinct sinkhole susceptibility zones: very low, low, moderate, high, and very high. Validation of the model, conducted through the Area Under Curve (AUC) and Sinkhole Density Index (SDI) methods, demonstrates a robust correlation with sinkhole inventory data. The prediction rate curve yields an AUC value of 74%, indicating a 74% validation accuracy. The SDI result further supports the success of the sinkhole susceptibility model. This model offers reliable predictions for the future distribution of sinkholes, providing valuable insights for planners and engineers in the formulation of development plans and land-use strategies. Its application extends to enhancing preparedness and minimizing the impact of sinkhole-related geohazards on both infrastructure and the community.

Keywords: sinkhole, GIS, analytical hierarchy process, frequency ratio, susceptibility, Missouri

Procedia PDF Downloads 34
15553 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 154
15552 Greyscale: A Tree-Based Taxonomy for Grey Literature Published by Fisheries Agencies

Authors: Tatiana Tunon, Gottfried Pestal

Abstract:

Government agencies responsible for the management of fisheries resources publish many types of grey literature, and these materials are increasingly accessible to the public on agency websites. However, scope and quality vary considerably, and end-users need meta-data about the report series when deciding whether to use the information (e.g. apply the methods, include the results in a systematic review), or when prioritizing materials for archiving (e.g. library holdings, reference databases). A proposed taxonomy for these report series was developed based on a review of 41 report series from 6 government agencies in 4 countries (Canada, New Zealand, Scotland, and United States). Each report series was categorized according to multiple criteria describing peer-review process, content, and purpose. A robust classification tree was then fitted to these descriptions, and the resulting taxonomic groups were used to compare agency output from 4 countries using reports available in their online repositories.

Keywords: classification tree, fisheries, government, grey literature

Procedia PDF Downloads 244
15551 Design and Control of a Knee Rehabilitation Device Using an MR-Fluid Brake

Authors: Mina Beheshti, Vida Shams, Mojtaba Esfandiari, Farzaneh Abdollahi, Abdolreza Ohadi

Abstract:

Most of the people who survive a stroke need rehabilitation tools to regain their mobility. The core function of these devices is a brake actuator. The goal of this study is to design and control a magnetorheological brake which can be used as a rehabilitation tool. In fact, the fluid used in this brake is called magnetorheological fluid or MR that properties can change by variation of the magnetic field. The braking properties can be set as control by using this feature of the fluid. In this research, different MR brake designs are first introduced in each design, and the dimensions of the brake have been determined based on the required torque for foot movement. To calculate the brake dimensions, it is assumed that the shear stress distribution in the fluid is uniform and the fluid is in its saturated state. After designing the rehabilitation brake, the mathematical model of the healthy movement of a healthy person is extracted. Due to the nonlinear nature of the system and its variability, various adaptive controllers, neural networks, and robust have been implemented to estimate the parameters and control the system. After calculating torque and control current, the best type of controller in terms of error and control current has been selected. Finally, this controller is implemented on the experimental data of the patient's movements, and the control current is calculated to achieve the desired torque and motion.

Keywords: rehabilitation, magnetorheological fluid, knee, brake, adaptive control, robust control, neural network control, torque control

Procedia PDF Downloads 119
15550 Crude Distillation Process Simulation Using Unisim Design Simulator

Authors: C. Patrascioiu, M. Jamali

Abstract:

The paper deals with the simulation of the crude distillation process using the Unisim Design simulator. The necessity of simulating this process is argued both by considerations related to the design of the crude distillation column, but also by considerations related to the design of advanced control systems. In order to use the Unisim Design simulator to simulate the crude distillation process, the identification of the simulators used in Romania and an analysis of the PRO/II, HYSYS, and Aspen HYSYS simulators were carried out. Analysis of the simulators for the crude distillation process has allowed the authors to elaborate the conclusions of the success of the crude modelling. A first aspect developed by the authors is the implementation of specific problems of petroleum liquid-vapors equilibrium using Unisim Design simulator. The second major element of the article is the development of the methodology and the elaboration of the simulation program for the crude distillation process, using Unisim Design resources. The obtained results validate the proposed methodology and will allow dynamic simulation of the process.  

Keywords: crude oil, distillation, simulation, Unisim Design, simulators

Procedia PDF Downloads 213
15549 Design and Test a Robust Bearing-Only Target Motion Analysis Algorithm Based on Modified Gain Extended Kalman Filter

Authors: Mohammad Tarek Al Muallim, Ozhan Duzenli, Ceyhun Ilguy

Abstract:

Passive sonar is a method for detecting acoustic signals in the ocean. It detects the acoustic signals emanating from external sources. With passive sonar, we can determine the bearing of the target only, no information about the range of the target. Target Motion Analysis (TMA) is a process to estimate the position and speed of a target using passive sonar information. Since bearing is the only available information, the TMA technique called Bearing-only TMA. Many TMA techniques have been developed. However, until now, there is not a very effective method that could be used to always track an unknown target and extract its moving trace. In this work, a design of effective Bearing-only TMA Algorithm is done. The measured bearing angles are very noisy. Moreover, for multi-beam sonar, the measurements is quantized due to the sonar beam width. To deal with this, modified gain extended Kalman filter algorithm is used. The algorithm is fine-tuned, and many modules are added to improve the performance. A special validation gate module is used to insure stability of the algorithm. Many indicators of the performance and confidence level measurement are designed and tested. A new method to detect if the target is maneuvering is proposed. Moreover, a reactive optimal observer maneuver based on bearing measurements is proposed, which insure converging to the right solution all of the times. To test the performance of the proposed TMA algorithm a simulation is done with a MATLAB program. The simulator program tries to model a discrete scenario for an observer and a target. The simulator takes into consideration all the practical aspects of the problem such as a smooth transition in the speed, a circular turn of the ship, noisy measurements, and a quantized bearing measurement come for multi-beam sonar. The tests are done for a lot of given test scenarios. For all the tests, full tracking is achieved within 10 minutes with very little error. The range estimation error was less than 5%, speed error less than 5% and heading error less than 2 degree. For the online performance estimator, it is mostly aligned with the real performance. The range estimation confidence level gives a value equal to 90% when the range error less than 10%. The experiments show that the proposed TMA algorithm is very robust and has low estimation error. However, the converging time of the algorithm is needed to be improved.

Keywords: target motion analysis, Kalman filter, passive sonar, bearing-only tracking

Procedia PDF Downloads 366
15548 Invasive Ranges of Gorse (Ulex europaeus) in South Australia and Sri Lanka Using Species Distribution Modelling

Authors: Champika S. Kariyawasam

Abstract:

The distribution of gorse (Ulex europaeus) plants in South Australia has been modelled using 126 presence-only location data as a function of seven climate parameters. The predicted range of U. europaeus is mainly along the Mount Lofty Ranges in the Adelaide Hills and on Kangaroo Island. Annual precipitation and yearly average aridity index appeared to be the highest contributing variables to the final model formulation. The Jackknife procedure was employed to identify the contribution of different variables to gorse model outputs and response curves were used to predict changes with changing environmental variables. Based on this analysis, it was revealed that the combined effect of one or more variables could make a completely different impact to the original variables on their own to the model prediction. This work also demonstrates the need for a careful approach when selecting environmental variables for projecting correlative models to climatically distinct area. Maxent acts as a robust model when projecting the fitted species distribution model to another area with changing climatic conditions, whereas the generalized linear model, bioclim, and domain models to be less robust in this regard. These findings are important not only for predicting and managing invasive alien gorse in South Australia and Sri Lanka but also in other countries of the invasive range.

Keywords: invasive species, Maxent, species distribution modelling, Ulex europaeus

Procedia PDF Downloads 104
15547 Unveiling Special Policy Regime, Judgment, and Taylor Rules in Tunisia

Authors: Yosra Baaziz, Moez Labidi

Abstract:

Given limited research on monetary policy rules in revolutionary countries, this paper challenges the suitability of the Taylor rule in characterizing the monetary policy behavior of the Tunisian Central Bank (BCT), especially in turbulent times. More specifically, we investigate the possibility that the Taylor rule should be formulated as a threshold process and examine the validity of such nonlinear Taylor rule as a robust rule for conducting monetary policy in Tunisia. Using quarterly data from 1998:Q4 to 2013:Q4 to analyze the movement of nominal short-term interest rate of the BCT, we find that the nonlinear Taylor rule improves its performance with the advent of special events providing thus a better description of the Tunisian interest rate setting. In particular, our results show that the adoption of an appropriate nonlinear approach leads to a reduction in the errors of 150 basis points in 1999 and 2009, and 60 basis points in 2011, relative to the linear approach.

Keywords: policy rule, central bank, exchange rate, taylor rule, nonlinearity

Procedia PDF Downloads 271
15546 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue

Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni

Abstract:

Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.

Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM

Procedia PDF Downloads 295
15545 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis

Authors: Mayada Attia Ibrahim

Abstract:

Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.

Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis

Procedia PDF Downloads 62
15544 Treatment of Cutting Oily-Wastewater by Sono-Fenton Process: Experimental Approach and Combined Process

Authors: Pisut Painmanakul, Thawatchai Chintateerachai, Supanid Lertlapwasin, Nusara Rojvilavan, Tanun Chalermsinsuwan, Nattawin Chawaloesphonsiya, Onanong Larpparisudthi

Abstract:

Conventional coagulation, advance oxidation process (AOPs), and the combined process were evaluated and compared for its suitability to treat the stabilized cutting-oil wastewater. The 90% efficiency was obtained from the coagulation at Al2(SO4)3 dosage of 150 mg/L and pH 7. On the other hands, efficiencies of AOPs for 30 minutes oxidation time were 10% for acoustic oxidation, 12% for acoustic oxidation with hydrogen peroxide, 76% for Fenton, and 92% sono-Fenton processes. The highest efficiency for effective oil removal of AOPs required large amount of chemical. Therefore, AOPs were studied as a post-treatment after conventional separation process. The efficiency was considerable as the effluent COD can pass the standard required for industrial wastewater discharge with less chemical and energy consumption.

Keywords: cutting oily-wastewater, advance oxidation process, sono-fenton, combined process

Procedia PDF Downloads 316
15543 Parametric Studies of Ethylene Dichloride Purification Process

Authors: Sh. Arzani, H. Kazemi Esfeh, Y. Galeh Zadeh, V. Akbari

Abstract:

Ethylene dichloride is a colorless liquid with a smell like chloroform. EDC is classified in the simple hydrocarbon group which is obtained from chlorinating ethylene gas. Its chemical formula is C2H2Cl2 which is used as the main mediator in VCM production. Therefore, the purification process of EDC is important in the petrochemical process. In this study, the purification unit of EDC was simulated, and then validation was performed. Finally, the impact of process parameter was studied for the degree of EDC purity. The results showed that by increasing the feed flow, the reflux impure combinations increase and result in an EDC purity decrease.

Keywords: ethylene dichloride, purification, edc, simulation

Procedia PDF Downloads 281
15542 Optimised Path Recommendation for a Real Time Process

Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa

Abstract:

Traditional execution process follows the path of execution drawn by the process analyst without observing the behaviour of resource and other real-time constraints. Identifying process model, predicting the behaviour of resource and recommending the optimal path of execution for a real time process is challenging. The proposed AlfyMiner: αyM iner gives a new dimension in process execution with the novel techniques Process Model Analyser: PMAMiner and Resource behaviour Analyser: RBAMiner for recommending the probable path of execution. PMAMiner discovers next probable activity for currently executing activity in an online process using variant matching technique to identify the set of next probable activity, among which the next probable activity is discovered using decision tree model. RBAMiner identifies the resource suitable for performing the discovered next probable activity and observe the behaviour based on; load and performance using polynomial regression model, and waiting time using queueing theory. Based on the observed behaviour αyM iner recommend the probable path of execution with; next probable activity and the best suitable resource for performing it. Experiments were conducted on process logs of CoSeLoG Project1 and 72% of accuracy is obtained in identifying and recommending next probable activity and the efficiency of resource performance was optimised by 59% by decreasing their load.

Keywords: cross-organization process mining, process behaviour, path of execution, polynomial regression model

Procedia PDF Downloads 305
15541 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication

Authors: Aishwarya Shekhar, Himanshu Sharma

Abstract:

Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.

Keywords: confidentiality, deduplication, data compression, hybridity of cloud

Procedia PDF Downloads 353
15540 Enhance Concurrent Design Approach through a Design Methodology Based on an Artificial Intelligence Framework: Guiding Group Decision Making to Balanced Preliminary Design Solution

Authors: Loris Franchi, Daniele Calvi, Sabrina Corpino

Abstract:

This paper presents a design methodology in which stakeholders are assisted with the exploration of a so-called negotiation space, aiming to the maximization of both group social welfare and single stakeholder’s perceived utility. The outcome results in less design iterations needed for design convergence while obtaining a higher solution effectiveness. During the early stage of a space project, not only the knowledge about the system but also the decision outcomes often are unknown. The scenario is exacerbated by the fact that decisions taken in this stage imply delayed costs associated with them. Hence, it is necessary to have a clear definition of the problem under analysis, especially in the initial definition. This can be obtained thanks to a robust generation and exploration of design alternatives. This process must consider that design usually involves various individuals, who take decisions affecting one another. An effective coordination among these decision-makers is critical. Finding mutual agreement solution will reduce the iterations involved in the design process. To handle this scenario, the paper proposes a design methodology which, aims to speed-up the process of pushing the mission’s concept maturity level. This push up is obtained thanks to a guided negotiation space exploration, which involves autonomously exploration and optimization of trade opportunities among stakeholders via Artificial Intelligence algorithms. The negotiation space is generated via a multidisciplinary collaborative optimization method, infused by game theory and multi-attribute utility theory. In particular, game theory is able to model the negotiation process to reach the equilibria among stakeholder needs. Because of the huge dimension of the negotiation space, a collaborative optimization framework with evolutionary algorithm has been integrated in order to guide the game process to efficiently and rapidly searching for the Pareto equilibria among stakeholders. At last, the concept of utility constituted the mechanism to bridge the language barrier between experts of different backgrounds and differing needs, using the elicited and modeled needs to evaluate a multitude of alternatives. To highlight the benefits of the proposed methodology, the paper presents the design of a CubeSat mission for the observation of lunar radiation environment. The derived solution results able to balance all stakeholders needs and guaranteeing the effectiveness of the selection mission concept thanks to its robustness in valuable changeability. The benefits provided by the proposed design methodology are highlighted, and further development proposed.

Keywords: concurrent engineering, artificial intelligence, negotiation in engineering design, multidisciplinary optimization

Procedia PDF Downloads 103
15539 Implementation of ALD in Product Development: Study of ROPS to Improve Energy Absorption Performance Using Absorption Part

Authors: Zefry Darmawan, Shigeyuki Haruyama, Ken Kaminishi

Abstract:

Product development is a big issue in the industrial competition and takes a serious part in development of technology. Product development process could adapt high changes of market needs and transform into engineering concept in order to produce high-quality product. One of the latest methods in product development is Analysis-Led-Design (ALD). It utilizes digital engineering design tools with finite analysis to perform product robust analysis and valuable for product reliability assurance. Heavy machinery which operates under severe condition should maintain safety to the customer when faced with potential hazard. Cab frame should able to absorb the energy while collision. Through ALD, a series of improvement of cab frame to increase energy absorption was made and analyzed. Improvement was made by modifying shapes of frame and-or install absorption device in certain areas. Simulation result showed that install absorption device could increase absorption energy than modifying shape.

Keywords: ALD, ROPS, energy absorption, cab frame

Procedia PDF Downloads 343
15538 Effect of Impurities in the Chlorination Process of TiO2

Authors: Seok Hong Min, Tae Kwon Ha

Abstract:

With the increasing interest on Ti alloys, the extraction process of Ti from its typical ore, TiO2, has long been and will be important issue. As an intermediate product for the production of pigment or titanium metal sponge, tetrachloride (TiCl4) is produced by fluidized bed using high TiO2 feedstock. The purity of TiCl4 after chlorination is subjected to the quality of the titanium feedstock. Since the impurities in the TiCl4 product are reported to final products, the purification process of the crude TiCl4 is required. The purification process includes fractional distillation and chemical treatment, which depends on the nature of the impurities present and the required quality of the final product. In this study, thermodynamic analysis on the impurity effect in the chlorination process, which is the first step of extraction of Ti from TiO2, has been conducted. All thermodynamic calculations were performed using the FactSage thermodynamical software.

Keywords: rutile, titanium, chlorination process, impurities, thermodynamic calculation, FactSage

Procedia PDF Downloads 275
15537 Controlling the Process of a Chicken Dressing Plant through Statistical Process Control

Authors: Jasper Kevin C. Dionisio, Denise Mae M. Unsay

Abstract:

In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.

Keywords: motion and time study, process controlling, statistical process control, X-Bar R Control chart

Procedia PDF Downloads 178
15536 Self-Tuning Dead-Beat PD Controller for Pitch Angle Control of a Bench-Top Helicopter

Authors: H. Mansor, S.B. Mohd-Noor, N. I. Othman, N. Tazali, R. I. Boby

Abstract:

This paper presents an improved robust Proportional Derivative controller for a 3-Degree-of-Freedom (3-DOF) bench-top helicopter by using adaptive methodology. Bench-top helicopter is a laboratory scale helicopter used for experimental purposes which is widely used in teaching laboratory and research. Proportional Derivative controller has been developed for a 3-DOF bench-top helicopter by Quanser. Experiments showed that the transient response of designed PD controller has very large steady state error i.e., 50%, which is very serious. The objective of this research is to improve the performance of existing pitch angle control of PD controller on the bench-top helicopter by integration of PD controller with adaptive controller. Usually standard adaptive controller will produce zero steady state error; however response time to reach desired set point is large. Therefore, this paper proposed an adaptive with deadbeat algorithm to overcome the limitations. The output response that is fast, robust and updated online is expected. Performance comparisons have been performed between the proposed self-tuning deadbeat PD controller and standard PD controller. The efficiency of the self-tuning dead beat controller has been proven from the tests results in terms of faster settling time, zero steady state error and capability of the controller to be updated online.

Keywords: adaptive control, deadbeat control, bench-top helicopter, self-tuning control

Procedia PDF Downloads 293
15535 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 494