Search results for: aims of dramatherapy process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20671

Search results for: aims of dramatherapy process

20041 Aerogel Fabrication Via Modified Rapid Supercritical Extraction (RSCE) Process - Needle Valve Pressure Release

Authors: Haibo Zhao, Thomas Andre, Katherine Avery, Alper Kiziltas, Deborah Mielewski

Abstract:

Silica aerogels were fabricated through a modified rapid supercritical extraction (RSCE) process. The silica aerogels were made using a tetramethyl orthosilicate precursor and then placed in a hot press and brought to the supercritical point of the solvent, ethanol. In order to control the pressure release without a pressure controller, a needle valve was used. The resulting aerogels were then characterized for their physical and chemical properties and compared to silica aerogels created using similar methods. The aerogels fabricated using this modified RSCE method were found to have similar properties to those in other papers using the unmodified RSCE method. Silica aerogel infused glass blanket composite, graphene reinforced silica aerogel composite were also successfully fabricated by this new method. The modified RSCE process and system is a prototype for better gas outflow control with a lower cost of equipment setup. Potentially, this process could be evolved to a continuous low-cost high-volume production process to meet automotive requirements.

Keywords: aerogel, automotive, rapid supercritical extraction process, low cost production

Procedia PDF Downloads 166
20040 A Survey of 2nd Year Students' Frequent Writing Error and the Effects of Participatory Error Correction Process

Authors: Chaiwat Tantarangsee

Abstract:

The purposes of this study are 1) to study the effects of participatory error correction process and 2) to find out the students’ satisfaction of such error correction process. This study is a Quasi Experimental Research with single group, in which data is collected 5 times preceding and following 4 experimental studies of participatory error correction process including providing coded indirect corrective feedback in the students’ texts with error treatment activities. Samples include 28 2nd year English Major students, Faculty of Humanities and Social Sciences, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tools for data collection include 5 writing tests of short texts and a questionnaire. Based on formative evaluation of the students’ writing ability prior to and after each of the 4 experiments, the research findings disclose the students’ higher scores with statistical difference at 0.05. Moreover, in terms of the effect size of such process, it is found that for mean of the students’ scores prior to and after the 4 experiments; d equals 1.0046, 1.1374, 1.297, and 1.0065 respectively. It can be concluded that participatory error correction process enables all of the students to learn equally well and there is improvement in their ability to write short texts. Finally, the students’ overall satisfaction of the participatory error correction process is in high level (Mean=4.32, S.D.=0.92).

Keywords: coded indirect corrective feedback, participatory error correction process, error treatment, humanities and social sciences

Procedia PDF Downloads 501
20039 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 261
20038 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence

Procedia PDF Downloads 418
20037 National Plans for Recovery and Resilience between National Recovery and EU Cohesion Objectives: Insights from European Countries

Authors: Arbolino Roberta, Boffardi Raffaele

Abstract:

Achieving the highest effectiveness for the National Plans for Recovery and Resilience (NPRR) while strengthening the objectives of cohesion and reduction of intra-EU unbalances is only possible by means of strategic, coordinated, and coherent policy planning. Therefore, the present research aims at assessing and quantifying the potential impact of NPRRs across the twenty-seven European Member States in terms of economic convergence, considering disaggregated data on industrial, construction, and service sectors. The first step of the research involves a performance analysis of the main macroeconomic indicators describing the trends of twenty-seven EU economies before the pandemic outbreak. Subsequently, in order to define the potential effect of the resources allocated, we perform an impact analysis of previous similar EU investment policies, estimating national-level sectoral elasticity associated with the expenditure of the 2007-2013 and 2014-2020 Cohesion programmes funds. These coefficients are then exploited to construct adjustment scenarios. Finally, convergence analysis is performed on the data used for constructing scenarios in order to understand whether the expenditure of funds might be useful to foster economic convergence besides driving recovery. The results of our analysis show that the allocation of resources largely mirrors the aims of the policy framework underlying the NPRR, thus reporting the largest investments in both those sectors most affected by the economic shock (services) and those considered fundamental for the digital and green transition. Notwithstanding an overall positive effect, large differences exist among European countries, while no convergence process seems to be activated or fostered by these interventions.

Keywords: NPRR, policy evaluation, cohesion policy, scenario Nalsysi

Procedia PDF Downloads 61
20036 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs

Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao

Abstract:

A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.

Keywords: tight reservoirs, cyclic O₂ injection, asphaltene, solubility, reservoir simulation

Procedia PDF Downloads 364
20035 An Integrated Approach for Risk Management of Transportation of HAZMAT: Use of Quality Function Deployment and Risk Assessment

Authors: Guldana Zhigerbayeva, Ming Yang

Abstract:

Transportation of hazardous materials (HAZMAT) is inevitable in the process industries. The statistics show a significant number of accidents has occurred during the transportation of HAZMAT. This makes risk management of HAZMAT transportation an important topic. The tree-based methods including fault-trees, event-trees and cause-consequence analysis, and Bayesian network, have been applied to risk management of HAZMAT transportation. However, there is limited work on the development of a systematic approach. The existing approaches fail to build up the linkages between the regulatory requirements and the safety measures development. The analysis of historical data from the past accidents’ report databases would limit our focus on the specific incidents and their specific causes. Thus, we may overlook some essential elements in risk management, including regulatory compliance, field expert opinions, and suggestions. A systematic approach is needed to translate the regulatory requirements of HAZMAT transportation into specified safety measures (both technical and administrative) to support the risk management process. This study aims to first adapt the House of Quality (HoQ) to House of Safety (HoS) and proposes a new approach- Safety Function Deployment (SFD). The results of SFD will be used in a multi-criteria decision-support system to develop find an optimal route for HazMats transportation. The proposed approach will be demonstrated through a hypothetical transportation case in Kazakhstan.

Keywords: hazardous materials, risk assessment, risk management, quality function deployment

Procedia PDF Downloads 120
20034 Analysis of Collision Avoidance System

Authors: N. Gayathri Devi, K. Batri

Abstract:

The advent of technology has increased the traffic hazards and the road accidents take place. Collision detection system in automobile aims at reducing or mitigating the severity of an accident. This project aims at avoiding Vehicle head on collision by means of collision detection algorithm. This collision detection algorithm predicts the collision and the avoidance or minimization have to be done within few seconds on confirmation. Under critical situation collision minimization is made possible by turning the vehicle to the desired turn radius so that collision impact can be reduced. In order to avoid the collision completely, the turning of the vehicle should be achieved at reduced speed in order to maintain the stability.

Keywords: collision avoidance system, time to collision, time to turn, turn radius

Procedia PDF Downloads 530
20033 Defect Management Life Cycle Process for Software Quality Improvement

Authors: Aedah Abd Rahman, Nurdatillah Hasim

Abstract:

Software quality issues require special attention especially in view of the demands of quality software product to meet customer satisfaction. Software development projects in most organisations need proper defect management process in order to produce high quality software product and reduce the number of defects. The research question of this study is how to produce high quality software and reducing the number of defects. Therefore, the objective of this paper is to provide a framework for managing software defects by following defined life cycle processes. The methodology starts by reviewing defects, defect models, best practices and standards. A framework for defect management life cycle is proposed. The major contribution of this study is to define a defect management road map in software development. The adoption of an effective defect management process helps to achieve the ultimate goal of producing high quality software products and contributes towards continuous software process improvement.

Keywords: defects, defect management, life cycle process, software quality

Procedia PDF Downloads 285
20032 Carbon Nanocomposites : Structure, Characterization and Environmental Application

Authors: Bensacia Nabila, Hadj-Ziane Amel, Sefah Karima

Abstract:

Carbon nanocomposites have received more attention in the last years in view of their special properties such as low density, high specific surface area, and thermal and mechanical stability. Taking into account the importance of these materials, many studies aimed at improving the synthesis process have been conducted. However, the presence of impurities could affect significantly the properties of these materials, and the characterization of these compounds is an important challenge to assure the quality of the new carbon nanocomposites. The present study aims to develop a new recyclable decontaminating material for dyes removal. This new material consists of an active element based on carbon nanotubes wrapped in a microcapsule of iron oxide. The adsorbent is characterized by Transmission electron microscopy, X-ray diffraction and the surface area was measured by the BET method.

Keywords: carbon nanocomposite, chitozen, elimination, dyes

Procedia PDF Downloads 297
20031 Treadmill Negotiation: The Stagnation of the Israeli – Palestinian Peace Process

Authors: Itai Kohavi, Wojciech Nowiak

Abstract:

This article explores the stagnation of the Israeli -Palestinian peace negotiation process, and the reasons behind the failure of more than 12 international initiatives to resolve the conflict. Twenty-seven top members of the Israeli national security elite (INSE) were interviewed, including heads of the negotiation teams, the National Security Council, the Mossad, and other intelligence and planning arms. The interviewees provided their insights on the Israeli challenges in reaching a sustainable and stable peace agreement and in dealing with the international pressure on Israel to negotiate a peace agreement while preventing anti-Israeli UN decisions and sanctions. The findings revealed a decision tree, with red herring deception strategies implemented to postpone the negotiation process and to delay major decisions during the negotiation process. Beyond the possible applications for the Israeli – Palestinian conflict, the findings shed more light on the phenomenon of rational deception of allies in a negotiation process, a subject less frequently researched as compared with deception of rivals.

Keywords: deception, Israeli-Palestinian conflict, negotiation, red herring, terrorist state, treadmill negotiation

Procedia PDF Downloads 284
20030 The Liberal Tension of the Adversarial Criminal ‎Procedure

Authors: Benjamin Newman

Abstract:

The picture of an adverse contest between two parties has often been used as an archetypal description of the Anglo-American adversarial criminal trial. However, in actuality, guilty pleas and plea-bargains have been dominating the procedure for over the last half-a-century. Characterised by two adverse parties, the court adjudicative system in the Anglo-American world adhere to the adversarial procedure, and while further features have been attributed and the values that are embedded within the procedure vary, it is a system that we have no adequate theory. Damaska had argued that the adversarial conflict-resolution mode of administration of justice stems from a liberal laissez-faire concept of a value neutral liberal state. Having said that, the court’s neutrality has been additionally rationalised in light of its liberal end as a safeguard from the state’s coercive force. Both conceptions of the court’s neutrality conflict in cases where the by-standing role disposes of its liberal duty in safeguarding the individual. Such is noticeable in plea bargains, where the defendant has the liberty to plead guilty, despite concerns over wrongful convictions and deprivation of liberty. It is an inner liberal tension within the notion of criminal adversarialism, between the laissez-faire mode which grants autonomy to the parties and the safeguarding liberal end of the trial. Langbein had asserted that the adversarial system is a criminal procedure for which we have no adequate theory, and it is by reference to political and moral theories that the research aims to articulate a normative account. The paper contemplates on the above liberal-tension, and by reference to Duff’s ‘calling-to-account’ theory, argues that autonomy is of inherent value to the criminal process, being considered a constitutive element in the process of being called to account. While the aspiration is that the defendant’s guilty plea should be genuine, the guilty-plea decision must be voluntary if it is to be considered a performative act of accountability. Thus, by valuing procedural autonomy as a necessary element within the criminal adjudicative process, it assimilates a liberal procedure, whilst maintaining the liberal end by holding the defendant to account.

Keywords: liberal theory, adversarial criminal procedure, criminal law theory, liberal perfectionism, political liberalism

Procedia PDF Downloads 72
20029 Distribution-Free Exponentially Weighted Moving Average Control Charts for Monitoring Process Variability

Authors: Chen-Fang Tsai, Shin-Li Lu

Abstract:

Distribution-free control chart is an oncoming area from the statistical process control charts in recent years. Some researchers have developed various nonparametric control charts and investigated the detection capability of these charts. The major advantage of nonparametric control charts is that the underlying process is not specifically considered the assumption of normality or any parametric distribution. In this paper, two nonparametric exponentially weighted moving average (EWMA) control charts based on nonparametric tests, namely NE-S and NE-M control charts, are proposed for monitoring process variability. Generally, weighted moving average (GWMA) control charts are extended by utilizing design and adjustment parameters for monitoring the changes in the process variability, namely NG-S and NG-M control charts. Statistical performance is also investigated on NG-S and NG-M control charts with run rules. Moreover, sensitivity analysis is performed to show the effects of design parameters under the nonparametric NG-S and NG-M control charts.

Keywords: Distribution-free control chart, EWMA control charts, GWMA control charts

Procedia PDF Downloads 249
20028 A System for Visual Management of Research Resources Focusing on Accumulation of Polish Processes

Authors: H. Anzai, H. Nakayama, H. Kaminaga, Y. Morimoto, Y. Miyadera, S. Nakamura

Abstract:

Various research resources such as papers and presentation slides are handled in the process of research activities. It is extremely important for smooth progress of the research to skillfully manage those research resources and utilize them for further investigations. However, number of the research resources increases more and more. Moreover, there are the differences in usage of each kind of research resource and their accumulation styles. So, it is actually difficult to satisfactorily manage and use the accumulated research resources. Therefore, a lack of tidiness of the resources causes the problems such as an oversight of the problem to be polish. Although there have existed research projects on support for management of research resources and for sharing of know-how, almost existing systems have not been effective enough since these systems have not sufficiently considered the polish process. This paper mainly describes a system that enables the strategic management of research resources together with polish process and their practical use.

Keywords: research resource, polish process, information sharing, knowledge management, information visualization

Procedia PDF Downloads 371
20027 Role of Process Parameters on Pocket Milling with Abrasive Water Jet Machining Technique

Authors: T. V. K. Gupta, J. Ramkumar, Puneet Tandon, N. S. Vyas

Abstract:

Abrasive Water Jet Machining (AWJM) is an unconventional machining process well known for machining hard to cut materials. The primary research focus on the process was for through cutting and a very limited literature is available on pocket milling using AWJM. The present work is an attempt to use this process for milling applications considering a set of various process parameters. Four different input parameters, which were considered by researchers for part separation, are selected for the above application i.e. abrasive size, flow rate, standoff distance, and traverse speed. Pockets of definite size are machined to investigate surface roughness, material removal rate, and pocket depth. Based on the data available through experiments on SS304 material, it is observed that higher traverse speeds gives a better finish because of reduction in the particle energy density and lower depth is also observed. Increase in the standoff distance and abrasive flow rate reduces the rate of material removal as the jet loses its focus and occurrence of collisions within the particles. ANOVA for individual output parameter has been studied to know the significant process parameters.

Keywords: abrasive flow rate, surface finish, abrasive size, standoff distance, traverse speed

Procedia PDF Downloads 281
20026 Influence of Ligature Tightening on Bone Fracture Risk in Interspinous Process Surgery

Authors: Dae Kyung Choi, Won Man Park, Kyungsoo Kim, Yoon Hyuk Kim

Abstract:

The interspinous process devices have been recently used due to its advantages such as minimal invasiveness and less subsidence of the implant to the osteoporotic bone. In this paper, we have analyzed the influences of ligature tightening of several interspinous process devices using finite element analysis. Four types of interspinous process implants were inserted to the L3-4 spinal motion segment based on their surgical protocols. Inferior plane of L4 vertebra was fixed and 7.5 Nm of extension moment were applied on superior plane of L3 vertebra with 400N of compressive load along follower load direction and pretension of the ligature. The stability of the spinal unit was high enough than that of intact model. The higher value of pretension in the ligature led the decrease of dynamic stabilization effect in cases of the WallisTM, DiamTM, Viking, and Spear®. The results of present study could be used to evaluate surgical option and validate the biomechanical characteristics of the spinal implants.

Keywords: interspinous process device, bone fracture risk, lumbar spine, finite element analysis

Procedia PDF Downloads 386
20025 An Innovative Use of Flow Columns in Electrocoagulation Reactor to Control Water Temperature

Authors: Khalid S. Hashim, Andy Shaw, Rafid Alkhaddar, David Phipps, Ortoneda Pedrola

Abstract:

Temperature is an essential parameter in the electrocoagulation process (EC) as it governs the solubility of electrodes and the precipitates and the collision rate of particles in water being treated. Although it has been about 100 years since the EC technology was invented and applied in water and wastewater treatment, the effects of temperature on the its performance were insufficiently investigated. Thus, the present project aims to fill this gap by an innovative use of perforated flow columns in the designing of a new EC reactor (ECR1). The new reactor (ECR1) consisted of a Perspex made cylinder container supplied with a flow column consisted of perorated discoid electrodes that made from aluminium. The flow column has been installed vertically, half submerged in the water being treated, inside a plastic cylinder. The unsubmerged part of the flow column works as a radiator for the water being treated. In order to investigate the performance of ECR1; water samples with different initial temperatures (15, 20, 25, 30, and 35 °C) to the ECR1 for 20 min. Temperature of effluent water samples were measured using Hanna meter (Model: HI 98130). The obtained results demonstrated that the ECR1 reduced water temperature from 35, 30, and 25 °C to 24.6, 23.8, and 21.8 °C respectively. While low water temperature, 15 °C, increased slowly to reach 19.1 °C after 15 minutes and kept the same level till the end of the treatment period. At the same time, water sample with initial temperature of 20 °C showed almost a steady level of temperature along the treatment process, where the temperature increased negligibly from 20 to 20.1 °C after 20 minutes of treatment. In conclusion, ECR1 is able to control the temperature of water being treated around the room temperature even when the initial temperature was high (35 °C) or low (15 °C).

Keywords: electrocoagulation, flow column, treatment, water temperature

Procedia PDF Downloads 407
20024 Argentine Immigrant Policy: A Qualitative Analysis of Changes and Trends from 2016 on

Authors: Romeu Bonk Mesquita

Abstract:

Argentina is the South American number 1 country of destiny to intraregional migration flows. This research aims to shed light on the main trends of the Argentine immigrant policy from 2016 on, when Mauricio Marci was elected President, taking the approval of the current and fairly protective of human rights Ley de Migraciones (2003) as an analytical starting point. Foreign Policy Analysis (FPA) serves as the theoretical background, highlighting decision-making processes and institutional designs that encourage or constraint political and social actors. The analysis goes through domestic and international levels, observing how immigration policy is formulated as a public policy and is simultaneously connected to Mercosur and other international organizations, such as the International Organization for Migration (IOM) and the United Nations High Commissioner for Refugees (UNHCR). Thus, the study revolves around the Direccion Nacional de Migraciones, which is the state agency in charge of executing the country’s immigrant policy, as to comprehend how its internal processes and the connections it has with both domestic and international institutions shape Argentina’s immigrant policy formulation and execution. Also, it aims to locate the migration agenda within the country’s contemporary social and political context. The methodology is qualitative, case-based and oriented by process-tracing techniques. Empirical evidence gathered includes official documents and data, media coverage and interviews to key-informants. Recent events, such as the Decreto de Necesidad y Urgencia 70/2017 issued by President Macri, and the return of discursive association between migration and criminality, indicate a trend of nationalization and securitization of the immigration policy in contemporary Argentina.

Keywords: Argentine foreign policy, human rights, immigrant policy, Mercosur

Procedia PDF Downloads 144
20023 Intelligent Ambulance with Advance Features of Traffic Management and Telecommunication

Authors: Mamatha M. N.

Abstract:

Traffic problems, congested traffic, and flow management were recognized as major problems mostly in all the areas, which have caused a problem for the ambulance which carries the emergency patient. The proposed paper aims in the development of ambulance which reaches the nearby hospital faster even in heavy traffic scenario. This process is activated by implementing hardware in an ambulance as well as in traffic post thus allowing a smooth flow to the ambulance to reach the hospital in time. 1) The design of the vehicle to have a communication between ambulance and traffic post. 2)Electronic Health Record with Data-acquisition system 3)Telemetry of acquired biological parameters to the nearest hospital. Thus interfacing all these three different modules and integrating them on the ambulance could reach the hospital earlier than the present ambulance. The system is accurate and efficient of 99.8%.

Keywords: bio-telemetry, data acquisition, patient database, automatic traffic control

Procedia PDF Downloads 294
20022 Study of Acoustic Resonance of Model Liquid Rocket Combustion Chamber and Its Suppression

Authors: Vimal O. Kumar, C. K. Muthukumaran, P. Rakesh

Abstract:

Liquid rocket engine (LRE) combustion chamber is subjected to pressure oscillation during the combustion process. The combustion noise (acoustic noise) is a broad band, small amplitude, high frequency component pressure oscillation. They constitute only a minor fraction ( < 1%) of the entire combustion process. However, this high frequency oscillation is huge concern during the design phase of LRE combustion chamber as it would cause catastrophic failure of the chamber. Depends on the chamber geometry, certain frequencies form standing wave pattern, and they resonate with high amplitude and are known as Eigen modes. These Eigen modes could cause failures unless it is suppressed to be within safe limits. These modes are categorized into radial, tangential, and azimuthal modes, and their structure inside the combustion chamber is of interest to the researchers. In the present proposal, experimental as well as numerical simulation will be performed to obtain the frequency-amplitude characteristics of the model combustion chamber for different baffle configuration. The main objective of this study is to find effect of baffle configuration that would provide better suppression of acoustic modes. The experimental study aims at measuring the frequency amplitude characteristics at certain points in the chamber wall. The experimental measurement will be also used for scheme used in numerical simulation. In addition to experiments, numerical simulation would provide detailed structure of the Eigenmodes exhibited and their level of suppression with the aid of different baffle configurations.

Keywords: baffle, instability, liquid rocket engine, pressure response of chamber

Procedia PDF Downloads 108
20021 Utilizing Reflection as a Tool for Experiential Learning through a Simulated Activity

Authors: Nadira Zaidi

Abstract:

The aim of this study is to gain direct feedback of interviewees in a simulated interview process. Reflection based on qualitative data analysis has been utilized through the Gibbs Reflective Cycle, with 30 students as respondents at the Undergraduate level. The respondents reflected on the positive and negative aspects of this active learning process in order to increase their performance in actual job interviews. Results indicate that students engaged in the process successfully imbibed the feedback that they received from the interviewers and also identified the areas that needed improvement.

Keywords: experiential learning, positive and negative impact, reflection, simulated

Procedia PDF Downloads 123
20020 Optimization of End Milling Process Parameters for Minimization of Surface Roughness of AISI D2 Steel

Authors: Pankaj Chandna, Dinesh Kumar

Abstract:

The present work analyses different parameters of end milling to minimize the surface roughness for AISI D2 steel. D2 Steel is generally used for stamping or forming dies, punches, forming rolls, knives, slitters, shear blades, tools, scrap choppers, tyre shredders etc. Surface roughness is one of the main indices that determines the quality of machined products and is influenced by various cutting parameters. In machining operations, achieving desired surface quality by optimization of machining parameters, is a challenging job. In case of mating components the surface roughness become more essential and is influenced by the cutting parameters, because, these quality structures are highly correlated and are expected to be influenced directly or indirectly by the direct effect of process parameters or their interactive effects (i.e. on process environment). In this work, the effects of selected process parameters on surface roughness and subsequent setting of parameters with the levels have been accomplished by Taguchi’s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L9 orthogonal array. Experimental investigation of the end milling of AISI D2 steel with carbide tool by varying feed, speed and depth of cut and the surface roughness has been measured using surface roughness tester. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the contribution of the different process parameters on the process.

Keywords: D2 steel, orthogonal array, optimization, surface roughness, Taguchi methodology

Procedia PDF Downloads 526
20019 Process Assessment Model for Process Capability Determination Based on ISO/IEC 20000-1:2011

Authors: Harvard Najoan, Sarwono Sutikno, Yusep Rosmansyah

Abstract:

Most enterprises are now using information technology services as their assets to support business objectives. These kinds of services are provided by the internal service provider (inside the enterprise) or external service provider (outside enterprise). To deliver quality information technology services, the service provider (which from now on will be called ‘organization’) either internal or external, must have a standard for service management system. At present, the standard that is recognized as best practice for service management system for the organization is international standard ISO/IEC 20000:2011. The most important part of this international standard is the first part or ISO/IEC 20000-1:2011-Service Management System Requirement, because it contains 22 for organization processes as a requirement to be implemented in an organizational environment in order to build, manage and deliver quality service to the customer. Assessing organization management processes is the first step to implementing ISO/IEC 20000:2011 into the organization management processes. This assessment needs Process Assessment Model (PAM) as an assessment instrument. PAM comprises two parts: Process Reference Model (PRM) and Measurement Framework (MF). PRM is built by transforming the 22 process of ISO/IEC 20000-1:2011 and MF is based on ISO/IEC 33020. This assessment instrument was designed to assess the capability of service management process in Divisi Teknologi dan Sistem Informasi (Information Systems and Technology Division) as an internal organization of PT Pos Indonesia. The result of this assessment model can be proposed to improve the capability of service management system.

Keywords: ISO/IEC 20000-1:2011, ISO/IEC 33020:2015, process assessment, process capability, service management system

Procedia PDF Downloads 445
20018 Enhance Concurrent Design Approach through a Design Methodology Based on an Artificial Intelligence Framework: Guiding Group Decision Making to Balanced Preliminary Design Solution

Authors: Loris Franchi, Daniele Calvi, Sabrina Corpino

Abstract:

This paper presents a design methodology in which stakeholders are assisted with the exploration of a so-called negotiation space, aiming to the maximization of both group social welfare and single stakeholder’s perceived utility. The outcome results in less design iterations needed for design convergence while obtaining a higher solution effectiveness. During the early stage of a space project, not only the knowledge about the system but also the decision outcomes often are unknown. The scenario is exacerbated by the fact that decisions taken in this stage imply delayed costs associated with them. Hence, it is necessary to have a clear definition of the problem under analysis, especially in the initial definition. This can be obtained thanks to a robust generation and exploration of design alternatives. This process must consider that design usually involves various individuals, who take decisions affecting one another. An effective coordination among these decision-makers is critical. Finding mutual agreement solution will reduce the iterations involved in the design process. To handle this scenario, the paper proposes a design methodology which, aims to speed-up the process of pushing the mission’s concept maturity level. This push up is obtained thanks to a guided negotiation space exploration, which involves autonomously exploration and optimization of trade opportunities among stakeholders via Artificial Intelligence algorithms. The negotiation space is generated via a multidisciplinary collaborative optimization method, infused by game theory and multi-attribute utility theory. In particular, game theory is able to model the negotiation process to reach the equilibria among stakeholder needs. Because of the huge dimension of the negotiation space, a collaborative optimization framework with evolutionary algorithm has been integrated in order to guide the game process to efficiently and rapidly searching for the Pareto equilibria among stakeholders. At last, the concept of utility constituted the mechanism to bridge the language barrier between experts of different backgrounds and differing needs, using the elicited and modeled needs to evaluate a multitude of alternatives. To highlight the benefits of the proposed methodology, the paper presents the design of a CubeSat mission for the observation of lunar radiation environment. The derived solution results able to balance all stakeholders needs and guaranteeing the effectiveness of the selection mission concept thanks to its robustness in valuable changeability. The benefits provided by the proposed design methodology are highlighted, and further development proposed.

Keywords: concurrent engineering, artificial intelligence, negotiation in engineering design, multidisciplinary optimization

Procedia PDF Downloads 112
20017 Adjustment and Scale-Up Strategy of Pilot Liquid Fermentation Process of Azotobacter sp.

Authors: G. Quiroga-Cubides, A. Díaz, M. Gómez

Abstract:

The genus Azotobacter has been widely used as bio-fertilizer due to its significant effects on the stimulation and promotion of plant growth in various agricultural species of commercial interest. In order to obtain significantly viable cellular concentration, a scale-up strategy for a liquid fermentation process (SmF) with two strains of A. chroococcum (named Ac1 and Ac10) was validated and adjusted at laboratory and pilot scale. A batch fermentation process under previously defined conditions was carried out on a biorreactor Infors®, model Minifors of 3.5 L, which served as a baseline for this research. For the purpose of increasing process efficiency, the effect of the reduction of stirring speed was evaluated in combination with a fed-batch-type fermentation laboratory scale. To reproduce the efficiency parameters obtained, a scale-up strategy with geometric and fluid dynamic behavior similarities was evaluated. According to the analysis of variance, this scale-up strategy did not have significant effect on cellular concentration and in laboratory and pilot fermentations (Tukey, p > 0.05). Regarding air consumption, fermentation process at pilot scale showed a reduction of 23% versus the baseline. The percentage of reduction related to energy consumption reduction under laboratory and pilot scale conditions was 96.9% compared with baseline.

Keywords: Azotobacter chroococcum, scale-up, liquid fermentation, fed-batch process

Procedia PDF Downloads 422
20016 Application of Fuzzy Analytical Hierarchical Process in Evaluation Supply Chain Performance Measurement

Authors: Riyadh Jamegh, AllaEldin Kassam, Sawsan Sabih

Abstract:

In modern trends of market, organizations face high-pressure environment which is characterized by globalization, high competition, and customer orientation, so it is very crucial to control and know the weak and strong points of the supply chain in order to improve their performance. So the performance measurements presented as an important tool of supply chain management because it's enabled the organizations to control, understand, and improve their efficiency. This paper aims to identify supply chain performance measurement (SCPM) by using Fuzzy Analytical Hierarchical Process (FAHP). In our real application, the performance of organizations estimated based on four parameters these are cost parameter indicator of cost (CPI), inventory turnover parameter indicator of (INPI), raw material parameter (RMPI), and safety stock level parameter indicator (SSPI), these indicators vary in impact on performance depending upon policies and strategies of organization. In this research (FAHP) technique has been used to identify the importance of such parameters, and then first fuzzy inference (FIR1) is applied to identify performance indicator of each factor depending on the importance of the factor and its value. Then, the second fuzzy inference (FIR2) also applied to integrate the effect of these indicators and identify (SCPM) which represent the required output. The developed approach provides an effective tool for evaluation of supply chain performance measurement.

Keywords: fuzzy performance measurements, supply chain, fuzzy logic, key performance indicator

Procedia PDF Downloads 121
20015 Determination of Non-CO2 Greenhouse Gas Emission in Electronics Industry

Authors: Bong Jae Lee, Jeong Il Lee, Hyo Su Kim

Abstract:

Both developed and developing countries have adopted the decision to join the Paris agreement to reduce greenhouse gas (GHG) emissions at the Conference of the Parties (COP) 21 meeting in Paris. As a result, the developed and developing countries have to submit the Intended Nationally Determined Contributions (INDC) by 2020, and each country will be assessed for their performance in reducing GHG. After that, they shall propose a reduction target which is higher than the previous target every five years. Therefore, an accurate method for calculating greenhouse gas emissions is essential to be presented as a rational for implementing GHG reduction measures based on the reduction targets. Non-CO2 GHGs (CF4, NF3, N2O, SF6 and so on) are being widely used in fabrication process of semiconductor manufacturing, and etching/deposition process of display manufacturing process. The Global Warming Potential (GWP) value of Non-CO2 is much higher than CO2, which means it will have greater effect on a global warming than CO2. Therefore, GHG calculation methods of the electronics industry are provided by Intergovernmental Panel on climate change (IPCC) and U.S. Environmental Protection Agency (EPA), and it will be discussed at ISO/TC 146 meeting. As discussed earlier, being precise and accurate in calculating Non-CO2 GHG is becoming more important. Thus this study aims to discuss the implications of the calculating methods through comparing the methods of IPCC and EPA. As a conclusion, after analyzing the methods of IPCC & EPA, the method of EPA is more detailed and it also provides the calculation for N2O. In case of the default emission factor (by IPCC & EPA), IPCC provides more conservative results compared to that of EPA; The factor of IPCC was developed for calculating a national GHG emission, while the factor of EPA was specifically developed for the U.S. which means it must have been developed to address the environmental issue of the US. The semiconductor factory ‘A’ measured F gas according to the EPA Destruction and Removal Efficiency (DRE) protocol and estimated their own DRE, and it was observed that their emission factor shows higher DRE compared to default DRE factor of IPCC and EPA Therefore, each country can improve their GHG emission calculation by developing its own emission factor (if possible) at the time of reporting Nationally Determined Contributions (NDC). Acknowledgements: This work was supported by the Korea Evaluation Institute of Industrial Technology (No. 10053589).

Keywords: non-CO2 GHG, GHG emission, electronics industry, measuring method

Procedia PDF Downloads 270
20014 Application of Tocopherol as Antioxidant to Reduce Decomposition Process on Palm Oil Biodiesel

Authors: Supriyono, Sumardiyono, Rendy J. Pramono

Abstract:

Biodiesel is one of the alternative fuels promising for substituting petrodiesel as energy source which has an advantage as it is sustainable and eco-friendly. Due to the raw material that tends to decompose during storage, biodiesel also has the same characteristic that tends to decompose during storage. Biodiesel decomposition will form higher acid value as the result of oxidation to double bond on a fatty acid compound on biodiesel. Thus, free fatty acid value could be used to evaluate degradation of biodiesel due to the oxidation process. High free fatty acid on biodiesel could impact on the engine performance. Decomposition of biodiesel due to oxidation reaction could prevent by introducing a small amount of antioxidant. The origin of raw materials and the process for producing biodiesel will determine the effectiveness of antioxidant. Biodiesel made from high free fatty acid (FFA) crude palm oil (CPO) by using two steps esterification is vulnerable to oxidation process which is resulted in increasing on the FFA value. Tocopherol also known as vitamin E is one of the antioxidant that could improve the stability of biodiesel due to decomposition by the oxidation process. Tocopherol 0.5% concentration on palm oil biodiesel could reduce 13% of increasing FFA under temperature 80 °C and exposing time 180 minute.

Keywords: antioxidant, palm oil biodiesel, decomposition, oxidation, tocopherol

Procedia PDF Downloads 333
20013 Clinch Process Simulation Using Diffuse Elements

Authors: Benzegaou Ali, Brani Benabderrahmane

Abstract:

This work describes a numerical study of the TOX–clinching process using diffuse elements. A computer code baptized SEMA "Static Explicit Method Analysis" is developed to simulate the clinch joining process. The FE code is based on an Updated Lagrangian scheme. The used resolution method is based on an explicit static approach. The integration of the elasto-plastic behavior law is realized with an algorithm of Simo and Taylor. The tools are represented by plane facets.

Keywords: diffuse elements, numerical simulation, clinching, contact, large deformation

Procedia PDF Downloads 345
20012 The Application of Dynamic Network Process to Environment Planning Support Systems

Authors: Wann-Ming Wey

Abstract:

In recent years, in addition to face the external threats such as energy shortages and climate change, traffic congestion and environmental pollution have become anxious problems for many cities. Considering private automobile-oriented urban development had produced many negative environmental and social impacts, the transit-oriented development (TOD) has been considered as a sustainable urban model. TOD encourages public transport combined with friendly walking and cycling environment designs, however, non-motorized modes help improving human health, energy saving, and reducing carbon emissions. Due to environmental changes often affect the planners’ decision-making; this research applies dynamic network process (DNP) which includes the time dependent concept to promoting friendly walking and cycling environmental designs as an advanced planning support system for environment improvements. This research aims to discuss what kinds of design strategies can improve a friendly walking and cycling environment under TOD. First of all, we collate and analyze environment designing factors by reviewing the relevant literatures as well as divide into three aspects of “safety”, “convenience”, and “amenity” from fifteen environment designing factors. Furthermore, we utilize fuzzy Delphi Technique (FDT) expert questionnaire to filter out the more important designing criteria for the study case. Finally, we utilized DNP expert questionnaire to obtain the weights changes at different time points for each design criterion. Based on the changing trends of each criterion weight, we are able to develop appropriate designing strategies as the reference for planners to allocate resources in a dynamic environment. In order to illustrate the approach we propose in this research, Taipei city as one example has been used as an empirical study, and the results are in depth analyzed to explain the application of our proposed approach.

Keywords: environment planning support systems, walking and cycling, transit-oriented development (TOD), dynamic network process (DNP)

Procedia PDF Downloads 326