Search results for: new process model
27131 Non-Methane Hydrocarbons Emission during the Photocopying Process
Authors: Kiurski S. Jelena, Aksentijević M. Snežana, Kecić S. Vesna, Oros B. Ivana
Abstract:
The prosperity of electronic equipment in photocopying environment not only has improved work efficiency, but also has changed indoor air quality. Considering the number of photocopying employed, indoor air quality might be worse than in general office environments. Determining the contribution from any type of equipment to indoor air pollution is a complex matter. Non-methane hydrocarbons are known to have an important role of air quality due to their high reactivity. The presence of hazardous pollutants in indoor air has been detected in one photocopying shop in Novi Sad, Serbia. Air samples were collected and analyzed for five days, during 8-hr working time in three-time intervals, whereas three different sampling points were determined. Using multiple linear regression model and software package STATISTICA 10 the concentrations of occupational hazards and micro-climates parameters were mutually correlated. Based on the obtained multiple coefficients of determination (0.3751, 0.2389, and 0.1975), a weak positive correlation between the observed variables was determined. Small values of parameter F indicated that there was no statistically significant difference between the concentration levels of non-methane hydrocarbons and micro-climates parameters. The results showed that variable could be presented by the general regression model: y = b0 + b1xi1+ b2xi2. Obtained regression equations allow to measure the quantitative agreement between the variation of variables and thus obtain more accurate knowledge of their mutual relations.Keywords: non-methane hydrocarbons, photocopying process, multiple regression analysis, indoor air quality, pollutant emission
Procedia PDF Downloads 37627130 Physical Education Teacher's Interpretation toward Teaching Games for Understanding Model
Authors: Soni Nopembri
Abstract:
The objective of this research is to evaluate the implementation of teaching games for Understanding model by conducting action to physical education teacher who have got long teaching experience. The research applied Participatory Action Research. The subjects of this research were 19 physical education teachers who had got training of Teaching Games for Understanding. Data collection was conducted intensively through a questionnaire, in-depth interview, Focus Group Discussion (FGD), observation, and documentation. The collected data was analysis zed qualitatively and quantitatively. The result showed that physical education teachers had got an appropriate interpretation on TGfU model. Some indicators that were the focus of this research indicated this points; they are: (1) physical education teachers had good understanding toward TGfU model, (2) PE teachers’ competence in applying TGfU model on Physical Education at school were adequate, though some improvement were needed, (3) the influence factors in the implementation of TGfU model, in sequence, were teacher, facilities, environment, and students factors, (4) PE teachers’ perspective toward TGfU model were positively good, although some teachers were less optimistic toward the development of TGfU model in the future.Keywords: TGfU, physical education teacher, teaching games, FGD
Procedia PDF Downloads 54427129 A Comparative Study of Force Prediction Models during Static Bending Stage for 3-Roller Cone Frustum Bending
Authors: Mahesh Chudasama, Harit Raval
Abstract:
Conical sections and shells of metal plates manufactured by 3-roller conical bending process are widely used in the industries. The process is completed by first bending the metal plates statically and then dynamic roller bending sequentially. It is required to have an analytical model to get maximum bending force, for optimum design of the machine, for static bending stage. Analytical models assuming various stress conditions are considered and these analytical models are compared considering various parameters and reported in this paper. It is concluded from the study that for higher bottom roller inclination, the shear stress affects greatly to the static bending force whereas for lower bottom roller inclination it can be neglected.Keywords: roller-bending, static-bending, stress-conditions, analytical-modeling
Procedia PDF Downloads 25027128 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand
Authors: Leila Jafari, Viliam Makis
Abstract:
In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.Keywords: condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand
Procedia PDF Downloads 46127127 Calibration of the Discrete Element Method Using a Large Shear Box
Authors: C. J. Coetzee, E. Horn
Abstract:
One of the main challenges in using the Discrete Element Method (DEM) is to specify the correct input parameter values. In general, the models are sensitive to the input parameter values and accurate results can only be achieved if the correct values are specified. For the linear contact model, micro-parameters such as the particle density, stiffness, coefficient of friction, as well as the particle size and shape distributions are required. There is a need for a procedure to accurately calibrate these parameters before any attempt can be made to accurately model a complete bulk materials handling system. Since DEM is often used to model applications in the mining and quarrying industries, a calibration procedure was developed for materials that consist of relatively large (up to 40 mm in size) particles. A coarse crushed aggregate was used as the test material. Using a specially designed large shear box with a diameter of 590 mm, the confined Young’s modulus (bulk stiffness) and internal friction angle of the material were measured by means of the confined compression test and the direct shear test respectively. DEM models of the experimental setup were developed and the input parameter values were varied iteratively until a close correlation between the experimental and numerical results was achieved. The calibration process was validated by modelling the pull-out of an anchor from a bed of material. The model results compared well with experimental measurement.Keywords: Discrete Element Method (DEM), calibration, shear box, anchor pull-out
Procedia PDF Downloads 29027126 Habitat Model Review and a Proposed Methodology to Value Economic Trade-Off between Cage Culture and Habitat of an Endemic Species in Lake Maninjau, Indonesia
Authors: Ivana Yuniarti, Iwan Ridwansyah
Abstract:
This paper delivers a review of various methodologies for habitat assessment and a proposed methodology to assess an endemic fish species habitat in Lake Maninjau, Indonesia as a part of a Ph.D. project. This application is mainly aimed to assess the trade-off between the economic value of aquaculture and the fisheries. The proposed methodology is a generalized linear model (GLM) combined with GIS to assess presence-absence data or habitat suitability index (HSI) combined with the analytical hierarchy process (AHP). Further, a cost of habitat replacement approach is planned to be used to calculate the habitat value as well as its trade-off with the economic value of aquaculture. The result of the study is expected to be a scientific consideration in local decision making and to provide a reference for other areas in the country.Keywords: AHP, habitat, GLM, HSI, Maninjau
Procedia PDF Downloads 15027125 A Machine Learning Approach for Performance Prediction Based on User Behavioral Factors in E-Learning Environments
Authors: Naduni Ranasinghe
Abstract:
E-learning environments are getting more popular than any other due to the impact of COVID19. Even though e-learning is one of the best solutions for the teaching-learning process in the academic process, it’s not without major challenges. Nowadays, machine learning approaches are utilized in the analysis of how behavioral factors lead to better adoption and how they related to better performance of the students in eLearning environments. During the pandemic, we realized the academic process in the eLearning approach had a major issue, especially for the performance of the students. Therefore, an approach that investigates student behaviors in eLearning environments using a data-intensive machine learning approach is appreciated. A hybrid approach was used to understand how each previously told variables are related to the other. A more quantitative approach was used referred to literature to understand the weights of each factor for adoption and in terms of performance. The data set was collected from previously done research to help the training and testing process in ML. Special attention was made to incorporating different dimensionality of the data to understand the dependency levels of each. Five independent variables out of twelve variables were chosen based on their impact on the dependent variable, and by considering the descriptive statistics, out of three models developed (Random Forest classifier, SVM, and Decision tree classifier), random forest Classifier (Accuracy – 0.8542) gave the highest value for accuracy. Overall, this work met its goals of improving student performance by identifying students who are at-risk and dropout, emphasizing the necessity of using both static and dynamic data.Keywords: academic performance prediction, e learning, learning analytics, machine learning, predictive model
Procedia PDF Downloads 15527124 Geomechanical Numerical Modeling of Well Wall in Drilling with Finite Difference Method
Authors: Marzieh Zarei
Abstract:
Well instability is one of the most fundamental challenges faced by the oil and gas industry. Well wall stability analysis is a gap to be filled in the oil industry. The collection of static data such as well logging leads to the construction of a geomechanical numerical model, which will help in assessing the probable risks in future drilling. In this paper, geomechanical model was designed, and mechanical properties of the rock was determined at all points of the model. It was found the safe mud window was determined and the minimum and maximum mud pressures were determined in the ranges of 70-60 MPa and 110-100 MPa, respectively.Keywords: geomechanics, numerical model, well stability, in-situ stress, underbalanced drilling
Procedia PDF Downloads 12727123 Study of the Protection of Induction Motors
Authors: Bencheikh Abdellah
Abstract:
In this paper, we present a mathematical model dedicated to the simulation breaks bars in a three-phase cage induction motor. This model is based on a mesh circuit representing the rotor cage. The tested simulation allowed us to demonstrate the effectiveness of this model to describe the behavior of the machine in a healthy state, failure.Keywords: AC motors, squirrel cage, diagnostics, MATLAB, SIMULINK
Procedia PDF Downloads 43527122 Reservoir Fluids: Occurrence, Classification, and Modeling
Authors: Ahmed El-Banbi
Abstract:
Several PVT models exist to represent how PVT properties are handled in sub-surface and surface engineering calculations for oil and gas production. The most commonly used models include black oil, modified black oil (MBO), and compositional models. These models are used in calculations that allow engineers to optimize and forecast well and reservoir performance (e.g., reservoir simulation calculations, material balance, nodal analysis, surface facilities, etc.). The choice of which model is dependent on fluid type and the production process (e.g., depletion, water injection, gas injection, etc.). Based on close to 2,000 reservoir fluid samples collected from different basins and locations, this paper presents some conclusions on the occurrence of reservoir fluids. It also reviews the common methods used to classify reservoir fluid types. Based on new criteria related to the production behavior of different fluids and economic considerations, an updated classification of reservoir fluid types is presented in the paper. Recommendations on the use of different PVT models to simulate the behavior of different reservoir fluid types are discussed. Each PVT model requirement is highlighted. Available methods for the calculation of PVT properties from each model are also discussed. Practical recommendations and tips on how to control the calculations to achieve the most accurate results are given.Keywords: PVT models, fluid types, PVT properties, fluids classification
Procedia PDF Downloads 7027121 Dynamic Model of Heterogeneous Markets with Imperfect Information for the Optimization of Company's Long-Time Strategy
Authors: Oleg Oborin
Abstract:
This paper is dedicated to the development of the model, which can be used to evaluate the effectiveness of long-term corporate strategies and identify the best strategies. The theoretical model of the relatively homogenous product market (such as iron and steel industry, mobile services or road transport) has been developed. In the model, the market consists of a large number of companies with different internal characteristics and objectives. The companies can perform mergers and acquisitions in order to increase their market share. The model allows the simulation of long-time dynamics of the market (for a period longer than 20 years). Therefore, a large number of simulations on random input data was conducted in the framework of the model. After that, the results of the model were compared with the dynamics of real markets, such as the US steel industry from the beginning of the XX century to the present day, and the market of mobile services in Germany for the period between 1990 and 2015.Keywords: Economic Modelling, Long-Time Strategy, Mergers and Acquisitions, Simulation
Procedia PDF Downloads 36627120 Working with Interpreters: Using Role Play to Teach Social Work Students
Authors: Yuet Wah Echo Yeung
Abstract:
Working with people from minority ethnic groups, refugees and asylum seeking communities who have limited proficiency in the language of the host country often presents a major challenge for social workers. Because of language differences, social workers need to work with interpreters to ensure accurate information is collected for their assessment and intervention. Drawing from social learning theory, this paper discusses how role play was used as an experiential learning exercise in a training session to help social work students develop skills when working with interpreters. Social learning theory posits that learning is a cognitive process that takes place in a social context when people observe, imitate and model others’ behaviours. The roleplay also helped students understand the role of the interpreter and the challenges they may face when they rely on interpreters to communicate with service users and their family. The first part of the session involved role play. A tutor played the role of social worker and deliberately behaved in an unprofessional manner and used inappropriate body language when working alongside the interpreter during a home visit. The purpose of the roleplay is not to provide a positive role model for students to ‘imitate’ social worker’s behaviours. Rather it aims to active and provoke internal thinking process and encourages students to critically consider the impacts of poor practice on relationship building and the intervention process. Having critically reflected on the implications for poor practice, students were then asked to play the role of social worker and demonstrate what good practice should look like. At the end of the session, students remarked that they learnt a lot by observing the good and bad example; it showed them what not to do. The exercise served to remind students how practitioners can easily slip into bad habits and of the importance of respect for the cultural difference when working with people from different cultural backgrounds.Keywords: role play, social learning theory, social work practice, working with interpreters
Procedia PDF Downloads 17927119 Response Surface Methodology to Supercritical Carbon Dioxide Extraction of Microalgal Lipids
Authors: Yen-Hui Chen, Terry Walker
Abstract:
As the world experiences an energy crisis, investing in sustainable energy resources is a pressing mission for many countries. Microalgae-derived biodiesel has attracted intensive attention as an important biofuel, and microalgae Chlorella protothecoides lipid is recognized as a renewable source for microalgae-derived biodiesel production. Supercritical carbon dioxide (SC-CO₂) is a promising green solvent that may potentially substitute the use of organic solvents for lipid extraction; however, the efficiency of SC-CO₂ extraction may be affected by many variables, including temperature, pressure and extraction time individually or in combination. In this study, response surface methodology (RSM) was used to optimize the process parameters, including temperature, pressure and extraction time, on C. protothecoides lipid yield by SC-CO₂ extraction. A second order polynomial model provided a good fit (R-square value of 0.94) for the C. protothecoides lipid yield. The linear and quadratic terms of temperature, pressure and extraction time—as well as the interaction between temperature and pressure—showed significant effects on lipid yield during extraction. The optimal lipid yield from the model was predicted as the temperature of 59 °C, the pressure of 350.7 bar and the extraction time 2.8 hours. Under these conditions, the experimental lipid yield (25%) was close to the predicted value. The principal fatty acid methyl esters (FAME) of C. protothecoides lipid-derived biodiesel were oleic acid methyl ester (60.1%), linoleic acid methyl ester (18.6%) and palmitic acid methyl ester (11.4%), which made up more than 90% of the total FAMEs. In summary, this study indicated that RSM was useful to characterize the optimization the SC-CO₂ extraction process of C. protothecoides lipid yield, and the second-order polynomial model could be used for predicting and describing the lipid yield very well. In addition, C. protothecoides lipid, extracted by SC-CO₂, was suggested as a potential candidate for microalgae-derived biodiesel production.Keywords: Chlorella protothecoides, microalgal lipids, response surface methodology, supercritical carbon dioxide extraction
Procedia PDF Downloads 44127118 Design, Implementation, and Evaluation of ALS-PBL Model in the EMI Classroom
Authors: Yen-Hui Lu
Abstract:
In the past two decades, in order to increase university visibility and internationalization, English as a medium of instruction (EMI) has become one of the main language policies in higher education institutions where English is not a dominant language. However, given the complex, discipline-embedded nature of academic communication, academic literacy does not come with students’ everyday language experience, and it is a challenge for all students. Particularly, to engage students in the effective learning process of discipline concepts in the EMI classrooms, teachers need to provide explicit academic language instruction to assist students in deep understanding of discipline concepts. To bridge the gap between academic language development and discipline learning in the EMI classrooms, the researcher incorporates academic language strategies and key elements of project-based learning (PBL) into an Academic Language Strategy driven PBL (ALS-PBL) model. With clear steps and strategies, the model helps EMI teachers to scaffold students’ academic language development in the EMI classrooms. ALS-PBL model includes three major stages: preparation, implementation, and assessment. First, in the preparation stage, ALS-PBL teachers need to identify learning goals for both content and language learning and to design PBL topics for investigation. Second, during the implementation stage, ALS-PBL teachers use the model as a guideline to create a lesson structure and class routine. There are five important elements in the implementation stage: (1) academic language preparation, (2) connecting background knowledge, (3) comprehensible input, (4) academic language reinforcement, and (5) sustained inquiry and project presentation. Finally, ALS-PBL teachers use formative assessments such as student learning logs, teachers’ feedback, and peer evaluation to collect detailed information that demonstrates students’ academic language development in the learning process. In this study, ALS-PBL model was implemented in an interdisciplinary course entitled “Science is Everywhere”, which was co-taught by five professors from different discipline backgrounds, English education, civil engineering, business administration, international business, and chemical engineering. The purpose of the course was to cultivate students’ interdisciplinary knowledge as well as English competency in disciplinary areas. This study used a case-study design to systematically investigate students’ learning experiences in the class using ALS-PBL model. The participants of the study were 22 college students with different majors. This course was one of the elective EMI courses in this focal university. The students enrolled in this EMI course to fulfill the school language policy, which requires the students to complete two EMI courses before their graduation. For the credibility, this study used multiple methods to collect data, including classroom observation, teachers’ feedback, peer assessment, student learning log, and student focus-group interviews. Research findings show four major successful aspects of implementing ALS-PBL model in the EMI classroom: (1) clear focus on both content and language learning, (2) meaningful practice in authentic communication, (3) reflective learning in academic language strategies, and (4) collaborative support in content knowledge.This study will be of value to teachers involved in delivering English as well as content lessons to language learners by providing a theoretically-sound practical model for application in the classroom.Keywords: academic language development, content and language integrated learning, english as a medium of instruction, project-based learning
Procedia PDF Downloads 8127117 Life Prediction Method of Lithium-Ion Battery Based on Grey Support Vector Machines
Authors: Xiaogang Li, Jieqiong Miao
Abstract:
As for the problem of the grey forecasting model prediction accuracy is low, an improved grey prediction model is put forward. Firstly, use trigonometric function transform the original data sequence in order to improve the smoothness of data , this model called SGM( smoothness of grey prediction model), then combine the improved grey model with support vector machine , and put forward the grey support vector machine model (SGM - SVM).Before the establishment of the model, we use trigonometric functions and accumulation generation operation preprocessing data in order to enhance the smoothness of the data and weaken the randomness of the data, then use support vector machine (SVM) to establish a prediction model for pre-processed data and select model parameters using genetic algorithms to obtain the optimum value of the global search. Finally, restore data through the "regressive generate" operation to get forecasting data. In order to prove that the SGM-SVM model is superior to other models, we select the battery life data from calce. The presented model is used to predict life of battery and the predicted result was compared with that of grey model and support vector machines.For a more intuitive comparison of the three models, this paper presents root mean square error of this three different models .The results show that the effect of grey support vector machine (SGM-SVM) to predict life is optimal, and the root mean square error is only 3.18%. Keywords: grey forecasting model, trigonometric function, support vector machine, genetic algorithms, root mean square errorKeywords: Grey prediction model, trigonometric functions, support vector machines, genetic algorithms, root mean square error
Procedia PDF Downloads 45927116 Comparative Study of Non-Identical Firearms with Priority to Repair Subject to Inspection
Authors: A. S. Grewal, R. S. Sangwan, Dharambir, Vikas Dhanda
Abstract:
The purpose of this paper is to develop and analyze two reliability models for a system of non-identical firearms – one is standard firearm (called as original unit) and the other is a country-made firearm (called as duplicate /substandard unit). There is a single server who comes immediately to do inspection and repair whenever needed. On the failure of standard firearm, the server inspects the operative country-made firearm to see whether the unit is capable of performing the desired function well or not. If country-made firearm is not capable to do so, the operation of the system is stopped and server starts repair of the standard firearms immediately. However, no inspection is done at the failure of the country-made firearm as the country-made firearm alone is capable of performing the given task well. In model I, priority to repair the standard firearm is given in case system fails completely and country-made firearm is already under repair, whereas in model II there is no such priority. The failure and repair times of each unit are assumed to be independent and uncorrelated random variables. The distributions of failure time of the units are taken as negative exponential while that of repair and inspection times are general. By using semi-Markov process and regenerative point technique some econo-reliability measures are obtained. Graphs are plotted to compare the MTSF (mean time to system failure), availability and profit of the models for a particular case.Keywords: non-identical firearms, inspection, priority to repair, semi-Markov process, regenerative point
Procedia PDF Downloads 42427115 Free Fatty Acid Assessment of Crude Palm Oil Using a Non-Destructive Approach
Authors: Siti Nurhidayah Naqiah Abdull Rani, Herlina Abdul Rahim, Rashidah Ghazali, Noramli Abdul Razak
Abstract:
Near infrared (NIR) spectroscopy has always been of great interest in the food and agriculture industries. The development of prediction models has facilitated the estimation process in recent years. In this study, 110 crude palm oil (CPO) samples were used to build a free fatty acid (FFA) prediction model. 60% of the collected data were used for training purposes and the remaining 40% used for testing. The visible peaks on the NIR spectrum were at 1725 nm and 1760 nm, indicating the existence of the first overtone of C-H bands. Principal component regression (PCR) was applied to the data in order to build this mathematical prediction model. The optimal number of principal components was 10. The results showed R2=0.7147 for the training set and R2=0.6404 for the testing set.Keywords: palm oil, fatty acid, NIRS, regression
Procedia PDF Downloads 50427114 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 1127113 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping
Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton
Abstract:
Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern
Procedia PDF Downloads 18027112 Determination of Stress-Strain Curve of Duplex Stainless Steel Welds
Authors: Carolina Payares-Asprino
Abstract:
Dual-phase duplex stainless steel comprised of ferrite and austenite has shown high strength and corrosion resistance in many aggressive environments. Joining duplex alloys is challenging due to several embrittling precipitates and metallurgical changes during the welding process. The welding parameters strongly influence the quality of a weld joint. Therefore, it is necessary to quantify the weld bead’s integral properties as a function of welding parameters, especially when part of the weld bead is removed through a machining process due to aesthetic reasons or to couple the elements in the in-service structure. The present study uses the existing stress-strain model to predict the stress-strain curves for duplex stainless-steel welds under different welding conditions. Having mathematical expressions that predict the shape of the stress-strain curve is advantageous since it reduces the experimental work in obtaining the tensile test. In analysis and design, such stress-strain modeling simplifies the time of operations by being integrated into calculation tools, such as the finite element program codes. The elastic zone and the plastic zone of the curve can be defined by specific parameters, generating expressions that simulate the curve with great precision. There are empirical equations that describe the stress-strain curves. However, they only refer to the stress-strain curve for the stainless steel, but not when the material is under the welding process. It is a significant contribution to the applications of duplex stainless steel welds. For this study, a 3x3 matrix with a low, medium, and high level for each of the welding parameters were applied, giving a total of 27 weld bead plates. Two tensile specimens were manufactured from each welded plate, resulting in 54 tensile specimens for testing. When evaluating the four models used to predict the stress-strain curve in the welded specimens, only one model (Rasmussen) presented a good correlation in predicting the strain stress curve.Keywords: duplex stainless steels, modeling, stress-stress curve, tensile test, welding
Procedia PDF Downloads 16627111 Modelling of Exothermic Reactions during Carbon Fibre Manufacturing and Coupling to Surrounding Airflow
Authors: Musa Akdere, Gunnar Seide, Thomas Gries
Abstract:
Carbon fibres are fibrous materials with a carbon atom amount of more than 90%. They combine excellent mechanicals properties with a very low density. Thus carbon fibre reinforced plastics (CFRP) are very often used in lightweight design and construction. The precursor material is usually polyacrylonitrile (PAN) based and wet-spun. During the production of carbon fibre, the precursor has to be stabilized thermally to withstand the high temperatures of up to 1500 °C which occur during carbonization. Even though carbon fibre has been used since the late 1970s in aerospace application, there is still no general method available to find the optimal production parameters and the trial-and-error approach is most often the only resolution. To have a much better insight into the process the chemical reactions during stabilization have to be analyzed particularly. Therefore, a model of the chemical reactions (cyclization, dehydration, and oxidation) based on the research of Dunham and Edie has been developed. With the presented model, it is possible to perform a complete simulation of the fibre undergoing all zones of stabilization. The fiber bundle is modeled as several circular fibers with a layer of air in-between. Two thermal mechanisms are considered to be the most important: the exothermic reactions inside the fiber and the convective heat transfer between the fiber and the air. The exothermic reactions inside the fibers are modeled as a heat source. Differential scanning calorimetry measurements have been performed to estimate the amount of heat of the reactions. To shorten the required time of a simulation, the number of fibers is decreased by similitude theory. Experiments were conducted to validate the simulation results of the fibre temperature during stabilization. The experiments for the validation were conducted on a pilot scale stabilization oven. To measure the fibre bundle temperature, a new measuring method is developed. The comparison of the results shows that the developed simulation model gives good approximations for the temperature profile of the fibre bundle during the stabilization process.Keywords: carbon fibre, coupled simulation, exothermic reactions, fibre-air-interface
Procedia PDF Downloads 27227110 Modelling Export Dynamics in the CSEE Countries Using GVAR Model
Abstract:
The paper investigates the key factors of export dynamics for a set of Central and Southeast European (CSEE) countries in the context of current economic and financial crisis. In order to model the export dynamics a Global Vector Auto Regressive (GVAR) model is defined. As opposed to models which model each country separately, the GVAR combines all country models in a global model which enables obtaining important information on spill-over effects in the context of globalization and rising international linkages. The results of the study indicate that for most of the CSEE countries, exports are mainly driven by domestic shocks, both in the short run and in the long run. This study is the first application of the GVAR model to studying the export dynamics in the CSEE countries and therefore the results of the study present an important empirical contribution.Keywords: export, GFEVD, global VAR, international trade, weak exogeneity
Procedia PDF Downloads 30027109 The Ductile Fracture of Armor Steel Targets Subjected to Ballistic Impact and Perforation: Calibration of Four Damage Criteria
Authors: Imen Asma Mbarek, Alexis Rusinek, Etienne Petit, Guy Sutter, Gautier List
Abstract:
Over the past two decades, the automotive, aerospace and army industries have been paying an increasing attention to Finite Elements (FE) numerical simulations of the fracture process of their structures. Thanks to the numerical simulations, it is nowadays possible to analyze several problems involving costly and dangerous extreme loadings safely and at a reduced cost such as blast or ballistic impact problems. The present paper is concerned with ballistic impact and perforation problems involving ductile fracture of thin armor steel targets. The target fracture process depends usually on various parameters: the projectile nose shape, the target thickness and its mechanical properties as well as the impact conditions (friction, oblique/normal impact...). In this work, the investigations are concerned with the normal impact of a conical head-shaped projectile on thin armor steel targets. The main aim is to establish a comparative study of four fracture criteria that are commonly used in the fracture process simulations of structures subjected to extreme loadings such as ballistic impact and perforation. Usually, the damage initiation results from a complex physical process that occurs at the micromechanical scale. On a macro scale and according to the following fracture models, the variables on which the fracture depends are mainly the stress triaxiality ƞ, the strain rate, temperature T, and eventually the Lode angle parameter Ɵ. The four failure criteria are: the critical strain to failure model, the Johnson-Cook model, the Wierzbicki model and the Modified Hosford-Coulomb model MHC. Using the SEM, the observations of the fracture facies of tension specimen and of armor steel targets impacted at low and high incident velocities show that the fracture of the specimens is a ductile fracture. The failure mode of the targets is petalling with crack propagation and the fracture facies are covered with micro-cavities. The parameters of each ductile fracture model have been identified for three armor steels and the applicability of each criterion was evaluated using experimental investigations coupled to numerical simulations. Two loading paths were investigated in this study, under a wide range of strain rates. Namely, quasi-static and intermediate uniaxial tension and quasi-static and dynamic double shear testing allow covering various values of stress triaxiality ƞ and of the Lode angle parameter Ɵ. All experiments were conducted on three different armor steel specimen under quasi-static strain rates ranging from 10-4 to 10-1 1/s and at three different temperatures ranging from 297K to 500K, allowing drawing the influence of temperature on the fracture process. Intermediate tension testing was coupled to dynamic double shear experiments conducted on the Hopkinson tube device, allowing to spot the effect of high strain rate on the damage evolution and the crack propagation. The aforementioned fracture criteria are implemented into the FE code ABAQUS via VUMAT subroutine and they were coupled to suitable constitutive relations allow having reliable results of ballistic impact problems simulation. The calibration of the four damage criteria as well as a concise evaluation of the applicability of each criterion are detailed in this work.Keywords: armor steels, ballistic impact, damage criteria, ductile fracture, SEM
Procedia PDF Downloads 31227108 A Propose of Personnel Assessment Method Including a Two-Way Assessment for Evaluating Evaluators and Employees
Authors: Shunsuke Saito, Kazuho Yoshimoto, Shunichi Ohmori, Sirawadee Arunyanart
Abstract:
In this paper, we suggest a mechanism of assessment that rater and Ratee (or employees) to convince. There are many problems exist in the personnel assessment. In particular, we were focusing on the three. (1) Raters are not sufficiently recognized assessment point. (2) Ratee are not convinced by the mechanism of assessment. (3) Raters (or Evaluators) and ratees have empathy. We suggest 1: Setting of "understanding of the assessment points." 2: Setting of "relative assessment ability." 3: Proposal of two-way assessment mechanism to solve these problems. As a prerequisite, it is assumed that there are multiple raters. This is because has been a growing importance of multi-faceted assessment. In this model, it determines the weight of each assessment point evaluators by the degree of understanding and assessment ability of raters and ratee. We used the ANP (Analytic Network Process) is a theory that an extension of the decision-making technique AHP (Analytic Hierarchy Process). ANP can be to address the problem of forming a network and assessment of Two-Way is possible. We apply this technique personnel assessment, the weights of rater of each point can be reasonably determined. We suggest absolute assessment for Two-Way assessment by ANP. We have verified that the consent of the two approaches is higher than conventional mechanism. Also, human resources consultant we got a comment about the application of the practice.Keywords: personnel evaluation, pairwise comparison, analytic network process (ANP), two-ways
Procedia PDF Downloads 37827107 Simplified 3R2C Building Thermal Network Model: A Case Study
Authors: S. M. Mahbobur Rahman
Abstract:
Whole building energy simulation models are widely used for predicting future energy consumption, performance diagnosis and optimum control. Black box building energy modeling approach has been heavily studied in the past decade. The thermal response of a building can also be modeled using a network of interconnected resistors (R) and capacitors (C) at each node called R-C network. In this study, a model building, Case 600, as described in the “Standard Method of Test for the Evaluation of Building Energy Analysis Computer Program”, ASHRAE standard 140, is studied along with a 3R2C thermal network model and the ASHRAE clear sky solar radiation model. Although building an energy model involves two important parts of building component i.e., the envelope and internal mass, the effect of building internal mass is not considered in this study. All the characteristic parameters of the building envelope are evaluated as on Case 600. Finally, monthly building energy consumption from the thermal network model is compared with a simple-box energy model within reasonable accuracy. From the results, 0.6-9.4% variation of monthly energy consumption is observed because of the south-facing windows.Keywords: ASHRAE case study, clear sky solar radiation model, energy modeling, thermal network model
Procedia PDF Downloads 14327106 Like Making an Ancient Urn: Metaphor Conceptualization of L2 Writing
Authors: Muhalim Muhalim
Abstract:
Drawing on Lakoff’s theory of metaphor conceptualization, this article explores the conceptualization of language two writing (L2W) of ten students-teachers in Indonesia via metaphors. The ten postgraduate English language teaching students and at the same time (former) English teachers received seven days of intervention in teaching and learning L2. Using introspective log and focus group discussion, the results illuminate us that all participants are unanimous on perceiving L2W as process-oriented rather than product-oriented activity. Specifically, the metaphor conceptualizations exhibit three categories of process-oriented L2W: deliberate process, learning process, and problem-solving process. However, it has to be clarified from the outset that this categorization is not rigid because some of the properties of metaphors might belong to other categories. Results of the study and implications for English language teaching will be further discussed.Keywords: metaphor conceptualisation, second language, learning writing, teaching writing
Procedia PDF Downloads 41127105 Effects of Changes in LULC on Hydrological Response in Upper Indus Basin
Authors: Ahmad Ammar, Umar Khan Khattak, Muhammad Majid
Abstract:
Empirically based lumped hydrologic models have an extensive track record of use for various watershed managements and flood related studies. This study focuses on the impacts of LULC change for 10 year period on the discharge in watershed using lumped model HEC-HMS. The Indus above Tarbela region acts as a source of the main flood events in the middle and lower portions of Indus because of the amount of rainfall and topographic setting of the region. The discharge pattern of the region is influenced by the LULC associated with it. In this study the Landsat TM images were used to do LULC analysis of the watershed. Satellite daily precipitation TRMM data was used as input rainfall. The input variables for model building in HEC-HMS were then calculated based on the GIS data collected and pre-processed in HEC-GeoHMS. SCS-CN was used as transform model, SCS unit hydrograph method was used as loss model and Muskingum was used as routing model. For discharge simulation years 2000 and 2010 were taken. HEC-HMS was calibrated for the year 2000 and then validated for 2010.The performance of the model was assessed through calibration and validation process and resulted R2=0.92 during calibration and validation. Relative Bias for the years 2000 was -9% and for2010 was -14%. The result shows that in 10 years the impact of LULC change on discharge has been negligible in the study area overall. One reason is that, the proportion of built-up area in the watershed, which is the main causative factor of change in discharge, is less than 1% of the total area. However, locally, the impact of development was found significant in built up area of Mansehra city. The analysis was done on Mansehra city sub-watershed with an area of about 16 km2 and has more than 13% built up area in 2010. The results showed that with an increase of 40% built-up area in the city from 2000 to 2010 the discharge values increased about 33 percent, indicating the impact of LULC change on discharge value.Keywords: LULC change, HEC-HMS, Indus Above Tarbela, SCS-CN
Procedia PDF Downloads 51027104 Exploring the Applications of Neural Networks in the Adaptive Learning Environment
Authors: Baladitya Swaika, Rahul Khatry
Abstract:
Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.Keywords: computer adaptive tests, item response theory, machine learning, neural networks
Procedia PDF Downloads 17327103 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process
Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma
Abstract:
As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis
Procedia PDF Downloads 9927102 Using Analytic Hierarchy Process as a Decision-Making Tool in Project Portfolio Management
Authors: Darius Danesh, Michael J. Ryan, Alireza Abbasi
Abstract:
Project Portfolio Management (PPM) is an essential component of an organisation’s strategic procedures, which requires attention of several factors to envisage a range of long-term outcomes to support strategic project portfolio decisions. To evaluate overall efficiency at the portfolio level, it is essential to identify the functionality of specific projects as well as to aggregate those findings in a mathematically meaningful manner that indicates the strategic significance of the associated projects at a number of levels of abstraction. PPM success is directly associated with the quality of decisions made and poor judgment increases portfolio costs. Hence, various Multi-Criteria Decision Making (MCDM) techniques have been designed and employed to support the decision-making functions. This paper reviews possible option to improve the decision-making outcomes in the organisational portfolio management processes using the Analytic Hierarchy Process (AHP) both from academic and practical perspectives and will examine the usability, certainty and quality of the technique. The results of the study will also provide insight into the technical risk associated with current decision-making model to underpin initiative tracking and strategic portfolio management.Keywords: analytic hierarchy process, decision support systems, multi-criteria decision making, project portfolio management
Procedia PDF Downloads 320