Search results for: analysis hierarchy process (AHP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37485

Search results for: analysis hierarchy process (AHP)

36075 Flame Kernel Growth and Related Effects of Spark Plug Electrodes: Fluid Motion Interaction in an Optically Accessible DISI Engine

Authors: A. Schirru, A. Irimescu, S. Merola, A. d’Adamo, S. Fontanesi

Abstract:

One of the aspects that are usually neglected during the design phase of an engine is the effect of the spark plug on the flow field inside the combustion chamber. Because of the difficulties in the experimental investigation of the mutual interaction between flow alteration and early flame kernel convection effect inside the engine combustion chamber, CFD-3D simulation is usually exploited in such cases. Experimentally speaking, a particular type of engine has to be used in order to directly observe the flame propagation process. In this study, a double electrode spark plug was fitted into an optically accessible engine and a high-speed camera was used to capture the initial stages of the combustion process. Both the arc and the kernel phases were observed. Then, a morphologic analysis was carried out and the position of the center of mass of the flame, relative to the spark plug position, was calculated. The crossflow orientation was chosen for the spark plug and the kernel growth process was observed for different air-fuel ratios. It was observed that during a normal cycle the flow field between the electrodes tends to transport the arc deforming it. Because of that, the kernel growth phase takes place away from the electrodes and the flame propagates with a preferential direction dictated by the flow field.

Keywords: Combustion, Optically Accessible Engine, Spark-Ignition Engine, Sparl Orientation, Kernel Growth

Procedia PDF Downloads 134
36074 A Simulated Scenario of WikiGIS to Support the Iteration and Traceability Management of the Geodesign Process

Authors: Wided Batita, Stéphane Roche, Claude Caron

Abstract:

Geodesign is an emergent term related to a new and complex process. Hence, it needs to rethink tools, technologies and platforms in order to efficiently achieve its goals. A few tools have emerged since 2010 such as CommunityViz, GeoPlanner, etc. In the era of Web 2.0 and collaboration, WikiGIS has been proposed as a new category of tools. In this paper, we present WikiGIS functionalities dealing mainly with the iteration and traceability management to support the collaboration of the Geodesign process. Actually, WikiGIS is built on GeoWeb 2.0 technologies —and primarily on wiki— and aims at managing the tracking of participants’ editing. This paper focuses on a simplified simulation to illustrate the strength of WikiGIS in the management of traceability and in the access to history in a Geodesign process. Indeed, a cartographic user interface has been implemented, and then a hypothetical use case has been imagined as proof of concept.

Keywords: geodesign, history, traceability, tracking of participants’ editing, WikiGIS

Procedia PDF Downloads 235
36073 Re-Engineering Management Process in IRAN’s Smart Schools

Authors: M. R. Babaei, S. M. Hosseini, S. Rahmani, L. Moradi

Abstract:

Today, the quality of education and training systems and the effectiveness of the education systems of most concern to stakeholders and decision-makers of our country's development in each country. In Iran this is a double issue of concern to numerous reasons; So that governments, over the past decade have hardly even paid the running costs of education. ICT is claiming it has the power to change the structure of a program for training, reduce costs and increase quality, and do education systems and products consistent with the needs of the community and take steps to practice education. Own of the areas that the introduction of information technology has fundamentally changed is the field of education. The aim of this research is process reengineering management in schools simultaneously has been using field studies to collect data in the form of interviews and a questionnaire survey. The statistical community of this research has been the country of Iran and smart schools under the education. Sampling was targeted. The data collection tool was a questionnaire composed of two parts. The questionnaire consists of 36 questions that each question designates one of effective factors on the management of smart schools. Also each question consists of two parts. The first part designates the operating position in the management process, which represents the domain's belonging to the management agent (planning, organizing, leading, controlling). According to the classification of Dabryn and in second part the factors affect the process of managing the smart schools were examined, that Likert scale is used to classify. Questions the validity of the group of experts and prominent university professors in the fields of information technology, management and reengineering of approved and Cronbach's alpha reliability and also with the use of the formula is evaluated and approved. To analyse the data, descriptive and inferential statistics were used to analyse the factors contributing to the rating of (Linkert scale) descriptive statistics (frequency table data, mean, median, mode) was used. To analyse the data using analysis of variance and nonparametric tests and Friedman test, the assumption was evaluated. The research conclusions show that the factors influencing the management process re-engineering smart schools in school performance is affected.

Keywords: re-engineering, management process, smart school, Iran's school

Procedia PDF Downloads 234
36072 Meta Model for Optimum Design Objective Function of Steel Frames Subjected to Seismic Loads

Authors: Salah R. Al Zaidee, Ali S. Mahdi

Abstract:

Except for simple problems of statically determinate structures, optimum design problems in structural engineering have implicit objective functions where structural analysis and design are essential within each searching loop. With these implicit functions, the structural engineer is usually enforced to write his/her own computer code for analysis, design, and searching for optimum design among many feasible candidates and cannot take advantage of available software for structural analysis, design, and searching for the optimum solution. The meta-model is a regression model used to transform an implicit objective function into objective one and leads in turn to decouple the structural analysis and design processes from the optimum searching process. With the meta-model, well-known software for structural analysis and design can be used in sequence with optimum searching software. In this paper, the meta-model has been used to develop an explicit objective function for plane steel frames subjected to dead, live, and seismic forces. Frame topology is assumed as predefined based on architectural and functional requirements. Columns and beams sections and different connections details are the main design variables in this study. Columns and beams are grouped to reduce the number of design variables and to make the problem similar to that adopted in engineering practice. Data for the implicit objective function have been generated based on analysis and assessment for many design proposals with CSI SAP software. These data have been used later in SPSS software to develop a pure quadratic nonlinear regression model for the explicit objective function. Good correlations with a coefficient, R2, in the range from 0.88 to 0.99 have been noted between the original implicit functions and the corresponding explicit functions generated with meta-model.

Keywords: meta-modal, objective function, steel frames, seismic analysis, design

Procedia PDF Downloads 230
36071 Analytical Modelling of Surface Roughness during Compacted Graphite Iron Milling Using Ceramic Inserts

Authors: Ş. Karabulut, A. Güllü, A. Güldaş, R. Gürbüz

Abstract:

This study investigates the effects of the lead angle and chip thickness variation on surface roughness during the machining of compacted graphite iron using ceramic cutting tools under dry cutting conditions. Analytical models were developed for predicting the surface roughness values of the specimens after the face milling process. Experimental data was collected and imported to the artificial neural network model. A multilayer perceptron model was used with the back propagation algorithm employing the input parameters of lead angle, cutting speed and feed rate in connection with chip thickness. Furthermore, analysis of variance was employed to determine the effects of the cutting parameters on surface roughness. Artificial neural network and regression analysis were used to predict surface roughness. The values thus predicted were compared with the collected experimental data, and the corresponding percentage error was computed. Analysis results revealed that the lead angle is the dominant factor affecting surface roughness. Experimental results indicated an improvement in the surface roughness value with decreasing lead angle value from 88° to 45°.

Keywords: CGI, milling, surface roughness, ANN, regression, modeling, analysis

Procedia PDF Downloads 440
36070 Employing Motivation, Enjoyment and Self-Regulation to Predict Aural Vocabulary Knowledge

Authors: Seyed Mohammad Reza Amirian, Seyedeh Khadije Amirian, Maryam Sabouri

Abstract:

The present study aimed to investigate second language (L2) motivation, enjoyment, and self-regulation as the main variables for explaining variance in the process, and to find out the outcome of L2 Aural Vocabulary Knowledge (AVK) development by focusing on the Iranian EFL students at Hakim Sabzevari University. To this end, 122 EFL students (86 females) and (36 males) participated in this study. The students filled out the Motivation Questionnaire, Foreign Language Enjoyment Questionnaire, and Self-Regulation Questionnaire and also took Aural Vocabulary Knowledge (AVK) Test. Using SPSS software, the data were analyzed through multiple regressions and path analysis. A preliminary Pearson correlation analysis revealed that 2 out of 3 independent variables were significantly linked to AVK. According to the obtained regression model, self-regulation was a significant predictor of aural vocabulary knowledge test. Finally, the results of the mediation analysis showed that the indirect effect of enjoyment on AVK through self- regulation was significant. These findings are discussed, and implications are offered.

Keywords: aural vocabulary knowledge, enjoyment, motivation, self-regulation

Procedia PDF Downloads 140
36069 Review of Friction Stir Welding of Dissimilar 5000 and 6000 Series Aluminum Alloy Plates

Authors: K. Subbaiah

Abstract:

Friction stir welding is a solid state welding process. Friction stir welding process eliminates the defects found in fusion welding processes. It is environmentally friend process. 5000 and 6000 series aluminum alloys are widely used in the transportation industries. The Al-Mg-Mn (5000) and Al-Mg-Si (6000) alloys are preferably offer best combination of use in Marine construction. The medium strength and high corrosion resistant 5000 series alloys are the aluminum alloys, which are found maximum utility in the world. In this review, the tool pin profile, process parameters such as hardness, yield strength and tensile strength, and microstructural evolution of friction stir welding of Al-Mg alloys 5000 Series and 6000 series have been discussed.

Keywords: 5000 series and 6000 series Al alloys, friction stir welding, tool pin profile, microstructure and properties

Procedia PDF Downloads 446
36068 Automation of Kitchen Chemical in the Textile Industry

Authors: José Luiz da Silva Neto, Renato Sipelli Silva, Érick Aragão Ribeiro

Abstract:

The automation of industrial processes plays a vital role in industries today, becoming an integral and important part of the industrial process and modern production. The process control systems are designed to maximize production, reduce costs and minimize risks in production. However, these systems are generally not deployed methodologies and planning. So that this article describes the development of an automation system of a kitchen preparation of chemicals in the textile industry based on a retrofitting methodology that provides more quality into the process at a lower cost.

Keywords: automation, textile industry, kitchen chemical, information integration

Procedia PDF Downloads 415
36067 Computational Team Dynamics in Student New Product Development Teams

Authors: Shankaran Sitarama

Abstract:

Teamwork is an extremely effective pedagogical tool in engineering education. New Product Development (NPD) has been an effective strategy of companies to streamline and bring innovative products and solutions to customers. Thus, Engineering curriculum in many schools, some collaboratively with business schools, have brought NPD into the curriculum at the graduate level. Teamwork is invariably used during instruction, where students work in teams to come up with new products and solutions. There is a significant emphasis of grade on the semester long teamwork for it to be taken seriously by students. As the students work in teams and go through this process to develop the new product prototypes, their effectiveness and learning to a great extent depends on how they function as a team and go through the creative process, come together, and work towards the common goal. A core attribute of a successful NPD team is their creativity and innovation. The team needs to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas resulting in a POC (proof-of-concept) implementation or a prototype of the product. The simultaneous requirement of teams to be creative and at the same time also converge and work together imposes different types of tensions in their team interactions. These ideational tensions / conflicts and sometimes relational tensions / conflicts are inevitable. Effective teams will have to deal with the Team dynamics and manage it to be resilient enough and yet be creative. This research paper provides a computational analysis of the teams’ communication that is reflective of the team dynamics, and through a superimposition of latent semantic analysis with social network analysis, provides a computational methodology of arriving at patterns of visual interaction. These team interaction patterns have clear correlations to the team dynamics and provide insights into the functioning and thus the effectiveness of the teams. 23 student NPD teams over 2 years of a course on Managing NPD that has a blend of engineering and business school students is considered, and the results are presented. It is also correlated with the teams’ detailed and tailored individual and group feedback and self-reflection and evaluation questionnaire.

Keywords: team dynamics, social network analysis, team interaction patterns, new product development teamwork, NPD teams

Procedia PDF Downloads 100
36066 Development of the Independent Building Permit System to Improve Productivity and Quality Service

Authors: Hartomo Soewardi, Bachtiar Jouhari

Abstract:

Ineffectiveness and inefficiency of the building permit process in Indonesia still becomes a major problems for people to apply. Long time of service, the complicated administration process, and an expensive fees are a process that causing a dissatisfaction and discomfort for applicant. Therefore, it is critical to improve the quality of service of building permit system. Objectives of this research is to develop a better process of the system to improve productivity and quality service. Lean six sigma concept by using DMAIC procedures was used to analyze the existing system. Moreover, improvement of the system was conducted by using the Axiomatic Design method. Verification test was done to test the hypothesis of the proposed system design. Result of this research shows that proposed system can produce increasing 61.8% of efficiency on service time, and more effective and easier.

Keywords: axiomatic design, bbuilding permit system, DMAIC, Lean Six Sigma

Procedia PDF Downloads 322
36065 Effect of Electromagnetic Field on Capacitive Deionization Performance

Authors: Alibi Kilybay, Emad Alhseinat, Ibrahim Mustafa, Abdulfahim Arangadi, Pei Shui, Faisal Almarzooqi

Abstract:

In this work, the electromagnetic field has been used for improving the performance of the capacitive deionization process. The effect of electromagnetic fields on the efficiency of the capacitive deionization (CDI) process was investigated experimentally. The results showed that treating the feed stream of the CDI process using an electromagnetic field can enhance the electrosorption capacity from 20% up to 70%. The effect of the degree of time of exposure, concentration, and type of ions have been examined. The electromagnetic field enhanced the salt adsorption capacity (SAC) of the Ca²⁺ ions by 70%, while the SAC enhanced 20% to the Na⁺ ions. It is hypnotized that the electrometric field affects the hydration shell around the ions and thus reduces their effective size and enhances the mass transfer. This reduction in ion effective size and increase in mass transfer enhanced the electrosorption capacity and kinetics of the CDI process.

Keywords: capacitive deionization, desalination, electromagnetic treatment, water treatment

Procedia PDF Downloads 247
36064 Automatic Lead Qualification with Opinion Mining in Customer Relationship Management Projects

Authors: Victor Radich, Tania Basso, Regina Moraes

Abstract:

Lead qualification is one of the main procedures in Customer Relationship Management (CRM) projects. Its main goal is to identify potential consumers who have the ideal characteristics to establish a profitable and long-term relationship with a certain organization. Social networks can be an important source of data for identifying and qualifying leads since interest in specific products or services can be identified from the users’ expressed feelings of (dis)satisfaction. In this context, this work proposes the use of machine learning techniques and sentiment analysis as an extra step in the lead qualification process in order to improve it. In addition to machine learning models, sentiment analysis or opinion mining can be used to understand the evaluation that the user makes of a particular service, product, or brand. The results obtained so far have shown that it is possible to extract data from social networks and combine the techniques for a more complete classification.

Keywords: lead qualification, sentiment analysis, opinion mining, machine learning, CRM, lead scoring

Procedia PDF Downloads 66
36063 TiO₂ Deactivation Process during Photocatalytic Ethanol Degradation in the Gas Phase

Authors: W. El-Alami, J. Araña, O. González Díaz, J. M. Doña Rodríguez

Abstract:

The efficiency of the semiconductor TiO₂ needs to be improved to be an effective tool for pollutant removal. To improve the efficiency of this semiconductor, it is necessary to deepen the knowledge of the processes that take place on its surface. In this sense, the deactivation of the catalyst is one of the aspects considered relevant. In order to study this point, the processes of deactivation of TiO₂ during the gas phase degradation of ethanol have been studied. For this, catalysts with only the anatase phase (SA and PC100) and catalysts with anatase and rutile phases (P25 and P90) have been selected. In order to force the deactivation processes, different cycles have been performed, adding ethanol gas but avoiding the degradation of acetates to determine their effect on the process. The surface concentration of fluorine on the catalysts was semi-quantitatively determined by EDAX analysis. The photocatalytic experiments were done with four commercial catalysts (P25, SA, P90, and PC100) and the two fluoride catalysts indicated above. The interaction and photocatalytic degradation of ethanol were followed by Fourier transform infrared spectroscopy (FTIR). EDAX analysis has revealed the presence of sodium on the surface of fluorinated catalysts. In FTIR studies, it has been observed that the acetates adsorbed on the anatase phase in P25 and P90 give rise to electron transfer to surface traps that modify the electronic states of the semiconductor. These deactivation studies have also been carried out with fluorinated P25 and SA catalysts (F-P25 and F-SA) which have observed similar electron transfers but in the opposite direction during illumination. In these materials, it has been observed that the electrons present in the surface traps, as a consequence of the interaction Ti-F, react with the holes, causing a change in the electronic states of the semiconductor. In this way, deactivated states of these materials have been detected by different electron transfer routes. It has been identified that acetates produced from the degradation of ethanol in P25 and P90 are probably hydrated on the surface of the rutile phase. In the catalysts with only the anatase phase (SA and PC100), the deactivation is immediate if the acetates are not removed before adsorbing ethanol again. In F-P25 and F-SA has been observed that the acetates formed react with the sodium ions present on the surface and not with the Ti atoms because they are interacting with the fluorine.

Keywords: photocatalytic degradation, ethanol, TiO₂, deactivation process, F-P25

Procedia PDF Downloads 61
36062 Taguchi-Based Optimization of Surface Roughness and Dimensional Accuracy in Wire EDM Process with S7 Heat Treated Steel

Authors: Joseph C. Chen, Joshua Cox

Abstract:

This research focuses on the use of the Taguchi method to reduce the surface roughness and improve dimensional accuracy of parts machined by Wire Electrical Discharge Machining (EDM) with S7 heat treated steel material. Due to its high impact toughness, the material is a candidate for a wide variety of tooling applications which require high precision in dimension and desired surface roughness. This paper demonstrates that Taguchi Parameter Design methodology is able to optimize both dimensioning and surface roughness successfully by investigating seven wire-EDM controllable parameters: pulse on time (ON), pulse off time (OFF), servo voltage (SV), voltage (V), servo feed (SF), wire tension (WT), and wire speed (WS). The temperature of the water in the Wire EDM process is investigated as the noise factor in this research. Experimental design and analysis based on L18 Taguchi orthogonal arrays are conducted. This paper demonstrates that the Taguchi-based system enables the wire EDM process to produce (1) high precision parts with an average of 0.6601 inches dimension, while the desired dimension is 0.6600 inches; and (2) surface roughness of 1.7322 microns which is significantly improved from 2.8160 microns.

Keywords: Taguchi Parameter Design, surface roughness, Wire EDM, dimensional accuracy

Procedia PDF Downloads 363
36061 Steady Conjugate Heat Transfer of Two Connected Thermal Systems

Authors: Mohamed El-Sayed Mosaad

Abstract:

An analytic approach is obtained for the steady heat transfer problem of two fluid systems, in thermal communication via heat conduction across a solid wall separating them. The two free convection layers created on wall sides are assumed to be in parallel flow. Fluid-solid interface temperature on wall sides is not prescribed in analysis in advance; rather, determined from conjugate solution among other unknown parameters. The analysis highlights the main conjugation parameters controlling thermal interaction process of involved heat transfer modes. Heat transfer results of engineering importance are obtained.

Keywords: conjugate heat transfer, boundary layer, convection, thermal systems

Procedia PDF Downloads 370
36060 Optimization of Fused Deposition Modeling 3D Printing Process via Preprocess Calibration Routine Using Low-Cost Thermal Sensing

Authors: Raz Flieshman, Adam Michael Altenbuchner, Jörg Krüger

Abstract:

This paper presents an approach to optimizing the Fused Deposition Modeling (FDM) 3D printing process through a preprocess calibration routine of printing parameters. The core of this method involves the use of a low-cost thermal sensor capable of measuring tempera-tures within the range of -20 to 500 degrees Celsius for detailed process observation. The calibration process is conducted by printing a predetermined path while varying the process parameters through machine instructions (g-code). This enables the extraction of critical thermal, dimensional, and surface properties along the printed path. The calibration routine utilizes computer vision models to extract features and metrics from the thermal images, in-cluding temperature distribution, layer adhesion quality, surface roughness, and dimension-al accuracy and consistency. These extracted properties are then analyzed to optimize the process parameters to achieve the desired qualities of the printed material. A significant benefit of this calibration method is its potential to create printing parameter profiles for new polymer and composite materials, thereby enhancing the versatility and application range of FDM 3D printing. The proposed method demonstrates significant potential in enhancing the precision and reliability of FDM 3D printing, making it a valuable contribution to the field of additive manufacturing.

Keywords: FDM 3D printing, preprocess calibration, thermal sensor, process optimization, additive manufacturing, computer vision, material profiles

Procedia PDF Downloads 21
36059 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach

Authors: Eric Mxolisi Mkhondo

Abstract:

Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.

Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism

Procedia PDF Downloads 157
36058 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components

Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich

Abstract:

This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.

Keywords: hard disk drive, line balancing, ECRS, simulation, arena program

Procedia PDF Downloads 217
36057 The Effect of Ingredients Mixing Sequence in Rubber Compounding on the Formation of Bound Rubber and Cross-Link Density of Natural Rubber

Authors: Abu Hasan, Rochmadi, Hary Sulistyo, Suharto Honggokusumo

Abstract:

This research purpose is to study the effect of Ingredients mixing sequence in rubber compounding onto the formation of bound rubber and cross link density of natural rubber and also the relationship of bound rubber and cross link density. Analysis of bound rubber formation of rubber compound and cross link density of rubber vulcanizates were carried out on a natural rubber formula having masticated and mixing, followed by curing. There were four methods of mixing and each mixing process was followed by four mixing sequence methods of carbon black into the rubber. In the first method of mixing sequence, rubber was masticated for 5 min and then rubber chemicals and carbon black N 330 were added simultaneously. In the second one, rubber was masticated for 1 min and followed by addition of rubber chemicals and carbon black N 330 simultaneously using the different method of mixing then the first one. In the third one, carbon black N 660 was used for the same mixing procedure of the second one, and in the last one, rubber was masticated for 3 min, carbon black N 330 and rubber chemicals were added subsequently. The addition of rubber chemicals and carbon black into masticated rubber was distinguished by the sequence and time allocated for each mixing process. Carbon black was added into two stages. In the first stage, 10 phr was added first and the remaining 40 phr was added later along with oil. In the second one to the fourth one, the addition of carbon black in the first and the second stage was added in the phr ratio 20:30, 30:20, and 40:10. The results showed that the ingredients mixing process influenced bound rubber formation and cross link density. In the three methods of mixing, the bound rubber formation was proportional with crosslink density. In contrast in the fourth one, bound rubber formation and cross link density had contradictive relation. Regardless of the mixing method operated, bound rubber had non linear relationship with cross link density. The high cross link density was formed when low bound rubber formation. The cross link density became constant at high bound rubber content.

Keywords: bound-rubber, cross-link density, natural rubber, rubber mixing process

Procedia PDF Downloads 403
36056 Modeling and Computational Validation of Dispersion Curves of Guide Waves in a Pipe Using ANSYS

Authors: A. Perdomo, J. R. Bacca, Q. E. Jabid

Abstract:

In recent years, technological and investigative progress has been achieved in the area of monitoring of equipment and installation as a result of a deeper understanding of physical phenomenon associated with the non-destructive tests (NDT). The modal analysis proposes an efficient solution to determine the dispersion curves of an arbitrary waveguide cross-sectional. Dispersion curves are essential in the discontinuity localization based on guided waves. In this work, an isotropic hollow cylinder is dynamically analyzed in ANSYS to obtain resonant frequencies and mode shapes all of them associated with the dispersion curves. The numerical results provide the relation between frequency and wavelength which is the foundation of the dispersion curves. Results of the simulation process are validated with the software GUIGW.

Keywords: ansys APDL, dispersion curves, guide waves, modal analysis

Procedia PDF Downloads 234
36055 Developing Interactive Media for Piston Engine Lectures to Improve Cadets Learning Outcomes: Literature Study

Authors: Jamaludin Jamaludin, Suparji Suparji, Lilik Anifah, I. Gusti Putu Asto Buditjahjanto, Eppy Yundra

Abstract:

Learning media is an important and main component in the learning process. By using currently available media, cadets still have difficulty understanding how the piston engine works, so they are not able to apply these concepts appropriately. This study aims to examine the development of interactive media for piston engine courses in order to improve student learning outcomes. The research method used is a literature study of several articles, journals and proceedings of interactive media development results from 2010-2020. The results showed that the development of interactive media is needed to support the learning process and influence the cognitive abilities of students. With this interactive media, learning outcomes can be improved and the learning process can be effective.

Keywords: interactive media, learning outcomes, learning process, literature study

Procedia PDF Downloads 137
36054 Strategies for Synchronizing Chocolate Conching Data Using Dynamic Time Warping

Authors: Fernanda A. P. Peres, Thiago N. Peres, Flavio S. Fogliatto, Michel J. Anzanello

Abstract:

Batch processes are widely used in food industry and have an important role in the production of high added value products, such as chocolate. Process performance is usually described by variables that are monitored as the batch progresses. Data arising from these processes are likely to display a strong correlation-autocorrelation structure, and are usually monitored using control charts based on multiway principal components analysis (MPCA). Process control of a new batch is carried out comparing the trajectories of its relevant process variables with those in a reference set of batches that yielded products within specifications; it is clear that proper determination of the reference set is key for the success of a correct signalization of non-conforming batches in such quality control schemes. In chocolate manufacturing, misclassifications of non-conforming batches in the conching phase may lead to significant financial losses. In such context, the accuracy of process control grows in relevance. In addition to that, the main assumption in MPCA-based monitoring strategies is that all batches are synchronized in duration, both the new batch being monitored and those in the reference set. Such assumption is often not satisfied in chocolate manufacturing process. As a consequence, traditional techniques as MPCA-based charts are not suitable for process control and monitoring. To address that issue, the objective of this work is to compare the performance of three dynamic time warping (DTW) methods in the alignment and synchronization of chocolate conching process variables’ trajectories, aimed at properly determining the reference distribution for multivariate statistical process control. The power of classification of batches in two categories (conforming and non-conforming) was evaluated using the k-nearest neighbor (KNN) algorithm. Real data from a milk chocolate conching process was collected and the following variables were monitored over time: frequency of soybean lecithin dosage, rotation speed of the shovels, current of the main motor of the conche, and chocolate temperature. A set of 62 batches with durations between 495 and 1,170 minutes was considered; 53% of the batches were known to be conforming based on lab test results and experts’ evaluations. Results showed that all three DTW methods tested were able to align and synchronize the conching dataset. However, synchronized datasets obtained from these methods performed differently when inputted in the KNN classification algorithm. Kassidas, MacGregor and Taylor’s (named KMT) method was deemed the best DTW method for aligning and synchronizing a milk chocolate conching dataset, presenting 93.7% accuracy, 97.2% sensitivity and 90.3% specificity in batch classification, being considered the best option to determine the reference set for the milk chocolate dataset. Such method was recommended due to the lowest number of iterations required to achieve convergence and highest average accuracy in the testing portion using the KNN classification technique.

Keywords: batch process monitoring, chocolate conching, dynamic time warping, reference set distribution, variable duration

Procedia PDF Downloads 157
36053 A Comparative Study of the Modeling and Quality Control of the Propylene-Propane Classical Distillation and Distillation Column with Heat Pump

Authors: C. Patrascioiu, Cao Minh Ahn

Abstract:

The paper presents the research evolution in the propylene – propane distillation process, especially for the distillation columns equipped with heat pump. The paper is structured in three parts: separation of the propylene-propane mixture, steady state process modeling, and quality control systems. The first part is dedicated to state of art of the two distillation processes. The second part continues the author’s researches of the steady state process modeling. There has been elaborated a software simulation instrument that may be used to dynamic simulation of the process and to design the quality control systems. The last part presents the research of the control systems, especially for quality control systems.

Keywords: absorption, distillation, heat pump, Unisim design

Procedia PDF Downloads 325
36052 A Construct to Perform in Situ Deformation Measurement of Material Extrusion-Fabricated Structures

Authors: Daniel Nelson, Valeria La Saponara

Abstract:

Material extrusion is an additive manufacturing modality that continues to show great promise in the ability to create low-cost, highly intricate, and exceedingly useful structural elements. As more capable and versatile filament materials are devised, and the resolution of manufacturing systems continues to increase, the need to understand and predict manufacturing-induced warping will gain ever greater importance. The following study presents an in situ remote sensing and data analysis construct that allows for the in situ mapping and quantification of surface displacements induced by residual stresses on a specified test structure. This proof-of-concept experimental process shows that it is possible to provide designers and manufacturers with insight into the manufacturing parameters that lead to the manifestation of these deformations and a greater understanding of the behavior of these warping events over the course of the manufacturing process.

Keywords: additive manufacturing, deformation, digital image correlation, fused filament fabrication, residual stress, warping

Procedia PDF Downloads 71
36051 Cultural Cognition and Voting: Understanding Values and Perceived Risks in the Colombian Population

Authors: Andrea N. Alarcon, Julian D. Castro, Gloria C. Rojas, Paola A. Vaca, Santiago Ortiz, Gustavo Martinez, Pablo D. Lemoine

Abstract:

Recently, electoral results across many countries have shown to be inconsistent with rational decision theory, which states that individuals make decisions based on maximizing benefits and reducing risks. An alternative explanation has emerged: Fear and rage-driven vote have been proved to be highly effective for political persuasion and mobilization. This phenomenon has been evident in the 2016 elections in the United States, 2006 elections in Mexico, 1998 elections in Venezuela, and 2004 elections in Bolivia. In Colombia, it has occurred recently in the 2016 plebiscite for peace and 2018 presidential elections. The aim of this study is to explain this phenomenon using cultural cognition theory, referring to the psychological predisposition individuals have to believe that its own and its peer´s behavior is correct and, therefore, beneficial to the entire society. Cultural cognition refers to the tendency of individuals to fit perceived risks, and factual beliefs into group shared values; the Cultural Cognition Worldview Scales (CCWS) measures cultural perceptions through two different dimensions: Individualism-communitarianism and hierarchy-egalitarianism. The former refers to attitudes towards social dominance based on conspicuous and static characteristics (sex, ethnicity or social class), while the latter refers to attitudes towards a social ordering in which it is expected from individuals to guarantee their own wellbeing without society´s or government´s intervention. A probabilistic national sample was obtained from different polls from the consulting and public opinion company Centro Nacional de Consultoría. Sociodemographic data was obtained along with CCWS scores, a subjective measure of left-right ideological placement and vote intention for 2019 Mayor´s elections were also included in the questionnaires. Finally, the question “In your opinion, what is the greatest risk Colombia is facing right now?” was included to identify perceived risk in the population. Preliminary results show that Colombians are highly distributed among hierarchical communitarians and egalitarian individualists (30.9% and 31.7%, respectively), and to a less extent among hierarchical individualists and egalitarian communitarians (19% and 18.4%, respectively). Males tended to be more hierarchical (p < .000) and communitarian (p=.009) than females. ANOVA´s revealed statistically significant differences between groups (quadrants) for the level of schooling, left-right ideological orientation, and stratum (p < .000 for all), and proportion differences revealed statistically significant differences for groups of age (p < .001). Differences and distributions for vote intention and perceived risks are still being processed and results are yet to be analyzed. Results show that Colombians are differentially distributed among quadrants in regard to sociodemographic data and left-right ideological orientation. These preliminary results indicate that this study may shed some light on why Colombians vote the way they do, and future qualitative data will show the fears emerging from the identified values in the CCWS and the relation this has with vote intention.

Keywords: communitarianism, cultural cognition, egalitarianism, hierarchy, individualism, perceived risks

Procedia PDF Downloads 134
36050 Influence of Environmental Conditions on a Solar Assisted Mashing Process

Authors: Ana Fonseca, Stefany Villacis

Abstract:

In this paper, the influence of several scenarios on a model of solar assisted mashing process in a brewery, while applying the model to different locations and therefore changing the environmental conditions, was analyzed. Assorted beer producer locations in different countries around the globe with contrasting climatic zones such as Guayaquil (Ecuador), Bangkok (Thailand), Mumbai (India), Veracruz (Mexico) and Brisbane (Australia) were evaluated and compared with a base case study Oldenburg (Germany), and results were drawn. The evaluation was restricted to the results obtained using TRNSYS 16 as simulating tool. On the base case, an annual Solar Fraction (SF) of 0.50 was encountered, results showed highly affection when modifying the pump control of the primary circuit and when increasing the area of collectors. A sensitivity analysis of the system for the selected locations was performed, resulting in Guayaquil the highest annual SF with a ratio of 2.5 times the expected value as compared with the base case. In contrast, Brisbane presented the lowest ratio, resulting in half of the expected one due to its lower irradiance. In conclusion, cities in Sunbelt countries have the technical potential to apply solar heat for their low-temperature industrial processes, in this case implementing a green brewery in Guayaquil.

Keywords: evacuated tubular solar collector, irradiance, mashing process, solar fraction, solar thermal

Procedia PDF Downloads 127
36049 A Geo DataBase to Investigate the Maximum Distance Error in Quality of Life Studies

Authors: Paolino Di Felice

Abstract:

The background and significance of this study come from papers already appeared in the literature which measured the impact of public services (e.g., hospitals, schools, ...) on the citizens’ needs satisfaction (one of the dimensions of QOL studies) by calculating the distance between the place where they live and the location on the territory of the services. Those studies assume that the citizens' dwelling coincides with the centroid of the polygon that expresses the boundary of the administrative district, within the city, they belong to. Such an assumption “introduces a maximum measurement error equal to the greatest distance between the centroid and the border of the administrative district.”. The case study, this abstract reports about, investigates the implications descending from the adoption of such an approach but at geographical scales greater than the urban one, namely at the three levels of nesting of the Italian administrative units: the (20) regions, the (110) provinces, and the 8,094 municipalities. To carry out this study, it needs to be decided: a) how to store the huge amount of (spatial and descriptive) input data and b) how to process them. The latter aspect involves: b.1) the design of algorithms to investigate the geometry of the boundary of the Italian administrative units; b.2) their coding in a programming language; b.3) their execution and, eventually, b.4) archiving the results in a permanent support. The IT solution we implemented is centered around a (PostgreSQL/PostGIS) Geo DataBase structured in terms of three tables that fit well to the hierarchy of nesting of the Italian administrative units: municipality(id, name, provinceId, istatCode, regionId, geometry) province(id, name, regionId, geometry) region(id, name, geometry). The adoption of the DBMS technology allows us to implement the steps "a)" and "b)" easily. In particular, step "b)" is simplified dramatically by calling spatial operators and spatial built-in User Defined Functions within SQL queries against the Geo DB. The major findings coming from our experiments can be summarized as follows. The approximation that, on the average, descends from assimilating the residence of the citizens with the centroid of the administrative unit of reference is of few kilometers (4.9) at the municipalities level, while it becomes conspicuous at the other two levels (28.9 and 36.1, respectively). Therefore, studies such as those mentioned above can be extended up to the municipal level without affecting the correctness of the interpretation of the results, but not further. The IT framework implemented to carry out the experiments can be replicated for studies referring to the territory of other countries all over the world.

Keywords: quality of life, distance measurement error, Italian administrative units, spatial database

Procedia PDF Downloads 363
36048 Examination of the Satisfaction Levels of Pre-Service Teachers Concerning E-Learning Process in Terms of Different Variables

Authors: Agah Tugrul Korucu

Abstract:

Significant changes have taken place for the better in the bulk of information and in the use of technology available in the field of education induced by technological changes in the 21st century. It is mainly the job of the teachers and pre-service teachers to integrate information and communication technologies into education by means of conveying the use of technology to individuals. While the pre-service teachers are conducting lessons by using technology, the methods they have developed are important factors for the requirements of the lesson and for the satisfaction levels of the students. The study of this study is to examine the satisfaction levels of pre-service teachers as regards e-learning in a technological environment in which there are lesson activities conducted through an online learning environment in terms of various variables. The study group of the research is composed of 156 pre-service teachers that were students in the departments of Computer and Teaching Technologies, Art Teaching and Pre-school Teaching in the academic year of 2014 - 2015. The qualitative research method was adopted for this study; the scanning model was employed in collecting the data. “The Satisfaction Scale regarding the E-learning Process”, developed by Gülbahar, and the personal information form, which was developed by the researcher, were used as means of collecting the data. Cronbach α reliability coefficient, which is the internal consistency coefficient of the scale, is 0.91. SPSS computerized statistical package program and the techniques of medium, standard deviation, percentage, correlation, t-test and variance analysis were used in the analysis of the data.

Keywords: online learning environment, integration of information technologies, e-learning, e-learning satisfaction, pre-service teachers

Procedia PDF Downloads 342
36047 Learning to Teach in Large Classrooms: Training Faculty Members from Milano Bicocca University, from Didactic Transposition to Communication Skills

Authors: E. Nigris, F. Passalacqua

Abstract:

Relating to the recent researches in the field of faculty development, this paper aims to present a pilot training programme realized at the University of Milano-Bicocca to improve teaching skills of faculty members. A total of 57 professors (both full professors and associate professors) were trained during the pilot programme in three editions of the workshop, focused on promoting skills for teaching large classes. The study takes into account: 1) the theoretical framework of the programme which combines the recent tradition about professional development and the research on in-service training of school teachers; 2) the structure and the content of the training programme, organized in a 12 hours-full immersion workshop and in individual consultations; 3) the educational specificity of the training programme which is based on the relation between 'general didactic' (active learning metholodies; didactic communication) and 'disciplinary didactics' (didactic transposition and reconstruction); 4) results about the impact of the training programme, both related to the workshop and the individual consultations. This study aims to provide insights mainly on two levels of the training program’s impact ('behaviour change' and 'transfer') and for this reason learning outcomes are evaluated by different instruments: a questionnaire filled out by all 57 participants; 12 in-depth interviews; 3 focus groups; conversation transcriptions of workshop activities. Data analysis is based on a descriptive qualitative approach and it is conducted through thematic analysis of the transcripts using analytical categories derived principally from the didactic transposition theory. The results show that the training programme developed effectively three major skills regarding different stages of the 'didactic transposition' process: a) the content selection; a more accurated selection and reduction of the 'scholarly knowledge', conforming to the first stage of the didactic transposition process; b) the consideration of students’ prior knowledge and misconceptions within the lesson design, in order to connect effectively the 'scholarly knowledge' to the 'knowledge to be taught' (second stage of the didactic transposition process); c) the way of asking questions and managing discussion in large classrooms, in line with the transformation of the 'knowledge to be taught' in 'taught knowledge' (third stage of the didactic transposition process).

Keywords: didactic communication, didactic transposition, instructional development, teaching large classroom

Procedia PDF Downloads 128
36046 The Effect of Material Properties and Volumetric Changes in Phase Transformation to the Final Residual Stress of Welding Process

Authors: Djarot B. Darmadi

Abstract:

The wider growing Finite Element Method (FEM) application is caused by its benefits of cost saving and environment friendly. Also, by using FEM a deep understanding of certain phenomenon can be achieved. This paper observed the role of material properties and volumetric change when Solid State Phase Transformation (SSPT) takes place in residual stress formation due to a welding process of ferritic steels through coupled Thermo-Metallurgy-Mechanical (TMM) analysis. The correctness of FEM residual stress prediction was validated by experiment. From parametric study of the FEM model, it can be concluded that the material properties change tend to over-predicts residual stress in the weld center whilst volumetric change tend to underestimates it. The best final result is the compromise of both by incorporates them in the model which has a better result compared to a model without SSPT.

Keywords: residual stress, ferritic steels, SSPT, coupled-TMM

Procedia PDF Downloads 262