Search results for: software process
18286 Design and Performance Evaluation of Plasma Spouted Bed Reactor for Converting Waste Plastic into Green Hydrogen
Authors: Palash Kumar Mollick, Leire Olazar, Laura Santamaria, Pablo Comendador, Gartzen Lopez, Martin Olazar
Abstract:
Average calorific value of a mixure of waste plastic is approximately 38 MJ/kg. Present work aims to extract maximum possible energy from a mixure of waste plastic using a DC thermal plasma in a spouted bed reactor. Plasma pyrolysis and steam reforming process has shown a potential to generate hydrogen from plastic with much below of legal limit of producing dioxins and furans as the carcinogenic gases. A spouted bed pyrolysis rector can continuously process plastic beads to produce organic volatiles, which later react with steam in presence of catalyst to results in syngas. lasma being the fourth state of matter, can carry high impact electrons to favour the activation energy of any chemical reactions. Computational Fluid Dynamic (CFD) simulation using COMSOL Multiphysics software has been performed to evaluate performance of a plasma spouted bed reactor in producing contamination free hydrogen as a green energy from waste plastic beads. The simulation results will showcase a design of a plasma spouted bed reactor for converting plastic waste into green hydrogen in a single step process. The high temperature hydrodynamics of spouted bed with plastic beads and the corresponding temperature distribution inside the reaction chamber will be critically examined for it’s near future installation of demonstration plant.Keywords: green hydrogen, plastic waste, synthetic gas, pyrolysis, steam reforming, spouted bed, reactor design, plasma, dc palsma, cfd simulation
Procedia PDF Downloads 11518285 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis
Authors: Mayada Attia Ibrahim
Abstract:
Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis
Procedia PDF Downloads 10018284 Treatment of Cutting Oily-Wastewater by Sono-Fenton Process: Experimental Approach and Combined Process
Authors: Pisut Painmanakul, Thawatchai Chintateerachai, Supanid Lertlapwasin, Nusara Rojvilavan, Tanun Chalermsinsuwan, Nattawin Chawaloesphonsiya, Onanong Larpparisudthi
Abstract:
Conventional coagulation, advance oxidation process (AOPs), and the combined process were evaluated and compared for its suitability to treat the stabilized cutting-oil wastewater. The 90% efficiency was obtained from the coagulation at Al2(SO4)3 dosage of 150 mg/L and pH 7. On the other hands, efficiencies of AOPs for 30 minutes oxidation time were 10% for acoustic oxidation, 12% for acoustic oxidation with hydrogen peroxide, 76% for Fenton, and 92% sono-Fenton processes. The highest efficiency for effective oil removal of AOPs required large amount of chemical. Therefore, AOPs were studied as a post-treatment after conventional separation process. The efficiency was considerable as the effluent COD can pass the standard required for industrial wastewater discharge with less chemical and energy consumption.Keywords: cutting oily-wastewater, advance oxidation process, sono-fenton, combined process
Procedia PDF Downloads 35618283 Parametric Studies of Ethylene Dichloride Purification Process
Authors: Sh. Arzani, H. Kazemi Esfeh, Y. Galeh Zadeh, V. Akbari
Abstract:
Ethylene dichloride is a colorless liquid with a smell like chloroform. EDC is classified in the simple hydrocarbon group which is obtained from chlorinating ethylene gas. Its chemical formula is C2H2Cl2 which is used as the main mediator in VCM production. Therefore, the purification process of EDC is important in the petrochemical process. In this study, the purification unit of EDC was simulated, and then validation was performed. Finally, the impact of process parameter was studied for the degree of EDC purity. The results showed that by increasing the feed flow, the reflux impure combinations increase and result in an EDC purity decrease.Keywords: ethylene dichloride, purification, edc, simulation
Procedia PDF Downloads 31618282 Optimised Path Recommendation for a Real Time Process
Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa
Abstract:
Traditional execution process follows the path of execution drawn by the process analyst without observing the behaviour of resource and other real-time constraints. Identifying process model, predicting the behaviour of resource and recommending the optimal path of execution for a real time process is challenging. The proposed AlfyMiner: αyM iner gives a new dimension in process execution with the novel techniques Process Model Analyser: PMAMiner and Resource behaviour Analyser: RBAMiner for recommending the probable path of execution. PMAMiner discovers next probable activity for currently executing activity in an online process using variant matching technique to identify the set of next probable activity, among which the next probable activity is discovered using decision tree model. RBAMiner identifies the resource suitable for performing the discovered next probable activity and observe the behaviour based on; load and performance using polynomial regression model, and waiting time using queueing theory. Based on the observed behaviour αyM iner recommend the probable path of execution with; next probable activity and the best suitable resource for performing it. Experiments were conducted on process logs of CoSeLoG Project1 and 72% of accuracy is obtained in identifying and recommending next probable activity and the efficiency of resource performance was optimised by 59% by decreasing their load.Keywords: cross-organization process mining, process behaviour, path of execution, polynomial regression model
Procedia PDF Downloads 33518281 Importance of Developing a Decision Support System for Diagnosis of Glaucoma
Authors: Murat Durucu
Abstract:
Glaucoma is a condition of irreversible blindness, early diagnosis and appropriate interventions to make the patients able to see longer time. In this study, it addressed that the importance of developing a decision support system for glaucoma diagnosis. Glaucoma occurs when pressure happens around the eyes it causes some damage to the optic nerves and deterioration of vision. There are different levels ranging blindness of glaucoma disease. The diagnosis at an early stage allows a chance for therapies that slows the progression of the disease. In recent years, imaging technology from Heidelberg Retinal Tomography (HRT), Stereoscopic Disc Photo (SDP) and Optical Coherence Tomography (OCT) have been used for the diagnosis of glaucoma. This better accuracy and faster imaging techniques in response technique of OCT have become the most common method used by experts. Although OCT images or HRT precision and quickness, especially in the early stages, there are still difficulties and mistakes are occurred in diagnosis of glaucoma. It is difficult to obtain objective results on diagnosis and placement process of the doctor's. It seems very important to develop an objective decision support system for diagnosis and level the glaucoma disease for patients. By using OCT images and pattern recognition systems, it is possible to develop a support system for doctors to make their decisions on glaucoma. Thus, in this recent study, we develop an evaluation and support system to the usage of doctors. Pattern recognition system based computer software would help the doctors to make an objective evaluation for their patients. It is intended that after development and evaluation processes of the software, the system is planning to be serve for the usage of doctors in different hospitals.Keywords: decision support system, glaucoma, image processing, pattern recognition
Procedia PDF Downloads 30218280 Numerical Study for the Estimation of Hydrodynamic Current Drag Coefficients for the Colombian Navy Frigates Using Computational Fluid Dynamics
Authors: Mauricio Gracia, Luis Leal, Bharat Verma
Abstract:
Computational fluid dynamics (CFD) has become nowadays an important tool in the process of hydrodynamic design of modern ships. CFD is used to model any phenomena related to fluid flow in a control volume like a ship or any offshore structure in the sea. In the present study, the current force drag coefficients for a Colombian Navy Frigate in deep and shallow water are estimated through the application of CFD. The study shows the process of simulating the ship current drag coefficients using the CFD simulations method, which is conducted using STAR-CCM+ software package. The Almirante Padilla class Frigate ship scale model is investigated. The results show the ship current drag coefficient calculated considering a current speed of 1 knot with a 90° drift angle for the full-scale ship. Predicted results were compared against the current drag coefficients published in the Lloyds register OCIMF report. It is shown that the simulation results agree fairly well with the published results and that STAR-CCM+ code can predict current drag coefficients.Keywords: CFD, current draft coefficient, STAR-CCM+, OCIMF, Bollard pull
Procedia PDF Downloads 17618279 Acoustic Induced Vibration Response Analysis of Honeycomb Panel
Authors: Po-Yuan Tung, Jen-Chueh Kuo, Chia-Ray Chen, Chien-Hsing Li, Kuo-Liang Pan
Abstract:
The main-body structure of satellite is mainly constructed by lightweight material, it should be able to withstand certain vibration load during launches. Since various kinds of change possibility in the space, it is an extremely important work to study the random vibration response of satellite structure. This paper based on the reciprocity relationship between sound and structure response and it will try to evaluate the dynamic response of satellite main body under random acoustic load excitation. This paper will study the technical process and verify the feasibility of sonic-borne vibration analysis. One simple plate exposed to the uniform acoustic field is utilized to take some important parameters and to validate the acoustics field model of the reverberation chamber. Then import both structure and acoustic field chamber models into the vibro-acoustic coupling analysis software to predict the structure response. During the modeling process, experiment verification is performed to make sure the quality of numerical models. Finally, the surface vibration level can be calculated through the modal participation factor, and the analysis results are presented in PSD spectrum.Keywords: vibration, acoustic, modal, honeycomb panel
Procedia PDF Downloads 55618278 Implementation of an Open Source ERP for SMEs in the Automotive Sector in Peru: A Case Study
Authors: Gerson E. Cornejo, Luis A. Gamarra, David S. Mauricio
Abstract:
The Enterprise Resource Planning Systems (ERP) allows the integration of all the business processes of the functional areas of the companies, in order to automate and standardize the processes, obtain accurate information and improve decision making in time real. In Peru, 79% of medium and small companies (SMEs) do not use any management software, this is because it is believed that ERPs are expensive, complex and difficult to implement. However, for more than 20 years there have been Open Source ERPs, which are more accessible and have the same benefit as proprietary ERPs, but there is little information on the implementation process. In this work is made a case of study, in order to show the implementation process of an Open Source ERP, Odoo, based on the ASAP methodology (Accelerated SAP) and applied to a company of corrective and preventive maintenance services of vehicles. The ERP allowed the SME to standardize its business processes, increase its productivity, reducing up to 40% certain processes. The study of this case shows that it is feasible and profitable to implement an Open Source ERP in SMEs in the Automotive Sector of Peru. In addition, it is shown that the ASAP methodology is adequate to carry out Open Source ERPs implementation projects.Keywords: ASAP, automotive sector, ERP implementation, open source
Procedia PDF Downloads 33718277 Development of a Tesla Music Coil from Signal Processing
Authors: Samaniego Campoverde José Enrique, Rosero Muñoz Jorge Enrique, Luzcando Narea Lorena Elizabeth
Abstract:
This paper presents a practical and theoretical model for the operation of the Tesla coil using digital signal processing. The research is based on the analysis of ten scientific papers exploring the development and operation of the Tesla coil. Starting from the Testa coil, several modifications were carried out on the Tesla coil, with the aim of amplifying the digital signal by making use of digital signal processing. To achieve this, an amplifier with a transistor and digital filters provided by MATLAB software were used, which were chosen according to the characteristics of the signals in question.Keywords: tesla coil, digital signal process, equalizer, graphical environment
Procedia PDF Downloads 11818276 Numerical Modelling of Immiscible Fluids Flow in Oil Reservoir Rocks during Enhanced Oil Recovery Processes
Authors: Zahreddine Hafsi, Manoranjan Mishra , Sami Elaoud
Abstract:
Ensuring the maximum recovery rate of oil from reservoir rocks is a challenging task that requires preliminary numerical analysis of different techniques used to enhance the recovery process. After conventional oil recovery processes and in order to retrieve oil left behind after the primary recovery phase, water flooding in one of several techniques used for enhanced oil recovery (EOR). In this research work, EOR via water flooding is numerically modeled, and hydrodynamic instabilities resulted from immiscible oil-water flow in reservoir rocks are investigated. An oil reservoir is a porous medium consisted of many fractures of tiny dimensions. For modeling purposes, the oil reservoir is considered as a collection of capillary tubes which provides useful insights into how fluids behave in the reservoir pore spaces. Equations governing oil-water flow in oil reservoir rocks are developed and numerically solved following a finite element scheme. Numerical results are obtained using Comsol Multiphysics software. The two phase Darcy module of COMSOL Multiphysics allows modelling the imbibition process by the injection of water (as wetting phase) into an oil reservoir. Van Genuchten, Brooks Corey and Levrett models were considered as retention models and obtained flow configurations are compared, and the governing parameters are discussed. For the considered retention models it was found that onset of instabilities viz. fingering phenomenon is highly dependent on the capillary pressure as well as the boundary conditions, i.e., the inlet pressure and the injection velocity.Keywords: capillary pressure, EOR process, immiscible flow, numerical modelling
Procedia PDF Downloads 13218275 Controlling the Process of a Chicken Dressing Plant through Statistical Process Control
Authors: Jasper Kevin C. Dionisio, Denise Mae M. Unsay
Abstract:
In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.Keywords: motion and time study, process controlling, statistical process control, X-Bar R Control chart
Procedia PDF Downloads 21718274 Analysis of Iran-Turkey Relations Based on Environmental Geopolitics
Authors: Farid Abbasi
Abstract:
Geographical spaces have different relations with each other, and especially neighboring geographical spaces have more relations than other spaces due to their proximity. Meanwhile, various parameters affect the relationships between these spaces, such as environmental parameters. These parameters have become important in recent decades, affecting the political relations of the actors in neighboring spaces. The Islamic Republic of Iran and the Republic of Turkey, as two actors in the region, political relations seem to have been affected to some extent by environmental issues. Based on this, the present study tries to examine and analyze the political relations between the two countries from an environmental, and geopolitical perspective. The method of this research is descriptive-analytical. The method of data analysis is based on library and field information (questionnaire) in the form of content analysis and statistics through the Mick Mac software system and Scenario Wizard. The results of studies and analysis of theories show that 35 indicators, directly and indirectly, affect Iran-Turkey relations from an environmental, and geopolitical perspective, which are in the form of five dimensions (water resources, soil resources, Vegetation, climate, living species). Using the Mick Mac method, 9 factors were extracted as key factors affecting Iran-Turkey relations, and in the process of analyzing research scenarios, 10100 possible situations were presented by scenario wizard software. 9 strong scenarios with 3 scenarios of favorable and very favorable situations, 3 scenarios with moderate situations and also 3 scenarios with critical situations and catastrophes according to Iran-Turkey relations from the environmental aspect are presented.Keywords: geopolitics, relations, Iran, Turkey, environment
Procedia PDF Downloads 15018273 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model
Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl
Abstract:
Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.Keywords: dexel, process stability, material removal, milling
Procedia PDF Downloads 52518272 Adding a Few Language-Level Constructs to Improve OOP Verifiability of Semantic Correctness
Authors: Lian Yang
Abstract:
Object-oriented programming (OOP) is the dominant programming paradigm in today’s software industry and it has literally enabled average software developers to develop millions of commercial strength software applications in the era of INTERNET revolution over the past three decades. On the other hand, the lack of strict mathematical model and domain constraint features at the language level has long perplexed the computer science academia and OOP engineering community. This situation resulted in inconsistent system qualities and hard-to-understand designs in some OOP projects. The difficulties with regards to fix the current situation are also well known. Although the power of OOP lies in its unbridled flexibility and enormously rich data modeling capability, we argue that the ambiguity and the implicit facade surrounding the conceptual model of a class and an object should be eliminated as much as possible. We listed the five major usage of class and propose to separate them by proposing new language constructs. By using well-established theories of set and FSM, we propose to apply certain simple, generic, and yet effective constraints at OOP language level in an attempt to find a possible solution to the above-mentioned issues regarding OOP. The goal is to make OOP more theoretically sound as well as to aid programmers uncover warning signs of irregularities and domain-specific issues in applications early on the development stage and catch semantic mistakes at runtime, improving correctness verifiability of software programs. On the other hand, the aim of this paper is more practical than theoretical.Keywords: new language constructs, set theory, FSM theory, user defined value type, function groups, membership qualification attribute (MQA), check-constraint (CC)
Procedia PDF Downloads 24118271 Application of Principle Component Analysis for Classification of Random Doppler-Radar Targets during the Surveillance Operations
Authors: G. C. Tikkiwal, Mukesh Upadhyay
Abstract:
During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving army, moving convoys etc. The Radar operator selects one of the promising targets into Single Target Tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper we present a technique using mathematical and statistical methods like Fast Fourier Transformation (FFT) and Principal Component Analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.Keywords: radar target, fft, principal component analysis, eigenvector, octave-notes, dsp
Procedia PDF Downloads 34618270 Developing a Process and Cost Model for Xanthan Biosynthesis from Bioethanol Production Waste Effluents
Authors: Bojana Ž. Bajić, Damjan G. Vučurović, Siniša N. Dodić, Jovana A. Grahovac, Jelena M. Dodić
Abstract:
Biosynthesis of xanthan, a microbial polysaccharide produced by Xanthomonas campestris, is characterized by the possibility of using non-specific carbohydrate substrates, which means different waste effluents can be used as a basis for the production media. Potential raw material sources for xanthan production come from industries with large amounts of waste effluents that are rich in compounds necessary for microorganism growth and multiplication. Taking into account the amount of waste effluents generated by the bioethanol industry and the fact that it contains a high inorganic and organic load it is clear that they represent a potential environmental pollutants if not properly treated. For this reason, it is necessary to develop new technologies which use wastes and wastewaters of one industry as raw materials for another industry. The result is not only a new product, but also reduction of pollution and environmental protection. Biotechnological production of xanthan, which consists of using biocatalysts to convert the bioethanol waste effluents into a high-value product, presents a possibility for sustainable development. This research uses scientific software developed for the modeling of biotechnological processes in order to design a xanthan production plant from bioethanol production waste effluents as raw material. The model was developed using SuperPro Designer® by using input data such as the composition of raw materials and products, defining unit operations, utility consumptions, etc., while obtaining capital and operating costs and the revenues from products to create a baseline production plant model. Results from this baseline model can help in the development of novel biopolymer production technologies. Additionally, a detailed economic analysis showed that this process for converting waste effluents into a high value product is economically viable. Therefore, the proposed model represents a useful tool for scaling up the process from the laboratory or pilot plant to a working industrial scale plant.Keywords: biotechnology, process model, xanthan, waste effluents
Procedia PDF Downloads 35018269 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation
Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke
Abstract:
Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform
Procedia PDF Downloads 30918268 Empowering Learners: From Augmented Reality to Shared Leadership
Authors: Vilma Zydziunaite, Monika Kelpsiene
Abstract:
In early childhood and preschool education, play has an important role in learning and cognitive processes. In the context of a changing world, personal autonomy and the use of technology are becoming increasingly important for the development of a wide range of learner competencies. By integrating technology into learning environments, the educational reality is changed, promoting unusual learning experiences for children through play-based activities. Alongside this, teachers are challenged to develop encouragement and motivation strategies that empower children to act independently. The aim of the study was to reveal the changes in the roles and experiences of teachers in the application of AR technology for the enrichment of the learning process. A quantitative research approach was used to conduct the study. The data was collected through an electronic questionnaire. Participants: 319 teachers of 5-6-year-old children using AR technology tools in their educational process. Methods of data analysis: Cronbach alpha, descriptive statistical analysis, normal distribution analysis, correlation analysis, regression analysis (SPSS software). Results. The results of the study show a significant relationship between children's learning and the educational process modeled by the teacher. The strongest predictor of child learning was found to be related to the role of the educator. Other predictors, such as pedagogical strategies, the concept of AR technology, and areas of children's education, have no significant relationship with child learning. The role of the educator was found to be a strong determinant of the child's learning process. Conclusions. The greatest potential for integrating AR technology into the teaching-learning process is revealed in collaborative learning. Teachers identified that when integrating AR technology into the educational process, they encourage children to learn from each other, develop problem-solving skills, and create inclusive learning contexts. A significant relationship has emerged - how the changing role of the teacher relates to the child's learning style and the aspiration for personal leadership and responsibility for their learning. Teachers identified the following key roles: observer of the learning process, proactive moderator, and creator of the educational context. All these roles enable the learner to become an autonomous and active participant in the learning process. This provides a better understanding and explanation of why it becomes crucial to empower the learner to experiment, explore, discover, actively create, and foster collaborative learning in the design and implementation of the educational content, also for teachers to integrate AR technologies and the application of the principles of shared leadership. No statistically significant relationship was found between the understanding of the definition of AR technology and the teacher’s choice of role in the learning process. However, teachers reported that their understanding of the definition of AR technology influences their choice of role, which has an impact on children's learning.Keywords: teacher, learner, augmented reality, collaboration, shared leadership, preschool education
Procedia PDF Downloads 4318267 Research Opportunities in Business Process Management and Performance Measurement from a Constructivist View
Authors: R.T.O. Lacerda, L. Ensslin., S.R. Ensslin, L. Knoff
Abstract:
This research paper aims to discover research opportunities in business process management and performance measurement from a constructivist view. The nature of this research is exploratory and descriptive and the research method was performed in a qualitative way. The process narrowed down 2142 articles, gathered after a search in scientific databases, and identified 16 articles that were relevant to the research and highly cited. The analysis found that most of the articles uses realistic approach and there is a need to analyze the decision making process in a singular manner. The measurement criteria are identified from scientific literature searching, in most cases, using ordinal scale without any integration process to present the results to the decision maker. Regarding management aspects, most of the articles do not have a structured process to measure the current situation and generate improvements opportunities.Keywords: performance measurement, BPM, decision, research opportunities
Procedia PDF Downloads 31318266 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis
Authors: Hyun-Woo Cho
Abstract:
Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques
Procedia PDF Downloads 38818265 The Search of Possibility of Running Six Sigma Process in It Education Center
Authors: Mohammad Amini, Aliakbar Alijarahi
Abstract:
This research that is collected and title as ‘ the search of possibility of running six sigma process in IT education center ‘ goals to test possibility of running the six sigma process and using in IT education center system. This process is a good method that is used for reducing process, errors. To evaluate running off six sigma in the IT education center, some variables relevant to this process is selected. These variables are: - The amount of support from organization master boss to process. - The current specialty. - The ability of training system for compensating reduction. - The amount of match between current culture whit six sigma culture . - The amount of current quality by comparing whit quality gain from running six sigma. For evaluation these variables we select four question and to gain the answers, we set a questionnaire from with 28 question and distribute it in our typical society. Since, our working environment is a very competition, and organization needs to decree the errors to minimum, otherwise it lasts their customers. The questionnaire from is given to 55 persons, they were filled and returned by 50 persons, after analyzing the forms these results is gained: - IT education center needs to use and run this system (six sigma) for improving their process qualities. - The most factors need to run the six sigma exist in the IT education center, but there is a need to support.Keywords: education, customer, self-action, quality, continuous improvement process
Procedia PDF Downloads 34018264 The Feasibility of Online, Interactive Workshops to Facilitate Anatomy Education during the UK COVID-19 Lockdowns
Authors: Prabhvir Singh Marway, Kai Lok Chan, Maria-Ruxandra Jinga, Rachel Bok Ying Lee, Matthew Bok Kit Lee, Krishan Nandapalan, Sze Yi Beh, Harry Carr, Christopher Kui
Abstract:
We piloted a structured series of online workshops on the 3D segmentation of anatomical structures from CT scans. 33 participants were recruited from four UK universities for two-day workshops between 2020 and 2021. Open-source software (3D-Slicer) was used. We hypothesized that active participation via real-time screen-sharing and voice-communication via Discord would enable improved engagement and learning, despite national lockdowns. Written feedback indicated positive learning experiences, with subjective measures of anatomical understanding and software confidence improving.Keywords: medical education, workshop, segmentation, anatomy
Procedia PDF Downloads 20118263 An Introduction to E-Content Producing Algorithm for Screen-Recorded Videos
Authors: Jamileh Darsareh, Mohammad Nikafrooz
Abstract:
Some teachers and e-content producers, based on their experiences, try to produce educational videos using screen recording software. There are many challenges that they may encounter while producing screen-recorded videos. These are in the domains of technical and pedagogical challenges like designing the roadmap, preparing the screen, setting the recording software and recording the screen, editing, etc. This study is a descriptive study and tries to present some procedures for producing acceptable and well-made videos. These procedures are presented in the form of an algorithm for producing screen-recorded video. This algorithm presents the main producing phases, including design, pre-production, production, post-production, and distribution. These phases consist of some steps which are supported by several technical and pedagogical considerations. Following these phases and steps according to the suggested order helps the producers to produce their intended and desired video by saving time and also facing fewer technical problems. It is expected that by using this algorithm, e-content producers and teachers gain better performance in producing educational videos.Keywords: e-content producing algorithm, screen-recorded videos, screen recording software, technical and pedagogical considerations
Procedia PDF Downloads 19718262 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security
Authors: James Andrew Fitzjohn
Abstract:
This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.Keywords: quantum computing, encryption, RSA, roadmap, real world
Procedia PDF Downloads 13318261 Development of a Process to Manufacture High Quality Refined Salt from Crude Solar Salt
Authors: Rathnayaka D. D. T. , Vidanage P. W. , Wasalathilake K. C. , Wickramasingha H. W. , Wijayarathne U. P. L. , Perera S. A. S.
Abstract:
This paper describes the research carried out to develop a process to increase the NaCl percentage of crude salt which is obtained from the conventional solar evaporation process. In this study refined salt was produced from crude solar salt by a chemico-physical method which consists of coagulation, precipitation and filtration. Initially crude salt crystals were crushed and dissolved in water. Optimum amounts of calcium hydroxide, sodium carbonate and Poly Aluminium Chloride (PAC) were added to the solution respectively. Refined NaCl solution was separated out by a filtration process. The solution was tested for Total Suspended Solids, SO42-, Mg2+, Ca2+. With optimum dosage of reagents, the results showed that a level of 99.60% NaCl could be achieved. Further this paper discusses the economic viability of the proposed process. A 83% profit margin can be achieved by this process and it is an increase of 112.3% compared to the traditional process.Keywords: chemico-physical, economic, optimum, refined, solar salt
Procedia PDF Downloads 25318260 Optimization of Laser Doping Selective Emitter for Silicon Solar Cells
Authors: Meziani Samir, Moussi Abderrahmane, Chaouchi Sofiane, Guendouzi Awatif, Djema Oussama
Abstract:
Laser doping has a large potential for integration into silicon solar cell technologies. The ability to process local, heavily diffused regions in a self-aligned manner can greatly simplify processing sequences for the fabrication of selective emitter. The choice of laser parameters for a laser doping process with 532nm is investigated. Solid state lasers with different power and speed were used for laser doping. In this work, the aim is the formation of selective emitter solar cells with a reduced number of technological steps. In order to have a highly doped localized emitter region, we used a 532 nm laser doping. Note that this region will receive the metallization of the Ag grid by screen printing. For this, we use SOLIDWORKS software to design a single type of pattern for square silicon cells. Sheet resistances, phosphorus doping concentration and silicon bulk lifetimes of irradiated samples are presented. Additionally, secondary ion mass spectroscopy (SIMS) profiles of the laser processed samples were acquired. Scanning electron microscope and optical microscope images of laser processed surfaces at different parameters are shown and compared.Keywords: laser doping, selective emitter, silicon, solar cells
Procedia PDF Downloads 10218259 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 7018258 Carrying Out the Steps of Decision Making Process in Concrete Organization
Authors: Eva Štěpánková
Abstract:
The decision-making process is theoretically clearly defined. Generally, it includes the problem identification and analysis, data gathering, goals and criteria setting, alternatives development and optimal alternative choice and its implementation. In practice however, various modifications of the theoretical decision-making process can occur. The managers can consider some of the phases to be too complicated or unfeasible and thus they do not carry them out and conversely some of the steps can be overestimated. The aim of the paper is to reveal and characterize the perception of the individual phases of decision-making process by the managers. The research is concerned with managers in the military environment–commanders. Quantitative survey is focused cross-sectionally in the individual levels of management of the Ministry of Defence of the Czech Republic. On the total number of 135 respondents the analysis focuses on which of the decision-making process phases are problematic or not carried out in practice and which are again perceived to be the easiest. Then it is examined the reasons of the findings.Keywords: decision making, decision making process, decision problems, concrete organization
Procedia PDF Downloads 47518257 Research on Straightening Process Model Based on Iteration and Self-Learning
Authors: Hong Lu, Xiong Xiao
Abstract:
Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.Keywords: straightness, straightening stroke, deflection, shaft parts
Procedia PDF Downloads 330