Search results for: right-based approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13431

Search results for: right-based approach

11151 Cleaning of Polycyclic Aromatic Hydrocarbons (PAH) Obtained from Ferroalloys Plant

Authors: Stefan Andersson, Balram Panjwani, Bernd Wittgens, Jan Erik Olsen

Abstract:

Polycyclic Aromatic hydrocarbons are organic compounds consisting of only hydrogen and carbon aromatic rings. PAH are neutral, non-polar molecules that are produced due to incomplete combustion of organic matter. These compounds are carcinogenic and interact with biological nucleophiles to inhibit the normal metabolic functions of the cells. Norways, the most important sources of PAH pollution is considered to be aluminum plants, the metallurgical industry, offshore oil activity, transport, and wood burning. Stricter governmental regulations regarding emissions to the outer and internal environment combined with increased awareness of the potential health effects have motivated Norwegian metal industries to increase their efforts to reduce emissions considerably. One of the objective of the ongoing industry and Norwegian research council supported "SCORE" project is to reduce potential PAH emissions from an off gas stream of a ferroalloy furnace through controlled combustion. In a dedicated combustion chamber. The sizing and configuration of the combustion chamber depends on the combined properties of the bulk gas stream and the properties of the PAH itself. In order to achieve efficient and complete combustion the residence time and minimum temperature need to be optimized. For this design approach reliable kinetic data of the individual PAH-species and/or groups thereof are necessary. However, kinetic data on the combustion of PAH are difficult to obtain and there is only a limited number of studies. The paper presents an evaluation of the kinetic data for some of the PAH obtained from literature. In the present study, the oxidation is modelled for pure PAH and also for PAH mixed with process gas. Using a perfectly stirred reactor modelling approach the oxidation is modelled including advanced reaction kinetics to study influence of residence time and temperature on the conversion of PAH to CO2 and water. A Chemical Reactor Network (CRN) approach is developed to understand the oxidation of PAH inside the combustion chamber. Chemical reactor network modeling has been found to be a valuable tool in the evaluation of oxidation behavior of PAH under various conditions.

Keywords: PAH, PSR, energy recovery, ferro alloy furnace

Procedia PDF Downloads 257
11150 Barriers and Facilitators to Inclusive Programming for Children with Mental and/or Developmental Challenges: A Participatory Action Research of Perspectives from Families and Professionals

Authors: Minnie Y. Teng, Kathy Xie, Jarus Tal

Abstract:

Rationale: The traditional approach to community programs for children with mental and/or developmental challenges often involves segregation from typically-developing peers. However, studies show that inclusive education improves children’s quality of life, self-concept, and long term health outcomes. Investigating factors that influence inclusion can thus have important implications in the design and facilitation of community programs such that all children - across a spectrum of needs and abilities - may benefit. Objectives: This study explores barriers and facilitators to inclusive community programming for children aged 0 to 12 with developmental/mental challenges. Methods: Using a participatory-action research methodology, semi-structured focus groups and interviews will be used to explore perspectives of sighted students, instructors, and staff. Data will be transcribed and coded thematically. Practice Implications or Results: By having a deeper understanding of the barriers and facilitators to inclusive programming in the community, researchers can work with the broader community to facilitate inclusion in children’s community programs. Conclusions: Expanding inclusive practices may improve the health and wellbeing of the pediatric populations with disabilities, which consistently reports lower levels of participation. These findings may help to identify gaps in existing practices and ways to approach them.

Keywords: aquatic programs, children, disabilities, inclusion, community programs

Procedia PDF Downloads 105
11149 From Shallow Semantic Representation to Deeper One: Verb Decomposition Approach

Authors: Aliaksandr Huminski

Abstract:

Semantic Role Labeling (SRL) as shallow semantic parsing approach includes recognition and labeling arguments of a verb in a sentence. Verb participants are linked with specific semantic roles (Agent, Patient, Instrument, Location, etc.). Thus, SRL can answer on key questions such as ‘Who’, ‘When’, ‘What’, ‘Where’ in a text and it is widely applied in dialog systems, question-answering, named entity recognition, information retrieval, and other fields of NLP. However, SRL has the following flaw: Two sentences with identical (or almost identical) meaning can have different semantic role structures. Let consider 2 sentences: (1) John put butter on the bread. (2) John buttered the bread. SRL for (1) and (2) will be significantly different. For the verb put in (1) it is [Agent + Patient + Goal], but for the verb butter in (2) it is [Agent + Goal]. It happens because of one of the most interesting and intriguing features of a verb: Its ability to capture participants as in the case of the verb butter, or their features as, say, in the case of the verb drink where the participant’s feature being liquid is shared with the verb. This capture looks like a total fusion of meaning and cannot be decomposed in direct way (in comparison with compound verbs like babysit or breastfeed). From this perspective, SRL looks really shallow to represent semantic structure. If the key point in semantic representation is an opportunity to use it for making inferences and finding hidden reasons, it assumes by default that two different but semantically identical sentences must have the same semantic structure. Otherwise we will have different inferences from the same meaning. To overcome the above-mentioned flaw, the following approach is suggested. Assume that: P is a participant of relation; F is a feature of a participant; Vcp is a verb that captures a participant; Vcf is a verb that captures a feature of a participant; Vpr is a primitive verb or a verb that does not capture any participant and represents only a relation. In another word, a primitive verb is a verb whose meaning does not include meanings from its surroundings. Then Vcp and Vcf can be decomposed as: Vcp = Vpr +P; Vcf = Vpr +F. If all Vcp and Vcf will be represented this way, then primitive verbs Vpr can be considered as a canonical form for SRL. As a result of that, there will be no hidden participants caught by a verb since all participants will be explicitly unfolded. An obvious example of Vpr is the verb go, which represents pure movement. In this case the verb drink can be represented as man-made movement of liquid into specific direction. Extraction and using primitive verbs for SRL create a canonical representation unique for semantically identical sentences. It leads to the unification of semantic representation. In this case, the critical flaw related to SRL will be resolved.

Keywords: decomposition, labeling, primitive verbs, semantic roles

Procedia PDF Downloads 351
11148 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 212
11147 Runoff Estimates of Rapidly Urbanizing Indian Cities: An Integrated Modeling Approach

Authors: Rupesh S. Gundewar, Kanchan C. Khare

Abstract:

Runoff contribution from urban areas is generally from manmade structures and few natural contributors. The manmade structures are buildings; roads and other paved areas whereas natural contributors are groundwater and overland flows etc. Runoff alleviation is done by manmade as well as natural storages. Manmade storages are storage tanks or other storage structures such as soakways or soak pits which are more common in western and European countries. Natural storages are catchment slope, infiltration, catchment length, channel rerouting, drainage density, depression storage etc. A literature survey on the manmade and natural storages/inflow has presented percentage contribution of each individually. Sanders et.al. in their research have reported that a vegetation canopy reduces runoff by 7% to 12%. Nassif et el in their research have reported that catchment slope has an impact of 16% on bare standard soil and 24% on grassed soil on rainfall runoff. Infiltration being a pervious/impervious ratio dependent parameter is catchment specific. But a literature survey has presented a range of 15% to 30% loss of rainfall runoff in various catchment study areas. Catchment length and channel rerouting too play a considerable role in reduction of rainfall runoff. Ground infiltration inflow adds to the runoff where the groundwater table is very shallow and soil saturates even in a lower intensity storm. An approximate percent contribution through this inflow and surface inflow contributes to about 2% of total runoff volume. Considering the various contributing factors in runoff it has been observed during a literature survey that integrated modelling approach needs to be considered. The traditional storm water network models are able to predict to a fair/acceptable degree of accuracy provided no interaction with receiving water (river, sea, canal etc), ground infiltration, treatment works etc. are assumed. When such interactions are significant then it becomes difficult to reproduce the actual flood extent using the traditional discrete modelling approach. As a result the correct flooding situation is very rarely addressed accurately. Since the development of spatially distributed hydrologic model the predictions have become more accurate at the cost of requiring more accurate spatial information.The integrated approach provides a greater understanding of performance of the entire catchment. It enables to identify the source of flow in the system, understand how it is conveyed and also its impact on the receiving body. It also confirms important pain points, hydraulic controls and the source of flooding which could not be easily understood with discrete modelling approach. This also enables the decision makers to identify solutions which can be spread throughout the catchment rather than being concentrated at single point where the problem exists. Thus it can be concluded from the literature survey that the representation of urban details can be a key differentiator to the successful understanding of flooding issue. The intent of this study is to accurately predict the runoff from impermeable areas from urban area in India. A representative area has been selected for which data was available and predictions have been made which are corroborated with the actual measured data.

Keywords: runoff, urbanization, impermeable response, flooding

Procedia PDF Downloads 238
11146 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network

Authors: Sandesh Achar

Abstract:

Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.

Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.

Procedia PDF Downloads 26
11145 Analysis of Minimizing Investment Risks in Power and Energy Business Development by Combining Total Quality Management and International Financing Institutions Project Management Tools

Authors: M. Radunovic

Abstract:

Region of Southeastern Europe has a substantial energy resource potential and is witnessing an increasing rate of power and energy project investments. This comes as a result of countries harmonizing their legal framework and market regulations to conform the ones of European Union, enabling direct private investments. Funding in the power and energy market in this region originates from various resources and investment entities, including commercial and institutional ones. Risk anticipation and assessment is crucial to project success, especially given the long exploitation period of project in power and energy domain, as well as the wide range of stakeholders involved. This paper analyzes the possibility of combined application of tools used in total quality management and international financing institutions for project planning, execution and evaluation, with the goal of anticipating, assessing and minimizing the risks that might occur in the development and execution phase of a power and energy project in the market of southeastern Europe. History of successful project management and investments both in the industry and institutional sector provides sufficient experience, guidance and internationally adopted tools to provide proper project assessment for investments in power and energy. Business environment of southeastern Europe provides immense potential for developing power and engineering projects of various magnitudes, depending on stakeholders’ interest. Diversification on investment sources provides assurance that there is interest and commitment to invest in this market. Global economic and political developments will be intensifying the pace of investments in the upcoming period. The proposed approach accounts for key parameters that contribute to the sustainability and profitability of a project which include technological, educational, social and economic gaps between the southeastern European region and western Europe, market trends in equipment design and production on a global level, environment friendly approach to renewable energy sources as well as conventional power generation systems, and finally the effect of the One Belt One Road Initiative led by People’s Republic of China to the power and energy market of this region in the upcoming period on a long term scale. Analysis will outline the key benefits of the approach as well as the accompanying constraints. Parallel to this it will provide an overview of dominant threats and opportunities in present and future business environment and their influence to the proposed application. Through concrete examples, full potential of this approach will be presented along with necessary improvements that need to be implemented. Number of power and engineering projects being developed in southeastern Europe will be increasing in the upcoming period. Proper risk analysis will lead to minimizing project failures. The proposed successful combination of reliable project planning tools from different investment areas can prove to be beneficial in the future power and engineering investments, and guarantee their sustainability and profitability.

Keywords: capital investments, lean six sigma, logical framework approach, logical framework matrix, one belt one road initiative, project management tools, quality function deployment, Southeastern Europe, total quality management

Procedia PDF Downloads 100
11144 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: computational social science, movie preference, machine learning, SVM

Procedia PDF Downloads 247
11143 Reconfigurable Intelligent Surfaces (RIS)-Assisted Integrated Leo Satellite and UAV for Non-terrestrial Networks Using a Deep Reinforcement Learning Approach

Authors: Tesfaw Belayneh Abebe

Abstract:

Integrating low-altitude earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs) within a non-terrestrial network (NTN) with the assistance of reconfigurable intelligent surfaces (RIS), we investigate the problem of how to enhance throughput through integrated LEO satellites and UAVs with the assistance of RIS. We propose a method to jointly optimize the associations with the LEO satellite, the 3D trajectory of the UAV, and the phase shifts of the RIS to maximize communication throughput for RIS-assisted integrated LEO satellite and UAV-enabled wireless communications, which is challenging due to the time-varying changes in the position of the LEO satellite, the high mobility of UAVs, an enormous number of possible control actions, and also the large number of RIS elements. Utilizing a multi-agent double deep Q-network (MADDQN), our approach dynamically adjusts LEO satellite association, UAV positioning, and RIS phase shifts. Simulation results demonstrate that our method significantly outperforms baseline strategies in maximizing throughput. Lastly, thanks to the integrated network and the RIS, the proposed scheme achieves up to 65.66x higher peak throughput and 25.09x higher worst-case throughput.

Keywords: integrating low-altitude earth orbit (LEO) satellites, unmanned aerial vehicles (UAVs) within a non-terrestrial network (NTN), reconfigurable intelligent surfaces (RIS), multi-agent double deep Q-network (MADDQN)

Procedia PDF Downloads 24
11142 Design of a Cooperative Neural Network, Particle Swarm Optimization (PSO) and Fuzzy Based Tracking Control for a Tilt Rotor Unmanned Aerial Vehicle

Authors: Mostafa Mjahed

Abstract:

Tilt Rotor UAVs (Unmanned Aerial Vehicles) are naturally unstable and difficult to maneuver. The purpose of this paper is to design controllers for the stabilization and trajectory tracking of this type of UAV. To this end, artificial intelligence methods have been exploited. First, the dynamics of this UAV was modeled using the Lagrange-Euler method. The conventional method based on Proportional, Integral and Derivative (PID) control was applied by decoupling the different flight modes. To improve stability and trajectory tracking of the Tilt Rotor, the fuzzy approach and the technique of multilayer neural networks (NN) has been used. Thus, Fuzzy Proportional Integral and Derivative (FPID) and Neural Network-based Proportional Integral and Derivative controllers (NNPID) have been developed. The meta-heuristic approach based on Particle Swarm Optimization (PSO) method allowed adjusting the setting parameters of NNPID controller, giving us an improved NNPID-PSO controller. Simulation results under the Matlab environment show the efficiency of the approaches adopted. Besides, the Tilt Rotor UAV has become stable and follows different types of trajectories with acceptable precision. The Fuzzy, NN and NN-PSO-based approaches demonstrated their robustness because the presence of the disturbances did not alter the stability or the trajectory tracking of the Tilt Rotor UAV.

Keywords: neural network, fuzzy logic, PSO, PID, trajectory tracking, tilt-rotor UAV

Procedia PDF Downloads 105
11141 Ionic Liquids as Substrates for Metal-Organic Framework Synthesis

Authors: Julian Mehler, Marcus Fischer, Martin Hartmann, Peter S. Schulz

Abstract:

During the last two decades, the synthesis of metal-organic frameworks (MOFs) has gained ever increasing attention. Based on their pore size and shape as well as host-guest interactions, they are of interest for numerous fields related to porous materials, like catalysis and gas separation. Usually, MOF-synthesis takes place in an organic solvent between room temperature and approximately 220 °C, with mixtures of polyfunctional organic linker molecules and metal precursors as substrates. Reaction temperatures above the boiling point of the solvent, i.e. solvothermal reactions, are run in autoclaves or sealed glass vessels under autogenous pressures. A relatively new approach for the synthesis of MOFs is the so-called ionothermal synthesis route. It applies an ionic liquid as a solvent, which can serve as a structure-directing template and/or a charge-compensating agent in the final coordination polymer structure. Furthermore, this method often allows for less harsh reaction conditions than the solvothermal route. Here a variation of the ionothermal approach is reported, where the ionic liquid also serves as an organic linker source. By using 1-ethyl-3-methylimidazolium terephthalates ([EMIM][Hbdc] and [EMIM]₂[bdc]), the one-step synthesis of MIL-53(Al)/Boehemite composites with interesting features is possible. The resulting material is already formed at moderate temperatures (90-130 °C) and is stabilized in the usually unfavored ht-phase. Additionally, in contrast to already published procedures for MIL-53(Al) synthesis, no further activation at high temperatures is mandatory. A full characterization of this novel composite material is provided, including XRD, SS-NMR, El-Al., SEM as well as sorption measurements and its interesting features are compared to MIL-53(Al) samples produced by the classical solvothermal route. Furthermore, the syntheses of the applied ionic liquids and salts is discussed. The influence of the degree of ionicity of the linker source [EMIM]x[H(2-x)bdc] on the crystal structure and the achievable synthesis temperature are investigated and give insight into the role of the IL during synthesis. Aside from the synthesis of MIL-53 from EMIM terephthalates, the use of the phosphonium cation in this approach is discussed as well. Additionally, the employment of ILs in the preparation of other MOFs is presented briefly. This includes the ZIF-4 framework from the respective imidazolate ILs and chiral camphorate based frameworks from their imidazolium precursors.

Keywords: ionic liquids, ionothermal synthesis, material synthesis, MIL-53, MOFs

Procedia PDF Downloads 192
11140 Review of Assessment of Integrated Information System (IIS) in Organisation

Authors: Mariya Salihu Ingawa, Sani Suleiman Isah

Abstract:

The assessment of Integrated Information System (IIS) in organisation is an important initiative to enable the Information System (IS) managers, as well as top management to understand the success status of their investment in IS integration efforts. However, without a proper assessment, an organisation will not know its IIS status, which may affect their judgment on what action should be taken onwards. Current research on IIS assessment is lacking and those related literature on IIS assessment focus more on assessing the technical aspect of IIS. It is argued that assessing technical aspect alone is inadequate since organisational and strategic aspects in IIS should also be considered. Current methods, techniques and tools used by vendors for IIS assessment also are lack of comprehensive measures to fully assess the Integrated Information System in term of technical, organisational and strategic domains. The purpose of this study is to establish critical success factors for measuring success of an Integrated Information System. These factors are used as the basis for constructing an approach to comprehensively assess IIS in an organisation. A comprehensive list of success factors for IIS assessment, established from literature, was initially presented. An expert surveys using both manual and online methods were conducted to verify the factors. Based on the factors, an instrument for IIS assessment was constructed. The results from a case study indicate that through comprehensive assessment approach, not only the level of success been known, but also reveals the contributing factors. This research contributes to the field of Information Systems specifically in the area of Integrated Information System assessment.

Keywords: integrated information system, expert surveys, organisation, assessment

Procedia PDF Downloads 374
11139 An Ensemble Learning Method for Applying Particle Swarm Optimization Algorithms to Systems Engineering Problems

Authors: Ken Hampshire, Thomas Mazzuchi, Shahram Sarkani

Abstract:

As a subset of metaheuristics, nature-inspired optimization algorithms such as particle swarm optimization (PSO) have shown promise both in solving intractable problems and in their extensibility to novel problem formulations due to their general approach requiring few assumptions. Unfortunately, single instantiations of algorithms require detailed tuning of parameters and cannot be proven to be best suited to a particular illustrative problem on account of the “no free lunch” (NFL) theorem. Using these algorithms in real-world problems requires exquisite knowledge of the many techniques and is not conducive to reconciling the various approaches to given classes of problems. This research aims to present a unified view of PSO-based approaches from the perspective of relevant systems engineering problems, with the express purpose of then eliciting the best solution for any problem formulation in an ensemble learning bucket of models approach. The central hypothesis of the research is that extending the PSO algorithms found in the literature to real-world optimization problems requires a general ensemble-based method for all problem formulations but a specific implementation and solution for any instance. The main results are a problem-based literature survey and a general method to find more globally optimal solutions for any systems engineering optimization problem.

Keywords: particle swarm optimization, nature-inspired optimization, metaheuristics, systems engineering, ensemble learning

Procedia PDF Downloads 75
11138 Lectures in Higher Education Using Teaching Strategies and Digital Tools to Overcome Challenges Faced in South Africa by Implementing Blended Learning

Authors: Thaiurie Govender, Shannon Verne

Abstract:

The Fourth Industrial Revolution has ushered in an era where technology significantly impacts various aspects of life, including higher education. Blended learning, which combines synchronous and asynchronous learning, has gained popularity as a pedagogical approach. However, its effective implementation is a challenge, particularly in the context of the COVID-19 pandemic and technological obstacles faced in South Africa. This study focused on lecturers' teaching and learning practices to implement blended learning, aiming to understand the teaching and learning strategies used with the integration of digital tools to facilitate the blended learning approach within a private higher educational institution in South Africa. Using heutagogy and constructivism theoretical frameworks, the study aimed to uncover insights into the lecturer’s teaching and learning practices to overcome challenges in designing and facilitating blended learning modules. Through a qualitative analysis, the themes of student engagement, teaching and learning strategies, digital tools, and feedback emerged, highlighting the complexities and opportunities in a blended learning classroom. The findings emphasize the importance of tailoring methods to students' needs and subject matter, aligning with constructivist principles. Recommendations include promoting professional development opportunities, addressing infrastructure issues, and fostering a supportive learning environment.

Keywords: blended learning, digital tools, higher education, teaching strategies

Procedia PDF Downloads 22
11137 Qualitative and Quantitative Research Methodology Theoretical Framework and Descriptive Theory: PhD Construction Management

Authors: Samuel Quashie

Abstract:

PhDs in Construction Management often designs their methods based on those established in social sciences using theoretical models, to collect, gather and analysis data to answer research questions. Work aim is to apply qualitative and quantitative as a data analysis method, and as part of the theoretical framework - descriptive theory. To improve the ability to replicate the contribution to knowledge the research. Using practical triangulation approach, which covers, interviews and observations, literature review and (archival) document studies, project-based case studies, questionnaires surveys and review of integrated systems used in, construction and construction related industries. The clarification of organisational context and management delivery that influences organizational performance and quality of product and measures are achieved. Results illustrate improved reliability in this research approach when interpreting real world phenomena; cumulative results of research can be applied with confidence under similar environments. Assisted validity of the PhD research outcomes and strengthens the confidence to apply cumulative results of research under similar conditions in the Built Environment research systems, which have been criticised for the lack of reliability in approaches when interpreting real world phenomena.

Keywords: case studies, descriptive theory, theoretical framework, qualitative and quantitative research

Procedia PDF Downloads 362
11136 Off-Policy Q-learning Technique for Intrusion Response in Network Security

Authors: Zheni S. Stefanova, Kandethody M. Ramachandran

Abstract:

With the increasing dependency on our computer devices, we face the necessity of adequate, efficient and effective mechanisms, for protecting our network. There are two main problems that Intrusion Detection Systems (IDS) attempt to solve. 1) To detect the attack, by analyzing the incoming traffic and inspect the network (intrusion detection). 2) To produce a prompt response when the attack occurs (intrusion prevention). It is critical creating an Intrusion detection model that will detect a breach in the system on time and also challenging making it provide an automatic and with an acceptable delay response at every single stage of the monitoring process. We cannot afford to adopt security measures with a high exploiting computational power, and we are not able to accept a mechanism that will react with a delay. In this paper, we will propose an intrusion response mechanism that is based on artificial intelligence, and more precisely, reinforcement learning techniques (RLT). The RLT will help us to create a decision agent, who will control the process of interacting with the undetermined environment. The goal is to find an optimal policy, which will represent the intrusion response, therefore, to solve the Reinforcement learning problem, using a Q-learning approach. Our agent will produce an optimal immediate response, in the process of evaluating the network traffic.This Q-learning approach will establish the balance between exploration and exploitation and provide a unique, self-learning and strategic artificial intelligence response mechanism for IDS.

Keywords: cyber security, intrusion prevention, optimal policy, Q-learning

Procedia PDF Downloads 222
11135 Agriculture, Food Security and Poverty Reduction in Nigeria: Cointegration and Granger Causality Approach

Authors: Ogunwole Cecilia Oluwakemi, Timothy Ayomitunde Aderemi

Abstract:

Provision of sufficient food and elimination of abject poverty have usually been the conventional benefits of agriculture in any society. Meanwhile, despite the fact that Nigeria is an agrarian society, food insecurity and poverty have become the issues of concern among both scholars and policymakers in the recent times. Against this backdrop, this study examined the nexus among agriculture, food security, and poverty reduction in Nigeria from 1990 to 2019 within the framework of the Cointegration and Granger Causality approach. Data was collected from the Central Bank of Nigeria Statistical Bulletin and the World Development Indicators, respectively. The following are the major results that emanated from the study. A long run equilibrium relationship exists among agricultural value added, food production index, and GDP per capita in Nigeria. Similarly, there is a unidirectional causality which flows from food production index to poverty reduction in Nigeria. In the same vein, one way causality flows from poverty reduction to agricultural value added in Nigeria. Consequently, this study makes the following recommendation for the policymakers in Nigeria, and other African countries by extension, that agricultural value added and food production are the important variables that cannot be undermined when poverty reduction occupies the central focus of the policymakers. Therefore, any time these policymakers want to reduce poverty, policies that drive agricultural value added and food production should be embarked upon. Therefore, this study will contribute to the literature by establishing the type of linkage that exists between agriculture, food security, and poverty reduction in Nigeria.

Keywords: agriculture, value added, food production, GDP per capita, Nigeria

Procedia PDF Downloads 160
11134 In the Valley of the Shadow of Death: Gossip, God, and Scapegoating in Susannah, an American Opera by Carlisle Floyd

Authors: Shirl H. Terrell

Abstract:

In the telling of mythologies, stories of cultural and religious histories, the creative arts provide an archetypal lens through which the personal and collective unconscious are viewed, thus revealing mysteries of the unknown psyche. To that end, the author of this paper, using the hermeneutic approach, proves that Carlisle Floyd’s (1955) English language opera Susannah illuminates humanity’s instinctual nature and behaviors through music, libretto, and drama. While impressive musical works such as Wagner’s Ring Cycle and Webber’s Phantom of the Opera have received extensive Jungian analyses, critics and scholars often ignore lesser esteemed works, such as Susannah, notwithstanding the fact that they have been consistently performed on the theater circuit. Such pieces, when given notice, allow viewers to grasp the soul-making depth and timeless quality of productions which may otherwise go unrecognized as culturally or psychologically significant. Although Susannah has sometimes been described as unsophisticated and simple in scope, the author demonstrates why Floyd’s 'little' opera, set in New Hope Valley, Appalachia, a cultural region in the Eastern United States known for its prevailing myths and distortions of isolation, temperament, and the judgmentally conservative behavior of its inhabitants, belongs to opera’s hallmark works. Its approach to powerful underlying archetypal themes, which give rise to the poignant and haunting depictions of the darker and destructive side of the human soul, the Shadow, provides crucial significance to the work. The Shadow’s manifestation in the form of the scapegoating complex is central to the plot of Susannah; the church’s meting out of rules, judgment, and reparation for sins point to the foreboding aspects of human behavior that evoke their intrinsic nature. The scapegoating complex is highlighted in an eight-step process gleaned from the works of Kenneth Burke and Rene Girard. In summary, through depth psychological terms and mythological motifs, the author provides an insightful approach to perceiving instinctual behaviors as they play out in an American opera that has been staged over eight-hundred times, yet, unfortunately, remains in the shadows. Susannah’s timelessness is now.

Keywords: archetypes, mythology, opera, scapegoating, Shadow, Susannah

Procedia PDF Downloads 141
11133 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life

Authors: Desplanches Maxime

Abstract:

Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.

Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression

Procedia PDF Downloads 54
11132 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering

Authors: Sharifah Mousli, Sona Taheri, Jiayuan He

Abstract:

Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.

Keywords: autism spectrum disorder, clustering, optimization, unsupervised machine learning

Procedia PDF Downloads 97
11131 Object-Based Flow Physics for Aerodynamic Modelling in Real-Time Environments

Authors: William J. Crowther, Conor Marsh

Abstract:

Object-based flow simulation allows fast computation of arbitrarily complex aerodynamic models made up of simple objects with limited flow interactions. The proposed approach is universally applicable to objects made from arbitrarily scaled ellipsoid primitives at arbitrary aerodynamic attitude and angular rate. The use of a component-based aerodynamic modelling approach increases efficiency by allowing selective inclusion of different physics models at run-time and allows extensibility through the development of new models. Insight into the numerical stability of the model under first order fixed-time step integration schemes is provided by stability analysis of the drag component. The compute cost of model components and functions is evaluated and compared against numerical benchmarks. Model static outputs are verified against theoretical expectations and dynamic behaviour using falling plate data from the literature. The model is applied to a range of case studies to demonstrate the efficacy of its application in extensibility, ease of use, and low computational cost. Dynamically complex multi-body systems can be implemented in a transparent and efficient manner, and we successfully demonstrate large scenes with hundreds of objects interacting with diverse flow fields.

Keywords: aerodynamics, real-time simulation, low-order model, flight dynamics

Procedia PDF Downloads 82
11130 Partnership Brokering as a Driver of Social Business

Authors: Lani Fraizer, Faiz Shah

Abstract:

Extreme poverty continues to plague the world. Forty-seven million people live well-below the poverty line in Bangladesh, enduring poor quality of life, often with no access to basic human needs like shelter and healthcare. It is not surprising that poverty eradication is central to the mission of social change makers, such as Muhammad Yunus, who have demonstrated how enterprise-led development initiatives empower individuals at the grassroots, and can galvanize entire communities to emerge out of poverty. Such strategies call for system-wide change, and like a number of systems leaders, social business champions have typically challenged the status quo, and broken out of silos to catalyze vibrant multi-stakeholder partnerships across sectors. Apart from individual charisma, social change makers succeed because they garner collaborative impact through socially beneficial partnerships. So while enterprise-led social development evolves in scope and complexity, in step with the need to create and sustain partnerships, Partnership Brokering is emerging as an approach to facilitate collaborative processes. As such, it may now be possible for anyone motivated by the idea of social business to acquire the skills and sophistication necessary for building enriching partnerships that harness the power of the market to address poverty. This paper examines dimensions of partnership brokering in the context of social business, and explores the implications of this emerging approach on fostering poverty eradication.

Keywords: poverty, social business, partnership brokering, social entrepreneurship, systems change, enterprise-led development, change making

Procedia PDF Downloads 238
11129 New Variational Approach for Contrast Enhancement of Color Image

Authors: Wanhyun Cho, Seongchae Seo, Soonja Kang

Abstract:

In this work, we propose a variational technique for image contrast enhancement which utilizes global and local information around each pixel. The energy functional is defined by a weighted linear combination of three terms which are called on a local, a global contrast term and dispersion term. The first one is a local contrast term that can lead to improve the contrast of an input image by increasing the grey-level differences between each pixel and its neighboring to utilize contextual information around each pixel. The second one is global contrast term, which can lead to enhance a contrast of image by minimizing the difference between its empirical distribution function and a cumulative distribution function to make the probability distribution of pixel values becoming a symmetric distribution about median. The third one is a dispersion term that controls the departure between new pixel value and pixel value of original image while preserving original image characteristics as well as possible. Second, we derive the Euler-Lagrange equation for true image that can achieve the minimum of a proposed functional by using the fundamental lemma for the calculus of variations. And, we considered the procedure that this equation can be solved by using a gradient decent method, which is one of the dynamic approximation techniques. Finally, by conducting various experiments, we can demonstrate that the proposed method can enhance the contrast of colour images better than existing techniques.

Keywords: color image, contrast enhancement technique, variational approach, Euler-Lagrang equation, dynamic approximation method, EME measure

Procedia PDF Downloads 435
11128 Simple Infrastructure in Measuring Countries e-Government

Authors: Sukhbaatar Dorj, Erdenebaatar Altangerel

Abstract:

As alternative to existing e-government measuring models, here proposed a new customer centric, service oriented, simple approach for measuring countries e-Governments. If successfully implemented, built infrastructure will provide a single e-government index number for countries. Main schema is as follows. Country CIO or equal position government official, at the beginning of each year will provide to United Nations dedicated web site 4 numbers on behalf of own country: 1) Ratio of available online public services, to total number of public services, 2) Ratio of interagency inter ministry online public services to total number of available online public services, 3) Ratio of total number of citizen and business entities served online annually to total number of citizen and business entities served annually online and physically on those services, 4) Simple index for geographical spread of online served citizen and business entities. 4 numbers then combined into one index number by mathematical Average function. In addition to 4 numbers 5th number can be introduced as service quality indicator of online public services. If in ordering of countries index number is equal, 5th criteria will be used. Notice: This approach is for country’s current e-government achievement assessment, not for e-government readiness assessment.

Keywords: countries e-government index, e-government, infrastructure for measuring e-government, measuring e-government

Procedia PDF Downloads 315
11127 An Evaluation of Drivers in Implementing Sustainable Manufacturing in India: Using DEMATEL Approach

Authors: D. Garg, S. Luthra, A. Haleem

Abstract:

Due to growing concern about environmental and social consequences throughout the world, a need has been felt to incorporate sustainability concepts in conventional manufacturing. This paper is an attempt to identify and evaluate drivers in implementing sustainable manufacturing in Indian context. Nine possible drivers for successful implementation of sustainable manufacturing have been identified from extensive review. Further, Decision Making Trial and Evaluation Laboratory (DEMATEL) approach has been utilized to evaluate and categorize these identified drivers for implementing sustainable manufacturing in to the cause and effect groups. Five drivers (Societal Pressure and Public Concerns; Regulations and Government Policies; Top Management Involvement, Commitment and Support; Effective Strategies and Activities towards Socially Responsible Manufacturing and Market Trends) have been categorized into the cause group and four drivers (Holistic View in Manufacturing Systems; Supplier Participation; Building Sustainable culture in Organization; and Corporate Image and Benefits) have been categorized into the effect group. “Societal Pressure and Public Concerns” has been found the most critical driver and “Corporate Image and Benefits” as least critical or the most easily influenced driver to implementing sustainable manufacturing in Indian context. This paper may surely help practitioners in better understanding of these drivers and their priorities towards effective implementation of sustainable manufacturing.

Keywords: drivers, decision making trial and evaluation laboratory (DEMATEL), India, sustainable manufacturing

Procedia PDF Downloads 371
11126 Teachers' and Learners' Experiences of Learners' Writing in English First Additional Language

Authors: Jane-Francis A. Abongdia, Thandiswa Mpiti

Abstract:

There is an international concern to develop children’s literacy skills. In many parts of the world, the need to become fluent in a second language is essential for gaining meaningful access to education, the labour market and broader social functioning. In spite of these efforts, the problem still continues. The level of English language proficiency is far from satisfactory and these goals are unattainable by others. The issue is more complex in South Africa as learners are immersed in a second language (L2) curriculum. South Africa is a prime example of a country facing the dilemma of how to effectively equip a majority of its population with English as a second language or first additional language (FAL). Given the multilingual nature of South Africa with eleven official languages, and the position and power of English, the study investigates teachers’ and learners’ experiences on isiXhosa and Afrikaans background learners’ writing in English First Additional Language (EFAL). Moreover, possible causes of writing difficulties and teacher’s practices for writing are explored. The theoretical and conceptual framework for the study is provided by studies on constructivist theories and sociocultural theories. In exploring these issues, a qualitative approach through semi-structured interviews, classroom observations, and document analysis were adopted. This data is analysed by critical discourse analysis (CDA). The study identified a weak correlation between teachers’ beliefs and their actual teaching practices. Although the teachers believe that writing is as important as listening, speaking, reading, grammar and vocabulary, and that it needs regular practice, the data reveal that they fail to put their beliefs into practice. Moreover, the data revealed that learners were disturbed by their home language because when they do not know a word they would write either the isiXhosa or the Afrikaans equivalent. Code-switching seems to have instilled a sense of “dependence on translations” where some learners would not even try to answer English questions but would wait for the teacher to translate the questions into isiXhosa or Afrikaans before they could attempt to give answers. The findings of the study show a marked improvement in the writing performance of learners who used the process approach in writing. These findings demonstrate the need for assisting teachers to shift away from focusing only on learners’ performance (testing and grading) towards a stronger emphasis on the process of writing. The study concludes that the process approach to writing could enable teachers to focus on the various parts of the writing process which can give more freedom to learners to experiment their language proficiency. It would require that teachers develop a deeper understanding of the process/genre approaches to teaching writing advocated by CAPS. All in all, the study shows that both learners and teachers face numerous challenges relating to writing. This means that more work still needs to be done in this area. The present study argues that teachers teaching EFAL learners should approach writing as a critical and core aspect of learners’ education. Learners should be exposed to intensive writing activities throughout their school years.

Keywords: constructivism, English second language, language of learning and teaching, writing

Procedia PDF Downloads 206
11125 Optimal Energy Management and Environmental Index Optimization of a Microgrid Operating by Renewable and Sustainable Generation Systems

Authors: Nabil Mezhoud

Abstract:

The economic operation of electric energy generating systems is one of the predominant problems in energy systems. Due to the need for better reliability, high energy quality, lower losses, lower cost and a clean environment, the application of renewable and sustainable energy sources, such as wind energy, solar energy, etc., in recent years has become more widespread. In this work, one of a bio-inspired meta-heuristic algorithm inspired by the flashing behavior of fireflies at night called the Firefly Algorithm (FFA) is applied to solve the Optimal Energy Management (OEM) and the environmental index (EI) problems of a micro-grid (MG) operating by Renewable and Sustainable Generation Systems (RSGS). Our main goal is to minimize the nonlinear objective function of an electrical microgrid, taking into account equality and inequality constraints. The FFA approach was examined and tested on a standard MG system composed of different types of RSGS, such as wind turbines (WT), photovoltaic systems (PV), and non-renewable energy, such as fuel cells (FC), micro turbine (MT), diesel generator (DEG) and loads with energy storage systems (ESS). The results are promising and show the effectiveness and robustness of the proposed approach to solve the OEM and the EI problems. The results of the proposed method have been compared and validated with those known references published recently.

Keywords: renewable energy sources, energy management, distributed generator, micro-grids, firefly algorithm

Procedia PDF Downloads 50
11124 Semantic-Based Collaborative Filtering to Improve Visitor Cold Start in Recommender Systems

Authors: Baba Mbaye

Abstract:

In collaborative filtering recommendation systems, a user receives suggested items based on the opinions and evaluations of a community of users. This type of recommendation system uses only the information (notes in numerical values) contained in a usage matrix as input data. This matrix can be constructed based on users' behaviors or by offering users to declare their opinions on the items they know. The cold start problem leads to very poor performance for new users. It is a phenomenon that occurs at the beginning of use, in the situation where the system lacks data to make recommendations. There are three types of cold start problems: cold start for a new item, a new system, and a new user. We are interested in this article at the cold start for a new user. When the system welcomes a new user, the profile exists but does not have enough data, and its communities with other users profiles are still unknown. This leads to recommendations not adapted to the profile of the new user. In this paper, we propose an approach that improves cold start by using the notions of similarity and semantic proximity between users profiles during cold start. We will use the cold-metadata available (metadata extracted from the new user's data) useful in positioning the new user within a community. The aim is to look for similarities and semantic proximities with the old and current user profiles of the system. Proximity is represented by close concepts considered to belong to the same group, while similarity groups together elements that appear similar. Similarity and proximity are two close but not similar concepts. This similarity leads us to the construction of similarity which is based on: a) the concepts (properties, terms, instances) independent of ontology structure and, b) the simultaneous representation of the two concepts (relations, presence of terms in a document, simultaneous presence of the authorities). We propose an ontology, OIVCSRS (Ontology of Improvement Visitor Cold Start in Recommender Systems), in order to structure the terms and concepts representing the meaning of an information field, whether by the metadata of a namespace, or the elements of a knowledge domain. This approach allows us to automatically attach the new user to a user community, partially compensate for the data that was not initially provided and ultimately to associate a better first profile with the cold start. Thus, the aim of this paper is to propose an approach to improving cold start using semantic technologies.

Keywords: visitor cold start, recommender systems, collaborative filtering, semantic filtering

Procedia PDF Downloads 206
11123 Evaluating the Nexus between Energy Demand and Economic Growth Using the VECM Approach: Case Study of Nigeria, China, and the United States

Authors: Rita U. Onolemhemhen, Saheed L. Bello, Akin P. Iwayemi

Abstract:

The effectiveness of energy demand policy depends on identifying the key drivers of energy demand both in the short-run and the long-run. This paper examines the influence of regional differences on the link between energy demand and other explanatory variables for Nigeria, China and USA using the Vector Error Correction Model (VECM) approach. This study employed annual time series data on energy consumption (ED), real gross domestic product (GDP) per capita (RGDP), real energy prices (P) and urbanization (N) for a thirty-six-year sample period. The utilized time-series data are sourced from World Bank’s World Development Indicators (WDI, 2016) and US Energy Information Administration (EIA). Results from the study, shows that all the independent variables (income, urbanization, and price) substantially affect the long-run energy consumption in Nigeria, USA and China, whereas, income has no significant effect on short-run energy demand in USA and Nigeria. In addition, the long-run effect of urbanization is relatively stronger in China. Urbanization is a key factor in energy demand, it therefore recommended that more attention should be given to the development of rural communities to reduce the inflow of migrants into urban communities which causes the increase in energy demand and energy excesses should be penalized while energy management should be incentivized.

Keywords: economic growth, energy demand, income, real GDP, urbanization, VECM

Procedia PDF Downloads 294
11122 Six Sigma-Based Optimization of Shrinkage Accuracy in Injection Molding Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using six sigma methodologies to reach the desired shrinkage of a manufactured high-density polyurethane (HDPE) part produced by the injection molding machine. It presents a case study where the correct shrinkage is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for an injection molding process. To improve this process and keep the product within specifications, the six sigma methodology, design, measure, analyze, improve, and control (DMAIC) approach, was implemented in this study. The six sigma approach was paired with the Taguchi methodology to identify the optimized processing parameters that keep the shrinkage rate within the specifications by our customer. An L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of the cooling time, melt temperature, holding time, and metering stroke. The noise factor is the difference between material brand 1 and material brand 2. After the confirmation run was completed, measurements verify that the new parameter settings are optimal. With the new settings, the process capability index has improved dramatically. The purpose of this study is to show that the six sigma and Taguchi methodology can be efficiently used to determine important factors that will improve the process capability index of the injection molding process.

Keywords: injection molding, shrinkage, six sigma, Taguchi parameter design

Procedia PDF Downloads 159