Search results for: solving Sudoku puzzles
421 Exploring Psychosocial Factors That Enable Teachers to Cope with Workplace Adversity at a Rural District School Setting
Authors: K. R. Mukuna
Abstract:
Teachers are faced many challenges in the South African rural schools such as stress, depression, lack of resources, poor working relationships, inflexible curriculum etc. These could affect their wellbeing and effectiveness at the workplace. As a result, the study had a significance in the teacher’s lives, and community due teachers worked under conditions that are unfavourable to perform their jobs effectively. Despite these conditions, they still managed to do their jobs and the community is uplifted. However, this study aimed to explore factors that enable teachers to cope with workplace adversities at a rural school district in the Free State Province. It adopted a qualitative case study as a research design. Semi-structured interviews and colleges had employed as tools to collect data. Ten participants (n=10; 5 males and 5 females) were selected through purposive and convenience sampling. All participants selected from a South African rural school. Sesotho culture was their home language, and most of them had 5 years of teaching experiences. The thematic findings revealed that they developed abilities to cope with and adjust to the social and cultural environment. These included self-efficacy, developing problem-solving skills, awareness of strengths and asserts, self-managing of emotions, and self-confidence. This study concluded that these psychosocial factors contributed to coping with teacher’s diversities, and effectively stabilized their wellbeing in the schools.Keywords: psychosocial factors, teachers counselling, teacher stress, workplace adversity, rural school, teachers’ wellbeing, teachers’ resilience, teachers’ self-efficacy, social interaction
Procedia PDF Downloads 127420 Supplier Selection Using Sustainable Criteria in Sustainable Supply Chain Management
Authors: Richa Grover, Rahul Grover, V. Balaji Rao, Kavish Kejriwal
Abstract:
Selection of suppliers is a crucial problem in the supply chain management. On top of that, sustainable supplier selection is the biggest challenge for the organizations. Environment protection and social problems have been of concern to society in recent years, and the traditional supplier selection does not consider about this factor; therefore, this research work focuses on introducing sustainable criteria into the structure of supplier selection criteria. Sustainable Supply Chain Management (SSCM) is the management and administration of material, information, and money flows, as well as coordination among business along the supply chain. All three dimensions - economic, environmental, and social - of sustainable development needs to be taken care of. Purpose of this research is to maximize supply chain profitability, maximize social wellbeing of supply chain and minimize environmental impacts. Problem statement is selection of suppliers in a sustainable supply chain network by ranking the suppliers against sustainable criteria identified. The aim of this research is twofold: To find out what are the sustainable parameters that can be applied to the supply chain, and to determine how these parameters can effectively be used in supplier selection. Multicriteria decision making tools will be used to rank both criteria and suppliers. AHP Analysis will be used to find out ratings for the criteria identified. It is a technique used for efficient decision making. TOPSIS will be used to find out rating for suppliers and then ranking them. TOPSIS is a MCDM problem solving method which is based on the principle that the chosen option should have the maximum distance from the negative ideal solution (NIS) and the minimum distance from the ideal solution.Keywords: sustainable supply chain management, sustainable criteria, MCDM tools, AHP analysis, TOPSIS method
Procedia PDF Downloads 325419 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach
Authors: Jean Berger, Nassirou Lo, Martin Noel
Abstract:
Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization
Procedia PDF Downloads 371418 Interdisciplinary Collaborative Innovation Mechanism for Sustainability Challenges
Authors: C. Park, H. Lee, Y-J. Lee
Abstract:
Aim: This study presents Interdisciplinary Collaborative Innovation Mechanism as a medium to enable the effective generation of innovations for sustainability challenges facing humanities. Background: Interdisciplinary approach of fusing disparate knowledge and perspectives from diverse expertise and subject areas is one of the key requirements to address the intricate nature of sustainability issues. There is a lack of rigorous empirical study of the systematic structure of interdisciplinary collaborative innovation for sustainability to date. Method: To address this research gap, the action research approach is adopted to develop the Interdisciplinary Collaborative Innovation Mechanism (ICIM) framework based on an empirical study of a total of 28 open innovation competitions in the format of MAKEathons between 2016 to 2023. First, the conceptual framework was formulated based on the literature findings, and the framework was subsequently tested and iterated. Outcomes: The findings provide the ICIM framework composed of five elements: Discipline Diversity Quadruple; Systematic Structure; Inspirational Stimuli; Supportive Collaboration Environment; and Hardware and Intellectual Support. The framework offers a discussion of the key elements when attempting to facilitate interdisciplinary collaboration for sustainability innovation. Contributions: This study contributes to two burgeoning areas of sustainable development and open innovation studies by articulating the concrete structure to bridge the gap. In practice, the framework helps facilitate effective innovation processes and positive social and environmental impact created for real-world sustainability challenges.Keywords: action research, interdisciplinary collaboration, open innovation, problem-solving, sustainable development, sustainability challenges
Procedia PDF Downloads 247417 Challenges in Curriculum Development in Eastern European Countries: A Case Study of Georgia and Ukraine
Authors: Revaz Tabatadze
Abstract:
This research aims to describe and analyze the intricacies of curriculum development within the broader context of general education reforms undertaken in Eastern European Countries. Importantly, this study is the first of its kind, examining Georgian and Ukrainian National Curriculum documents locally and internationally. The significance of this research lies in its potential to guide the Ministry of Education and Science of the mentioned countries in revising existing curriculum documents to address contemporary challenges in general education. The findings will not only benefit post-Soviet countries but also offer insights for nations facing curriculum development and effectiveness issues. By examining the peculiarities of curriculum development amid globalization, this research aims to contribute to overcoming educational challenges at both local and international levels. This study defines key concepts related to curriculum, distinguishing between intended, implemented, and attained curricula. It also explores the historical context of curriculum development in Georgia and Ukraine from 1991 to 2021, highlighting changes in teacher standards and teacher certification examinations. The literature review section emphasizes the importance of curriculum development as a complex and evolving process, especially in the context of globalization. It underscores the need for a curriculum that fosters critical thinking, problem-solving, and collaboration skills in students. In summary, this research offers a comprehensive examination of curriculum development in Georgia and Ukraine, shedding light on the challenges and opportunities in the age of globalization, with potential implications for educational systems worldwide.Keywords: curriculum development, general education reforms, eastern European countries, globalization in education
Procedia PDF Downloads 64416 The Bayesian Premium Under Entropy Loss
Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita
Abstract:
Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation
Procedia PDF Downloads 334415 Realizing the Full Potential of Islamic Banking System: Proposed Suitable Legal Framework for Islamic Banking System in Tanzania
Authors: Maulana Ayoub Ali, Pradeep Kulshrestha
Abstract:
Laws of any given secular state have a huge contribution in the growth of the Islamic banking system because the system uses conventional laws to govern its activities. Therefore, the former should be ready to accommodate the latter in order to make the Islamic banking system work properly without affecting the current conventional banking system and therefore without affecting its system. Islamic financial rules have been practiced since the birth of Islam. Following the recent world economic challenges in the financial sector, a quick rebirth of the contemporary Islamic ethical banking system took place. The coming of the Islamic banking system is due to various reasons including but not limited to the failure of the interest based economy in solving financial problems around the globe. Therefore, the Islamic banking system has been adopted as an alternative banking system in order to recover the highly damaged global financial sector. But the Islamic banking system has been facing a number of challenges which hinder its smooth operation in different parts of the world. It has not been the aim of this paper to discuss other challenges rather than the legal ones, but the same was partly discussed when it was justified that it was proper to do so. Generally, there are so many things which have been discovered in the course of writing this paper. The most important part is the issue of the regulatory and supervisory framework for the Islamic banking system in Tanzania and in other nations is considered to be a crucial part for the development of the Islamic banking industry. This paper analyses what has been observed in the study on that area and recommends for necessary actions to be taken on board in a bid to make Islamic banking system reach its climax of serving the larger community by providing ethical, equitable, affordable, interest-free and society cantered banking system around the globe.Keywords: Islamic banking, interest free banking, ethical banking, legal framework
Procedia PDF Downloads 149414 A Finite Element/Finite Volume Method for Dam-Break Flows over Deformable Beds
Authors: Alia Alghosoun, Ashraf Osman, Mohammed Seaid
Abstract:
A coupled two-layer finite volume/finite element method was proposed for solving dam-break flow problem over deformable beds. The governing equations consist of the well-balanced two-layer shallow water equations for the water flow and a linear elastic model for the bed deformations. Deformations in the topography can be caused by a brutal localized force or simply by a class of sliding displacements on the bathymetry. This deformation in the bed is a source of perturbations, on the water surface generating water waves which propagate with different amplitudes and frequencies. Coupling conditions at the interface are also investigated in the current study and two mesh procedure is proposed for the transfer of information through the interface. In the present work a new procedure is implemented at the soil-water interface using the finite element and two-layer finite volume meshes with a conservative distribution of the forces at their intersections. The finite element method employs quadratic elements in an unstructured triangular mesh and the finite volume method uses the Rusanove to reconstruct the numerical fluxes. The numerical coupled method is highly efficient, accurate, well balanced, and it can handle complex geometries as well as rapidly varying flows. Numerical results are presented for several test examples of dam-break flows over deformable beds. Mesh convergence study is performed for both methods, the overall model provides new insight into the problems at minimal computational cost.Keywords: dam-break flows, deformable beds, finite element method, finite volume method, hybrid techniques, linear elasticity, shallow water equations
Procedia PDF Downloads 181413 The Role of Physical Education and Fitness for Active Ageing
Authors: A. Lakshya
Abstract:
The main aim of this paper is to interpret physical education for children from 5 to 18 years. Schools have the ability to promote positive mental health by developing physical education, which helps to build individual growth, goal setting, decision making, helps in muscular development, self-discipline, stresses relief, leadership qualities that can arise with new skills, prosocial behavior and problem-solving skills. But mostly the children at these early ages ought to hold the disorders as heart attack, diabetes and obesity disorders may increase in large number. The data of P.E has got a very least place, where children are with feeble minds and they acquired a state of inactiveness. Globally, 81% of adolescents aged 11-18 years were insufficiently physically active in the year 2016. Adolescent girls were less active than boys, with the percentage of 85% vs. 78% as well. A recent study of California schools found that students are sedentary most of the time during PE classes, with just four minutes of every half-hour spent in vigorous physical activity. Additionally, active PE time decreases with larger class sizes. Students in classes with more than forty-five students are half as active as students in smaller class sizes. The children in adolescence age they acquire more creative ideas hence they create new hairstyles, cooking styles and dressing styles. Instead, all the children are engaging themselves to TV (television) and video games. The development of physical quality not only improves students ’ physical fitness but is also conducive to the psychological development of the students. Physical education teaching should pay more attention to the training of physical quality in the future.Keywords: physical education, prosocial behavior, leadership, goal setting
Procedia PDF Downloads 137412 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids
Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao
Abstract:
An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.
Procedia PDF Downloads 148411 Synthesis and Properties of Nanosized Mixed Oxide Systems for Environmental Protection
Authors: I. Yordanova, H. Kolev, S. Todorova, Z. Cherkezova-Zheleva
Abstract:
Catalysis plays a key role in solving many environmental problems by establishing efficient catalytic systems for environmental protection and reducing emissions of greenhouse gases from industry. Volatile organic compounds are major air pollutants. There are several ways to dispose of emissions like - adsorption, condensation, absorption, bio-filtration, thermal, catalytic, plasma and ultraviolet oxidation. The catalytic oxidation has more advantages over other methods. For example - lower energy consumption; the concentration of the organic contaminant may be low or may vary within wide limits. Catalysts for complete oxidation of VOCs can be classified into three categories: noble metal, metal oxides or supported metal oxides and mixture of noble metals and metal oxides. Most of the catalysts for the complete catalytic oxidation are based on Pt, Pd, Rh or a combination thereof. The oxides of the transition metal are one of the alternatives to noble metal catalysts for these reactions. They are less active at low temperatures, but at higher - their activity is similar. The properties of the catalyst depend on the distribution of the active phase, the medium type of the pre-treatment, the interaction between the active phase and the support and the interaction between the active phase and the reaction medium. Supported mono-component Mn and bi-component Mn-Co systems are examined in present study. The samples are prepared using co-precipitation method. SiO2 (Aerosil) is used as a support. The studied samples were precipitated by NH4OH. The synthesized samples were characterized by XRD, XPS, TPR and tested in the catalytic reaction of complete oxidation of n-hexane, propane, methanol, ethanol and propanol.Keywords: catalytic oxidation, Co-Mn oxide, oxidation of hydrocarbons and alcohols, environmental protection
Procedia PDF Downloads 386410 Hydrodynamics Study on Planing Hull with and without Step Using Numerical Solution
Authors: Koe Han Beng, Khoo Boo Cheong
Abstract:
The rising interest of stepped hull design has been led by the demand of more efficient high-speed boat. At the same time, the need of accurate prediction method for stepped planing hull is getting more important. By understanding the flow at high Froude number is the key in designing a practical step hull, the study surrounding stepped hull has been done mainly in the towing tank which is time-consuming and costly for initial design phase. Here the feasibility of predicting hydrodynamics of high-speed planing hull both with and without step using computational fluid dynamics (CFD) with the volume of fluid (VOF) methodology is studied in this work. First the flow around the prismatic body is analyzed, the force generated and its center of pressure are compared with available experimental and empirical data from the literature. The wake behind the transom on the keel line as well as the quarter beam buttock line are then compared with the available data, this is important since the afterbody flow of stepped hull is subjected from the wake of the forebody. Finally the calm water performance prediction of a conventional planing hull and its stepped version is then analyzed. Overset mesh methodology is employed in solving the dynamic equilibrium of the hull. The resistance, trim, and heave are then compared with the experimental data. The resistance is found to be predicted well and the dynamic equilibrium solved by the numerical method is deemed to be acceptable. This means that computational fluid dynamics will be very useful in further study on the complex flow around stepped hull and its potential usage in the design phase.Keywords: planing hulls, stepped hulls, wake shape, numerical simulation, hydrodynamics
Procedia PDF Downloads 282409 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 43408 Engineering of Filtration Systems in Egyptian Cement Plants: Industrial Case Study
Authors: Mohamed. A. Saad
Abstract:
The paper represents a case study regarding the conversion of Electro-Static Precipitators (ESP`s) into Fabric Filters (FF). Seven cement production companies were established in Egypt during the period 1927 to 1980 and 6 new companies were established to cope with the increasing cement demand in 1980's. The cement production market shares in Egypt indicate that there are six multinational companies in the local market, they are interested in the environmental conditions improving and so decided to achieve emission reduction project. The experimental work in the present study is divided into two main parts: (I) Measuring Efficiency of Filter Fabrics with detailed description of a designed apparatus. The paper also reveals the factors that should be optimized in order to assist problem diagnosis, solving and increasing the life of bag filters. (II) Methods to mitigate dust emissions in Egyptian cement plants with a special focus on converting the Electrostatic Precipitators (ESP`s) into Fabric Filters (FF) using the same ESP casing, bottom hoppers, dust transportation system, and ESP ductwork. Only the fan system for the higher pressure drop with the fabric filter was replaced. The proper selection of bag material was a prime factor with regard to gas composition, temperature and particle size. Fiberglass with PTFE membrane coated bags was selected. This fabric is rated for a continuous temperature of 250 C and a surge temperature of 280C. The dust emission recorded was less than 20 mg/m3 from the production line fitted with fabric filters which is super compared with the ESP`s working lines stack.Keywords: Engineering Electrostatic Precipitator, filtration, dust collectors, cement
Procedia PDF Downloads 253407 A Multi Objective Reliable Location-Inventory Capacitated Disruption Facility Problem with Penalty Cost Solve with Efficient Meta Historic Algorithms
Authors: Elham Taghizadeh, Mostafa Abedzadeh, Mostafa Setak
Abstract:
Logistics network is expected that opened facilities work continuously for a long time horizon without any failure; but in real world problems, facilities may face disruptions. This paper studies a reliable joint inventory location problem to optimize cost of facility locations, customers’ assignment, and inventory management decisions when facilities face failure risks and doesn’t work. In our model we assume when a facility is out of work, its customers may be reassigned to other operational facilities otherwise they must endure high penalty costs associated with losing service. For defining the model closer to real world problems, the model is proposed based on p-median problem and the facilities are considered to have limited capacities. We define a new binary variable (Z_is) for showing that customers are not assigned to any facilities. Our problem involve a bi-objective model; the first one minimizes the sum of facility construction costs and expected inventory holding costs, the second one function that mention for the first one is minimizes maximum expected customer costs under normal and failure scenarios. For solving this model we use NSGAII and MOSS algorithms have been applied to find the pareto- archive solution. Also Response Surface Methodology (RSM) is applied for optimizing the NSGAII Algorithm Parameters. We compare performance of two algorithms with three metrics and the results show NSGAII is more suitable for our model.Keywords: joint inventory-location problem, facility location, NSGAII, MOSS
Procedia PDF Downloads 525406 Importance of Risk Assessment in Managers´ Decision-Making Process
Authors: Mária Hudáková, Vladimír Míka, Katarína Hollá
Abstract:
Making decisions is the core of management and a result of conscious activities which is under way in a particular environment and concrete conditions. The managers decide about the goals, procedures and about the methods how to respond to the changes and to the problems which developed. Their decisions affect the effectiveness, quality, economy and the overall successfulness in every organisation. In spite of this fact, they do not pay sufficient attention to the individual steps of the decision-making process. They emphasise more how to cope with the individual methods and techniques of making decisions and forget about the way how to cope with analysing the problem or assessing the individual solution variants. In many cases, the underestimating of the analytical phase can lead to an incorrect assessment of the problem and this can then negatively influence its further solution. Based on our analysis of the theoretical solutions by individual authors who are dealing with this area and the realised research in Slovakia and also abroad we can recognise an insufficient interest of the managers to assess the risks in the decision-making process. The goal of this paper is to assess the risks in the managers´ decision-making process relating to the conditions of the environment, to the subject’s activity (the manager’s personality), to the insufficient assessment of individual variants for solving the problems but also to situations when the arisen problem is not solved. The benefit of this paper is the effort to increase the need of the managers to deal with the risks during the decision-making process. It is important for every manager to assess the risks in his/her decision-making process and to make efforts to take such decisions which reflect the basic conditions, states and development of the environment in the best way and especially for the managers´ decisions to contribute to achieving the determined goals of the organisation as effectively as possible.Keywords: risk, decision-making, manager, process, analysis, source of risk
Procedia PDF Downloads 264405 Scheduling in a Single-Stage, Multi-Item Compatible Process Using Multiple Arc Network Model
Authors: Bokkasam Sasidhar, Ibrahim Aljasser
Abstract:
The problem of finding optimal schedules for each equipment in a production process is considered, which consists of a single stage of manufacturing and which can handle different types of products, where changeover for handling one type of product to the other type incurs certain costs. The machine capacity is determined by the upper limit for the quantity that can be processed for each of the products in a set up. The changeover costs increase with the number of set ups and hence to minimize the costs associated with the product changeover, the planning should be such that similar types of products should be processed successively so that the total number of changeovers and in turn the associated set up costs are minimized. The problem of cost minimization is equivalent to the problem of minimizing the number of set ups or equivalently maximizing the capacity utilization in between every set up or maximizing the total capacity utilization. Further, the production is usually planned against customers’ orders, and generally different customers’ orders are assigned one of the two priorities – “normal” or “priority” order. The problem of production planning in such a situation can be formulated into a Multiple Arc Network (MAN) model and can be solved sequentially using the algorithm for maximizing flow along a MAN and the algorithm for maximizing flow along a MAN with priority arcs. The model aims to provide optimal production schedule with an objective of maximizing capacity utilization, so that the customer-wise delivery schedules are fulfilled, keeping in view the customer priorities. Algorithms have been presented for solving the MAN formulation of the production planning with customer priorities. The application of the model is demonstrated through numerical examples.Keywords: scheduling, maximal flow problem, multiple arc network model, optimization
Procedia PDF Downloads 402404 Using the SMT Solver to Minimize the Latency and to Optimize the Number of Cores in an NoC-DSP Architectures
Authors: Imen Amari, Kaouther Gasmi, Asma Rebaya, Salem Hasnaoui
Abstract:
The problem of scheduling and mapping data flow applications on multi-core architectures is notoriously difficult. This difficulty is related to the rapid evaluation of Telecommunication and multimedia systems accompanied by a rapid increase of user requirements in terms of latency, execution time, consumption, energy, etc. Having an optimal scheduling on multi-cores DSP (Digital signal Processors) platforms is a challenging task. In this context, we present a novel technic and algorithm in order to find a valid schedule that optimizes the key performance metrics particularly the Latency. Our contribution is based on Satisfiability Modulo Theories (SMT) solving technologies which is strongly driven by the industrial applications and needs. This paper, describe a scheduling module integrated in our proposed Workflow which is advised to be a successful approach for programming the applications based on NoC-DSP platforms. This workflow transform automatically a Simulink model to a synchronous dataflow (SDF) model. The automatic transformation followed by SMT solver scheduling aim to minimize the final latency and other software/hardware metrics in terms of an optimal schedule. Also, finding the optimal numbers of cores to be used. In fact, our proposed workflow taking as entry point a Simulink file (.mdl or .slx) derived from embedded Matlab functions. We use an approach which is based on the synchronous and hierarchical behavior of both Simulink and SDF. Whence, results of running the scheduler which exist in the Workflow mentioned above using our proposed SMT solver algorithm refinements produce the best possible scheduling in terms of latency and numbers of cores.Keywords: multi-cores DSP, scheduling, SMT solver, workflow
Procedia PDF Downloads 286403 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 259402 Aerodynamic Modeling Using Flight Data at High Angle of Attack
Authors: Rakesh Kumar, A. K. Ghosh
Abstract:
The paper presents the modeling of linear and nonlinear longitudinal aerodynamics using real flight data of Hansa-3 aircraft gathered at low and high angles of attack. The Neural-Gauss-Newton (NGN) method has been applied to model the linear and nonlinear longitudinal dynamics and estimate parameters from flight data. Unsteady aerodynamics due to flow separation at high angles of attack near stall has been included in the aerodynamic model using Kirchhoff’s quasi-steady stall model. NGN method is an algorithm that utilizes Feed Forward Neural Network (FFNN) and Gauss-Newton optimization to estimate the parameters and it does not require any a priori postulation of mathematical model or solving of equations of motion. NGN method was validated on real flight data generated at moderate angles of attack before application to the data at high angles of attack. The estimates obtained from compatible flight data using NGN method were validated by comparing with wind tunnel values and the maximum likelihood estimates. Validation was also carried out by comparing the response of measured motion variables with the response generated by using estimates a different control input. Next, NGN method was applied to real flight data generated by executing a well-designed quasi-steady stall maneuver. The results obtained in terms of stall characteristics and aerodynamic parameters were encouraging and reasonably accurate to establish NGN as a method for modeling nonlinear aerodynamics from real flight data at high angles of attack.Keywords: parameter estimation, NGN method, linear and nonlinear, aerodynamic modeling
Procedia PDF Downloads 445401 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant
Authors: Michael Smalenberger
Abstract:
Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation
Procedia PDF Downloads 172400 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem
Authors: Bidzina Matsaberidze
Abstract:
It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions
Procedia PDF Downloads 92399 A Three-Dimensional Investigation of Stabilized Turbulent Diffusion Flames Using Different Type of Fuel
Authors: Moataz Medhat, Essam E. Khalil, Hatem Haridy
Abstract:
In the present study, a numerical simulation study is used to 3-D model the steady-state combustion of a staged natural gas flame in a 300 kW swirl-stabilized burner, using ANSYS solver to find the highest combustion efficiency by changing the inlet air swirl number and burner quarl angle in a furnace and showing the effect of flue gas recirculation, type of fuel and staging. The combustion chamber of the gas turbine is a cylinder of diameter 1006.8 mm, and a height of 1651mm ending with a hood until the exhaust cylinder has been reached, where the exit of combustion products which have a diameter of 300 mm, with a height of 751mm. The model was studied by 15 degree of the circumference due to axisymmetric of the geometry and divided into a mesh of about 1.1 million cells. The numerical simulations were performed by solving the governing equations in a three-dimensional model using realizable K-epsilon equations to express the turbulence and non-premixed flamelet combustion model taking into consideration radiation effect. The validation of the results was done by comparing it with other experimental data to ensure the agreement of the results. The study showed two zones of recirculation. The primary one is at the center of the furnace, and the location of the secondary one varies by changing the quarl angle of the burner. It is found that the increase in temperature in the external recirculation zone is a result of increasing the swirl number of the inlet air stream. Also it was found that recirculating part of the combustion products back to the combustion zone decreases pollutants formation especially nitrogen monoxide.Keywords: burner selection, natural gas, analysis, recirculation
Procedia PDF Downloads 161398 Qualitative Profiling Model and Competencies Evaluation to Fighting Unemployment
Authors: Francesca Carta, Giovanna Linfante, Laura Agneni, Debora Radicchia, Camilla Micheletta, Angelo Del Cimmuto
Abstract:
Overtaking competence mismatches and fostering career pathways congruent with the individual skills profile would significantly contribute to fighting unemployment. The aim of this paper is to examine the usefulness and efficiency of qualitative tools in supporting and improving the quality of caseworkers’ activities during the jobseekers’ profile analysis and career guidance process. The selected target groups are long-term and middle term unemployed, job seekers, young people at the end of the vocational training pathway and unemployed woman with social disadvantages. The experimentation is conducted in Italy at public employment services in 2017. In the framework of Italian labour market reform, the experimentation represents the first step to develop a customized qualitative model profiling; the final general object is to improve the public employment services quality. The experimentation tests the transferability of an OECD self-assessment competences tool in the Italian public employment services. On one hand, the first analysis results will indicate the user’s perception concerning the tool’s application and their different competence levels (literacy, numeracy, problem solving, career interest, subjective well-being and health, behavioural competencies) with reference to the specific target. On the other hand, the experimentation outcomes will show caseworkers understanding regarding the tool’s usability and efficiency for career guidance and reskilling and upskilling programs.Keywords: career guidance, evaluation competences, reskilling pathway, unemployment
Procedia PDF Downloads 318397 Thermal Transport Properties of Common Transition Single Metal Atom Catalysts
Authors: Yuxi Zhu, Zhenqian Chen
Abstract:
It is of great interest to investigate the thermal properties of non-precious metal catalysts for Proton exchange membrane fuel cell (PEMFC) based on the thermal management requirements. Due to the low symmetry of materials, to accurately obtain the thermal conductivity of materials, it is necessary to obtain the second and third order force constants by combining density functional theory and machine learning interatomic potential. To be specific, the interatomic force constants are obtained by moment tensor potential (MTP), which is trained by the computational trajectory of Ab initio molecular dynamics (AIMD) at 50, 300, 600, and 900 K for 1 ps each, with a time step of 1 fs in the AIMD computation. And then the thermal conductivity can be obtained by solving the Boltzmann transport equation. In this paper, the thermal transport properties of single metal atom catalysts are studied for the first time to our best knowledge by machine-learning interatomic potential (MLIP). Results show that the single metal atom catalysts exhibit anisotropic thermal conductivities and partially exhibit good thermal conductivity. The average lattice thermal conductivities of G-FeN₄, G-CoN₄ and G-NiN₄ at 300 K are 88.61 W/mK, 205.32 W/mK and 210.57 W/mK, respectively. While other single metal atom catalysts show low thermal conductivity due to their low phonon lifetime. The results also show that low-frequency phonons (0-10 THz) dominate thermal transport properties. The results provide theoretical insights into the application of single metal atom catalysts in thermal management.Keywords: proton exchange membrane fuel cell, single metal atom catalysts, density functional theory, thermal conductivity, machine-learning interatomic potential
Procedia PDF Downloads 23396 Evaluation Study of Easily Identification of Tactile Symbol on Body Soap Bottle
Authors: K. Doi, T. Nishimura, H. Fujimoto, Y. Hoshikawa, T. Wada
Abstract:
Japanese industrial standard (JIS) association established one JIS (JIS S 0021) regarding packaging accessible design for people with visual impairments and elderly people in 2000. Recently, tactile symbol on shampoo bottle has been known as one of package accessible design and more effectively used. However, it has been said that people with visual impairment have been not been in trouble with difficulty of identifying body soap bottle between three bottles such as body soap bottle, shampoo bottle, and conditioner bottle. Japanese low vision association asked JIS association to solve this problem. JIS association and Japan cosmetic industry association constituted one review team for solving the problem. The review team asked our research team to make a proposal regarding new tactile symbol on body soap bottle. We conducted user survey and maker survey regarding tactile symbol on body soap bottle with easily identification. Seven test tactile symbol marks were elected in our proposed tactile symbols. In this study, we evaluate easily identification of tactile symbol on body soap bottle. Six visual impaired subjects were participated in our experiment. These subjects were asked to identify body soap bottle between three bottles such as body soap bottle, shampoo bottle, and conditioner bottle. The test tactile symbol on body soap were presented in random order. The test tactile symbols were produced by use of our originally developed 3D raised equipment. From our study, test tactile symbol marks with easily identification were made a short list of our proposed tactile symbols. This knowledge will be helpful in revision of ISO 11156.Keywords: tactile symbol, easily identification, body soap, people with visual impairments
Procedia PDF Downloads 313395 Experimental Monitoring of the Parameters of the Ionosphere in the Local Area Using the Results of Multifrequency GNSS-Measurements
Authors: Andrey Kupriyanov
Abstract:
In recent years, much attention has been paid to the problems of ionospheric disturbances and their influence on the signals of global navigation satellite systems (GNSS) around the world. This is due to the increase in solar activity, the expansion of the scope of GNSS, the emergence of new satellite systems, the introduction of new frequencies and many others. The influence of the Earth's ionosphere on the propagation of radio signals is an important factor in many applied fields of science and technology. The paper considers the application of the method of transionospheric sounding using measurements from signals from Global Navigation Satellite Systems to determine the TEC distribution and scintillations of the ionospheric layers. To calculate these parameters, the International Reference Ionosphere (IRI) model of the ionosphere, refined in the local area, is used. The organization of operational monitoring of ionospheric parameters is analyzed using several NovAtel GPStation6 base stations. It allows performing primary processing of GNSS measurement data, calculating TEC and fixing scintillation moments, modeling the ionosphere using the obtained data, storing data and performing ionospheric correction in measurements. As a result of the study, it was proved that the use of the transionospheric sounding method for reconstructing the altitude distribution of electron concentration in different altitude range and would provide operational information about the ionosphere, which is necessary for solving a number of practical problems in the field of many applications. Also, the use of multi-frequency multisystem GNSS equipment and special software will allow achieving the specified accuracy and volume of measurements.Keywords: global navigation satellite systems (GNSS), GPstation6, international reference ionosphere (IRI), ionosphere, scintillations, total electron content (TEC)
Procedia PDF Downloads 181394 The Quality of Business Relationships in the Tourism System: An Imaginary Organisation Approach
Authors: Armando Luis Vieira, Carlos Costa, Arthur Araújo
Abstract:
The tourism system is viewable as a network of relationships amongst business partners where the success of each actor will ultimately be determined by the success of the whole network. Especially since the publication of Gümmesson’s (1996) ‘theory of imaginary organisations’, which suggests that organisational effectiveness largely depends on managing relationships and sharing resources and activities, relationship quality (RQ) has been increasingly recognised as a main source of value creation and competitive advantage. However, there is still ambiguity around this topic, and managers and researchers have been recurrently reporting the need to better understand and capitalise on the quality of interactions with business partners. This research aims at testing an RQ model from a relational, imaginary organisation’s approach. Two mail surveys provide the perceptions of 725 hotel representatives about their business relationships with tour operators, and 1,224 corporate client representatives about their business relationships with hotels (21.9 % and 38.8 % response rate, respectively). The analysis contributes to enhance our understanding on the linkages between RQ and its determinants, and identifies the role of their dimensions. Structural equation modelling results highlight trust as the dominant dimension, the crucial role of commitment and satisfaction, and suggest customer orientation as complementary building block. Findings also emphasise problem solving behaviour and selling orientation as the most relevant dimensions of customer orientation. The comparison of the two ‘dyads’ deepens the discussion and enriches the suggested theoretical and managerial guidelines concerning the contribution of quality relationships to business performance.Keywords: corporate clients, destination competitiveness, hotels, relationship quality, structural equations modelling, tour operators
Procedia PDF Downloads 393393 The Effects of Ellagic Acid on Rat Liver Induced Tobacco Smoke
Authors: Nalan Kaya, Elif Erdem, Mehmet Ali Kisacam, Gonca Ozan, Enver Ozan
Abstract:
Tobacco smokers continuously inhale thousands of carcinogens and free radicals. It is estimated that about 1017 oxidant molecules are present in each puff of tobacco smoke. It is known that smoking has adverse effects on the structure and functions of the liver. Ellagic acid (EA) has antioxidant, antiapoptotic, anticarcinogenic, antibacterial and antiinflammatory effects. The aim of our study was to investigate the possible protective effect of ellagic acid against tobacco smoke-mediated oxidative stress in the rat liver. Twenty-four male adult (8 weeks old) Spraque-Dawley rats were divided randomly into 4 equal groups: group I (control), group II (tobacco smoke), group III (tobacco smoke + corn oil) and group IV (tobacco smoke + ellagic acid). The rats in group II, III and IV, were exposed to tobacco smoke 1 hour twice a day for 12 weeks. In addition to tobacco smoke exposure, 12 mg/kg ellagic acid (dissolved in corn oil), was applied to the rats in group IV by oral gavage. An equal amount of corn oil used in solving ellagic acid was applied to the rats by oral gavage in group III. At the end of the experimental period, rats were decapitated, and liver tissues were removed. Histological and biochemical analyzes were performed. Sinusoidal dilatation, inflammatory cell infiltration in portal area, increased Kuppfer cells were examined in tobacco smoke group and tobacco smoke+ corn oil groups. The results, observed in tobacco smoke and tobacco smoke+corn oil groups, were found significantly decreased in tobacco smoke+EA group. Group-II and group-III MDA levels were significantly higher, and GSH activities were not different than group-I. Compared to group-II, group-IV MDA level was decreased, and GSH activities was increased significantly. The results indicate that ellagic acid could protect the liver tissue from the tobacco smoke harmful effects.Keywords: ellagic acid, liver, rat, tobacco smoke
Procedia PDF Downloads 300392 Evaluating the Validity of CFD Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements
Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck
Abstract:
This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the geometric mean bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow
Procedia PDF Downloads 135