Search results for: robot programming
115 Evaluating the Success of an Intervention Course in a South African Engineering Programme
Authors: Alessandra Chiara Maraschin, Estelle Trengove
Abstract:
In South Africa, only 23% of engineering students attain their degrees in the minimum time of 4 years. This begs the question: Why is the 4-year throughput rate so low? Improving the throughput rate is crucial in assisting students to the shortest possible path to completion. The Electrical Engineering programme has a fixed curriculum and students must pass all courses in order to graduate. In South Africa, as is the case in several other countries, many students rely on external funding such as bursaries from companies in industry. If students fail a course, they often lose their bursaries, and most might not be able to fund their 'repeating year' fees. It is thus important to improve the throughput rate, since for many students, graduating from university is a way out of poverty for an entire family. In Electrical Engineering, it has been found that the Software Development I course (an introduction to C++ programming) is a significant hurdle course for students and has been found to have a low pass rate. It has been well-documented that students struggle with this type of course as it introduces a number of new threshold concepts that can be challenging to grasp in a short time frame. In an attempt to mitigate this situation, a part-time night-school for Software Development I was introduced in 2015 as an intervention measure. The course includes all the course material from the Software Development I module and allows students who failed the course in first semester a second chance by repeating the course through taking the night-school course. The purpose of this study is to determine whether the introduction of this intervention course could be considered a success. The success of the intervention is assessed in two ways. The study will first look at whether the night-school course contributed to improving the pass rate of the Software Development I course. Secondly, the study will examine whether the intervention contributed to improving the overall throughput from the 2nd year to the 3rd year of study at a South African University. Second year academic results for a sample of 1216 students have been collected from 2010-2017. Preliminary results show that the lowest pass rate for Software Development I was found to be in 2017 with a pass rate of 34.9%. Since the intervention course's inception, the pass rate for Software Development I has increased each year from 2015-2017 by 13.75%, 25.53% and 25.81% respectively. To conclude, the preliminary results show that the intervention course is a success in improving the pass rate of Software Development I.Keywords: academic performance, electrical engineering, engineering education, intervention course, low pass rate, software development course, throughput
Procedia PDF Downloads 164114 Advancements in Mathematical Modeling and Optimization for Control, Signal Processing, and Energy Systems
Authors: Zahid Ullah, Atlas Khan
Abstract:
This abstract focuses on the advancements in mathematical modeling and optimization techniques that play a crucial role in enhancing the efficiency, reliability, and performance of these systems. In this era of rapidly evolving technology, mathematical modeling and optimization offer powerful tools to tackle the complex challenges faced by control, signal processing, and energy systems. This abstract presents the latest research and developments in mathematical methodologies, encompassing areas such as control theory, system identification, signal processing algorithms, and energy optimization. The abstract highlights the interdisciplinary nature of mathematical modeling and optimization, showcasing their applications in a wide range of domains, including power systems, communication networks, industrial automation, and renewable energy. It explores key mathematical techniques, such as linear and nonlinear programming, convex optimization, stochastic modeling, and numerical algorithms, that enable the design, analysis, and optimization of complex control and signal processing systems. Furthermore, the abstract emphasizes the importance of addressing real-world challenges in control, signal processing, and energy systems through innovative mathematical approaches. It discusses the integration of mathematical models with data-driven approaches, machine learning, and artificial intelligence to enhance system performance, adaptability, and decision-making capabilities. The abstract also underscores the significance of bridging the gap between theoretical advancements and practical applications. It recognizes the need for practical implementation of mathematical models and optimization algorithms in real-world systems, considering factors such as scalability, computational efficiency, and robustness. In summary, this abstract showcases the advancements in mathematical modeling and optimization techniques for control, signal processing, and energy systems. It highlights the interdisciplinary nature of these techniques, their applications across various domains, and their potential to address real-world challenges. The abstract emphasizes the importance of practical implementation and integration with emerging technologies to drive innovation and improve the performance of control, signal processing, and energy.Keywords: mathematical modeling, optimization, control systems, signal processing, energy systems, interdisciplinary applications, system identification, numerical algorithms
Procedia PDF Downloads 112113 Critical Success Factors Influencing Construction Project Performance for Different Objectives: Procurement Phase
Authors: Samart Homthong, Wutthipong Moungnoi
Abstract:
Critical success factors (CSFs) and the criteria to measure project success have received much attention over the decades and are among the most widely researched topics in the context of project management. However, although there have been extensive studies on the subject by different researchers, to date, there has been little agreement on the CSFs. The aim of this study is to identify the CSFs that influence the performance of construction projects, and determine their relative importance for different objectives across five stages in the project life cycle. A considerable literature review was conducted that resulted in the identification of 179 individual factors. These factors were then grouped into nine major categories. A questionnaire survey was used to collect data from three groups of respondents: client representatives, consultants, and contractors. Out of 164 questionnaires distributed, 93 were returned, yielding a response rate of 56.7%. Using the mean score, relative importance index, and weighted average method, the top 10 critical factors for each category were identified. The agreement of survey respondents on those categorised factors were analysed using Spearman’s rank correlation. A one-way analysis of variance was then performed to determine whether the mean scores among the various groups of respondents were statistically significant. The findings indicate the most CSFs in each category in procurement phase are: proper procurement programming of materials (time), stability in the price of materials (cost), and determining quality in the construction (quality). They are then followed by safety equipment acquisition and maintenance (health and safety), budgeting allowed in a contractual arrangement for implementing environmental management activities (environment), completeness of drawing documents (productivity), accurate measurement and pricing of bill of quantities (risk management), adequate communication among the project team (human resource), and adequate cost control measures (client satisfaction). An understanding of CSFs would help all interested parties in the construction industry to improve project performance. Furthermore, the results of this study would help construction professionals and practitioners take proactive measures for effective project management.Keywords: critical success factors, procurement phase, project life cycle, project performance
Procedia PDF Downloads 183112 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status
Authors: Rosa Figueroa, Christopher Flores
Abstract:
Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm
Procedia PDF Downloads 297111 Comfort Sensor Using Fuzzy Logic and Arduino
Authors: Samuel John, S. Sharanya
Abstract:
Automation has become an important part of our life. It has been used to control home entertainment systems, changing the ambience of rooms for different events etc. One of the main parameters to control in a smart home is the atmospheric comfort. Atmospheric comfort mainly includes temperature and relative humidity. In homes, the desired temperature of different rooms varies from 20 °C to 25 °C and relative humidity is around 50%. However, it varies widely. Hence, automated measurement of these parameters to ensure comfort assumes significance. To achieve this, a fuzzy logic controller using Arduino was developed using MATLAB. Arduino is an open source hardware consisting of a 24 pin ATMEGA chip (atmega328), 14 digital input /output pins and an inbuilt ADC. It runs on 5v and 3.3v power supported by a board voltage regulator. Some of the digital pins in Aruduino provide PWM (pulse width modulation) signals, which can be used in different applications. The Arduino platform provides an integrated development environment, which includes support for c, c++ and java programming languages. In the present work, soft sensor was introduced in this system that can indirectly measure temperature and humidity and can be used for processing several measurements these to ensure comfort. The Sugeno method (output variables are functions or singleton/constant, more suitable for implementing on microcontrollers) was used in the soft sensor in MATLAB and then interfaced to the Arduino, which is again interfaced to the temperature and humidity sensor DHT11. The temperature-humidity sensor DHT11 acts as the sensing element in this system. Further, a capacitive humidity sensor and a thermistor were also used to support the measurement of temperature and relative humidity of the surrounding to provide a digital signal on the data pin. The comfort sensor developed was able to measure temperature and relative humidity correctly. The comfort percentage was calculated and accordingly the temperature in the room was controlled. This system was placed in different rooms of the house to ensure that it modifies the comfort values depending on temperature and relative humidity of the environment. Compared to the existing comfort control sensors, this system was found to provide an accurate comfort percentage. Depending on the comfort percentage, the air conditioners and the coolers in the room were controlled. The main highlight of the project is its cost efficiency.Keywords: arduino, DHT11, soft sensor, sugeno
Procedia PDF Downloads 310110 Innovation Management in E-Health Care: The Implementation of New Technologies for Health Care in Europe and the USA
Authors: Dariusz M. Trzmielak, William Bradley Zehner, Elin Oftedal, Ilona Lipka-Matusiak
Abstract:
The use of new technologies should create new value for all stakeholders in the healthcare system. The article focuses on demonstrating that technologies or products typically enable new functionality, a higher standard of service, or a higher level of knowledge and competence for clinicians. It also highlights the key benefits that can be achieved through the use of artificial intelligence, such as relieving clinicians of many tasks and enabling the expansion and greater specialisation of healthcare services. The comparative analysis allowed the authors to create a classification of new technologies in e-health according to health needs and benefits for patients, doctors, and healthcare systems, i.e., the main stakeholders in the implementation of new technologies and products in healthcare. The added value of the development of new technologies in healthcare is diagnosed. The work is both theoretical and practical in nature. The primary research methods are bibliographic analysis and analysis of research data and market potential of new solutions for healthcare organisations. The bibliographic analysis is complemented by the author's case studies of implemented technologies, mostly based on artificial intelligence or telemedicine. In the past, patients were often passive recipients, the end point of the service delivery system, rather than stakeholders in the system. One of the dangers of powerful new technologies is that patients may become even more marginalised. Healthcare will be provided and delivered in an increasingly administrative, programmed way. The doctor may also become a robot, carrying out programmed activities - using 'non-human services'. An alternative approach is to put the patient at the centre, using technologies, products, and services that allow them to design and control technologies based on their own needs. An important contribution to the discussion is to open up the different dimensions of the user (carer and patient) and to make them aware of healthcare units implementing new technologies. The authors of this article outline the importance of three types of patients in the successful implementation of new medical solutions. The impact of implemented technologies is analysed based on: 1) "Informed users", who are able to use the technology based on a better understanding of it; 2) "Engaged users" who play an active role in the broader healthcare system as a result of the technology; 3) "Innovative users" who bring their own ideas to the table based on a deeper understanding of healthcare issues. The authors' research hypothesis is that the distinction between informed, engaged, and innovative users has an impact on the perceived and actual quality of healthcare services. The analysis is based on case studies of new solutions implemented in different medical centres. In addition, based on the observations of the Polish author, who is a manager at the largest medical research institute in Poland, with analytical input from American and Norwegian partners, the added value of the implementations for patients, clinicians, and the healthcare system will be demonstrated.Keywords: innovation, management, medicine, e-health, artificial intelligence
Procedia PDF Downloads 20109 Fostering Organizational Learning across the Canadian Sport System through Leadership and Mentorship Development of Sport Science Leaders
Authors: Jennifer Walinga, Samantha Heron
Abstract:
The goal of the study was to inform the design of effective leadership and mentorship development programming for sport science leaders within the network of Canadian sport institutes and centers. The LEAD (Learn, Engage, Accelerate, Develop) program was implemented to equip sport science leaders with the leadership knowledge, skills, and practice to foster a high - performance culture, enhance the daily training environment, and contribute to optimal performance in sport. After two years of delivery, this analysis of LEAD’s effect on individual and organizational health and performance factors informs the quality of future deliveries and identifies best practice for leadership development across the Canadian sport system and beyond. A larger goal for this project was to inform the public sector more broadly and position sport as a source of best practice for human and social health, development, and performance. The objectives of this study were to review and refine the LEAD program in collaboration with Canadian Sport Institute and Centre leaders, 40-50 participants from three cohorts, and the LEAD program advisory committee, and to trace the effects of the LEAD leadership development program on key leadership mentorship and organizational health indicators across the Canadian sport institutes and centers so as to capture best practice. The study followed a participatory action research framework (PAR) using semi structured interviews with sport scientist participants, program and institute leaders inquiring into impact on specific individual and organizational health and performance factors. Findings included a strong increase in self-reported leadership knowledge, skill, language and confidence, enhancement of human and organizational health factors, and the opportunity to explore more deeply issues of diversity and inclusion, psychological safety, team dynamics, and performance management. The study was significant in building sport leadership and mentorship development strategies for managing change efforts, addressing inequalities, and building personal and operational resilience amidst challenges of uncertainty, pressure, and constraint in real time.Keywords: sport leadership, sport science leader, leadership development, professional development, sport education, mentorship
Procedia PDF Downloads 22108 Multilevel Modelling of Modern Contraceptive Use in Nigeria: Analysis of the 2013 NDHS
Authors: Akiode Ayobami, Akiode Akinsewa, Odeku Mojisola, Salako Busola, Odutolu Omobola, Nuhu Khadija
Abstract:
Purpose: Evidence exists that family planning use can contribute to reduction in infant and maternal mortality in any country. Despite these benefits, contraceptive use in Nigeria still remains very low, only 10% among married women. Understanding factors that predict contraceptive use is very important in order to improve the situation. In this paper, we analysed data from the 2013 Nigerian Demographic and Health Survey (NDHS) to better understand predictors of contraceptive use in Nigeria. The use of logistics regression and other traditional models in this type of situation is not appropriate as they do not account for social structure influence brought about by the hierarchical nature of the data on response variable. We therefore used multilevel modelling to explore the determinants of contraceptive use in order to account for the significant variation in modern contraceptive use by socio-demographic, and other proximate variables across the different Nigerian states. Method: This data has a two-level hierarchical structure. We considered the data of 26, 403 married women of reproductive age at level 1 and nested them within the 36 states and the Federal Capital Territory, Abuja at level 2. We modelled use of modern contraceptive against demographic variables, being told about FP at health facility, heard of FP on TV, Magazine or radio, husband desire for more children nested within the state. Results: Our results showed that the independent variables in the model were significant predictors of modern contraceptive use. The estimated variance component for the null model, random intercept, and random slope models were significant (p=0.00), indicating that the variation in contraceptive use across the Nigerian states is significant, and needs to be accounted for in order to accurately determine the predictors of contraceptive use, hence the data is best fitted by the multilevel model. Only being told about family planning at the health facility and religion have a significant random effect, implying that their predictability of contraceptive use varies across the states. Conclusion and Recommendation: Results showed that providing FP information at the health facility and religion needs to be considered when programming to improve contraceptive use at the state levels.Keywords: multilevel modelling, family planning, predictors, Nigeria
Procedia PDF Downloads 418107 Robotic Process Automation in Accounting and Finance Processes: An Impact Assessment of Benefits
Authors: Rafał Szmajser, Katarzyna Świetla, Mariusz Andrzejewski
Abstract:
Robotic process automation (RPA) is a technology of repeatable business processes performed using computer programs, robots that simulate the work of a human being. This approach assumes replacing an existing employee with the use of dedicated software (software robots) to support activities, primarily repeated and uncomplicated, characterized by a low number of exceptions. RPA application is widespread in modern business services, particularly in the areas of Finance, Accounting and Human Resources Management. By utilizing this technology, the effectiveness of operations increases while reducing workload, minimizing possible errors in the process, and as a result, bringing measurable decrease in the cost of providing services. Regardless of how the use of modern information technology is assessed, there are also some doubts as to whether we should replace human activities in the implementation of the automation in business processes. After the initial awe for the new technological concept, a reflection arises: to what extent does the implementation of RPA increase the efficiency of operations or is there a Business Case for implementing it? If the business case is beneficial, in which business processes is the greatest potential for RPA? A closer look at these issues was provided by in this research during which the respondents’ view of the perceived advantages resulting from the use of robotization and automation in financial and accounting processes was verified. As a result of an online survey addressed to over 500 respondents from international companies, 162 complete answers were returned from the most important types of organizations in the modern business services industry, i.e. Business or IT Process Outsourcing (BPO/ITO), Shared Service Centers (SSC), Consulting/Advisory and their customers. Answers were provided by representatives of the positions in their organizations: Members of the Board, Directors, Managers and Experts/Specialists. The structure of the survey allowed the respondents to supplement the survey with additional comments and observations. The results formed the basis for the creation of a business case calculating tangible benefits associated with the implementation of automation in the selected financial processes. The results of the statistical analyses carried out with regard to revenue growth confirmed the correctness of the hypothesis that there is a correlation between job position and the perception of the impact of RPA implementation on individual benefits. Second hypothesis (H2) that: There is a relationship between the kind of company in the business services industry and the reception of the impact of RPA on individual benefits was thus not confirmed. Based results of survey authors performed simulation of business case for implementation of RPA in selected Finance and Accounting Processes. Calculated payback period was diametrically different ranging from 2 months for the Account Payables process with 75% savings and in the extreme case for the process Taxes implementation and maintenance costs exceed the savings resulting from the use of the robot.Keywords: automation, outsourcing, business process automation, process automation, robotic process automation, RPA, RPA business case, RPA benefits
Procedia PDF Downloads 137106 Smart Defect Detection in XLPE Cables Using Convolutional Neural Networks
Authors: Tesfaye Mengistu
Abstract:
Power cables play a crucial role in the transmission and distribution of electrical energy. As the electricity generation, transmission, distribution, and storage systems become smarter, there is a growing emphasis on incorporating intelligent approaches to ensure the reliability of power cables. Various types of electrical cables are employed for transmitting and distributing electrical energy, with cross-linked polyethylene (XLPE) cables being widely utilized due to their exceptional electrical and mechanical properties. However, insulation defects can occur in XLPE cables due to subpar manufacturing techniques during production and cable joint installation. To address this issue, experts have proposed different methods for monitoring XLPE cables. Some suggest the use of interdigital capacitive (IDC) technology for online monitoring, while others propose employing continuous wave (CW) terahertz (THz) imaging systems to detect internal defects in XLPE plates used for power cable insulation. In this study, we have developed models that employ a custom dataset collected locally to classify the physical safety status of individual power cables. Our models aim to replace physical inspections with computer vision and image processing techniques to classify defective power cables from non-defective ones. The implementation of our project utilized the Python programming language along with the TensorFlow package and a convolutional neural network (CNN). The CNN-based algorithm was specifically chosen for power cable defect classification. The results of our project demonstrate the effectiveness of CNNs in accurately classifying power cable defects. We recommend the utilization of similar or additional datasets to further enhance and refine our models. Additionally, we believe that our models could be used to develop methodologies for detecting power cable defects from live video feeds. We firmly believe that our work makes a significant contribution to the field of power cable inspection and maintenance. Our models offer a more efficient and cost-effective approach to detecting power cable defects, thereby improving the reliability and safety of power grids.Keywords: artificial intelligence, computer vision, defect detection, convolutional neural net
Procedia PDF Downloads 112105 Family Planning and HIV Integration: A One-stop Shop Model at Spilhaus Clinic, Harare Zimbabwe
Authors: Mercy Marimirofa, Farai Machinga, Alfred Zvoushe, Tsitsidzaishe Musvosvi
Abstract:
The Government of Zimbabwe embarked on integrating family planning with Sexually Transmitted Infection (STI) and Human Immunodeficiency Virus (HIV) services in May 2020 with support from the World Health Organization (WHO). There was high HIV prevalence, incidence rates and STI infections among women attending FP clinics. Spilhaus is a specialized center of excellence clinic which offers a range of sexual reproductive health services. HIV services were limited to testing only, and clients were referred to other facilities for further management. Integration of services requires that all the services be available at one point so that clients will access them during their visit to the facility. Objectives: The study was conducted to assess the impact the one-stop-shop model has made in accessing integrated Family Planning services and sexual reproductive health services compared to the supermarket approach. It also assessed the relationship family planning services have with other sexual reproductive health services. Methods: A secondary data analysis was conducted at Spilhaus clinic in Harare using family planning registers and HIV services registers comparing years 2019 and 2021. A 2 sample t-test was used to determine the difference in clients accessing the services under the two models. A Spearman’s rank correlation was used to determine if accessing family planning services has a relationship with other sexual reproductive health services. Results: In 2019, 7,548 clients visited the Spilhaus clinic compared to 8,265 during the period January to December 2021. The median age for all clients accessing services was 32 years. An increase of 69% in the number of services accessed was recorded from 2019 to 2021. More services were accessed in 2021. There was no difference in the number of clients accessing family planning services cervical cancer, and HIV services. A difference was found in the number of clients who were offered STI screening services. There was also a relationship between accessing family planning services and STI screening services (ρ = 0.729, p-value=0.006). Conclusion: Programming towards SRH services was a great achievement, the use of an integrated approach proved to be cost-effective as it minimised the required resources for separate programs. Clients accessed important health needs at once. The integration of these services provided an opportunity to offer comprehensive information which addressed an individual’s sexual reproductive health needs.Keywords: intergration, one stop shop, family planning, reproductive health
Procedia PDF Downloads 68104 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life
Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr
Abstract:
Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology
Procedia PDF Downloads 296103 Improving the Constructability of Highway Design Plans
Authors: R. Edward Minchin Jr.
Abstract:
The U.S. Federal Highway Administration (FHWA) Every Day Counts Program (EDC) has resulted in state DOTs putting evermore emphasis on speeding up the delivery of highway and bridge construction projects for use by the driving public. This has resulted in an increase in the use of alternative construction delivery systems such as design-build (D-B), construction manager at-risk (CMR) or construction manager/general contractor (CM/GC), and adding alternative technical concepts (ATCs) to traditional design-bid-build (DBB) contracts. ATCs have exhibited great potential for delivering substantial benefits like cost savings, increased constructability, and quicker project delivery. Previous research has found that knowledge of project constructability was lacking in state Department of Transportation (DOT) planning, programming, and environmental staffs. Many agencies have therefore relied on a set of ‘acceptable’ design solutions over the years of working with their local resource agencies. The result is that the permitting process for several government agencies has become increasingly restrictive with the result that the DOTs and their industry partners lose the ability to innovate after a permit is approved. The intent of this paper is to report on the research team’s progress in this ongoing effort furnish the United States government with a uniform set of guidelines for the application of constructability reviews during all phases of project development and delivery. The research uses surveys and interviews to determine which states have implemented formal programs to ensure that the constructor is furnished with a set of contract documents that affords said constructor with the best possible opportunity to successfully construct the project with the highest quality standards, within the contract duration and without exceeding the construction budget. Once these states are identified, workshops are held all over the nation, resulting in the team learning the best current practices and giving the team the ability to recommend new practices that will improve the process. The plan is for the FHWA to encourage or require state DOTs to use these practices on all federally funded highway and bridge construction projects. The project deliverable is a Guidebook for FHWA to use in disseminating the recommended practices to the states.Keywords: alternative construction delivery, alternative technical concepts, constructability, construction design plans
Procedia PDF Downloads 216102 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 118101 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm
Authors: Zachary Huffman, Joana Rocha
Abstract:
Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations
Procedia PDF Downloads 135100 The Effect of Green Power Trading Mechanism on Interregional Power Generation and Transmission in China
Authors: Yan-Shen Yang, Bai-Chen Xie
Abstract:
Background and significance of the study: Both green power trading schemes and interregional power transmission are effective ways to increase green power absorption and achieve renewable power development goals. China accelerates the construction of interregional power transmission lines and the green power market. A critical issue focusing on the close interaction between these two approaches arises, which can heavily affect the green power quota allocation and renewable power development. Existing studies have not discussed this issue adequately, so it is urgent to figure out their relationship to achieve a suitable power market design and a more reasonable power grid construction.Basic methodologies: We develop an equilibrium model of the power market in China to analyze the coupling effect of these two approaches as well as their influence on power generation and interregional transmission in China. Our model considers both the Tradable green certificate (TGC) and green power market, which consists of producers, consumers in the market, and an independent system operator (ISO) minimizing the total system cost. Our equilibrium model includes the decision optimization process of each participant. To reformulate the models presented as a single-level one, we replace the producer, consumer, ISO, and market equilibrium problems with their Karush-Kuhn-Tucker (KKT) conditions, which is further reformulated as a mixed-integer linear programming (MILP) and solved in Gurobi solver. Major findings: The result shows that: (1) the green power market can significantly promote renewable power absorption while the TGC market provides a more flexible way for green power trading. (2) The phenomena of inefficient occupation and no available transmission lines appear simultaneously. The existing interregional transmission lines cannot fully meet the demand for wind and solar PV power trading in some areas while the situation is vice versa in other areas. (3) Synchronous implementation of green power and TGC trading mechanism can benefit the development of green power as well as interregional power transmission. (4) The green power transaction exacerbates the unfair distribution of carbon emissions. The Carbon Gini Coefficient is up to 0.323 under the green power market which shows a high Carbon inequality. The eastern coastal region will benefit the most due to its huge demand for external power.Keywords: green power market, tradable green certificate, interregional power transmission, power market equilibrium model
Procedia PDF Downloads 14799 Inverterless Grid Compatible Micro Turbine Generator
Authors: S. Ozeri, D. Shmilovitz
Abstract:
Micro‐Turbine Generators (MTG) are small size power plants that consist of a high speed, gas turbine driving an electrical generator. MTGs may be fueled by either natural gas or kerosene and may also use sustainable and recycled green fuels such as biomass, landfill or digester gas. The typical ratings of MTGs start from 20 kW up to 200 kW. The primary use of MTGs is for backup for sensitive load sites such as hospitals, and they are also considered a feasible power source for Distributed Generation (DG) providing on-site generation in proximity to remote loads. The MTGs have the compressor, the turbine, and the electrical generator mounted on a single shaft. For this reason, the electrical energy is generated at high frequency and is incompatible with the power grid. Therefore, MTGs must contain, in addition, a power conditioning unit to generate an AC voltage at the grid frequency. Presently, this power conditioning unit consists of a rectifier followed by a DC/AC inverter, both rated at the full MTG’s power. The losses of the power conditioning unit account to some 3-5%. Moreover, the full-power processing stage is a bulky and costly piece of equipment that also lowers the overall system reliability. In this study, we propose a new type of power conditioning stage in which only a small fraction of the power is processed. A low power converter is used only to program the rotor current (i.e. the excitation current which is substantially lower). Thus, the MTG's output voltage is shaped to the desired amplitude and frequency by proper programming of the excitation current. The control is realized by causing the rotor current to track the electrical frequency (which is related to the shaft frequency) with a difference that is exactly equal to the line frequency. Since the phasor of the rotation speed and the phasor of the rotor magnetic field are multiplied, the spectrum of the MTG generator voltage contains the sum and the difference components. The desired difference component is at the line frequency (50/60 Hz), whereas the unwanted sum component is at about twice the electrical frequency of the stator. The unwanted high frequency component can be filtered out by a low-pass filter leaving only the low-frequency output. This approach allows elimination of the large power conditioning unit incorporated in conventional MTGs. Instead, a much smaller and cheaper fractional power stage can be used. The proposed technology is also applicable to other high rotation generator sets such as aircraft power units.Keywords: gas turbine, inverter, power multiplier, distributed generation
Procedia PDF Downloads 23898 Pathway to Sustainable Shipping: Electric Ships
Authors: Wei Wang, Yannick Liu, Lu Zhen, H. Wang
Abstract:
Maritime transport plays an important role in global economic development but also inevitably faces increasing pressures from all sides, such as ship operating cost reduction and environmental protection. An ideal innovation to address these pressures is electric ships. The electric ship is in the early stage. Considering the special characteristics of electric ships, i.e., travel range limit, to guarantee the efficient operation of electric ships, the service network needs to be re-designed carefully. This research designs a cost-efficient and environmentally friendly service network for electric ships, including the location of charging stations, charging plan, route planning, ship scheduling, and ship deployment. The problem is formulated as a mixed-integer linear programming model with the objective of minimizing total cost comprised of charging cost, the construction cost of charging stations, and fixed cost of ships. A case study using data of the shipping network along the Yangtze River is conducted to evaluate the performance of the model. Two operating scenarios are used: an electric ship scenario where all the transportation tasks are fulfilled by electric ships and a conventional ship scenario where all the transportation tasks are fulfilled by fuel oil ships. Results unveil that the total cost of using electric ships is only 42.8% of using conventional ships. Using electric ships can reduce 80% SOx, 93.47% NOx, 89.47% PM, and 42.62% CO2, but will consume 2.78% more time to fulfill all the transportation tasks. Extensive sensitivity analyses are also conducted for key operating factors, including battery capacity, charging speed, volume capacity, and a service time limit of transportation task. Implications from the results are as follows: 1) it is necessary to equip the ship with a large capacity battery when the number of charging stations is low; 2) battery capacity will influence the number of ships deployed on each route; 3) increasing battery capacity will make the electric ship more cost-effective; 4) charging speed does not affect charging amount and location of charging station, but will influence the schedule of ships on each route; 5) there exists an optimal volume capacity, at which all costs and total delivery time are lowest; 6) service time limit will influence ship schedule and ship cost.Keywords: cost reduction, electric ship, environmental protection, sustainable shipping
Procedia PDF Downloads 7797 Logistics and Supply Chain Management Using Smart Contracts on Blockchain
Authors: Armen Grigoryan, Milena Arakelyan
Abstract:
The idea of smart logistics is still quite a complicated one. It can be used to market products to a large number of customers or to acquire raw materials of the highest quality at the lowest cost in geographically dispersed areas. The use of smart contracts in logistics and supply chain management has the potential to revolutionize the way that goods are tracked, transported, and managed. Smart contracts are simply computer programs written in one of the blockchain programming languages (Solidity, Rust, Vyper), which are capable of self-execution once the predetermined conditions are met. They can be used to automate and streamline many of the traditional manual processes that are currently used in logistics and supply chain management, including the tracking and movement of goods, the management of inventory, and the facilitation of payments and settlements between different parties in the supply chain. Currently, logistics is a core area for companies which is concerned with transporting products between parties. Still, the problem of this sector is that its scale may lead to detainments and defaults in the delivery of goods, as well as other issues. Moreover, large distributors require a large number of workers to meet all the needs of their stores. All this may contribute to big detainments in order processing and increases the potentiality of losing orders. In an attempt to break this problem, companies have automated all their procedures, contributing to a significant augmentation in the number of businesses and distributors in the logistics sector. Hence, blockchain technology and smart contracted legal agreements seem to be suitable concepts to redesign and optimize collaborative business processes and supply chains. The main purpose of this paper is to examine the scope of blockchain technology and smart contracts in the field of logistics and supply chain management. This study discusses the research question of how and to which extent smart contracts and blockchain technology can facilitate and improve the implementation of collaborative business structures for sustainable entrepreneurial activities in smart supply chains. The intention is to provide a comprehensive overview of the existing research on the use of smart contracts in logistics and supply chain management and to identify any gaps or limitations in the current knowledge on this topic. This review aims to provide a summary and evaluation of the key findings and themes that emerge from the research, as well as to suggest potential directions for future research on the use of smart contracts in logistics and supply chain management.Keywords: smart contracts, smart logistics, smart supply chain management, blockchain and smart contracts in logistics, smart contracts for controlling supply chain management
Procedia PDF Downloads 9596 A Geo DataBase to Investigate the Maximum Distance Error in Quality of Life Studies
Authors: Paolino Di Felice
Abstract:
The background and significance of this study come from papers already appeared in the literature which measured the impact of public services (e.g., hospitals, schools, ...) on the citizens’ needs satisfaction (one of the dimensions of QOL studies) by calculating the distance between the place where they live and the location on the territory of the services. Those studies assume that the citizens' dwelling coincides with the centroid of the polygon that expresses the boundary of the administrative district, within the city, they belong to. Such an assumption “introduces a maximum measurement error equal to the greatest distance between the centroid and the border of the administrative district.”. The case study, this abstract reports about, investigates the implications descending from the adoption of such an approach but at geographical scales greater than the urban one, namely at the three levels of nesting of the Italian administrative units: the (20) regions, the (110) provinces, and the 8,094 municipalities. To carry out this study, it needs to be decided: a) how to store the huge amount of (spatial and descriptive) input data and b) how to process them. The latter aspect involves: b.1) the design of algorithms to investigate the geometry of the boundary of the Italian administrative units; b.2) their coding in a programming language; b.3) their execution and, eventually, b.4) archiving the results in a permanent support. The IT solution we implemented is centered around a (PostgreSQL/PostGIS) Geo DataBase structured in terms of three tables that fit well to the hierarchy of nesting of the Italian administrative units: municipality(id, name, provinceId, istatCode, regionId, geometry) province(id, name, regionId, geometry) region(id, name, geometry). The adoption of the DBMS technology allows us to implement the steps "a)" and "b)" easily. In particular, step "b)" is simplified dramatically by calling spatial operators and spatial built-in User Defined Functions within SQL queries against the Geo DB. The major findings coming from our experiments can be summarized as follows. The approximation that, on the average, descends from assimilating the residence of the citizens with the centroid of the administrative unit of reference is of few kilometers (4.9) at the municipalities level, while it becomes conspicuous at the other two levels (28.9 and 36.1, respectively). Therefore, studies such as those mentioned above can be extended up to the municipal level without affecting the correctness of the interpretation of the results, but not further. The IT framework implemented to carry out the experiments can be replicated for studies referring to the territory of other countries all over the world.Keywords: quality of life, distance measurement error, Italian administrative units, spatial database
Procedia PDF Downloads 37195 Mural Exhibition as a Promotive Strategy to Proper Hygiene and Sanitation Practices among Children: A Case Study from Urban Slum Schools in Nairobi, Kenya
Authors: Abdulaziz Kikanga, Kellen Muchira, Styvers Kathuni, Paul Saitoti
Abstract:
Background: Provision of adequate levels of water, sanitation, and hygiene in schools is a strategic objective in achieving universal primary education among children in low and middle-income countries. However, lack of proper sanitation and hygiene practices in schools, especially those in informal settlement has resulted to an increased rate of school absenteeism thereby affecting the education and health outcomes of the children in those setting. Intervention or Response: Catholic Relief Services in Kenya supports five schools in informal settlements of Nairobi by painting of key hygiene messages on school walls to promote proper hygiene and sanitation practices among the school children. The mural exhibitions depict the essence of proper hygiene practices, proper latrine use, and hand washing after visiting the latrine. The artwork is context specific and its aimed at improving the uptake of proper hygiene and sanitation practices among the school children. Review of project related documents was conducted including interviews with the school children. Thematic analysis was used to interpret the qualitative information generated. Results and Lessons Learnt: 12 school children have interviewed on proper hygiene and sanitation practices and the exercise revealed that painted murals were the best communication platforms for creating awareness on proper sanitation on issues relating to water, sanitation, and hygiene in schools. The painting mural provided a strong knowledge base for the formation of healthy habits in both the school and informal settlement. In addition, these sanitation messages on the school walls empower the children to share these practices with their siblings, parents, and other family members thereby acting as agents of change to proper hygiene and sanitation in those informal settlements. The findings revealed that by adopting proper sanitation and hygiene practices, there has been a reduction of school absenteeism due to a decrease in disease related to inadequate sanitation and hygiene in schools. Conclusion: The adoption of proper sanitation in schools entails more than just a painted mural wall. Insights revealed that to have a lasting sanitation and hygiene intervention, there is a need to invest in effective hygiene educational programming that encourages the formation of proper hygiene habits and promotes changes in behavior.Keywords: education outcomes, informal settlement, mural exhibition, school hygiene and sanitation
Procedia PDF Downloads 25494 Leveraging xAPI in a Corporate e-Learning Environment to Facilitate the Tracking, Modelling, and Predictive Analysis of Learner Behaviour
Authors: Libor Zachoval, Daire O Broin, Oisin Cawley
Abstract:
E-learning platforms, such as Blackboard have two major shortcomings: limited data capture as a result of the limitations of SCORM (Shareable Content Object Reference Model), and lack of incorporation of Artificial Intelligence (AI) and machine learning algorithms which could lead to better course adaptations. With the recent development of Experience Application Programming Interface (xAPI), a large amount of additional types of data can be captured and that opens a window of possibilities from which online education can benefit. In a corporate setting, where companies invest billions on the learning and development of their employees, some learner behaviours can be troublesome for they can hinder the knowledge development of a learner. Behaviours that hinder the knowledge development also raise ambiguity about learner’s knowledge mastery, specifically those related to gaming the system. Furthermore, a company receives little benefit from their investment if employees are passing courses without possessing the required knowledge and potential compliance risks may arise. Using xAPI and rules derived from a state-of-the-art review, we identified three learner behaviours, primarily related to guessing, in a corporate compliance course. The identified behaviours are: trying each option for a question, specifically for multiple-choice questions; selecting a single option for all the questions on the test; and continuously repeating tests upon failing as opposed to going over the learning material. These behaviours were detected on learners who repeated the test at least 4 times before passing the course. These findings suggest that gauging the mastery of a learner from multiple-choice questions test scores alone is a naive approach. Thus, next steps will consider the incorporation of additional data points, knowledge estimation models to model knowledge mastery of a learner more accurately, and analysis of the data for correlations between knowledge development and identified learner behaviours. Additional work could explore how learner behaviours could be utilised to make changes to a course. For example, course content may require modifications (certain sections of learning material may be shown to not be helpful to many learners to master the learning outcomes aimed at) or course design (such as the type and duration of feedback).Keywords: artificial intelligence, corporate e-learning environment, knowledge maintenance, xAPI
Procedia PDF Downloads 12193 Development of a Multi-User Country Specific Food Composition Table for Malawi
Authors: Averalda van Graan, Joelaine Chetty, Malory Links, Agness Mwangwela, Sitilitha Masangwi, Dalitso Chimwala, Shiban Ghosh, Elizabeth Marino-Costello
Abstract:
Food composition data is becoming increasingly important as dealing with food insecurity and malnutrition in its persistent form of under-nutrition is now coupled with increasing over-nutrition and its related ailments in the developing world, of which Malawi is not spared. In the absence of a food composition database (FCDB) inherent to our dietary patterns, efforts were made to develop a country-specific FCDB for nutrition practice, research, and programming. The main objective was to develop a multi-user, country-specific food composition database, and table from existing published and unpublished scientific literature. A multi-phased approach guided by the project framework was employed. Phase 1 comprised a scoping mission to assess the nutrition landscape for compilation activities. Phase 2 involved training of a compiler and data collection from various sources, primarily; institutional libraries, online databases, and food industry nutrient data. Phase 3 subsumed evaluation and compilation of data using FAO and IN FOODS standards and guidelines. Phase 4 concluded the process with quality assurance. 316 Malawian food items categorized into eight food groups for 42 components were captured. The majority were from the baby food group (27%), followed by a staple (22%) and animal (22%) food group. Fats and oils consisted the least number of food items (2%), followed by fruits (6%). Proximate values are well represented; however, the percent missing data is huge for some components, including Se 68%, I 75%, Vitamin A 42%, and lipid profile; saturated fat 53%, mono-saturated fat 59%, poly-saturated fat 59% and cholesterol 56%. A multi-phased approach following the project framework led to the development of the first Malawian FCDB and table. The table reflects inherent Malawian dietary patterns and nutritional concerns. The FCDB can be used by various professionals in nutrition and health. Rising over-nutrition, NCD, and changing diets challenge us for nutrient profiles of processed foods and complete lipid profiles.Keywords: analytical data, dietary pattern, food composition data, multi-phased approach
Procedia PDF Downloads 9392 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 7591 KPI and Tool for the Evaluation of Competency in Warehouse Management for Furniture Business
Authors: Kritchakhris Na-Wattanaprasert
Abstract:
The objective of this research is to design and develop a prototype of a key performance indicator system this is suitable for warehouse management in a case study and use requirement. In this study, we design a prototype of key performance indicator system (KPI) for warehouse case study of furniture business by methodology in step of identify scope of the research and study related papers, gather necessary data and users requirement, develop key performance indicator base on balance scorecard, design pro and database for key performance indicator, coding the program and set relationship of database and finally testing and debugging each module. This study use Balance Scorecard (BSC) for selecting and grouping key performance indicator. The system developed by using Microsoft SQL Server 2010 is used to create the system database. In regard to visual-programming language, Microsoft Visual C# 2010 is chosen as the graphic user interface development tool. This system consists of six main menus: menu login, menu main data, menu financial perspective, menu customer perspective, menu internal, and menu learning and growth perspective. Each menu consists of key performance indicator form. Each form contains a data import section, a data input section, a data searches – edit section, and a report section. The system generates outputs in 5 main reports, the KPI detail reports, KPI summary report, KPI graph report, benchmarking summary report and benchmarking graph report. The user will select the condition of the report and period time. As the system has been developed and tested, discovers that it is one of the ways to judging the extent to warehouse objectives had been achieved. Moreover, it encourages the warehouse functional proceed with more efficiency. In order to be useful propose for other industries, can adjust this system appropriately. To increase the usefulness of the key performance indicator system, the recommendations for further development are as follows: -The warehouse should review the target value and set the better suitable target periodically under the situation fluctuated in the future. -The warehouse should review the key performance indicators and set the better suitable key performance indicators periodically under the situation fluctuated in the future for increasing competitiveness and take advantage of new opportunities.Keywords: key performance indicator, warehouse management, warehouse operation, logistics management
Procedia PDF Downloads 43190 Fraud in the Higher Educational Institutions in Assam, India: Issues and Challenges
Authors: Kalidas Sarma
Abstract:
Fraud is a social problem changing with social change and it has a regional and global impact. Introduction of private domain in higher education along with public institutions has led to commercialization of higher education which encourages unprecedented mushrooming of private institutions resulting in fraudulent activities in higher educational institutions in Assam, India. Presently, fraud has been noticed in in-service promotion, fake entry qualification by teachers in different levels of work-place by using fake master degrees, master of philosophy and doctor of philosophy degree certificates. The aim and objective of the study are to identify grey areas in maintenance of quality in higher educational institutions in Assam and also to draw the contour for planning and implementation. This study is based on both primary and secondary data collected through questionnaire and seeking information through Right to Information Act 2005. In Assam, there are 301 undergraduate and graduate colleges distributed in 27 (Twenty seven) administrative districts with 11000 (Eleven thousand) college teachers. Total 421 (Four hundred twenty one) college teachers from the 14 respondent colleges have been taken for analysis. Data collected has been analyzed by using 'Hypertext Pre-processor' (PhP) application with My Sequel Structure Query Language (MySQL) and Google Map Application Programming Interface (APIs). Graph has been generated by using open source tool Chart.js. Spatial distribution maps have been generated with the help of geo-references of the colleges. The result shows: (i) the violation of University Grants Commission's (UGCs) Regulation for the awards of M. Phil/Ph.D. clearly exhibits. (ii) There is a gap between apex regulatory bodies of higher education at national and as well as state level to check fraud. (iii) Mala fide 'No Objection Certificate' (NOC) issued by the Government of Assam have played pivotal role in the occurrence of fraudulent practices in higher educational institutions of Assam. (iv) Violation of verdict of the Hon'ble Supreme Court of India regarding territorial jurisdiction of Universities for the awards of Ph.D. and M. Phil degrees in distance mode/study centre is also a responsible factor for the spread of these academic frauds in Assam and other states. The challenges and mitigation of these issues have been discussed.Keywords: Assam, fraud, higher education, mitigation
Procedia PDF Downloads 16789 A Multi-Criteria Decision Making Approach for Disassembly-To-Order Systems under Uncertainty
Authors: Ammar Y. Alqahtani
Abstract:
In order to minimize the negative impact on the environment, it is essential to manage the waste that generated from the premature disposal of end-of-life (EOL) products properly. Consequently, government and international organizations introduced new policies and regulations to minimize the amount of waste being sent to landfills. Moreover, the consumers’ awareness regards environment has forced original equipment manufacturers to consider being more environmentally conscious. Therefore, manufacturers have thought of different ways to deal with waste generated from EOL products viz., remanufacturing, reusing, recycling, or disposing of EOL products. The rate of depletion of virgin natural resources and their dependency on the natural resources can be reduced by manufacturers when EOL products are treated as remanufactured, reused, or recycled, as well as this will cut on the amount of harmful waste sent to landfills. However, disposal of EOL products contributes to the problem and therefore is used as a last option. Number of EOL need to be estimated in order to fulfill the components demand. Then, disassembly process needs to be performed to extract individual components and subassemblies. Smart products, built with sensors embedded and network connectivity to enable the collection and exchange of data, utilize sensors that are implanted into products during production. These sensors are used for remanufacturers to predict an optimal warranty policy and time period that should be offered to customers who purchase remanufactured components and products. Sensor-provided data can help to evaluate the overall condition of a product, as well as the remaining lives of product components, prior to perform a disassembly process. In this paper, a multi-period disassembly-to-order (DTO) model is developed that takes into consideration the different system uncertainties. The DTO model is solved using Nonlinear Programming (NLP) in multiple periods. A DTO system is considered where a variety of EOL products are purchased for disassembly. The model’s main objective is to determine the best combination of EOL products to be purchased from every supplier in each period which maximized the total profit of the system while satisfying the demand. This paper also addressed the impact of sensor embedded products on the cost of warranties. Lastly, this paper presented and analyzed a case study involving various simulation conditions to illustrate the applicability of the model.Keywords: closed-loop supply chains, environmentally conscious manufacturing, product recovery, reverse logistics
Procedia PDF Downloads 13788 Review of the Model-Based Supply Chain Management Research in the Construction Industry
Authors: Aspasia Koutsokosta, Stefanos Katsavounis
Abstract:
This paper reviews the model-based qualitative and quantitative Operations Management research in the context of Construction Supply Chain Management (CSCM). Construction industry has been traditionally blamed for low productivity, cost and time overruns, waste, high fragmentation and adversarial relationships. The construction industry has been slower than other industries to employ the Supply Chain Management (SCM) concept and develop models that support the decision-making and planning. However the last decade there is a distinct shift from a project-based to a supply-based approach of construction management. CSCM comes up as a new promising management tool of construction operations and improves the performance of construction projects in terms of cost, time and quality. Modeling the Construction Supply Chain (CSC) offers the means to reap the benefits of SCM, make informed decisions and gain competitive advantage. Different modeling approaches and methodologies have been applied in the multi-disciplinary and heterogeneous research field of CSCM. The literature review reveals that a considerable percentage of CSC modeling accommodates conceptual or process models which discuss general management frameworks and do not relate to acknowledged soft OR methods. We particularly focus on the model-based quantitative research and categorize the CSCM models depending on their scope, mathematical formulation, structure, objectives, solution approach, software used and decision level. Although over the last few years there has been clearly an increase of research papers on quantitative CSC models, we identify that the relevant literature is very fragmented with limited applications of simulation, mathematical programming and simulation-based optimization. Most applications are project-specific or study only parts of the supply system. Thus, some complex interdependencies within construction are neglected and the implementation of the integrated supply chain management is hindered. We conclude this paper by giving future research directions and emphasizing the need to develop robust mathematical optimization models for the CSC. We stress that CSC modeling needs a multi-dimensional, system-wide and long-term perspective. Finally, prior applications of SCM to other industries have to be taken into account in order to model CSCs, but not without the consequential reform of generic concepts to match the unique characteristics of the construction industry.Keywords: construction supply chain management, modeling, operations research, optimization, simulation
Procedia PDF Downloads 50387 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions
Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini
Abstract:
This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing
Procedia PDF Downloads 14686 Mental Health Impacts of COVID-19 on Diverse Youth and Families in Canada
Authors: Lucksini Raveendran
Abstract:
Introduction: This mixed-methods study focuses on the experiences of ethnocultural youth and families in Canada, identifying key barriers and opportunities to inform service programming and policies that can better meet their mental health needs during the COVID-19 pandemic and beyond. Methods: Mental Health Commission of Canada's Headstrong initiative administered the youth survey (April – June 2020) and family survey (June – August 2020) with a total sample size of 137 and 481 respondents, respectively. Thematic analysis was conducted to identify key challenges faced, coping strategies used, and help-seeking behaviours. A similar approach was also applied to the family survey data, but instead, a representative sample was collated to analyze geographically variable and ethnically diverse subgroups. Results and analysis: Multiple challenges have impacted families, including increased feelings of loneliness and distress from border travel restrictions, especially among those navigating pregnancy alone or managing children with developmental needs, which is often understudied. Also, marginalized groups were disproportionately affected by inequitable access to communication technologies, further deepening the digital divide. Some reported living in congregated homes with regular conflicts, thus leading to increased anxiety and exposure to violence. For many families, urbanicity and ethnicity played a key role in how families reported coping with feelings of uncertainty while managing work commitments, navigating community resources, fulfilling care responsibilities, and homeschooling children of all ages. Despite these challenges, there was evidence of post-traumatic growth and building community resiliency. Conclusions and implications for policy, practice, or additional research: There is a need to foster opportunities to promote and sustain mental health, wellness, and resilience for families through social connections. Also, intersectionality must be embedded in the collection, analysis, and application of data to improve equitable access to evidence-based and recovery-oriented mental health supports among diverse families in Canada. Lastly, address future research on the long-term COVID-19 impacts of travel border restrictions on family wellness.Keywords: mental health, youth mental health, family wellness, health equity
Procedia PDF Downloads 94