Search results for: automated checklists
197 Reliability Analysis of Variable Stiffness Composite Laminate Structures
Authors: A. Sohouli, A. Suleman
Abstract:
This study focuses on reliability analysis of variable stiffness composite laminate structures to investigate the potential structural improvement compared to conventional (straight fibers) composite laminate structures. A computational framework was developed which it consists of a deterministic design step and reliability analysis. The optimization part is Discrete Material Optimization (DMO) and the reliability of the structure is computed by Monte Carlo Simulation (MCS) after using Stochastic Response Surface Method (SRSM). The design driver in deterministic optimization is the maximum stiffness, while optimization method concerns certain manufacturing constraints to attain industrial relevance. These manufacturing constraints are the change of orientation between adjacent patches cannot be too large and the maximum number of successive plies of a particular fiber orientation should not be too high. Variable stiffness composites may be manufactured by Automated Fiber Machines (AFP) which provides consistent quality with good production rates. However, laps and gaps are the most important challenges to steer fibers that effect on the performance of the structures. In this study, the optimal curved fiber paths at each layer of composites are designed in the first step by DMO, and then the reliability analysis is applied to investigate the sensitivity of the structure with different standard deviations compared to the straight fiber angle composites. The random variables are material properties and loads on the structures. The results show that the variable stiffness composite laminate structures are much more reliable, even for high standard deviation of material properties, than the conventional composite laminate structures. The reason is that the variable stiffness composite laminates allow tailoring stiffness and provide the possibility of adjusting stress and strain distribution favorably in the structures.Keywords: material optimization, Monte Carlo simulation, reliability analysis, response surface method, variable stiffness composite structures
Procedia PDF Downloads 520196 A Bicycle Based Model of Prehospital Care Implanted in Northeast of the Brazil: Initial Experience
Authors: Odaleia de O. Farias, Suzelene C. Marinho, Ecleidson B. Fragoso, Daniel S. Lima, Francisco R. S. Lira, Lara S. Araújo, Gabriel dos S. D. Soares
Abstract:
In populous cities, prehospital care services that use vehicles alternative to ambulances are needed in order to reduce costs and improve response time to occurrences in areas with large concentration of people, such as leisure and tourism spaces. In this context, it was implanted a program called BIKE VIDA, that is innovative quick access and assistance program. The aim of this study is to describe the implantation and initial profile of occurrences performed by an urgency/emergency pre-hospital care service through paramedics on bicycles. It is a cross-sectional, descriptive study carried out in the city of Fortaleza, Ceara, Brazil. The data included service records from July to August 2017. Ethical aspects were respected. The service covers a perimeter of 4.5 km, divided into three areas with perimeter of 1.5 km for each paramedic, attending from 5 am to 9 pm. Materials transported by bicycles include External Automated Defibrillator - DEA, portable oxygen, oximeter, cervical collar, stethoscope, sphygmomanometer, dressing and immobilization materials and personal protective equipment. Occurrences are requested directly by calling the emergency number 192 or through direct approach to the professional. In the first month of the program, there were 93 emergencies/urgencies, mainly in the daytime period (71,0%), in males (59,7%), in the age range of 26 to 45 years (46,2%). The main nature was traumatic incidents (53.3%). Most of the cases (88,2%) did not require ambulance transport to the hospital, and there were two deaths. Pre-hospital service through bicycles is an innovative strategy in Brazil and has shown to be promising in terms of reducing costs and improving the quality of the services offered.Keywords: emergency, response time, prehospital care, urgency
Procedia PDF Downloads 197195 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology
Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani
Abstract:
Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography
Procedia PDF Downloads 424194 Effects of Different Processing Methods of Typha Grass on Feed Intake Milk Yield/Composition and Blood Parameters of Diry Cows
Authors: Alhaji Musa Abdullahi, Usman Abdullahi, Adamu Lawan, Aminu Maidala
Abstract:
Abstract 16 healthy lactating cows will be randomly selected for the trial and will be randomly divided in to 4 groups with 4 cows in each. They will be kept under similar management condition (conventional management system). Animals of relatively same weight and age will be used. After 11days for adaptation, feed intake and performance of the experimental animals will be determine. Milk sample will be collected at each milking in the morning and afternoon to determine; Milk yield, Milk fat percentage, Solid not fat percentage, Total solid percentage of milk. Cows dung will be observe to determine; Score 1 very loose watery stool, Score 2 semi solid with undigested raw material, Score 3 semi solid with less undigested raw material, Score 4 solid with very less undigested raw material, Score 5 good dung no undigested raw material. At the end of the experiment, blood samples will be analyzed for full blood counts and differentials {White Blood Cells (WBC), Red Blood Cells (RBC), Hemoglobin (Hb), Packed Cell Volume (PCV), Mean Corpuscular Volume (MCV), Mean Corpuscular Hemoglobin (MCH), Mean Corpuscular Hemoglobin Concentration (MCHC), Platelets (PLT), Lymphocytes (LYM), Basophils, Eosinophils and Monocytes Proportion (MXD) and Neutrophils (NEUT)} using automated hematology analyzer. Serum samples will be analyzed for heat shock transcription factors, heat shock proteins and hormones (Serum glucocorticoid, prolactin and cortisol). Moreover, biochemical analysis will also be conducted to check for Total protein (TP), Albumen (ALB), Globulin (GBL), Total cholesterol (TCH), glucose (G), sodium (Na+), potassium (K+), chloride (Cl-) and pH. Keywords: Lactating cows, milk composition, dung score and blood parameters.Keywords: Lactating cows , Milk yield , Dung score , Blood parameters
Procedia PDF Downloads 184193 Evaluating Generative Neural Attention Weights-Based Chatbot on Customer Support Twitter Dataset
Authors: Sinarwati Mohamad Suhaili, Naomie Salim, Mohamad Nazim Jambli
Abstract:
Sequence-to-sequence (seq2seq) models augmented with attention mechanisms are playing an increasingly important role in automated customer service. These models, which are able to recognize complex relationships between input and output sequences, are crucial for optimizing chatbot responses. Central to these mechanisms are neural attention weights that determine the focus of the model during sequence generation. Despite their widespread use, there remains a gap in the comparative analysis of different attention weighting functions within seq2seq models, particularly in the domain of chatbots using the Customer Support Twitter (CST) dataset. This study addresses this gap by evaluating four distinct attention-scoring functions—dot, multiplicative/general, additive, and an extended multiplicative function with a tanh activation parameter — in neural generative seq2seq models. Utilizing the CST dataset, these models were trained and evaluated over 10 epochs with the AdamW optimizer. Evaluation criteria included validation loss and BLEU scores implemented under both greedy and beam search strategies with a beam size of k=3. Results indicate that the model with the tanh-augmented multiplicative function significantly outperforms its counterparts, achieving the lowest validation loss (1.136484) and the highest BLEU scores (0.438926 under greedy search, 0.443000 under beam search, k=3). These results emphasize the crucial influence of selecting an appropriate attention-scoring function in improving the performance of seq2seq models for chatbots. Particularly, the model that integrates tanh activation proves to be a promising approach to improve the quality of chatbots in the customer support context.Keywords: attention weight, chatbot, encoder-decoder, neural generative attention, score function, sequence-to-sequence
Procedia PDF Downloads 78192 Work Related and Psychosocial Risk Factors for Musculoskeletal Disorders among Workers in an Automated flexible Assembly Line in India
Authors: Rohin Rameswarapu, Sameer Valsangkar
Abstract:
Background: Globally, musculoskeletal disorders are the largest single cause of work-related illnesses accounting for over 33% of all newly reported occupational illnesses. Risk factors for MSD need to be delineated to suggest means for amelioration. Material and methods: In this current cross-sectional study, the prevalence of MSDs among workers in an electrical company assembly line, the socio-demographic and job characteristics associated with MSD were obtained through a semi-structured questionnaire. A quantitative assessment of the physical risk factors through the Rapid Upper Limb Assessment (RULA) tool, and measurement of psychosocial risk factors through a Likert scale was obtained. Statistical analysis was conducted using Epi-info software and descriptive and inferential statistics including chi-square and unpaired t test were obtained. Results: A total of 263 workers consented and participated in the study. Among these workers, 200 (76%) suffered from MSD. Most of the workers were aged between 18–27 years and majority of the workers were women with 198 (75.2%) of the 263 workers being women. A chi square test was significant for association between male gender and MSD with a P value of 0.007. Among the MSD positive group, 4 (2%) had a grand score of 5, 10 (5%) had a grand score of 6 and 186 (93%) had a grand score of 7 on RULA. There were significant differences between the non-MSD and MSD group on five out of the seven psychosocial domains, namely job demand, job monotony, co-worker support, decision control and family and environment domains. Discussion: The current cross-sectional study demonstrates a high prevalence of MSD among assembly line works with inherent physical and psychosocial risk factors and recommends that not only physical risk factors, addressing psychosocial risk factors through proper ergonomic means is also essential to the well-being of the employee.Keywords: musculoskeletal disorders, India, occupational health, Rapid Upper Limb Assessment (RULA)
Procedia PDF Downloads 349191 Experimental and Numerical Analysis of Wood Pellet Breakage during Pneumatic Transport
Authors: Julian Jaegers, Siegmar Wirtz, Viktor Scherer
Abstract:
Wood pellets belong to the most established trade formats of wood-based fuels. Especially, because of the transportability and the storage properties, but also due to low moisture content, high energy density, and the homogeneous particle size and shape, wood pellets are well suited for power generation in power plants and for the use in automated domestic firing systems. Before they are thermally converted, wood pellets pass various transport and storage procedures. There they undergo different mechanical impacts, which leads to pellet breakage and abrasion and to an increase in fines. The fines lead to operational problems during storage, charging, and discharging of pellets, they can increase the risk of dust explosions and can lead to pollutant emissions during combustion. In the current work, the dependence of the formation of fines caused by breakage during pneumatic transport is analyzed experimentally and numerically. The focus lies on the influence of conveying velocity, pellet loading, pipe diameter, and the shape of pipe components like bends or couplings. A test rig has been built, which allows the experimental evaluation of the pneumatic transport varying the above-mentioned parameters. Two high-speed cameras are installed for the quantitative optical access to the particle-particle and particle-wall contacts. The particle size distribution of the bulk before and after a transport process is measured as well as the amount of fines produced. The experiments will be compared with results of corresponding DEM/CFD simulations to provide information on contact frequencies and forces. The contribution proposed will present experimental results and report on the status of the DEM/CFD simulations. The final goal of the project is to provide a better insight into pellet breakage during pneumatic transport and to develop guidelines ensuring a more gentle transport.Keywords: DEM/CFD-simulation of pneumatic conveying, mechanical impact on wood pellets during transportation, pellet breakage, pneumatic transport of wood pellets
Procedia PDF Downloads 150190 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules
Authors: Michael Naderhirn, Marco Pavone
Abstract:
Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.Keywords: formal system verification, reachability, real time controller, hybrid system
Procedia PDF Downloads 241189 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products
Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry
Abstract:
The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively
Procedia PDF Downloads 90188 E-Learning Platform for School Kids
Authors: Gihan Thilakarathna, Fernando Ishara, Rathnayake Yasith, Bandara A. M. R. Y.
Abstract:
E-learning is a crucial component of intelligent education. Even in the midst of a pandemic, E-learning is becoming increasingly important in the educational system. Several e-learning programs are accessible for students. Here, we decided to create an e-learning framework for children. We've found a few issues that teachers are having with their online classes. When there are numerous students in an online classroom, how does a teacher recognize a student's focus on academics and below-the-surface behaviors? Some kids are not paying attention in class, and others are napping. The teacher is unable to keep track of each and every student. Key challenge in e-learning is online exams. Because students can cheat easily during online exams. Hence there is need of exam proctoring is occurred. In here we propose an automated online exam cheating detection method using a web camera. The purpose of this project is to present an E-learning platform for math education and include games for kids as an alternative teaching method for math students. The game will be accessible via a web browser. The imagery in the game is drawn in a cartoonish style. This will help students learn math through games. Everything in this day and age is moving towards automation. However, automatic answer evaluation is only available for MCQ-based questions. As a result, the checker has a difficult time evaluating the theory solution. The current system requires more manpower and takes a long time to evaluate responses. It's also possible to mark two identical responses differently and receive two different grades. As a result, this application employs machine learning techniques to provide an automatic evaluation of subjective responses based on the keyword provided to the computer as student input, resulting in a fair distribution of marks. In addition, it will save time and manpower. We used deep learning, machine learning, image processing and natural language technologies to develop these research components.Keywords: math, education games, e-learning platform, artificial intelligence
Procedia PDF Downloads 156187 A Web Service-Based Framework for Mining E-Learning Data
Authors: Felermino D. M. A. Ali, S. C. Ng
Abstract:
E-learning is an evolutionary form of distance learning and has become better over time as new technologies emerged. Today, efforts are still being made to embrace E-learning systems with emerging technologies in order to make them better. Among these advancements, Educational Data Mining (EDM) is one that is gaining a huge and increasing popularity due to its wide application for improving the teaching-learning process in online practices. However, even though EDM promises to bring many benefits to educational industry in general and E-learning environments in particular, its principal drawback is the lack of easy to use tools. The current EDM tools usually require users to have some additional technical expertise to effectively perform EDM tasks. Thus, in response to these limitations, this study intends to design and implement an EDM application framework which aims at automating and simplify the development of EDM in E-learning environment. The application framework introduces a Service-Oriented Architecture (SOA) that hides the complexity of technical details and enables users to perform EDM in an automated fashion. The framework was designed based on abstraction, extensibility, and interoperability principles. The framework implementation was made up of three major modules. The first module provides an abstraction for data gathering, which was done by extending Moodle LMS (Learning Management System) source code. The second module provides data mining methods and techniques as services; it was done by converting Weka API into a set of Web services. The third module acts as an intermediary between the first two modules, it contains a user-friendly interface that allows dynamically locating data provider services, and running knowledge discovery tasks on data mining services. An experiment was conducted to evaluate the overhead of the proposed framework through a combination of simulation and implementation. The experiments have shown that the overhead introduced by the SOA mechanism is relatively small, therefore, it has been concluded that a service-oriented architecture can be effectively used to facilitate educational data mining in E-learning environments.Keywords: educational data mining, e-learning, distributed data mining, moodle, service-oriented architecture, Weka
Procedia PDF Downloads 236186 Statistical Approach to Identify Stress and Biases Impairing Decision-Making in High-Risk Industry
Authors: Ph. Fauquet-Alekhine
Abstract:
Decision-making occurs several times an hour when working in high risk industry and an erroneous choice might have undesirable outcomes for people and the environment surrounding the industrial plant. Industrial decisions are very often made in a context of acute stress. Time pressure is a crucial stressor leading decision makers sometimes to boost up the decision-making process and if it is not possible then shift to the simplest strategy. We thus found it interesting to update the characterization of the stress factors impairing decision-making at Chinon Nuclear Power Plant (France) in order to optimize decision making contexts and/or associated processes. The investigation was based on the analysis of reports addressing safety events over the last 3 years. Among 93 reports, those explicitly addressing decision-making issues were identified. Characterization of each event was undertaken in terms of three criteria: stressors, biases impairing decision making and weaknesses of the decision-making process. The statistical analysis showed that biases were distributed over 10 possibilities among which the hypothesis confirmation bias was clearly salient. No significant correlation was found between criteria. The analysis indicated that the main stressor was time pressure and highlights an unexpected form of stressor: the trust asymmetry principle of the expert. The analysis led to the conclusion that this stressor impaired decision-making from a psychological angle rather than from a physiological angle: it induces defensive bias of self-esteem, self-protection associated with a bias of confirmation. This leads to the hypothesis that this stressor can intervene in some cases without being detected, and to the hypothesis that other stressors of the same kind might occur without being detected too. Further investigations addressing these hypotheses are considered. The analysis also led to the conclusion that dealing with these issues implied i) decision-making methods being well known to the workers and automated and ii) the decision-making tools being well known and strictly applied. Training was thus adjusted.Keywords: bias, expert, high risk industry, stress.
Procedia PDF Downloads 112185 Automation of Savitsky's Method for Power Calculation of High Speed Vessel and Generating Empirical Formula
Authors: M. Towhidur Rahman, Nasim Zaman Piyas, M. Sadiqul Baree, Shahnewaz Ahmed
Abstract:
The design of high-speed craft has recently become one of the most active areas of naval architecture. Speed increase makes these vehicles more efficient and useful for military, economic or leisure purpose. The planing hull is designed specifically to achieve relatively high speed on the surface of the water. Speed on the water surface is closely related to the size of the vessel and the installed power. The Savitsky method was first presented in 1964 for application to non-monohedric hulls and for application to stepped hulls. This method is well known as a reliable comparative to CFD analysis of hull resistance. A computer program based on Savitsky’s method has been developed using MATLAB. The power of high-speed vessels has been computed in this research. At first, the program reads some principal parameters such as displacement, LCG, Speed, Deadrise angle, inclination of thrust line with respect to keel line etc. and calculates the resistance of the hull using empirical planning equations of Savitsky. However, some functions used in the empirical equations are available only in the graphical form, which is not suitable for the automatic computation. We use digital plotting system to extract data from nomogram. As a result, value of wetted length-beam ratio and trim angle can be determined directly from the input of initial variables, which makes the power calculation automated without manually plotting of secondary variables such as p/b and other coefficients and the regression equations of those functions are derived by using data from different charts. Finally, the trim angle, mean wetted length-beam ratio, frictional coefficient, resistance, and power are computed and compared with the results of Savitsky and good agreement has been observed.Keywords: nomogram, planing hull, principal parameters, regression
Procedia PDF Downloads 405184 Optimizing The Residential Design Process Using Automated Technologies
Authors: Martin Georgiev, Milena Nanova, Damyan Damov
Abstract:
Architects, engineers, and developers need to analyse and implement a wide spectrum of data in different formats, if they want to produce viable residential developments. Usually, this data comes from a number of different sources and is not well structured. The main objective of this research project is to provide parametric tools working with real geodesic data that can generate residential solutions. Various codes, regulations and design constraints are described by variables and prioritized. In this way, we establish a common workflow for architects, geodesists, and other professionals involved in the building and investment process. This collaborative medium ensures that the generated design variants conform to various requirements, contributing to a more streamlined and informed decision-making process. The quantification of distinctive characteristics inherent to typical residential structures allows a systematic evaluation of the generated variants, focusing on factors crucial to designers, such as daylight simulation, circulation analysis, space utilization, view orientation, etc. Integrating real geodesic data offers a holistic view of the built environment, enhancing the accuracy and relevance of the design solutions. The use of generative algorithms and parametric models offers high productivity and flexibility of the design variants. It can be implemented in more conventional CAD and BIM workflow. Experts from different specialties can join their efforts, sharing a common digital workspace. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the building investment during its entire lifecycle.Keywords: architectural design, residential buildings, urban development, geodesic data, generative design, parametric models, workflow optimization
Procedia PDF Downloads 52183 Design, Analysis and Obstacle Avoidance Control of an Electric Wheelchair with Sit-Sleep-Seat Elevation Functions
Authors: Waleed Ahmed, Huang Xiaohua, Wilayat Ali
Abstract:
The wheelchair users are generally exposed to physical and psychological health problems, e.g., pressure sores and pain in the hip joint, associated with seating posture or being inactive in a wheelchair for a long time. Reclining Wheelchair with back, thigh, and leg adjustment helps in daily life activities and health preservation. The seat elevating function of an electric wheelchair allows the user (lower limb amputation) to reach different heights. An electric wheelchair is expected to ease the lives of the elderly and disable people by giving them mobility support and decreasing the percentage of accidents caused by users’ narrow sight or joystick operation errors. Thus, this paper proposed the design, analysis and obstacle avoidance control of an electric wheelchair with sit-sleep-seat elevation functions. A 3D model of a wheelchair is designed in SolidWorks that was later used for multi-body dynamic (MBD) analysis and to verify driving control system. The control system uses the fuzzy algorithm to avoid the obstacle by getting information in the form of distance from the ultrasonic sensor and user-specified direction from the joystick’s operation. The proposed fuzzy driving control system focuses on the direction and velocity of the wheelchair. The wheelchair model has been examined and proven in MSC Adams (Automated Dynamic Analysis of Mechanical Systems). The designed fuzzy control algorithm is implemented on Gazebo robotic 3D simulator using Robotic Operating System (ROS) middleware. The proposed wheelchair design enhanced mobility and quality of life by improving the user’s functional capabilities. Simulation results verify the non-accidental behavior of the electric wheelchair.Keywords: fuzzy logic control, joystick, multi body dynamics, obstacle avoidance, scissor mechanism, sensor
Procedia PDF Downloads 129182 Improving Lane Detection for Autonomous Vehicles Using Deep Transfer Learning
Authors: Richard O’Riordan, Saritha Unnikrishnan
Abstract:
Autonomous Vehicles (AVs) are incorporating an increasing number of ADAS features, including automated lane-keeping systems. In recent years, many research papers into lane detection algorithms have been published, varying from computer vision techniques to deep learning methods. The transition from lower levels of autonomy defined in the SAE framework and the progression to higher autonomy levels requires increasingly complex models and algorithms that must be highly reliable in their operation and functionality capacities. Furthermore, these algorithms have no room for error when operating at high levels of autonomy. Although the current research details existing computer vision and deep learning algorithms and their methodologies and individual results, the research also details challenges faced by the algorithms and the resources needed to operate, along with shortcomings experienced during their detection of lanes in certain weather and lighting conditions. This paper will explore these shortcomings and attempt to implement a lane detection algorithm that could be used to achieve improvements in AV lane detection systems. This paper uses a pre-trained LaneNet model to detect lane or non-lane pixels using binary segmentation as the base detection method using an existing dataset BDD100k followed by a custom dataset generated locally. The selected roads will be modern well-laid roads with up-to-date infrastructure and lane markings, while the second road network will be an older road with infrastructure and lane markings reflecting the road network's age. The performance of the proposed method will be evaluated on the custom dataset to compare its performance to the BDD100k dataset. In summary, this paper will use Transfer Learning to provide a fast and robust lane detection algorithm that can handle various road conditions and provide accurate lane detection.Keywords: ADAS, autonomous vehicles, deep learning, LaneNet, lane detection
Procedia PDF Downloads 104181 Economic Development Impacts of Connected and Automated Vehicles (CAV)
Authors: Rimon Rafiah
Abstract:
This paper will present a combination of two seemingly unrelated models, which are the one for estimating economic development impacts as a result of transportation investment and the other for increasing CAV penetration in order to reduce congestion. Measuring economic development impacts resulting from transportation investments is becoming more recognized around the world. Examples include the UK’s Wider Economic Benefits (WEB) model, Economic Impact Assessments in the USA, various input-output models, and additional models around the world. The economic impact model is based on WEB and is based on the following premise: investments in transportation will reduce the cost of personal travel, enabling firms to be more competitive, creating additional throughput (the same road allows more people to travel), and reducing the cost of travel of workers to a new workplace. This reduction in travel costs was estimated in out-of-pocket terms in a given localized area and was then translated into additional employment based on regional labor supply elasticity. This additional employment was conservatively assumed to be at minimum wage levels, translated into GDP terms, and from there into direct taxation (i.e., an increase in tax taken by the government). The CAV model is based on economic principles such as CAV usage, supply, and demand. Usage of CAVs can increase capacity using a variety of means – increased automation (known as Level I thru Level IV) and also by increased penetration and usage, which has been predicted to go up to 50% by 2030 according to several forecasts, with possible full conversion by 2045-2050. Several countries have passed policies and/or legislation on sales of gasoline-powered vehicles (none) starting in 2030 and later. Supply was measured via increased capacity on given infrastructure as a function of both CAV penetration and implemented technologies. The CAV model, as implemented in the USA, has shown significant savings in travel time and also in vehicle operating costs, which can be translated into economic development impacts in terms of job creation, GDP growth and salaries as well. The models have policy implications as well and can be adapted for use in Japan as well.Keywords: CAV, economic development, WEB, transport economics
Procedia PDF Downloads 74180 Integrated Mass Rapid Transit System for Smart City Project in Western India
Authors: Debasis Sarkar, Jatan Talati
Abstract:
This paper is an attempt to develop an Integrated Mass Rapid Transit System (MRTS) for a smart city project in Western India. Integrated transportation is one of the enablers of smart transportation for providing a seamless intercity as well as regional level transportation experience. The success of a smart city project at the city level for transportation is providing proper integration to different mass rapid transit modes by way of integrating information, physical, network of routes fares, etc. The methodology adopted for this study was primary data research through questionnaire survey. The respondents of the questionnaire survey have responded on the issues about their perceptions on the ways and means to improve public transport services in urban cities. The respondents were also required to identify the factors and attributes which might motivate more people to shift towards the public mode. Also, the respondents were questioned about the factors which they feel might restrain the integration of various modes of MRTS. Furthermore, this study also focuses on developing a utility equation for respondents with the help of multiple linear regression analysis and its probability to shift to public transport for certain factors listed in the questionnaire. It has been observed that for shifting to public transport, the most important factors that need to be considered were travel time saving and comfort rating. Also, an Integrated MRTS can be obtained by combining metro rail with BRTS, metro rail with monorail, monorail with BRTS and metro rail with Indian railways. Providing a common smart card to transport users for accessing all the different available modes would be a pragmatic solution towards integration of the available modes of MRTS.Keywords: mass rapid transit systems, smart city, metro rail, bus rapid transit system, multiple linear regression, smart card, automated fare collection system
Procedia PDF Downloads 271179 Effectiveness of Gamified Virtual Physiotherapy Patients with Shoulder Problems
Authors: A. Barratt, M. H. Granat, S. Buttress, B. Roy
Abstract:
Introduction: Physiotherapy is an essential part of the treatment of patients with shoulder problems. The focus of treatment is usually centred on addressing specific physiotherapy goals, ultimately resulting in the improvement in pain and function. This study investigates if computerised physiotherapy using gamification principles are as effective as standard physiotherapy. Methods: Physiotherapy exergames were created using a combination of commercially available hardware, the Microsoft Kinect, and bespoke software. The exergames used were validated by mapping physiotherapy goals of physiotherapy which included; strength, range of movement, control, speed, and activation of the kinetic chain. A multicenter, randomised prospective controlled trial investigated the use of exergames on patients with Shoulder Impingement Syndrome who had undergone Arthroscopic Subacromial Decompression surgery. The intervention group was provided with the automated sensor-based technology, allowing them to perform exergames and track their rehabilitation progress. The control group was treated with standard physiotherapy protocols. Outcomes from different domains were used to compare the groups. An important metric was the assessment of shoulder range of movement pre- and post-operatively. The range of movement data included abduction, forward flexion and external rotation which were measured by the software, pre-operatively, 6 weeks and 12 weeks post-operatively. Results: Both groups show significant improvement from pre-operative to 12 weeks in elevation in forward flexion and abduction planes. Results for abduction showed an improvement for the interventional group (p < 0.015) as well as the test group (p < 0.003). Forward flexion improvement was interventional group (p < 0.0201) with the control group (p < 0.004). There was however no significant difference between the groups at 12 weeks for abduction (p < 0.118067) , forward flexion (p < 0.189755) or external rotation (p < 0.346967). Conclusion: Exergames may be used as an alternative to standard physiotherapy regimes; however, further analysis is required focusing on patient engagement.Keywords: shoulder, physiotherapy, exergames, gamification
Procedia PDF Downloads 194178 Arabic Lexicon Learning to Analyze Sentiment in Microblogs
Authors: Mahmoud B. Rokaya
Abstract:
The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation
Procedia PDF Downloads 189177 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 269176 Reinforcement Learning For Agile CNC Manufacturing: Optimizing Configurations And Sequencing
Authors: Huan Ting Liao
Abstract:
In a typical manufacturing environment, computer numerical control (CNC) machining is essential for automating production through precise computer-controlled tool operations, significantly enhancing efficiency and ensuring consistent product quality. However, traditional CNC production lines often rely on manual loading and unloading, limiting operational efficiency and scalability. Although automated loading systems have been developed, they frequently lack sufficient intelligence and configuration efficiency, requiring extensive setup adjustments for different products and impacting overall productivity. This research addresses the job shop scheduling problem (JSSP) in CNC machining environments, aiming to minimize total completion time (makespan) and maximize CNC machine utilization. We propose a novel approach using reinforcement learning (RL), specifically the Q-learning algorithm, to optimize scheduling decisions. The study simulates the JSSP, incorporating robotic arm operations, machine processing times, and work order demand allocation to determine optimal processing sequences. The Q-learning algorithm enhances machine utilization by dynamically balancing workloads across CNC machines, adapting to varying job demands and machine states. This approach offers robust solutions for complex manufacturing environments by automating decision-making processes for job assignments. Additionally, we evaluate various layout configurations to identify the most efficient setup. By integrating RL-based scheduling optimization with layout analysis, this research aims to provide a comprehensive solution for improving manufacturing efficiency and productivity in CNC-based job shops. The proposed method's adaptability and automation potential promise significant advancements in tackling dynamic manufacturing challenges.Keywords: job shop scheduling problem, reinforcement learning, operations sequence, layout optimization, q-learning
Procedia PDF Downloads 24175 A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses
Authors: Rashima Mahajan, Dipali Bansal, Shweta Singh
Abstract:
Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotive EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.Keywords: brain computer interface, electroencephalogram, EEGLab, BCILab, emotive, emotions, interval features, spectral features, artificial neural network, control applications
Procedia PDF Downloads 317174 Species Distribution and Incidence of Inducible Clindamycin Resistance in Coagulase-Negative Staphylococci Isolated from Blood Cultures of Patients with True Bacteremia in Turkey
Authors: Fatma Koksal Cakirlar, Murat Gunaydin, Nevri̇ye Gonullu, Nuri Kiraz
Abstract:
During the last few decades, the increasing prevalence of methicillin resistant-CoNS isolates has become a common problem worldwide. Macrolide-lincosamide-streptogramin B (MLSB) antibiotics are effectively used for the treatment of CoNS infections. However, resistance to MLSB antibiotics is prevalent among staphylococci. The aim of this study is to determine species distribution and the incidence of inducible clindamycin resistance in CoNS isolates caused nosocomial bacteremia in our hospital. Between January 2014 and October 2015, a total of 484 coagulase-negative CoNS isolates were isolated from blood samples of patients with true bacteremia who were hospitalized in intensive care units and in other departments of Istanbul University Cerrahpasa Medical Hospital. Blood cultures were analyzed with the BACTEC 9120 system (Becton Dickinson, USA). The identification and antimicrobial resistance of isolates were determined by Phoenix automated system (BD Diagnostic Systems, Sparks, MD). Inducible clindamycin resistance was detected using D-test. The species distribution was as follows: Staphylococcus epidermidis 211 (43%), S. hominis 154 (32%), S. haemolyticus 69 (14%), S. capitis 28 (6%), S. saprophyticus 11 (2%), S. warnerii 7 (1%), S. schleiferi 5 (1%) and S. lugdunensis 1 (0.2%). Resistance to methicillin was detected in 74.6% of CoNS isolates. Methicillin resistance was highest in S.hemoliticus isolates (89%). Resistance rates of CoNS strains to the antibacterial agents, respectively, were as follows: ampicillin 77%, gentamicin 20%, erythromycin 71%, clindamycin 22%, trimethoprim-sulfamethoxazole 45%, ciprofloxacin 52%, tetracycline 34%, rifampicin 20%, daptomycin 0.2% and linezolid 0.2%. None of the strains were resistant to vancomycin and teicoplanin. Fifteen (3%) CoNS isolates were D-test positive, inducible MLSB resistance type (iMLSB-phenotype), 94 (19%) were constitutively resistant (cMLSB -phenotype), and 237 (46,76%) isolates were found D-test negative, indicating truly clindamycin-susceptible MS phenotype (M-phenotype resistance). The incidence of iMLSB-phenotypes was higher in S. epidermidis isolates (4,7%) compared to other CoNS isolates.Keywords: bacteremia, inducible MLSB resistance phenotype, methicillin-resistant, staphylococci
Procedia PDF Downloads 239173 A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes
Authors: Tim Giesen, Raphael Adamietz, Pablo Mayer, Philipp Stiefel, Patrick Alle, Dirk Schlenker
Abstract:
The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.Keywords: automation, battery production, carrier, advanced process control, cyber-physical system
Procedia PDF Downloads 338172 Segmenting 3D Optical Coherence Tomography Images Using a Kalman Filter
Authors: Deniz Guven, Wil Ward, Jinming Duan, Li Bai
Abstract:
Over the past two decades or so, Optical Coherence Tomography (OCT) has been used to diagnose retina and optic nerve diseases. The retinal nerve fibre layer, for example, is a powerful diagnostic marker for detecting and staging glaucoma. With the advances in optical imaging hardware, the adoption of OCT is now commonplace in clinics. More and more OCT images are being generated, and for these OCT images to have clinical applicability, accurate automated OCT image segmentation software is needed. Oct image segmentation is still an active research area, as OCT images are inherently noisy, with the multiplicative speckling noise. Simple edge detection algorithms are unsuitable for detecting retinal layer boundaries in OCT images. Intensity fluctuation, motion artefact, and the presence of blood vessels also decrease further OCT image quality. In this paper, we introduce a new method for segmenting three-dimensional (3D) OCT images. This involves the use of a Kalman filter, which is commonly used in computer vision for object tracking. The Kalman filter is applied to the 3D OCT image volume to track the retinal layer boundaries through the slices within the volume and thus segmenting the 3D image. Specifically, after some pre-processing of the OCT images, points on the retinal layer boundaries in the first image are identified, and curve fitting is applied to them such that the layer boundaries can be represented by the coefficients of the curve equations. These coefficients then form the state space for the Kalman Filter. The filter then produces an optimal estimate of the current state of the system by updating its previous state using the measurements available in the form of a feedback control loop. The results show that the algorithm can be used to segment the retinal layers in OCT images. One of the limitations of the current algorithm is that the curve representation of the retinal layer boundary does not work well when the layer boundary is split into two, e.g., at the optic nerve, the layer boundary split into two. This maybe resolved by using a different approach to representing the boundaries, such as b-splines or level sets. The use of a Kalman filter shows promise to developing accurate and effective 3D OCT segmentation methods.Keywords: optical coherence tomography, image segmentation, Kalman filter, object tracking
Procedia PDF Downloads 482171 Transient Freshwater-Saltwater Transition-Zone Dynamics in Heterogeneous Coastal Aquifers
Authors: Antoifi Abdoulhalik, Ashraf Ahmed
Abstract:
The ever growing threat of saltwater intrusion has prompted the need to further advance the understanding of underlying processes related to SWI for effective water resource management. While research efforts have mainly been focused on steady state analysis, studies on the transience of saltwater intrusion mechanism remain very scarce and studies considering transient SWI in heterogeneous medium are, as per our knowledge, simply inexistent. This study provides for the first time a quantitative analysis of the effect of both inland and coastal water level changes on the transition zone under transient conditions in layered coastal aquifer. In all, two sets of four experiments were completed, including a homogeneous case, and four layered cases: case LH and case HL presented were two bi-layered scenarios where a low K layer was set at the top and the bottom, respectively; case HLH and case LHL presented two stratified aquifers with High K–Low K–High K and Low K–High K– Low K pattern, respectively. Experimental automated image analysis technique was used here to quantify the main SWI parameters under high spatial and temporal resolution. The findings of this study provide an invaluable insight on the underlying processes responsible of transition zone dynamics in coastal aquifers. The results show that in all the investigated cases, the width of the transition zone remains almost unchanged throughout the saltwater intrusion process regardless of where the boundary change occurs. However, the results demonstrate that the width of the transition zone considerably increases during the retreat, with largest amplitude observed in cases LH and LHL, where a low K was set at the top of the system. In all the scenarios, the amplitude of widening was slightly smaller when the retreat was prompted by instantaneous drop of the saltwater level than when caused by inland freshwater rise, despite equivalent absolute head change magnitude. The magnitude of head change significantly caused larger widening during the saltwater wedge retreat, while having no impact during the intrusion phase.Keywords: freshwater-saltwater transition-zone dynamics, heterogeneous coastal aquifers, laboratory experiments, transience seawater intrusion
Procedia PDF Downloads 241170 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016
Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi
Abstract:
This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.Keywords: big health data, data subject rights, GDPR, pandemic
Procedia PDF Downloads 129169 Career Guidance System Using Machine Learning
Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan
Abstract:
Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should properly evaluate their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, Neural Networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable to offer an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills
Procedia PDF Downloads 80168 150 KVA Multifunction Laboratory Test Unit Based on Power-Frequency Converter
Authors: Bartosz Kedra, Robert Malkowski
Abstract:
This paper provides description and presentation of laboratory test unit built basing on 150 kVA power frequency converter and Simulink RealTime platform. Assumptions, based on criteria which load and generator types may be simulated using discussed device, are presented, as well as control algorithm structure. As laboratory setup contains transformer with thyristor controlled tap changer, a wider scope of setup capabilities is presented. Information about used communication interface, data maintenance, and storage solution as well as used Simulink real-time features is presented. List and description of all measurements are provided. Potential of laboratory setup modifications is evaluated. For purposes of Rapid Control Prototyping, a dedicated environment was used Simulink RealTime. Therefore, load model Functional Unit Controller is based on a PC computer with I/O cards and Simulink RealTime software. Simulink RealTime was used to create real-time applications directly from Simulink models. In the next step, applications were loaded on a target computer connected to physical devices that provided opportunity to perform Hardware in the Loop (HIL) tests, as well as the mentioned Rapid Control Prototyping process. With Simulink RealTime, Simulink models were extended with I/O cards driver blocks that made automatic generation of real-time applications and performing interactive or automated runs on a dedicated target computer equipped with a real-time kernel, multicore CPU, and I/O cards possible. Results of performed laboratory tests are presented. Different load configurations are described and experimental results are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area and arbitrary active and reactive power regulation basing on defined schedule.Keywords: MATLAB, power converter, Simulink Real-Time, thyristor-controlled tap changer
Procedia PDF Downloads 323