Search results for: five-phase asynchronous machine
596 Neural Network Supervisory Proportional-Integral-Derivative Control of the Pressurized Water Reactor Core Power Load Following Operation
Authors: Derjew Ayele Ejigu, Houde Song, Xiaojing Liu
Abstract:
This work presents the particle swarm optimization trained neural network (PSO-NN) supervisory proportional integral derivative (PID) control method to monitor the pressurized water reactor (PWR) core power for safe operation. The proposed control approach is implemented on the transfer function of the PWR core, which is computed from the state-space model. The PWR core state-space model is designed from the neutronics, thermal-hydraulics, and reactivity models using perturbation around the equilibrium value. The proposed control approach computes the control rod speed to maneuver the core power to track the reference in a closed-loop scheme. The particle swarm optimization (PSO) algorithm is used to train the neural network (NN) and to tune the PID simultaneously. The controller performance is examined using integral absolute error, integral time absolute error, integral square error, and integral time square error functions, and the stability of the system is analyzed by using the Bode diagram. The simulation results indicated that the controller shows satisfactory performance to control and track the load power effectively and smoothly as compared to the PSO-PID control technique. This study will give benefit to design a supervisory controller for nuclear engineering research fields for control application.Keywords: machine learning, neural network, pressurized water reactor, supervisory controller
Procedia PDF Downloads 155595 An Optimal Path for Virtual Reality Education using Association Rules
Authors: Adam Patterson
Abstract:
This study analyzes the self-reported experiences of virtual reality users to develop insight into an optimal learning path for education within virtual reality. This research uses a sample of 1000 observations to statistically define factors influencing (i) immersion level and (ii) motion sickness rating for virtual reality experience respondents of college age. This paper recommends an efficient duration for each virtual reality session, to minimize sickness and maximize engagement, utilizing modern machine learning methods such as association rules. The goal of this research, in augmentation with previous literature, is to inform logistical decisions relating to implementation of pilot instruction for virtual reality at the collegiate level. Future research will include a Randomized Control Trial (RCT) to quantify the effect of virtual reality education on student learning outcomes and engagement measures. Current research aims to maximize the treatment effect within the RCT by optimizing the learning benefits of virtual reality. Results suggest significant gender heterogeneity amongst likelihood of reporting motion sickness. Females are 1.7 times more likely, than males, to report high levels of motion sickness resulting from a virtual reality experience. Regarding duration, respondents were 1.29 times more likely to select the lowest level of motion sickness after an engagement lasting between 24.3 and 42 minutes. Conversely, respondents between 42 to 60 minutes were 1.2 times more likely to select the higher levels of motion sickness.Keywords: applications and integration of e-education, practices and cases in e-education, systems and technologies in e-education, technology adoption and diffusion of e-learning
Procedia PDF Downloads 67594 The Analysis of Emergency Shutdown Valves Torque Data in Terms of Its Use as a Health Indicator for System Prognostics
Authors: Ewa M. Laskowska, Jorn Vatn
Abstract:
Industry 4.0 focuses on digital optimization of industrial processes. The idea is to use extracted data in order to build a decision support model enabling use of those data for real time decision making. In terms of predictive maintenance, the desired decision support tool would be a model enabling prognostics of system's health based on the current condition of considered equipment. Within area of system prognostics and health management, a commonly used health indicator is Remaining Useful Lifetime (RUL) of a system. Because the RUL is a random variable, it has to be estimated based on available health indicators. Health indicators can be of different types and come from different sources. They can be process variables, equipment performance variables, data related to number of experienced failures, etc. The aim of this study is the analysis of performance variables of emergency shutdown valves (ESV) used in oil and gas industry. ESV is inspected periodically, and at each inspection torque and time of valve operation are registered. The data will be analyzed by means of machine learning or statistical analysis. The purpose is to investigate whether the available data could be used as a health indicator for a prognostic purpose. The second objective is to examine what is the most efficient way to incorporate the data into predictive model. The idea is to check whether the data can be applied in form of explanatory variables in Markov process or whether other stochastic processes would be a more convenient to build an RUL model based on the information coming from registered data.Keywords: emergency shutdown valves, health indicator, prognostics, remaining useful lifetime, RUL
Procedia PDF Downloads 91593 Study of the Performances of an Environmental Concrete Based on Recycled Aggregates and Marble Waste Fillers Addition
Authors: Larbi Belagraa, Miloud Beddar, Abderrazak Bouzid
Abstract:
The needs of the construction sector still increasing for concrete. However, the shortage of natural resources of aggregate could be a problem for the concrete industry, in addition to the negative impact on the environment due to the demolition wastes. Recycling aggregate from construction and demolition (C&D) waste presents a major interest for users and researchers of concrete since this constituent can occupies more than 70% of concrete volume. The aim of the study here in is to assess the effect of sulfate resistant cement combined with the local mineral addition of marble waste fillers on the mechanical behavior of a recycled aggregate concrete (RAC). Physical and mechanical properties of RAC including the density, the flexural and the compressive strength were studied. The non destructive test methods (pulse-velocity, rebound hammer) were performed . The results obtained were compared to crushed aggregate concrete (CAC) using the normal compressive testing machine test method. The optimal content of 5% marble fillers showed an improvement for both used test methods (compression, flexion and NDT). Non-destructive methods (ultrasonic and rebound hammer test) can be used to assess the strength of RAC, but a correction coefficient is required to obtain a similar value to the compressive strength given by the compression tests. The study emphasizes that these waste materials can be successfully and economically utilized as additional inert filler in RAC formulation within similar performances compared to a conventional concrete.Keywords: marble waste fillers, mechanical strength, natural aggregate, non-destructive testing (NDT), recycled aggregate concrete
Procedia PDF Downloads 312592 Image Ranking to Assist Object Labeling for Training Detection Models
Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman
Abstract:
Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.Keywords: computer vision, deep learning, object detection, semiconductor
Procedia PDF Downloads 136591 Biaxial Fatigue Specimen Design and Testing Rig Development
Authors: Ahmed H. Elkholy
Abstract:
An elastic analysis is developed to obtain the distribution of stresses, strains, bending moment and deformation for a thin hollow, variable thickness cylindrical specimen when subjected to different biaxial loadings. The specimen was subjected to a combination of internal pressure, axial tensile loading and external pressure. Several axial to circumferential stress ratios were investigated in detail. The analytical model was then validated using experimental results obtained from a test rig using several biaxial loadings. Based on the preliminary results obtained, the specimen was then modified geometrically to ensure uniform strain distribution through its wall thickness and along its gauge length. The new design of the specimen has a higher buckling strength and a maximum value of equivalent stress according to the maximum distortion energy theory. A cyclic function generator of the standard servo-controlled, electro-hydraulic testing machine is used to generate a specific signal shape (sine, square,…) at a certain frequency. The two independent controllers of the electronic circuit cause an independent movement to each servo-valve piston. The movement of each piston pressurizes the upper and lower sides of the actuators alternately. So, the specimen will be subjected to axial and diametral loads independent of each other. The hydraulic system has two different pressures: one pressure will be responsible for axial stress produced in the specimen and the other will be responsible for the tangential stress. Changing the two pressure ratios will change the stress ratios accordingly. The only restriction on the maximum stress obtained is the capacity of the testing system and specimen instability due to buckling.Keywords: biaxial, fatigue, stress, testing
Procedia PDF Downloads 128590 Multiscale Entropy Analysis of Electroencephalogram (EEG) of Alcoholic and Control Subjects
Authors: Lal Hussain, Wajid Aziz, Imtiaz Ahmed Awan, Sharjeel Saeed
Abstract:
Multiscale entropy analysis (MSE) is a useful technique recently developed to quantify the dynamics of physiological signals at different time scales. This study is aimed at investigating the electroencephalogram (EEG) signals to analyze the background activity of alcoholic and control subjects by inspecting various coarse-grained sequences formed at different time scales. EEG recordings of alcoholic and control subjects were taken from the publically available machine learning repository of University of California (UCI) acquired using 64 electrodes. The MSE analysis was performed on the EEG data acquired from all the electrodes of alcoholic and control subjects. Mann-Whitney rank test was used to find significant differences between the groups and result were considered statistically significant for p-values<0.05. The area under receiver operator curve was computed to find the degree separation between the groups. The mean ranks of MSE values at all the times scales for all electrodes were higher control subject as compared to alcoholic subjects. Higher mean ranks represent higher complexity and vice versa. The finding indicated that EEG signals acquired through electrodes C3, C4, F3, F7, F8, O1, O2, P3, T7 showed significant differences between alcoholic and control subjects at time scales 1 to 5. Moreover, all electrodes exhibit significance level at different time scales. Likewise, the highest accuracy and separation was obtained at the central region (C3 and C4), front polar regions (P3, O1, F3, F7, F8 and T8) while other electrodes such asFp1, Fp2, P4 and F4 shows no significant results.Keywords: electroencephalogram (EEG), multiscale sample entropy (MSE), Mann-Whitney test (MMT), Receiver Operator Curve (ROC), complexity analysis
Procedia PDF Downloads 376589 Tensile Behavior of Oil Palm Fiber Concrete (OPFC) with Different Fiber Volume
Authors: Khairul Zahreen Mohd Arof, Rahimah Muhamad
Abstract:
Oil palm fiber (OPF) is a fibrous material produced from the waste of palm oil industry which is suitable to be used in construction industry. The applications of OPF in concrete can reduce the material costs and enhance concrete behavior. Dog-bone test provides significant results for investigating the behavior of fiber reinforced concrete under tensile loading. It is able to provide stress-strain profile, modulus of elasticity, stress at cracking point and total crack width. In this research, dog-bone tests have been conducted to analyze total crack width, stress-strain profile, and modulus of elasticity of OPFC. Specimens are in a dog-bone shape with a long notch in the middle as compared to the end, to ensure cracks occur only within the notch. Tests were instrumented using a universal testing machine Shimadzu 300kN, a linear variable differential transformer and two strain gauges. A total of nine specimens with different fibers at fiber volume fractions of 0.75%, 1.00%, and 1.25% have been tested to analyze the behavior under tensile loading. Also, three specimens of plain concrete fiber have been tested as control specimens. The tensile test of all specimens have been carried out for concrete age exceed 28 days. It shows that OPFC able to reduce total crack width. In addition, OPFC has higher cracking stress than plain concrete. The study shows plain concrete can be improved with the addition of OPF.Keywords: cracks, crack width, dog-bone test, oil palm fiber concrete
Procedia PDF Downloads 344588 A Human Centered Design of an Exoskeleton Using Multibody Simulation
Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann
Abstract:
Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation
Procedia PDF Downloads 162587 Artificial Intelligence Based Abnormality Detection System and Real Valuᵀᴹ Product Design
Authors: Junbeom Lee, Jaehyuck Cho, Wookyeong Jeong, Jonghan Won, Jungmin Hwang, Youngseok Song, Taikyeong Jeong
Abstract:
This paper investigates and analyzes meta-learning technologies that use multiple-cameras to monitor and check abnormal behavior in people in real-time in the area of healthcare fields. Advances in artificial intelligence and computer vision technologies have confirmed that cameras can be useful for individual health monitoring and abnormal behavior detection. Through this, it is possible to establish a system that can respond early by automatically detecting abnormal behavior of the elderly, such as patients and the elderly. In this paper, we use a technique called meta-learning to analyze image data collected from cameras and develop a commercial product to determine abnormal behavior. Meta-learning applies machine learning algorithms to help systems learn and adapt quickly to new real data. Through this, the accuracy and reliability of the abnormal behavior discrimination system can be improved. In addition, this study proposes a meta-learning-based abnormal behavior detection system that includes steps such as data collection and preprocessing, feature extraction and selection, and classification model development. Various healthcare scenarios and experiments analyze the performance of the proposed system and demonstrate excellence compared to other existing methods. Through this study, we present the possibility that camera-based meta-learning technology can be useful for monitoring and testing abnormal behavior in the healthcare area.Keywords: artificial intelligence, abnormal behavior, early detection, health monitoring
Procedia PDF Downloads 86586 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques
Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet
Abstract:
5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics
Procedia PDF Downloads 63585 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 41584 The Methodology of Hand-Gesture Based Form Design in Digital Modeling
Authors: Sanghoon Shim, Jaehwan Jung, Sung-Ah Kim
Abstract:
As the digital technology develops, studies on the TUI (Tangible User Interface) that links the physical environment utilizing the human senses with the virtual environment through the computer are actively being conducted. In addition, there has been a tremendous advance in computer design making through the use of computer-aided design techniques, which enable optimized decision-making through comparison with machine learning and parallel comparison of alternatives. However, a complex design that can respond to user requirements or performance can emerge through the intuition of the designer, but it is difficult to actualize the emerged design by the designer's ability alone. Ancillary tools such as Gaudí's Sandbag can be an instrument to reinforce and evolve emerged ideas from designers. With the advent of many commercial tools that support 3D objects, designers' intentions are easily reflected in their designs, but the degree of their reflection reflects their intentions according to the proficiency of design tools. This study embodies the environment in which the form can be implemented by the fingers of the most basic designer in the initial design phase of the complex type building design. Leapmotion is used as a sensor to recognize the hand motions of the designer, and it is converted into digital information to realize an environment that can be linked in real time in virtual reality (VR). In addition, the implemented design can be linked with Rhino™, a 3D authoring tool, and its plug-in Grasshopper™ in real time. As a result, it is possible to design sensibly using TUI, and it can serve as a tool for assisting designer intuition.Keywords: design environment, digital modeling, hand gesture, TUI, virtual reality
Procedia PDF Downloads 366583 The Research of the Relationship between Triathlon Competition Results with Physical Fitness Performance
Authors: Chen Chan Wei
Abstract:
The purpose of this study was to investigate the impact of swim 1500m, 10000m run, VO2 max, and body fat on Olympic distance triathlon competition performance. The subjects were thirteen college triathletes with endurance training, with an average age, height and weight of 20.61±1.04 years (mean ± SD), 171.76±8.54 cm and 65.32±8.14 kg respectively. All subjects were required to take the tests of swim 1500m, run 10000m, VO2 max, body fat, and participate in the Olympic distance triathlon competition. First, the swim 1500m test was taken in the standardized 50m pool, with a depth of 2m, and the 10000m run test on the standardized 400m track. After three days, VO2 max was tested with the MetaMax 3B and body fat was measured with the DEXA machine. After two weeks, all 13 subjects joined the Olympic distance triathlon competition at the 2016 New Taipei City Asian Cup. The relationships between swim 1500m, 10000m run, VO2 max, body fat test, and Olympic distance triathlon competition performance were evaluated using Pearson's product-moment correlation. The results show that 10000m run and body fat had a significant positive correlation with Olympic distance triathlon performance (r=.830, .768), but VO2 max has a significant negative correlation with Olympic distance triathlon performance (r=-.735). In conclusion, for improved non-draft Olympic distance triathlon performance, triathletes should focus on running than swimming training and can be measure VO2 max to prediction triathlon performance. Also, managing body fat can improve Olympic distance triathlon performance. In addition, swimming performance was not significantly correlated to Olympic distance triathlon performance, possibly because the 2016 New Taipei City Asian Cup age group was not a drafting competition. The swimming race is the shortest component of Olympic distance triathlons. Therefore, in a non-draft competition, swimming ability is not significantly correlated with overall performance.Keywords: triathletes, olympic, non-drafting, correlation
Procedia PDF Downloads 250582 Moderation in Temperature Dependence on Counter Frictional Coefficient and Prevention of Wear of C/C Composites by Synthesizing SiC around Surface and Internal Vacancies
Authors: Noboru Wakamoto, Kiyotaka Obunai, Kazuya Okubo, Toru Fujii
Abstract:
The aim of this study is to moderate the dependence of counter frictional coefficient on temperature between counter surfaces and to reduce the wear of C/C composites at low temperature. To modify the C/C composites, Silica (SiO2) powders were added into phenolic resin for carbon precursor. The preform plate of the precursor of C/C composites was prepared by conventional filament winding method. The C/C composites plates were obtained by carbonizing preform plate at 2200 °C under an argon atmosphere. At that time, the silicon carbides (SiC) were synthesized around the surfaces and the internal vacancies of the C/C composites. The frictional coefficient on the counter surfaces and specific wear volumes of the C/C composites were measured by our developed frictional test machine like pin-on disk type. The XRD indicated that SiC was synthesized in the body of C/C composite fabricated by current method. The results of friction test showed that coefficient of friction of unmodified C/C composites have temperature dependence when the test condition was changed. In contrast, frictional coefficient of the C/C composite modified with SiO2 powders was almost constant at about 0.27 when the temperature condition was changed from Room Temperature (RT) to 300 °C. The specific wear rate decreased from 25×10-6 mm2/N to 0.1×10-6 mm2/N. The observations of the surfaces after friction tests showed that the frictional surface of the modified C/C composites was covered with a film produced by the friction. This study found that synthesizing SiC around surface and internal vacancies of C/C composites was effective to moderate the dependence on the frictional coefficient and reduce to the abrasion of C/C composites.Keywords: C/C composites, friction coefficient, wear, SiC
Procedia PDF Downloads 344581 TimeTune: Personalized Study Plans Generation with Google Calendar Integration
Authors: Chevon Fernando, Banuka Athuraliya
Abstract:
The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.Keywords: personalized learning, study planner, time management, calendar integration
Procedia PDF Downloads 48580 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit
Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana
Abstract:
Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification
Procedia PDF Downloads 155579 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates
Authors: Bongs Lainjo
Abstract:
Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum
Procedia PDF Downloads 175578 Numerical Simulation of the Flowing of Ice Slurry in Seawater Pipe of Polar Ships
Authors: Li Xu, Huanbao Jiang, Zhenfei Huang, Lailai Zhang
Abstract:
In recent years, as global warming, the sea-ice extent of North Arctic undergoes an evident decrease and Arctic channel has attracted the attention of shipping industry. Ice crystals existing in the seawater of Arctic channel which enter the seawater system of the ship with the seawater were found blocking the seawater pipe. The appearance of cooler paralysis, auxiliary machine error and even ship power system paralysis may be happened if seriously. In order to reduce the effect of high temperature in auxiliary equipment, seawater system will use external ice-water to participate in the cooling cycle and achieve the state of its flow. The distribution of ice crystals in seawater pipe can be achieved. As the ice slurry system is solid liquid two-phase system, the flow process of ice-water mixture is very complex and diverse. In this paper, the flow process in seawater pipe of ice slurry is simulated with fluid dynamics simulation software based on k-ε turbulence model. As the ice packing fraction is a key factor effecting the distribution of ice crystals, the influence of ice packing fraction on the flowing process of ice slurry is analyzed. In this work, the simulation results show that as the ice packing fraction is relatively large, the distribution of ice crystals is uneven in the flowing process of the seawater which has such disadvantage as increase the possibility of blocking, that will provide scientific forecasting methods for the forming of ice block in seawater piping system. It has important significance for the reliability of the operating of polar ships in the future.Keywords: ice slurry, seawater pipe, ice packing fraction, numerical simulation
Procedia PDF Downloads 367577 Electricity Price Forecasting: A Comparative Analysis with Shallow-ANN and DNN
Authors: Fazıl Gökgöz, Fahrettin Filiz
Abstract:
Electricity prices have sophisticated features such as high volatility, nonlinearity and high frequency that make forecasting quite difficult. Electricity price has a volatile and non-random character so that, it is possible to identify the patterns based on the historical data. Intelligent decision-making requires accurate price forecasting for market traders, retailers, and generation companies. So far, many shallow-ANN (artificial neural networks) models have been published in the literature and showed adequate forecasting results. During the last years, neural networks with many hidden layers, which are referred to as DNN (deep neural networks) have been using in the machine learning community. The goal of this study is to investigate electricity price forecasting performance of the shallow-ANN and DNN models for the Turkish day-ahead electricity market. The forecasting accuracy of the models has been evaluated with publicly available data from the Turkish day-ahead electricity market. Both shallow-ANN and DNN approach would give successful result in forecasting problems. Historical load, price and weather temperature data are used as the input variables for the models. The data set includes power consumption measurements gathered between January 2016 and December 2017 with one-hour resolution. In this regard, forecasting studies have been carried out comparatively with shallow-ANN and DNN models for Turkish electricity markets in the related time period. The main contribution of this study is the investigation of different shallow-ANN and DNN models in the field of electricity price forecast. All models are compared regarding their MAE (Mean Absolute Error) and MSE (Mean Square) results. DNN models give better forecasting performance compare to shallow-ANN. Best five MAE results for DNN models are 0.346, 0.372, 0.392, 0,402 and 0.409.Keywords: deep learning, artificial neural networks, energy price forecasting, turkey
Procedia PDF Downloads 292576 Vibration Transmission across Junctions of Walls and Floors in an Apartment Building: An Experimental Investigation
Authors: Hugo Sampaio Libero, Max de Castro Magalhaes
Abstract:
The perception of sound radiated from a building floor is greatly influenced by the rooms in which it is immersed and by the position of both listener and source. The main question that remains unanswered is related to the influence of the source position on the sound power radiated by a complex wall-floor system in buildings. This research is concerned with the investigation of vibration transmission across walls and floors in buildings. It is primarily based on the determination of vibration reduction index via experimental tests. Knowledge of this parameter may help in predicting noise and vibration propagation in building components. First, the physical mechanisms involving vibration transmission across structural junctions are described. An experimental setup is performed to aid this investigation. The experimental tests have shown that the vibration generation in the walls and floors is directed related to their size and boundary conditions. It is also shown that the vibration source position can affect the overall vibration spectrum significantly. Second, the characteristics of the noise spectra inside the rooms due to an impact source (tapping machine) are also presented. Conclusions are drawn for the general trend of vibration and noise spectrum of the structural components and rooms, respectively. In summary, the aim of this paper is to investigate the vibro-acoustical behavior of building floors and walls under floor impact excitation. The impact excitation was at distinct positions on the slab. The analysis has highlighted the main physical characteristics of the vibration transmission mechanism.Keywords: vibration transmission, vibration reduction index, impact excitation, experimental tests
Procedia PDF Downloads 93575 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series
Procedia PDF Downloads 143574 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules
Authors: Mohsen Maraoui
Abstract:
In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing
Procedia PDF Downloads 141573 Effect of Nitrogen-Based Cryotherapy on the Calf Muscle Spasticity in Stroke Patients
Authors: Engi E. I. Sarhan, Usama M. Rashad, Ibrahim M. I. Hamoda, Mohammed K. Mohamed
Abstract:
Background: This study aimed to know the effect of nitrogen-based cryotherapy on the spasticity of calf muscle in stroke patients. Patients were selected from the outpatient clinic of Neurology, Al-Mansoura general hospital, Al-Mansoura University. Subjects and methods: Thirty Stroke Patients of both sexes ranged from 45 to 60 years old were divided randomly into two equal groups, a study group (A) received a nitrogen-based cryotherapy, a selective physical therapy program and ankle foot orthosis (AFO), while as patients in control group (B) received the same program and AFO only. The treatment duration was three times per week for four weeks for both groups. We assessed spasticity of calf muscle before and after treatment subjectively using modified Ashworth scale (MAS) and objectively via measuring H / M ratio on electromyography machine. We also assessed ankle dorsiflexion ROM objectively using two dimensions motion analysis (2D). Results: After treatment, there was a highly significant improvement in the study group compared to the control group regarding the score of MAS, no significant difference in the study group compared to the control group regarding the readings of H / M ratio, highly significant improvement in the study group compared to the control group regarding the 2D motion analysis findings. Conclusion: This modality considers effective in reducing spasticity in the calf muscle and improving ankle dorsiflexion of the affected limb.Keywords: ankle foot orthosis, nitrogen-based cryotherapy, stroke, spasticity
Procedia PDF Downloads 202572 Defining Death and Dying in Relation to Information Technology and Advances in Biomedicine
Authors: Evangelos Koumparoudis
Abstract:
The definition of death is a deep philosophical question, and no single meaning can be ascribed to it. This essay focuses on the ontological, epistemological, and ethical aspects of death and dying in view of technological progress in information technology and biomedicine. It starts with the ad hoc 1968 Harvard committee that proposed that the criterion for the definition of death be irreversible coma and then refers to the debate over the whole brain death formula, emphasizing the integrated function of the organism and higher brain formula, taking consciousness and personality as essential human characteristics. It follows with the contribution of information technology in personalized and precision medicine and anti-aging measures aimed at life prolongation. It also touches on the possibility of the creation of human-machine hybrids and how this raises ontological and ethical issues that concern the “cyborgization” of human beings and the conception of the organism and personhood based on a post/transhumanist essence, and, furthermore, if sentient AI capable of autonomous decision-making that might even surpass human intelligence (singularity, superintelligence) deserves moral or legal personhood. Finally, there is the question as to whether death and dying should be redefined at a transcendent level, which is reinforced by already-existing technologies of “virtual after-” life and the possibility of uploading human minds. In the last section, I refer to the current (and future) applications of nanomedicine in diagnostics, therapeutics, implants, and tissue engineering as well as the aspiration to “immortality” by cryonics. The definition of death is reformulated since age and disease elimination may be realized, and the criterion of irreversibility may be challenged.Keywords: death, posthumanism, infomedicine, nanomedicine, cryonics
Procedia PDF Downloads 70571 Effect of Fiber Orientation on the Mechanical Properties of Fabricated Plate Using Basalt Fiber
Authors: Sharmili Routray, Kishor Chandra Biswal
Abstract:
The use of corrosion resistant fiber reinforced polymer (FRP) reinforcement is beneficial in structures particularly those exposed to deicing salts, and/or located in highly corrosive environment. Generally Glass, Carbon and Aramid fibers are used for the strengthening purpose of the structures. Due to the necessities of low weight and high strength materials, it is required to find out the suitable substitute with low cost. Recent developments in fiber production technology allow the strengthening of structures using Basalt fiber which is made from basalt rock. Basalt fiber has good range of thermal performance, high tensile strength, resistance to acids, good electro‐magnetic properties, inert nature, resistance to corrosion, radiation and UV light, vibration and impact loading. This investigation focuses on the effect of fibre content and fiber orientation of basalt fibre on mechanical properties of the fabricated composites. Specimen prepared with unidirectional Basalt fabric as reinforcing materials and epoxy resin as a matrix in polymer composite. In this investigation different fiber orientation are taken and the fabrication is done by hand lay-up process. The variation of the properties with the increasing number of plies of fiber in the composites is also studied. Specimens are subjected to tensile strength test and the failure of the composite is examined with the help of INSTRON universal testing Machine (SATEC) of 600 kN capacities. The average tensile strength and modulus of elasticity of BFRP plates are determined from the test Program.Keywords: BFRP, fabrication, Fiber Reinforced Polymer (FRP), strengthening
Procedia PDF Downloads 292570 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 47569 Analysis of the Level of Production Failures by Implementing New Assembly Line
Authors: Joanna Kochanska, Dagmara Gornicka, Anna Burduk
Abstract:
The article examines the process of implementing a new assembly line in a manufacturing enterprise of the household appliances industry area. At the initial stages of the project, a decision was made that one of its foundations should be the concept of lean management. Because of that, eliminating as many errors as possible in the first phases of its functioning was emphasized. During the start-up of the line, there were identified and documented all production losses (from serious machine failures, through any unplanned downtime, to micro-stops and quality defects). During 6 weeks (line start-up period), all errors resulting from problems in various areas were analyzed. These areas were, among the others, production, logistics, quality, and organization. The aim of the work was to analyze the occurrence of production failures during the initial phase of starting up the line and to propose a method for determining their critical level during its full functionality. There was examined the repeatability of the production losses in various areas and at different levels at such an early stage of implementation, by using the methods of statistical process control. Based on the Pareto analysis, there were identified the weakest points in order to focus improvement actions on them. The next step was to examine the effectiveness of the actions undertaken to reduce the level of recorded losses. Based on the obtained results, there was proposed a method for determining the critical failures level in the studied areas. The developed coefficient can be used as an alarm in case of imbalance of the production, which is caused by the increased failures level in production and production support processes in the period of the standardized functioning of the line.Keywords: production failures, level of production losses, new production line implementation, assembly line, statistical process control
Procedia PDF Downloads 128568 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process
Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum
Abstract:
Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact
Procedia PDF Downloads 197567 Effect of B2O3 Addition on Sol-gel Synthesized 45S5 Bioglass
Abstract:
Ceramics or glass ceramics with the property of bone bonding at the nearby tissues and producing possible bone in growth are known to be bioactive. The most extensively used glass in this context is 45S5 which is a silica based bioglass mostly explored in the field of tissue engineering as scaffolds for bone repair. Nowadays, the borate based bioglass are being utilized in orthopedic area largely due to its superior bioactivity with the formation of bone bonding. An attempt has been made, in the present study, to observe the effect of B2O3 addition in 45S5 glass and perceive its consequences on the thermal, mechanical and biological properties. The B2O3 was added in 1, 2.5, and 5 wt% with simultaneous reduction in the silica content of the 45S5 composition. The borate based bioglass has been synthesized by the means of sol-gel route. The synthesized powders were then thermally analyzed by DSC-TG. The as synthesized powders were then calcined at 600ºC for 2hrs. The calcined powders were then pressed into pellets followed by sintering at 850ºC with a holding time of 2hrs. The phase analysis and the microstructural analysis of the as synthesized and calcined powder glass samples and the sintered glass samples were being carried out using XRD and FESEM respectively. The formation of hydroxyapatite layer was performed by immersing the sintered samples in the simulated body fluid (SBF) and mechanical property has been tested for the sintered samples by universal testing machine (UTM). The sintered samples showed the presence of sodium calcium silicate phase while the formation of hydroxyapaptite takes place for SBF immersed samples. The formation of hydroxyapatite is more pronounced in case of borated based glass samples instead of 45S5.Keywords: 45S5 bioglass, bioactive, borate, hydroxyapatite, sol-gel synthesis
Procedia PDF Downloads 256