Search results for: command and data handling (CDH)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24678

Search results for: command and data handling (CDH)

24618 Process Data-Driven Representation of Abnormalities for Efficient Process Control

Authors: Hyun-Woo Cho

Abstract:

Unexpected operational events or abnormalities of industrial processes have a serious impact on the quality of final product of interest. In terms of statistical process control, fault detection and diagnosis of processes is one of the essential tasks needed to run the process safely. In this work, nonlinear representation of process measurement data is presented and evaluated using a simulation process. The effect of using different representation methods on the diagnosis performance is tested in terms of computational efficiency and data handling. The results have shown that the nonlinear representation technique produced more reliable diagnosis results and outperforms linear methods. The use of data filtering step improved computational speed and diagnosis performance for test data sets. The presented scheme is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. Thus this scheme helps to reduce the sensitivity of empirical models to noise.

Keywords: fault diagnosis, nonlinear technique, process data, reduced spaces

Procedia PDF Downloads 221
24617 The Psychometric Properties of an Instrument to Estimate Performance in Ball Tasks Objectively

Authors: Kougioumtzis Konstantin, Rylander Pär, Karlsteen Magnus

Abstract:

Ball skills as a subset of fundamental motor skills are predictors for performance in sports. Currently, most tools evaluate ball skills utilizing subjective ratings. The aim of this study was to examine the psychometric properties of a newly developed instrument to objectively measure ball handling skills (BHS-test) utilizing digital instrument. Participants were a convenience sample of 213 adolescents (age M = 17.1 years, SD =3.6; 55% females, 45% males) recruited from upper secondary schools and invited to a sports hall for the assessment. The 8-item instrument incorporated both accuracy-based ball skill tests and repetitive-performance tests with a ball. Testers counted performance manually in the four tests (one throwing and three juggling tasks). Furthermore, assessment was technologically enhanced in the other four tests utilizing a ball machine, a Kinect camera and balls with motion sensors (one balancing and three rolling tasks). 3D printing technology was used to construct equipment, while all results were administered digitally with smart phones/tablets, computers and a specially constructed application to send data to a server. The instrument was deemed reliable (α = .77) and principal component analysis was used in a random subset (53 of the participants). Furthermore, latent variable modeling was employed to confirm the structure with the remaining subset (160 of the participants). The analysis showed good factorial-related validity with one factor explaining 57.90 % of the total variance. Four loadings were larger than .80, two more exceeded .76 and the other two were .65 and .49. The one factor solution was confirmed by a first order model with one general factor and an excellent fit between model and data (χ² = 16.12, DF = 20; RMSEA = .00, CI90 .00–.05; CFI = 1.00; SRMR = .02). The loadings on the general factor ranged between .65 and .83. Our findings indicate good reliability and construct validity for the BHS-test. To develop the instrument further, more studies are needed with various age-groups, e.g. children. We suggest using the BHS-test for diagnostic or assessment purpose for talent development and sports participation interventions that focus on ball games.

Keywords: ball-handling skills, ball-handling ability, technologically-enhanced measurements, assessment

Procedia PDF Downloads 61
24616 Application of the Material Point Method as a New Fast Simulation Technique for Textile Composites Forming and Material Handling

Authors: Amir Nazemi, Milad Ramezankhani, Marian Kӧrber, Abbas S. Milani

Abstract:

The excellent strength to weight ratio of woven fabric composites, along with their high formability, is one of the primary design parameters defining their increased use in modern manufacturing processes, including those in aerospace and automotive. However, for emerging automated preform processes under the smart manufacturing paradigm, complex geometries of finished components continue to bring several challenges to the designers to cope with manufacturing defects on site. Wrinklinge. g. is a common defectoccurring during the forming process and handling of semi-finished textile composites. One of the main reasons for this defect is the weak bending stiffness of fibers in unconsolidated state, causing excessive relative motion between them. Further challenges are represented by the automated handling of large-area fiber blanks with specialized gripper systems. For fabric composites forming simulations, the finite element (FE)method is a longstanding tool usedfor prediction and mitigation of manufacturing defects. Such simulations are predominately meant, not only to predict the onset, growth, and shape of wrinkles but also to determine the best processing condition that can yield optimized positioning of the fibers upon forming (or robot handling in the automated processes case). However, the need for use of small-time steps via explicit FE codes, facing numerical instabilities, as well as large computational time, are among notable drawbacks of the current FEtools, hindering their extensive use as fast and yet efficient digital twins in industry. This paper presents a novel woven fabric simulation technique through the application of the material point method (MPM), which enables the use of much larger time steps, facing less numerical instabilities, hence the ability to run significantly faster and efficient simulationsfor fabric materials handling and forming processes. Therefore, this method has the ability to enhance the development of automated fiber handling and preform processes by calculating the physical interactions with the MPM fiber models and rigid tool components. This enables the designers to virtually develop, test, and optimize their processes based on either algorithmicor Machine Learning applications. As a preliminary case study, forming of a hemispherical plain weave is shown, and the results are compared to theFE simulations, as well as experiments.

Keywords: material point method, woven fabric composites, forming, material handling

Procedia PDF Downloads 148
24615 Impact of COVID-19 Pandemic in the European Air Transport Command during 2020-2021

Authors: Martin Gascón Hove, Ralph Vermeltfoort, Alessandro Fiorini, Erwan Dulaurent, Henning von Perbandt

Abstract:

Introduction: The outbreak of the COVID-19 pandemic has completely changed the global health situation, with more than 400 million cases published and over 5 million deaths. European Air Transport Command (EATC) is integrated by seven nations, and among its capabilities is that of aeromedical evacuation (AM). Material and methods: Impact of novel coronavirus was analysed based on the number and characteristics of patients and executed missions within EATC and, particularly by Spain, during the biennium 2020-2021. Results: One thousand sixty patients were transported in 186 missions. Neither death nor disease contagion was reported during AM performances. Military cases transferred were 986, mostly routine priority (91,4%), and 74 were civilians, who were transported in 17 missions, and 81,1% of which were categorized as urgent. Niger led the list of original countries, with 191 evacuated patients. 76,1% of requests came from Italy and Germany. Airbus A310 was the most used aircraft (32,2%). Germany transported 222 patients of another nationality, while Spain executed eight missions and repatriated 68 cases, 58 of which were from Mali. Conclusions: COVID-19 has led to a surged number of evacuated patients inside EATC, which has proven to be a safe and effective means of transportation, even in critical cases. Spain has gained prominence since its annexation in 2015.

Keywords: COVID-19, SARS-CoV-2, pandemic, aviation, Spain

Procedia PDF Downloads 107
24614 Spatially Random Sampling for Retail Food Risk Factors Study

Authors: Guilan Huang

Abstract:

In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.

Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling

Procedia PDF Downloads 321
24613 Managing the Effects of Wet Coal on Generation in Thermal Power Station: A Case Study

Authors: Ravindra Gohane, S. V. Deshmukh

Abstract:

The coal acts as a fuel on a very large scale. Coal forms the basis of any thermal power plant. Different types of coal are available for utilization. The moisture content, volatile nature and ash content determines the type of the coal. Out of these moisture plays a very important part as it is present naturally within the coal and is added while handling the coal and is termed as wet coal. The problems of wet coal are many and more particularly during rainy season such as generation loss, jamming of crusher, reduction in calorific value, transportation of coal etc. Efforts are made to resolve the problems arising out of wet coal worldwide. This paper highlights the issue of resolving the problem due to wet coal with the help of a case study involving installation of V-type wiper on the conveyer belt.

Keywords: coal handling plant, wet coal, v-type, generation

Procedia PDF Downloads 323
24612 Long- and Short-Term Impacts of COVID-19 and Gold Price on Price Volatility: A Comparative Study of MIDAS and GARCH-MIDAS Models for USA Crude Oil

Authors: Samir K. Safi

Abstract:

The purpose of this study was to compare the performance of two types of models, namely MIDAS and MIDAS-GARCH, in predicting the volatility of crude oil returns based on gold price returns and the COVID-19 pandemic. The study aimed to identify which model would provide more accurate short-term and long-term predictions and which model would perform better in handling the increased volatility caused by the pandemic. The findings of the study revealed that the MIDAS model performed better in predicting short-term and long-term volatility before the pandemic, while the MIDAS-GARCH model performed significantly better in handling the increased volatility caused by the pandemic. The study highlights the importance of selecting appropriate models to handle the complexities of real-world data and shows that the choice of model can significantly impact the accuracy of predictions. The practical implications of model selection and exploring potential methodological adjustments for future research will be highlighted and discussed.

Keywords: GARCH-MIDAS, MIDAS, crude oil, gold, COVID-19, volatility

Procedia PDF Downloads 19
24611 Programming Language Extension Using Structured Query Language for Database Access

Authors: Chapman Eze Nnadozie

Abstract:

Relational databases constitute a very vital tool for the effective management and administration of both personal and organizational data. Data access ranges from a single user database management software to a more complex distributed server system. This paper intends to appraise the use a programming language extension like structured query language (SQL) to establish links to a relational database (Microsoft Access 2013) using Visual C++ 9 programming language environment. The methodology used involves the creation of tables to form a database using Microsoft Access 2013, which is Object Linking and Embedding (OLE) database compliant. The SQL command is used to query the tables in the database for easy extraction of expected records inside the visual C++ environment. The findings of this paper reveal that records can easily be accessed and manipulated to filter exactly what the user wants, such as retrieval of records with specified criteria, updating of records, and deletion of part or the whole records in a table.

Keywords: data access, database, database management system, OLE, programming language, records, relational database, software, SQL, table

Procedia PDF Downloads 160
24610 Utilizing Minecraft Java Edition for the Application of Fire Disaster Procedures to Establish Fire Disaster Readiness for Grade 12 STEM students of DLSU-IS

Authors: Aravella Flores, Jose Rafael E. Sotelo, Luis Romulus Phillippe R. Javier, Josh Christian V. Nunez

Abstract:

This study focuses on analyzing the performance of Grade 12 STEM students of De La Salle University - Integrated School that has completed the Disaster Readiness and Risk Reduction course in handling fire hazards through Minecraft Java Edition. This platform is suitable because fire DRRR is challenging to learn in a practical setting as well as questionable with regard to supplementing the successful implementation of textbook knowledge into actual practice. The purpose of this study is to acknowledge whether Minecraft can be a suitable environment to familiarize oneself to fire DRRR. The objectives are achieved through utilizing Minecraft in simulating fire scenarios which allows the participants to freely act upon and practice fire DRRR. The experiment was divided into the grounding and validation phase, where researchers observed the performance of the participants in the simulation. A pre-simulation and post-simulation survey was given to acknowledge the change in participants’ perception of being able to utilize fire DRRR procedures and their vulnerabilities. The paired t-test was utilized, showing significant differences in the pre-simulation and post-simulation survey scores, thus, insinuating improved judgment of DRRR, lessening their vulnerabilities in the possibility of encountering a fire hazard. This research poses a model for future research which can gather more participants and dwell on more complex codes outside just command blocks and into the code lines of Minecraft itself.

Keywords: minecraft, DRRR, fire, disaster, simulation

Procedia PDF Downloads 96
24609 Air Handling Units Power Consumption Using Generalized Additive Model for Anomaly Detection: A Case Study in a Singapore Campus

Authors: Ju Peng Poh, Jun Yu Charles Lee, Jonathan Chew Hoe Khoo

Abstract:

The emergence of digital twin technology, a digital replica of physical world, has improved the real-time access to data from sensors about the performance of buildings. This digital transformation has opened up many opportunities to improve the management of the building by using the data collected to help monitor consumption patterns and energy leakages. One example is the integration of predictive models for anomaly detection. In this paper, we use the GAM (Generalised Additive Model) for the anomaly detection of Air Handling Units (AHU) power consumption pattern. There is ample research work on the use of GAM for the prediction of power consumption at the office building and nation-wide level. However, there is limited illustration of its anomaly detection capabilities, prescriptive analytics case study, and its integration with the latest development of digital twin technology. In this paper, we applied the general GAM modelling framework on the historical data of the AHU power consumption and cooling load of the building between Jan 2018 to Aug 2019 from an education campus in Singapore to train prediction models that, in turn, yield predicted values and ranges. The historical data are seamlessly extracted from the digital twin for modelling purposes. We enhanced the utility of the GAM model by using it to power a real-time anomaly detection system based on the forward predicted ranges. The magnitude of deviation from the upper and lower bounds of the uncertainty intervals is used to inform and identify anomalous data points, all based on historical data, without explicit intervention from domain experts. Notwithstanding, the domain expert fits in through an optional feedback loop through which iterative data cleansing is performed. After an anomalously high or low level of power consumption detected, a set of rule-based conditions are evaluated in real-time to help determine the next course of action for the facilities manager. The performance of GAM is then compared with other approaches to evaluate its effectiveness. Lastly, we discuss the successfully deployment of this approach for the detection of anomalous power consumption pattern and illustrated with real-world use cases.

Keywords: anomaly detection, digital twin, generalised additive model, GAM, power consumption, supervised learning

Procedia PDF Downloads 125
24608 Awareness about Work-Related Hazards Causing Musculoskeletal Disorders

Authors: Bintou Jobe

Abstract:

Musculo-skeletal disorders (MSDs) are injuries or disorders of the spine disc, muscle strains, and low back injuries. It remains a major cause of occupational illness. Findings: Due to poor grips during handling, it is possible for neck, shoulder, arm, knees, ankle, fingers, waist, lower back injuries, and other muscle joints to be affected. Pregnant women are more prone to physical and hormonal changes, which lead to the relaxation of supporting ligaments. MSD continues to pose a global concern due to its impact on workers worldwide. The prevalence of the disorder is high, according to research into the workforce in Europe and developing countries. The causes are characterized by long working hours, insufficient rest breaks, poor posture, repetitive motion, poor manual handling techniques, psychological stress, and poor nutrition. To prevent MSD, the design mainly involves avoiding and assessing the risk. However, clinical solutions, policy governance, and minimizing manual labour are also an alternative. In addition, eating a balanced diet and teamwork force are key to elements in minimising the risk. This review aims to raise awareness and promote cost effectiveness prevention and understanding of MSD through research and identify proposed solutions to recognise the underlying causes of MSDs in the construction sectors. The methodology involves a literature review approach, engaging with the policy landscape of MSD, synthesising publications on MSD and a wider range of academic publications. In conclusion, training on effective manual handling techniques should be considered, and Personal Protective Equipment should be a last resort. The implementation of training guidelines has yielded significant benefits.

Keywords: musculoskeletal disorder work related, MSD, manual handling, work hazards

Procedia PDF Downloads 34
24607 A Weighted Sum Particle Swarm Approach (WPSO) Combined with a Novel Feasibility-Based Ranking Strategy for Constrained Multi-Objective Optimization of Compact Heat Exchangers

Authors: Milad Yousefi, Moslem Yousefi, Ricarpo Poley, Amer Nordin Darus

Abstract:

Design optimization of heat exchangers is a very complicated task that has been traditionally carried out based on a trial-and-error procedure. To overcome the difficulties of the conventional design approaches especially when a large number of variables, constraints and objectives are involved, a new method based on a well-stablished evolutionary algorithm, particle swarm optimization (PSO), weighted sum approach and a novel constraint handling strategy is presented in this study. Since, the conventional constraint handling strategies are not effective and easy-to-implement in multi-objective algorithms, a novel feasibility-based ranking strategy is introduced which is both extremely user-friendly and effective. A case study from industry has been investigated to illustrate the performance of the presented approach. The results show that the proposed algorithm can find the near pareto-optimal with higher accuracy when it is compared to conventional non-dominated sorting genetic algorithm II (NSGA-II). Moreover, the difficulties of a trial-and-error process for setting the penalty parameters is solved in this algorithm.

Keywords: Heat exchanger, Multi-objective optimization, Particle swarm optimization, NSGA-II Constraints handling.

Procedia PDF Downloads 531
24606 Analysis of Cyber Activities of Potential Business Customers Using Neo4j Graph Databases

Authors: Suglo Tohari Luri

Abstract:

Data analysis is an important aspect of business performance. With the application of artificial intelligence within databases, selecting a suitable database engine for an application design is also very crucial for business data analysis. The application of business intelligence (BI) software into some relational databases such as Neo4j has proved highly effective in terms of customer data analysis. Yet what remains of great concern is the fact that not all business organizations have the neo4j business intelligence software applications to implement for customer data analysis. Further, those with the BI software lack personnel with the requisite expertise to use it effectively with the neo4j database. The purpose of this research is to demonstrate how the Neo4j program code alone can be applied for the analysis of e-commerce website customer visits. As the neo4j database engine is optimized for handling and managing data relationships with the capability of building high performance and scalable systems to handle connected data nodes, it will ensure that business owners who advertise their products at websites using neo4j as a database are able to determine the number of visitors so as to know which products are visited at routine intervals for the necessary decision making. It will also help in knowing the best customer segments in relation to specific goods so as to place more emphasis on their advertisement on the said websites.

Keywords: data, engine, intelligence, customer, neo4j, database

Procedia PDF Downloads 165
24605 The Analysis Fleet Operational Performance as an Indicator of Load and Haul Productivity

Authors: Linet Melisa Daubanes, Nhleko Monique Chiloane

Abstract:

The shovel-truck system is the most prevalent material handling system used in surface mining operations. Material handling entails the loading and hauling of material from production areas to dumping areas. The material handling process has operational delays that have a negative impact on the productivity of the load and haul fleet. Factors that may contribute to operational delays include shovel-truck mismatch, haul routes, machine breakdowns, extreme weather conditions, etc. The aim of this paper is to investigate factors that contribute to operational delays affecting the productivity of the load and haul fleet at the mine. Productivity is the measure of the effectiveness of producing products from a given quantity of units, the ratio of output to inputs. Productivity can be improved by producing more outputs with the same or fewer units and/or introducing better working methods etc. Several key performance indicators (KPI) for the evaluation of productivity will be discussed in this study. These KPIs include but are not limited to hauling conditions, bucket fill factor, cycle time, and utilization. The research methodology of this study is a combination of on-site time studies and observations. Productivity can be optimized by managing the factors that affect the operational performance of the haulage fleet.

Keywords: cycle time, fleet performance, load and haul, surface mining

Procedia PDF Downloads 161
24604 Exhaust Gas Cleaning Systems on Board Ships and Impact on Crews’ Health: A Feasibility Study Protocol

Authors: Despoina Andrioti Bygvraa, Ida-Maja Hassellöv, George Charalambous

Abstract:

Exhaust gas cleaning systems, also known as scrubbers, are today widely used to allow for the use of High Sulphur Heavy Fuel Oil and still comply with the regulations limiting sulphur content in marine fuels. There are extensive concerns about environmental consequences, especially in the Baltic Sea, from the wide-scale use of scrubbers, as the wash water is acidic (ca pH 3) and contains high concentrations of toxic, carcinogenic, and mutagenic substances. The aim of this feasibility study is to investigate the potential adverse effects on seafarers’ health with the ultimate goal of raising awareness of chemical-related health and safety issues in the shipping environment. The project got funding from the Swedish Foundation. The team will extend previously compiled data on scrubber wash water concentrations of hazardous substances and pH to include the use of strong base in closed-loop scrubbers, and scoping assessment on handling and disposing practices. Based on the findings (a), a systematic review of risk assessment will follow to show the risk of exposures, the establishment of the hazardous levels for human health as well as the respective prevention practices. In addition, the researchers will perform (b) a systematic review to identify facilitators and barriers of the crew on compliance with the safe handling of chemicals. The study will run for 12 months, delivering (a) a risk assessment inventory with risk exposures and (b) a course description of safe handling practices. This feasibility study could provide valuable knowledge on how pollutants found in scrubbers should be considered from a human health perspective to facilitate evidence-based informed decisions in future technology- and policy development to make shipping a safer, healthier, and more attractive workplace.

Keywords: health and safety, seafarers, scrubbers, chemicals, risk exposures

Procedia PDF Downloads 17
24603 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

Authors: Ruchika Malhotra, Megha Khanna

Abstract:

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics

Procedia PDF Downloads 392
24602 Managing Incomplete PSA Observations in Prostate Cancer Data: Key Strategies and Best Practices for Handling Loss to Follow-Up and Missing Data

Authors: Madiha Liaqat, Rehan Ahmed Khan, Shahid Kamal

Abstract:

Multiple imputation with delta adjustment is a versatile and transparent technique for addressing univariate missing data in the presence of various missing mechanisms. This approach allows for the exploration of sensitivity to the missing-at-random (MAR) assumption. In this review, we outline the delta-adjustment procedure and illustrate its application for assessing the sensitivity to deviations from the MAR assumption. By examining diverse missingness scenarios and conducting sensitivity analyses, we gain valuable insights into the implications of missing data on our analyses, enhancing the reliability of our study's conclusions. In our study, we focused on assessing logPSA, a continuous biomarker in incomplete prostate cancer data, to examine the robustness of conclusions against plausible departures from the MAR assumption. We introduced several approaches for conducting sensitivity analyses, illustrating their application within the pattern mixture model (PMM) under the delta adjustment framework. This proposed approach effectively handles missing data, particularly loss to follow-up.

Keywords: loss to follow-up, incomplete response, multiple imputation, sensitivity analysis, prostate cancer

Procedia PDF Downloads 56
24601 Modern Imputation Technique for Missing Data in Linear Functional Relationship Model

Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, Rahmatullah Imon

Abstract:

Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in the LFRM. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.

Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators

Procedia PDF Downloads 363
24600 Using Electrical Impedance Tomography to Control a Robot

Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi

Abstract:

Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.

Keywords: electrical impedance tomography, EIT, surgeon robot, image processing of electrical impedance tomography

Procedia PDF Downloads 239
24599 Incidence of Orphans Neonatal Puppies Attend in Veterinary Hospital – Causes, Consequences and Mortality

Authors: Maria L. G. Lourenço, Keylla H. N. P. Pereira, Viviane Y. Hibaru, Fabiana F. Souza, João C. P. Ferreira, Simone B. Chiacchio, Luiz H. A. Machado

Abstract:

Orphaned is a risk factor for mortality in newborns since it is a condition with total or partial absence of maternal care that is essential for neonatal survival, including nursing (nutrition, the transference of passive immunity and hydration), warmth, urination, and defecation stimuli, and protection. The most common causes of mortality in orphans are related to lack of assistance, handling mistakes and infections. This study aims to describe the orphans rates in neonatal puppies, the main causes, and the mortality rates. The study included 735 neonates admitted to the Sao Paulo State University (UNESP) Veterinary Hospital, Botucatu, Sao Paulo, Brazil, between January 2018 and November 2019. The orphans rate was 43.4% (319/735) of all neonates included, and the main causes for orphaned were related to maternal agalactia/hypogalactia (23.5%, 75/319); numerous litter (15.7%, 50/319), toxic milk syndrome due to maternal mastitis (14.4%, 46/319), absence of suction/weak neonate (12.2%, 39/319), maternal disease (9.4%, 30/319), cleft palate/lip (6.3%, 20/319), maternal death (5.9%, 19/319), prematurity (5.3%, 17/319), rejection/failure in maternal instinct (3.8%, 12/319) and abandonment by the owner/separation of mother and neonate (3.5%, 11/319). The main consequences of orphaned observed in the admitted neonates were hypoglycemia, hypothermia, dehydration, aspiration pneumonia, wasting syndrome, failure in the transference of passive immunity, infections and sepsis, which happened due to failure of identifying the problem early, lack of adequate assistance, negligence and handling mistakes by the owner. The total neonatal mortality rate was 8% (59/735) and the neonatal mortality rate among orphans was 18.5% (59/319). The orphaned and mortality rates were considered high, but even higher rates may be observed in locations without adequate neonatal assistance and owner orientation. The survival of these patients is related to constant monitoring of the litter, early diagnosis and assistance, and the implementation of effective handling for orphans. Understanding the correct handling for neonates and instructing the owners regarding proper handling are essential to minimize the consequences of orphaned and the mortality rates.

Keywords: orphans, neonatal care, puppies, newborn dogs

Procedia PDF Downloads 222
24598 Optimizing Pick and Place Operations in a Simulated Work Cell for Deformable 3D Objects

Authors: Troels Bo Jørgensen, Preben Hagh Strunge Holm, Henrik Gordon Petersen, Norbert Kruger

Abstract:

This paper presents a simulation framework for using machine learning techniques to determine robust robotic motions for handling deformable objects. The main focus is on applications in the meat sector, which mainly handle three-dimensional objects. In order to optimize the robotic handling, the robot motions have been parameterized in terms of grasp points, robot trajectory and robot speed. The motions are evaluated based on a dynamic simulation environment for robotic control of deformable objects. The evaluation indicates certain parameter setups, which produce robust motions in the simulated environment, and based on a visual analysis indicate satisfactory solutions for a real world system.

Keywords: deformable objects, robotic manipulation, simulation, real world system

Procedia PDF Downloads 255
24597 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features

Authors: Stylianos Kampakis

Abstract:

This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.

Keywords: neural networks, feature selection, regularization, aggressive reweighting

Procedia PDF Downloads 425
24596 Pre-Analytical Laboratory Performance Evaluation Utilizing Quality Indicators between Private and Government-Owned Hospitals Affiliated to University of Santo Tomas

Authors: A. J. Francisco, K. C. Gallosa, R. J. Gasacao, J. R. Ros, B. J. Viado

Abstract:

The study focuses on the use of quality indicators (QI)s based on the standards made by the (IFCC), that could effectively identify and minimize errors occurring throughout the total testing process (TTP), in order to improve patient safety. The study was conducted through a survey questionnaire that was given to a random sample of 19 respondents (eight privately-owned and eleven government-owned hospitals), mainly CMTs, MTs, and Supervisors from UST-affiliated hospitals. The pre-analytical laboratory errors, which include misidentification errors, transcription errors, sample collection errors and sample handling and transportation errors, were considered as variables according to the IFCC WG-LEPS. Data gathered were analyzed using the Mann-Whitney U test, Percentile, Linear Regression, Percentage, and Frequency. The laboratory performance of both hospitals is High level. There is no significant difference between the laboratory performance between the two stated variables. Moreover, among the four QIs, sample handling and transportation errors contributed most to the difference between the two variables. Outcomes indicate satisfactory performance between both variables. However, in order to ensure high-quality and efficient laboratory operation, constant vigilance and improvements in pre-analytical QI are still needed. Expanding the coverage of the study, the inclusion of other phases, utilization of parametric tests are recommended.

Keywords: pre-analytical phase, quality indicators, laboratory performance, pre-analytical error

Procedia PDF Downloads 115
24595 Automated End-to-End Pipeline Processing Solution for Autonomous Driving

Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi

Abstract:

Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.

Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing

Procedia PDF Downloads 74
24594 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 404
24593 Interactive Shadow Play Animation System

Authors: Bo Wan, Xiu Wen, Lingling An, Xiaoling Ding

Abstract:

The paper describes a Chinese shadow play animation system based on Kinect. Users, without any professional training, can personally manipulate the shadow characters to finish a shadow play performance by their body actions and get a shadow play video through giving the record command to our system if they want. In our system, Kinect is responsible for capturing human movement and voice commands data. Gesture recognition module is used to control the change of the shadow play scenes. After packaging the data from Kinect and the recognition result from gesture recognition module, VRPN transmits them to the server-side. At last, the server-side uses the information to control the motion of shadow characters and video recording. This system not only achieves human-computer interaction, but also realizes the interaction between people. It brings an entertaining experience to users and easy to operate for all ages. Even more important is that the application background of Chinese shadow play embodies the protection of the art of shadow play animation.

Keywords: hadow play animation, Kinect, gesture recognition, VRPN, HCI

Procedia PDF Downloads 370
24592 An Adaptive Oversampling Technique for Imbalanced Datasets

Authors: Shaukat Ali Shahee, Usha Ananthakumar

Abstract:

A data set exhibits class imbalance problem when one class has very few examples compared to the other class, and this is also referred to as between class imbalance. The traditional classifiers fail to classify the minority class examples correctly due to its bias towards the majority class. Apart from between-class imbalance, imbalance within classes where classes are composed of a different number of sub-clusters with these sub-clusters containing different number of examples also deteriorates the performance of the classifier. Previously, many methods have been proposed for handling imbalanced dataset problem. These methods can be classified into four categories: data preprocessing, algorithmic based, cost-based methods and ensemble of classifier. Data preprocessing techniques have shown great potential as they attempt to improve data distribution rather than the classifier. Data preprocessing technique handles class imbalance either by increasing the minority class examples or by decreasing the majority class examples. Decreasing the majority class examples lead to loss of information and also when minority class has an absolute rarity, removing the majority class examples is generally not recommended. Existing methods available for handling class imbalance do not address both between-class imbalance and within-class imbalance simultaneously. In this paper, we propose a method that handles between class imbalance and within class imbalance simultaneously for binary classification problem. Removing between class imbalance and within class imbalance simultaneously eliminates the biases of the classifier towards bigger sub-clusters by minimizing the error domination of bigger sub-clusters in total error. The proposed method uses model-based clustering to find the presence of sub-clusters or sub-concepts in the dataset. The number of examples oversampled among the sub-clusters is determined based on the complexity of sub-clusters. The method also takes into consideration the scatter of the data in the feature space and also adaptively copes up with unseen test data using Lowner-John ellipsoid for increasing the accuracy of the classifier. In this study, neural network is being used as this is one such classifier where the total error is minimized and removing the between-class imbalance and within class imbalance simultaneously help the classifier in giving equal weight to all the sub-clusters irrespective of the classes. The proposed method is validated on 9 publicly available data sets and compared with three existing oversampling techniques that rely on the spatial location of minority class examples in the euclidean feature space. The experimental results show the proposed method to be statistically significantly superior to other methods in terms of various accuracy measures. Thus the proposed method can serve as a good alternative to handle various problem domains like credit scoring, customer churn prediction, financial distress, etc., that typically involve imbalanced data sets.

Keywords: classification, imbalanced dataset, Lowner-John ellipsoid, model based clustering, oversampling

Procedia PDF Downloads 390
24591 Development of Quasi Real-Time Comprehensive System for Earthquake Disaster

Authors: Zhi Liu, Hui Jiang, Jin Li, Kunhao Chen, Langfang Zhang

Abstract:

Fast acquisition of the seismic information and accurate assessment of the earthquake disaster is the key problem for emergency rescue after a destructive earthquake. In order to meet the requirements of the earthquake emergency response and rescue for the cities and counties, a quasi real-time comprehensive evaluation system for earthquake disaster is developed. Based on monitoring data of Micro-Electro-Mechanical Systems (MEMS) strong motion network, structure database of a county area and the real-time disaster information by the mobile terminal after an earthquake, fragility analysis method and dynamic correction algorithm are synthetically obtained in the developed system. Real-time evaluation of the seismic disaster in the county region is finally realized to provide scientific basis for seismic emergency command, rescue and assistant decision.

Keywords: quasi real-time, earthquake disaster data collection, MEMS accelerometer, dynamic correction, comprehensive evaluation

Procedia PDF Downloads 184
24590 A Reasoning Method of Cyber-Attack Attribution Based on Threat Intelligence

Authors: Li Qiang, Yang Ze-Ming, Liu Bao-Xu, Jiang Zheng-Wei

Abstract:

With the increasing complexity of cyberspace security, the cyber-attack attribution has become an important challenge of the security protection systems. The difficult points of cyber-attack attribution were forced on the problems of huge data handling and key data missing. According to this situation, this paper presented a reasoning method of cyber-attack attribution based on threat intelligence. The method utilizes the intrusion kill chain model and Bayesian network to build attack chain and evidence chain of cyber-attack on threat intelligence platform through data calculation, analysis and reasoning. Then, we used a number of cyber-attack events which we have observed and analyzed to test the reasoning method and demo system, the result of testing indicates that the reasoning method can provide certain help in cyber-attack attribution.

Keywords: reasoning, Bayesian networks, cyber-attack attribution, Kill Chain, threat intelligence

Procedia PDF Downloads 412
24589 Low Back Pain and Patients Lifting Behaviors among Nurses Working in Al Sadairy Hospital, Aljouf

Authors: Fatma Abdel Moneim Al Tawil

Abstract:

Low back pain (LBP) among nurses has been the subject of research studies worldwide. However, evidence of the influence of patients lifting behaviors and LBP among nurses in Saudi Arabia remains scarce. The purpose of this study was to investigate the relationship between LBP and nurses lifting behaviors. LBP questionnaire was distributed to 100 nurses working in Alsadairy Hospital distributed as Emergency unit(9),Coronary Care unit (9), Intensive Care Unit (7), Dialysis unit (30), Burn unit (5), surgical unit (11), Medical (14) and, X-ray unit (15). The questionnaire included demographic data, attitude scale, Team work scale, Back pain history and Knowledge scale. Regarding to emergency unit, there is appositive significant relation between teamwork scale and Knowledge as r = (0.807) and P =0.05. Regarding to ICU unit, there is a positive significant relation between teamwork scale and attitude scale as r= (0.781) and P =0.05. Regarding to Dialysis unit, there is a positive significant relation between attitude scale and teamwork scale as r=(0.443) and P =0.05. The findings suggest enhanced awareness of occupational safety with safe patient handling practices among nursing students must be emphasized and integrated into their educational curriculum. Moreover, back pain prevention program should incorporate the promotion of an active lifestyle and fitness training the implementation of institutional patient handling policies.

Keywords: low back pain, lifting behaviors, nurses, team work

Procedia PDF Downloads 390