Search results for: missing data estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26524

Search results for: missing data estimation

25114 Burnout and Salivary Cortisol Among Laboratory Personnel in Klang Valley, Malaysia During COVID-19 Pandemic

Authors: Maznieda Mahjom, Rohaida Ismail, Masita Arip, Mohd Shaiful Azlan, Nor’Ashikin Othman, Hafizah Abdullah, nor Zahrin Hasran, Joshita Jothimanickam, Syaqilah Shawaluddin, Nadia Mohamad, Raheel Nazakat, Tuan Mohd Amin, Mizanurfakhri Ghazali, Rosmanajihah Mat Lazim

Abstract:

COVID-19 outbreak is particularly detrimental to the mental health of everyone as well as leaving a long devastating crisis in the healthcare sector. Daily increment of COVID-19 cases and close contact, necessitating the testing of a large number of samples, thus increasing the workload and burden to laboratory personnel. This study aims to determine the prevalence of personal-, work- and client-related burnout as well as to measure the concentration of salivary cortisol among laboratory personnel in the main laboratories in Klang Valley, Malaysia. This cross-sectional study was conducted in late 2021 and recruited a total of 404 respondents from three laboratories in Klang Valley, Malaysia. The level of burnout was assessed using Copenhagen Burnout Inventory (CBI) comprising three sub-dimensions of personal-, work- and client-related burnout. The cut-off score of 50% and above indicated possible burnout. Meanwhile, salivary cortisol was measured using a competitive enzyme immunoassay kit (Salimetrics, State College, PA, USA). Normal levels of salivary cortisol concentration in adults are within 0.094 to 1.551 μg/dl (morning) and can be none detected to 0.359 μg/dl (evening). The prevalence of personal-, work- and client-related burnout among laboratory personnel were 36.1%, 17.8% and 7.2% respectively. Meanwhile, the abnormal morning and evening cortisol concentration recorded were 29.5% and 21.8% excluding 6.9%-7.4% missing data. While the IgA level is normal for most of the respondents, which recorded at 95.53%. Laboratory personnel were at risk of suffering burnout during the COVID-19 pandemic. Thus, mental health programs need to be addressed at the department and hospital level by regularly screening healthcare workers and designing an intervention program. It is also vital to improve the coping skills of laboratory personnel by increasing the awareness of good coping skill techniques. The training must be in an innovative way to ensure that the lab personnel can internalise the technique and practise it in real life.

Keywords: burnout, COVID-19, laborotary personnel, salivary cortisol

Procedia PDF Downloads 71
25113 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 265
25112 A Review of Literature on Theories of Construction Accident Causation Models

Authors: Samuel Opeyemi Williams, Razali Bin Adul Hamid, M. S. Misnan, Taki Eddine Seghier, D. I. Ajayi

Abstract:

Construction sites are characterized with occupational risks. Review of literature on construction accidents reveals that a lot of theories have been propounded over the years by different theorists, coupled with multifarious models developed by different proponents at different times. Accidents are unplanned events that are prominent in construction sites, involving materials, objects and people with attendant damages, loses and injuries. Models were developed to investigate the causations of accident with the aim of preventing its occurrence. Though, some of these theories were criticized, most especially, the Heinrich Domino theory, being mostly faulted for placing much blame on operatives rather than the management. The purpose of this paper is to unravel the significant construction accident causation theories and models for the benefit of understanding of the theories, and consequently enabling construction stakeholders identify the possible potential hazards on construction sites, as all stakeholders have significant roles to play in preventing accident. Accidents are preventable; hence, understanding the risk factors of accident and the causation theories paves way for its prevention. However, findings reveal that still some gaps missing in the existing models, while it is recommended that further research can be made in order to develop more models in order to maintain zero accident on construction sites.

Keywords: domino theory, construction site, site safety, accident causation model

Procedia PDF Downloads 304
25111 Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation-Based Approach

Authors: Sujoy Das, M. M. Ghosh

Abstract:

The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solid-solid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulse-like pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.

Keywords: brownian dynamics, molecular dynamics, nanofluid, thermal conductivity

Procedia PDF Downloads 371
25110 Effect of Nicorandil, Bone Marrow-Derived Mesenchymal Stem Cells and Their Combination in Isoproterenol-Induced Heart Failure in Rats

Authors: Sarah Elsayed Mohammed, Lamiaa Ahmed Ahmed, Mahmoud Mohammed Khattab

Abstract:

Aim: The aim of the present study was to investigate whether combined nicorandil and bone marrow-derived mesenchymal stem cells (BMDMSC) treatment could offer an additional benefit in ameliorating isoproterenol (ISO)-induced heart failure in rats. Methods: ISO (85 and 170 mg/kg/day) was injected subcutaneously for 2 successive days, respectively. By day 3, electrocardiographic changes were recorded and serum was separated for determination of CK-MB level for confirmation of myocardial damage. Nicorandil (3 mg/kg/day) was then given orally with or without a single i.v. BMDMSC administration. Electrocardiography and echocardiography were recorded 2 weeks after beginning of treatment. Rats were then sacrificed and ventricles were isolated for estimation of vascular endothelial growth factor (VEGF), tumor necrosis factor-alpha (TNF-α) and transforming growth factor-beta (TGF-β) contents, caspase-3 activity as well as inducible nitric oxide synthase (iNOS) and connexin-43 protein expressions. Moreover, histological analysis of myocardial fibrosis was performed and cryosections were done for estimation of homing of BMDMSC. Results: ISO induced a significant increase in ventricles/body weight ratio, left ventricular end diastolic (LVEDD) and systolic dimensions (LVESD), ST segment and QRS duration. Moreover, myocardial fibrosis as well as VEGF, TNF-α and TGF-β contents were significantly increased. On the other hand, connexin-43 protein expression was significantly decreased, while caspase-3 and iNOS protein expressions were significantly increased. Combined therapy provided additional improvement compared to cell treatment alone towards reducing cardiac hypertrophy, fibrosis and inflammation. Furthermore, combined therapy induced significant increase in angiogenesis and BMDMSC homing and prevented ISO induced changes in iNOS, connexin-43 and caspase-3 protein expressions. Conclusion: Combined nicorandil/BMDMSC treatment was superior to BMDMSC alone towards preventing ISO-induced heart failure in rats.

Keywords: fibrosis, isoproterenol, mesenchymal stem cells, nicorandil

Procedia PDF Downloads 532
25109 3D Estimation of Synaptic Vesicle Distributions in Serial Section Transmission Electron Microscopy

Authors: Mahdieh Khanmohammadi, Sune Darkner, Nicoletta Nava, Jens Randel Nyengaard, Jon Sporring

Abstract:

We study the effect of stress on nervous system and we use two experimental groups of rats: sham rats and rats subjected to acute foot-shock stress. We investigate the synaptic vesicles density as a function of distance to the active zone in serial section transmission electron microscope images in 2 and 3 dimensions. By estimating the density in 2D and 3D we compare two groups of rats.

Keywords: stress, 3-dimensional synaptic vesicle density, image registration, bioinformatics

Procedia PDF Downloads 278
25108 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study

Authors: Zeba Mahmood

Abstract:

The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.

Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining

Procedia PDF Downloads 538
25107 Estimation of Pressure Profile and Boundary Layer Characteristics over NACA 4412 Airfoil

Authors: Anwar Ul Haque, Waqar Asrar, Erwin Sulaeman, Jaffar S. M. Ali

Abstract:

Pressure distribution data of the standard airfoils is usually used for the calibration purposes in subsonic wind tunnels. Results of such experiments are quite old and obtained by using the model in the spanwise direction. In this manuscript, pressure distribution over NACA 4412 airfoil model was presented by placing the 3D model in the lateral direction. The model is made of metal with pressure ports distributed longitudinally as well as in the lateral direction. The pressure model was attached to the floor of the tunnel with the help of the base plate to give the specified angle of attack to the model. Before the start of the experiments, the pressure tubes of the respective ports of the 128 ports pressure scanner are checked for leakage, and the losses due to the length of the pipes were also incorporated in the results for the specified pressure range. Growth rate maps of the boundary layer thickness were also plotted. It was found that with the increase in the velocity, the dynamic pressure distribution was also increased for the alpha seep. Plots of pressure distribution so obtained were overlapped with those obtained by using XFLR software, a low fidelity tool. It was found that at moderate and high angles of attack, the distribution of the pressure coefficients obtained from the experiments is high when compared with the XFLR ® results obtained along with the span of the wing. This under-prediction by XFLR ® is more obvious on the windward than on the leeward side.

Keywords: subsonic flow, boundary layer, wind tunnel, pressure testing

Procedia PDF Downloads 320
25106 Image Inpainting Model with Small-Sample Size Based on Generative Adversary Network and Genetic Algorithm

Authors: Jiawen Wang, Qijun Chen

Abstract:

The performance of most machine-learning methods for image inpainting depends on the quantity and quality of the training samples. However, it is very expensive or even impossible to obtain a great number of training samples in many scenarios. In this paper, an image inpainting model based on a generative adversary network (GAN) is constructed for the cases when the number of training samples is small. Firstly, a feature extraction network (F-net) is incorporated into the GAN network to utilize the available information of the inpainting image. The weighted sum of the extracted feature and the random noise acts as the input to the generative network (G-net). The proposed network can be trained well even when the sample size is very small. Secondly, in the phase of the completion for each damaged image, a genetic algorithm is designed to search an optimized noise input for G-net; based on this optimized input, the parameters of the G-net and F-net are further learned (Once the completion for a certain damaged image ends, the parameters restore to its original values obtained in the training phase) to generate an image patch that not only can fill the missing part of the damaged image smoothly but also has visual semantics.

Keywords: image inpainting, generative adversary nets, genetic algorithm, small-sample size

Procedia PDF Downloads 130
25105 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 354
25104 Estimation of Residual Stresses in Thick Walled Cylinder by Radial Basis Artificial Neural

Authors: Mohammad Heidari

Abstract:

In this paper a method for high strength steel is proposed of residual stresses in autofrettaged tubes by combination of artificial neural networks is presented. Many different thick walled cylinders that were subjected to different conditions were studied. At first, the residual stress is calculated by analytical solution. Then by changing of the parameters that influenced in residual stresses such as percentage of autofrettage, internal pressure, wall ratio of cylinder, material property of cylinder, bauschinger and hardening effect factor, a neural network is created. These parameters are the input of network. The output of network is residual stress. Numerical data, employed for training the network and capabilities of the model in predicting the residual stress has been verified. The output obtained from neural network model is compared with numerical results, and the amount of relative error has been calculated. Based on this verification error, it is shown that the radial basis function of neural network has the average error of 2.75% in predicting residual stress of thick wall cylinder. Further analysis of residual stress of thick wall cylinder under different input conditions has been investigated and comparison results of modeling with numerical considerations shows a good agreement, which also proves the feasibility and effectiveness of the adopted approach.

Keywords: thick walled cylinder, residual stress, radial basis, artificial neural network

Procedia PDF Downloads 417
25103 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 168
25102 Policies and Politics of Infrastructure Provisioning in Nigeria

Authors: Olufemi Adedamola Oyedele

Abstract:

Infrastructure provision in Nigeria is now at its lowest ebb in spite of its being critical to the socio-economic and political development of any nation. This is partly because the policy that will ensure its adequate provisioning is missing and partly because politics is affecting its provision. Policy is the basic principles by which a government is guided. Infrastructural development is the basis for measuring the performance of governments and it is the foundation of good governance. Demand for infrastructural development is higher and resources used in its provision are limited. Ethnic-interest agitation and lobbying for infrastructure provision are common things in multi-ethnic state like Nigeria. Most infrastructures are now decayed and need repair or replacement. Government is the system that organizes, control and sensitizes the people in a society in other for all to have an acceptable level of living. Governments have the power to put in place all measures that they deem fit will make an environment conducive for living for everybody. Infrastructure development in any environment requires needs assessment, feasibility and viability studies and carrying out physical development of the project. The challenge in Nigeria is largely carrying out development where they are not needed but where the people are loyal. There are numerous abandoned projects because they were started due to politics and not because they are feasible. Policies and politics greatly affect infrastructure provisioning in Nigeria and this is the premise of this paper.

Keywords: infrastructure challenges, infrastructure development, policy making, politics, project finance

Procedia PDF Downloads 281
25101 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods

Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie

Abstract:

The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.

Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence

Procedia PDF Downloads 248
25100 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 87
25099 Secure Multiparty Computations for Privacy Preserving Classifiers

Authors: M. Sumana, K. S. Hareesha

Abstract:

Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.

Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data

Procedia PDF Downloads 412
25098 Estimating Anthropometric Dimensions for Saudi Males Using Artificial Neural Networks

Authors: Waleed Basuliman

Abstract:

Anthropometric dimensions are considered one of the important factors when designing human-machine systems. In this study, the estimation of anthropometric dimensions has been improved by using Artificial Neural Network (ANN) model that is able to predict the anthropometric measurements of Saudi males in Riyadh City. A total of 1427 Saudi males aged 6 to 60 years participated in measuring 20 anthropometric dimensions. These anthropometric measurements are considered important for designing the work and life applications in Saudi Arabia. The data were collected during eight months from different locations in Riyadh City. Five of these dimensions were used as predictors variables (inputs) of the model, and the remaining 15 dimensions were set to be the measured variables (Model’s outcomes). The hidden layers varied during the structuring stage, and the best performance was achieved with the network structure 6-25-15. The results showed that the developed Neural Network model was able to estimate the body dimensions of Saudi male population in Riyadh City. The network's mean absolute percentage error (MAPE) and the root mean squared error (RMSE) were found to be 0.0348 and 3.225, respectively. These results were found less, and then better, than the errors found in the literature. Finally, the accuracy of the developed neural network was evaluated by comparing the predicted outcomes with regression model. The ANN model showed higher coefficient of determination (R2) between the predicted and actual dimensions than the regression model.

Keywords: artificial neural network, anthropometric measurements, back-propagation

Procedia PDF Downloads 488
25097 Robot Operating System-Based SLAM for a Gazebo-Simulated Turtlebot2 in 2d Indoor Environment with Cartographer Algorithm

Authors: Wilayat Ali, Li Sheng, Waleed Ahmed

Abstract:

The ability of the robot to make simultaneously map of the environment and localize itself with respect to that environment is the most important element of mobile robots. To solve SLAM many algorithms could be utilized to build up the SLAM process and SLAM is a developing area in Robotics research. Robot Operating System (ROS) is one of the frameworks which provide multiple algorithm nodes to work with and provide a transmission layer to robots. Manyof these algorithms extensively in use are Hector SLAM, Gmapping and Cartographer SLAM. This paper describes a ROS-based Simultaneous localization and mapping (SLAM) library Google Cartographer mapping, which is open-source algorithm. The algorithm was applied to create a map using laser and pose data from 2d Lidar that was placed on a mobile robot. The model robot uses the gazebo package and simulated in Rviz. Our research work's primary goal is to obtain mapping through Cartographer SLAM algorithm in a static indoor environment. From our research, it is shown that for indoor environments cartographer is an applicable algorithm to generate 2d maps with LIDAR placed on mobile robot because it uses both odometry and poses estimation. The algorithm has been evaluated and maps are constructed against the SLAM algorithms presented by Turtlebot2 in the static indoor environment.

Keywords: SLAM, ROS, navigation, localization and mapping, gazebo, Rviz, Turtlebot2, slam algorithms, 2d indoor environment, cartographer

Procedia PDF Downloads 148
25096 Application of the EU Commission Waste Management Methodology Level(s) to a Construction and a Demolition in North-West Romania.

Authors: Valean Maria

Abstract:

Construction and demolition waste management is a timely topic, due to the urgency of its transition to sustainability. This sector is responsible for over a third of the waste generated in the E.U., while the legislation requires a proportion of at least 70% preparation for reuse and recycle, excluding backfilling. To this end, the E.U. Commission has provided the Level(s) methodology, allowing for the standardized planning and reporting of waste quantities across all levels of the construction process, from the architecture, to the demolition, from the estimation stage, to the actual measurements at the end of the operations. We applied Level(s) for the first time to the Romanian context, a developing E.U. country in which illegal dumping of contruction waste in nature and landfills, are still common practice. We performed the desk study of the buildings’ documents, followed by field studies of the sites, and finally the insertion and calculation of statistical data of the construction and demolition waste. We learned that Romania is far from the E.U. average in terms of the initial estimations of waste, with some numbers being higher, others lower, and that the price of evacuation to landfills is significantly lower in the developing country, a possible barrier to adopting the new regulations. Finally, we found that concrete is the predominant type waste, in terms of quantity as well as cost of disposal. Further directions of research are provided, such as mapping out all of the alternative facilities in the region and the calculation of the financial costs and of the CO2 footprint, for preparing and delivering waste sustainably, for a more sound and locally adapted model of waste management.

Keywords: construction, waste, management, levels, EU

Procedia PDF Downloads 77
25095 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: software metrics, fault prediction, cross project, within project.

Procedia PDF Downloads 344
25094 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features

Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova

Abstract:

The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.

Keywords: emotion recognition, facial recognition, signal processing, machine learning

Procedia PDF Downloads 317
25093 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels

Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur

Abstract:

With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.

Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography

Procedia PDF Downloads 124
25092 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development

Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas

Abstract:

One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.

Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development

Procedia PDF Downloads 317
25091 Synchronization of Chaotic T-System via Optimal Control as an Adaptive Controller

Authors: Hossein Kheiri, Bashir Naderi, Mohamad Reza Niknam

Abstract:

In this paper we study the optimal synchronization of chaotic T-system with complete uncertain parameter. Optimal control laws and parameter estimation rules are obtained by using Hamilton-Jacobi-Bellman (HJB) technique and Lyapunov stability theorem. The derived control laws are optimal adaptive control and make the states of drive and response systems asymptotically synchronized. Numerical simulation shows the effectiveness and feasibility of the proposed method.

Keywords: Lyapunov stability, synchronization, chaos, optimal control, adaptive control

Procedia PDF Downloads 489
25090 Data Recording for Remote Monitoring of Autonomous Vehicles

Authors: Rong-Terng Juang

Abstract:

Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.

Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar

Procedia PDF Downloads 164
25089 Applicant Perceptions in Admission Process to Higher Education: The Influence of Social Anxiety

Authors: I. Diamant, R. Srouji

Abstract:

Applicant perceptions are attitudes, feelings, and cognitions which individuals have about selection procedures and have been mostly studied in the context of personnel selection. The main aim of the present study is to expand the understanding of applicant perceptions, using the framework of Organizational Justice Theory, in the domain of selection for higher education. The secondary aim is to explore the relationships between individual differences in social anxiety and applicants’ perceptions. The selection process is an accept/reject situation; it was hypothesized that applicants with higher social anxiety would experience negative perceptions and a lower success estimation, especially when subjected to social interaction elements in the process (interview and group simulation). Also, the effects of prior preparation and post-process explanations offered at the end of the selection process were explored. One hundred sixty psychology M.A. program applicants participated in this research, and following the selection process completed questionnaires measuring social anxiety, social exclusion, ratings on several justice dimensions for each of the methods in the selection process, feelings of success, and self-estimation of compatibility. About half of the applicants also received explanations regarding the significance and the aims of the selection process. Results provided support for most of our hypotheses: applicants with higher social anxiety experienced an increased level of social exclusion in the selection process, perceived the selection as less fair and ended with a lower feeling of success relative to those applicants without social anxiety. These relationships were especially salient in the selection procedures which included social interaction. Additionally, preparation for the selection process was positively related to the favorable perception of fairness in the selection process. Finally, contrary to our hypothesis, it was found that explanations did not affect the applicant’s perceptions. The results enhance our understanding of which factors affect applicant perceptions in applicants to higher education studies and contribute uniquely to the understanding of the effect of social anxiety on different aspects of selection experienced by applicants. The findings clearly show that some individuals may be predisposed to react unfavorably to certain selection situations. In an age of increasing awareness towards fairness in evaluation and selection and hiring procedures, these findings may be of relevance and may contribute to the design of future personnel selection methods in general and of higher education selection in particular.

Keywords: applicant perceptions, selection and assessment, organizational justice theory, social anxiety

Procedia PDF Downloads 153
25088 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory

Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan

Abstract:

Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.

Keywords: data fusion, Dempster-Shafer theory, data mining, event detection

Procedia PDF Downloads 411
25087 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 130
25086 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems

Authors: Yong-Kyu Jung

Abstract:

The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).

Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity

Procedia PDF Downloads 80
25085 BIASS in the Estimation of Covariance Matrices and Optimality Criteria

Authors: Juan M. Rodriguez-Diaz

Abstract:

The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.

Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix

Procedia PDF Downloads 444