Search results for: data analysis of Uzbekistan
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42036

Search results for: data analysis of Uzbekistan

40836 Review of Concepts and Tools Applied to Assess Risks Associated with Food Imports

Authors: A. Falenski, A. Kaesbohrer, M. Filter

Abstract:

Introduction: Risk assessments can be performed in various ways and in different degrees of complexity. In order to assess risks associated with imported foods additional information needs to be taken into account compared to a risk assessment on regional products. The present review is an overview on currently available best practise approaches and data sources used for food import risk assessments (IRAs). Methods: A literature review has been performed. PubMed was searched for articles about food IRAs published in the years 2004 to 2014 (English and German texts only, search string “(English [la] OR German [la]) (2004:2014 [dp]) import [ti] risk”). Titles and abstracts were screened for import risks in the context of IRAs. The finally selected publications were analysed according to a predefined questionnaire extracting the following information: risk assessment guidelines followed, modelling methods used, data and software applied, existence of an analysis of uncertainty and variability. IRAs cited in these publications were also included in the analysis. Results: The PubMed search resulted in 49 publications, 17 of which contained information about import risks and risk assessments. Within these 19 cross references were identified to be of interest for the present study. These included original articles, reviews and guidelines. At least one of the guidelines of the World Organisation for Animal Health (OIE) and the Codex Alimentarius Commission were referenced in any of the IRAs, either for import of animals or for imports concerning foods, respectively. Interestingly, also a combination of both was used to assess the risk associated with the import of live animals serving as the source of food. Methods ranged from full quantitative IRAs using probabilistic models and dose-response models to qualitative IRA in which decision trees or severity tables were set up using parameter estimations based on expert opinions. Calculations were done using @Risk, R or Excel. Most heterogeneous was the type of data used, ranging from general information on imported goods (food, live animals) to pathogen prevalence in the country of origin. These data were either publicly available in databases or lists (e.g., OIE WAHID and Handystatus II, FAOSTAT, Eurostat, TRACES), accessible on a national level (e.g., herd information) or only open to a small group of people (flight passenger import data at national airport customs office). In the IRAs, an uncertainty analysis has been mentioned in some cases, but calculations have been performed only in a few cases. Conclusion: The current state-of-the-art in the assessment of risks of imported foods is characterized by a great heterogeneity in relation to general methodology and data used. Often information is gathered on a case-by-case basis and reformatted by hand in order to perform the IRA. This analysis therefore illustrates the need for a flexible, modular framework supporting the connection of existing data sources with data analysis and modelling tools. Such an infrastructure could pave the way to IRA workflows applicable ad-hoc, e.g. in case of a crisis situation.

Keywords: import risk assessment, review, tools, food import

Procedia PDF Downloads 301
40835 The Use of Image Processing Responses Tools Applied to Analysing Bouguer Gravity Anomaly Map (Tangier-Tetuan's Area-Morocco)

Authors: Saad Bakkali

Abstract:

Image processing is a powerful tool for the enhancement of edges in images used in the interpretation of geophysical potential field data. Arial and terrestrial gravimetric surveys were carried out in the region of Tangier-Tetuan. From the observed and measured data of gravity Bouguer gravity anomalies map was prepared. This paper reports the results and interpretations of the transformed maps of Bouguer gravity anomaly of the Tangier-Tetuan area using image processing. Filtering analysis based on classical image process was applied. Operator image process like logarithmic and gamma correction are used. This paper also present the results obtained from this image processing analysis of the enhancement edges of the Bouguer gravity anomaly map of the Tangier-Tetuan zone.

Keywords: bouguer, tangier, filtering, gamma correction, logarithmic enhancement edges

Procedia PDF Downloads 420
40834 Time-Series Load Data Analysis for User Power Profiling

Authors: Mahdi Daghmhehci Firoozjaei, Minchang Kim, Dima Alhadidi

Abstract:

In this paper, we present a power profiling model for smart grid consumers based on real time load data acquired smart meters. It profiles consumers’ power consumption behaviour using the dynamic time warping (DTW) clustering algorithm. Due to the invariability of signal warping of this algorithm, time-disordered load data can be profiled and consumption features be extracted. Two load types are defined and the related load patterns are extracted for classifying consumption behaviour by DTW. The classification methodology is discussed in detail. To evaluate the performance of the method, we analyze the time-series load data measured by a smart meter in a real case. The results verify the effectiveness of the proposed profiling method with 90.91% true positive rate for load type clustering in the best case.

Keywords: power profiling, user privacy, dynamic time warping, smart grid

Procedia PDF Downloads 147
40833 Detection of Change Points in Earthquakes Data: A Bayesian Approach

Authors: F. A. Al-Awadhi, D. Al-Hulail

Abstract:

In this study, we applied the Bayesian hierarchical model to detect single and multiple change points for daily earthquake body wave magnitude. The change point analysis is used in both backward (off-line) and forward (on-line) statistical research. In this study, it is used with the backward approach. Different types of change parameters are considered (mean, variance or both). The posterior model and the conditional distributions for single and multiple change points are derived and implemented using BUGS software. The model is applicable for any set of data. The sensitivity of the model is tested using different prior and likelihood functions. Using Mb data, we concluded that during January 2002 and December 2003, three changes occurred in the mean magnitude of Mb in Kuwait and its vicinity.

Keywords: multiple change points, Markov Chain Monte Carlo, earthquake magnitude, hierarchical Bayesian mode

Procedia PDF Downloads 455
40832 Efficiency in Islamic Banks: Some Empirical Evidences in Indonesian Finance Market

Authors: Ahmed Sameer El Khatib

Abstract:

The aim of the present paper is to examine the revenue efficiency of the Indonesian Islamic banking sector. The study also seeks to investigate the potential internal (bank specific) and external (macroeconomic) determinants that influence the revenue efficiency of Indonesian domestic Islamic banks. We employ the whole gamut of domestic and foreign Islamic banks operating in the Indonesian Islamic banking sector during the period of 2009 to 2018. The level of revenue efficiency is computed by using the Data Envelopment Analysis (DEA) method. Furthermore, we employ a panel regression analysis framework based on the Ordinary Least Square (OLS) method to examine the potential determinants of revenue efficiency. The results indicate that the level of revenue efficiency of Indonesian domestic Islamic banks is lower compared to their foreign Islamic bank counterparts. We find that bank market power, liquidity, and management quality significantly influence the improvement in revenue efficiency of the Indonesian domestic Islamic banks during the period under study. By calculating these efficiency concepts, we can observe the efficiency levels of the domestic and foreign Islamic banks. In addition, by comparing both cost and profit efficiency, we can identify the influence of the revenue efficiency on the banks’ profitability.

Keywords: Islamic Finance, Islamic Banks, Revenue Efficiency, Data Envelopment Analysis

Procedia PDF Downloads 239
40831 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering

Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel

Abstract:

Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.

Keywords: classification, data mining, spam filtering, naive bayes, decision tree

Procedia PDF Downloads 408
40830 Discriminant Analysis as a Function of Predictive Learning to Select Evolutionary Algorithms in Intelligent Transportation System

Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, Daniel Vélez-Díaz, Edith Olaco García

Abstract:

In this paper, we present the use of the discriminant analysis to select evolutionary algorithms that better solve instances of the vehicle routing problem with time windows. We use indicators as independent variables to obtain the classification criteria, and the best algorithm from the generic genetic algorithm (GA), random search (RS), steady-state genetic algorithm (SSGA), and sexual genetic algorithm (SXGA) as the dependent variable for the classification. The discriminant classification was trained with classic instances of the vehicle routing problem with time windows obtained from the Solomon benchmark. We obtained a classification of the discriminant analysis of 66.7%.

Keywords: Intelligent Transportation Systems, data-mining techniques, evolutionary algorithms, discriminant analysis, machine learning

Procedia PDF Downloads 470
40829 Possible Approach for Interlinking of Ponds to Mitigate Drought in Sivaganga Villages at Micro Level

Authors: Manikandan Sathianarayanan, Pernaidu Pasala

Abstract:

This paper presents the results of our studies concerning the implementation and exploitation of a Geographical Information System (GIS) dedicated to the support and assistance of decisions requested by drought management. In this study on diverting of surplus water through canals, pond sand check dams in the study area was carried out. The remote sensing data and GIS data was used to identify the drought prone villages in sivaganga taluk and to generate present land use, drainage pattern as well as slope and contour. This analysis was carried out for diverting surplus water through proposed canal and pond. The results of the study indicate that if the surplus water from the ponds and streams are diverted to the drought villages in Sivaganga taluk, it will definitely improve the agricultural production due to availability of water in the ponds. The improvements in agricultural production will help to improve the economical condition of the farmers in the region.

Keywords: interlinking, spatial analysis, remote sensing, GIS

Procedia PDF Downloads 251
40828 Wireless Sensor Network for Forest Fire Detection and Localization

Authors: Tarek Dandashi

Abstract:

WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.

Keywords: forest fire, WSN, wireless sensor network, algortihm

Procedia PDF Downloads 260
40827 Students’ Perceptions on Educational Game for Learning Programming Subject: A Case Study

Authors: Roslina Ibrahim, Azizah Jaafar, Khalili Khalil

Abstract:

Educational games (EG) are regarded as a promising teaching and learning tool for the new generation. Growing number of studies and literatures can be found in EG studies. Both academic researchers and commercial developers come out with various educational games prototypes and titles. Despite that, acceptance of educational games still lacks among the students. It is important to understanding students’ perceptions of EG, since they are the main stakeholder of the technology. Thus, this study seeks to understand perceptions of undergraduates’ students using a framework originated from user acceptance theory. The framework consists of six constructs with twenty-eight items. Data collection was done on 180 undergraduate students of Universiti Teknologi Malaysia, Kuala Lumpur using self-developed online EG called ROBO-C. Data analysis was done using descriptive, factor analysis and correlations. Performance expectancy, effort expectancy, attitude, and enjoyment factors were found significantly correlated with the intention to use EG. This study provides more understanding towards the use of educational games among students.

Keywords: educational games, perceptions, acceptance, UTAUT

Procedia PDF Downloads 410
40826 Understanding Team Member Autonomy and Team Collaboration: A Qualitative Study

Authors: Ayşen Bakioğlu, Gökçen Seyra Çakır

Abstract:

This study aims to explore how research assistants who work in project teams experience team member autonomy and how they reconcile team member autonomy with team collaboration. The study utilizes snowball sampling. 20 research assistants who work the faculties of education in Marmara University and Yıldız Technical University have been interviewed. The analysis of data involves a content analysis MAXQDAPlus 11 which is a qualitative data analysis software is used as the data analysis tool. According to the findings of this study, emerging themes include team norm formation, team coordination management, the role of individual tasks in team collaboration, leadership distribution. According to the findings, interviewees experience team norm formation process in terms of processes, which pertain to task fulfillment, and processes, which pertain to the regulation of team dynamics. Team norm formation process instills a sense of responsibility amongst individual team members. Apart from that, the interviewees’ responses indicate that the realization of the obligation to work in a team contributes to the team norm formation process. The participants indicate that individual expectations are taken into consideration during the coordination of the team. The supervisor of the project team also has a crucial role in maintaining team collaboration. Coordination problems arise when an individual team member does not relate his/her academic field with the research topic of the project team. The findings indicate that the leadership distribution in the project teams involves two leadership processes: leadership distribution which is based on the processes that focus on individual team members and leadership distribution which is based on the processes that focus on team interaction. Apart from that, individual tasks serve as a facilitator of collaboration amongst team members. Interviewees also indicate that individual tasks also facilitate the expression of individuality.

Keywords: project teams in higher education, research assistant teams, team collaboration, team member autonomy

Procedia PDF Downloads 359
40825 Logistics Information Systems in the Distribution of Flour in Nigeria

Authors: Cornelius Femi Popoola

Abstract:

This study investigated logistics information systems in the distribution of flour in Nigeria. A case study design was used and 50 staff of Honeywell Flour Mill was sampled for the study. Data generated through a questionnaire were analysed using correlation and regression analysis. The findings of the study revealed that logistic information systems such as e-commerce, interactive telephone systems and electronic data interchange positively correlated with the distribution of flour in Honeywell Flour Mill. Finding also deduced that e-commerce, interactive telephone systems and electronic data interchange jointly and positively contribute to the distribution of flour in Honeywell Flour Mill in Nigeria (R = .935; Adj. R2 = .642; F (3,47) = 14.739; p < .05). The study therefore recommended that Honeywell Flour Mill should upgrade their logistic information systems to computer-to-computer communication of business transactions and documents, as well adopt new technology such as, tracking-and-tracing systems (barcode scanning for packages and palettes), tracking vehicles with Global Positioning System (GPS), measuring vehicle performance with ‘black boxes’ (containing logistic data), and Automatic Equipment Identification (AEI) into their systems.

Keywords: e-commerce, electronic data interchange, flour distribution, information system, interactive telephone systems

Procedia PDF Downloads 551
40824 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques

Authors: Jonathan J. Burson

Abstract:

With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.

Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis

Procedia PDF Downloads 96
40823 Agricultural Land Suitability Analysis of Kampe-Omi Irrigation Scheme Using Remote Sensing and Geographic Information System

Authors: Olalekan Sunday Alabi, Titus Adeyemi Alonge, Olumuyiwa Idowu Ojo

Abstract:

Agricultural land suitability analysis and mapping play an imperative role for sustainable utilization of scarce physical land resources. The objective of this study was to prepare spatial database of physical land resources for irrigated agriculture and to assess land suitability for irrigation and developing suitable area map of the study area. The study was conducted at Kampe-Omi irrigation scheme located at Yagba West Local Government Area of Kogi State, Nigeria. Temperature and rainfall data of the study area were collected for 10 consecutive years (2005-2014). Geographic Information System (GIS) techniques were used to develop irrigation land suitability map of the study area. Attribute parameters such as the slope, soil properties, topography of the study area were used for the analysis. The available data were arranged, proximity analysis of Arc-GIS was made, and this resulted into five mapping units. The final agricultural land suitability map of the study area was derived after overlay analysis. Based on soil composition, slope, soil properties and topography, it was concluded that; Kampe-Omi has rich sandy loam soil, which is viable for agricultural purpose, the soil composition is made up of 60% sand and 40% loam. The land-use pattern map of Kampe-Omi has vegetal area and water-bodies covering 55.6% and 19.3% of the total assessed area respectively. The landform of Kampe-Omi is made up of 41.2% lowlands, 37.5% normal lands and 21.3% highlands. Kampe-Omi is adequately suitable for agricultural purpose while an extra of 20.2% of the area is highly suitable for agricultural purpose making 72.6% while 18.7% of the area is slightly suitable.

Keywords: remote sensing, GIS, Kampe–Omi, land suitability, mapping

Procedia PDF Downloads 210
40822 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin is an emerging research topic that attracted researchers in the last decade. It is used in many fields, such as smart manufacturing and smart healthcare because it saves time and money. It is usually related to other technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, Human digital twin (HDT), in specific, is still a novel idea that still needs to prove its feasibility. HDT expands the idea of Digital Twin to human beings, which are living beings and different from the inanimate physical entities. The goal of this research was to create a Human digital twin that is responsible for real-time human replies automation by simulating human behavior. For this reason, clustering, supervised classification, topic extraction, and sentiment analysis were studied in this paper. The feasibility of the HDT for personal replies generation on social messaging applications was proved in this work. The overall accuracy of the proposed approach in this paper was 63% which is a very promising result that can open the way for researchers to expand the idea of HDT. This was achieved by using Random Forest for clustering the question data base and matching new questions. K-nearest neighbor was also applied for sentiment analysis.

Keywords: human digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification, clustering

Procedia PDF Downloads 85
40821 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 195
40820 Sparse Principal Component Analysis: A Least Squares Approximation Approach

Authors: Giovanni Merola

Abstract:

Sparse Principal Components Analysis aims to find principal components with few non-zero loadings. We derive such sparse solutions by adding a genuine sparsity requirement to the original Principal Components Analysis (PCA) objective function. This approach differs from others because it preserves PCA's original optimality: uncorrelatedness of the components and least squares approximation of the data. To identify the best subset of non-zero loadings we propose a branch-and-bound search and an iterative elimination algorithm. This last algorithm finds sparse solutions with large loadings and can be run without specifying the cardinality of the loadings and the number of components to compute in advance. We give thorough comparisons with the existing sparse PCA methods and several examples on real datasets.

Keywords: SPCA, uncorrelated components, branch-and-bound, backward elimination

Procedia PDF Downloads 379
40819 Performance Analysis of PAPR Reduction in OFDM Systems based on Partial Transmit Sequence (PTS) Technique

Authors: Alcardo Alex Barakabitze, Tan Xiaoheng

Abstract:

Orthogonal Frequency Division Multiplexing (OFDM) is a special case of Multi-Carrier Modulation (MCM) technique which transmits a stream of data over a number of lower data rate subcarriers. OFDM splits the total transmission bandwidth into a number of orthogonal and non-overlapping subcarriers and transmit the collection of bits called symbols in parallel using these subcarriers. This paper explores the Peak to Average Power Reduction (PAPR) using the Partial Transmit Sequence technique. We provide the distribution analysis and the basics of OFDM signals and then show how the PAPR increases as the number of subcarriers increases. We provide the performance analysis of CCDF and PAPR expressed in decibels through MATLAB simulations. The simulation results show that, in PTS technique, the performance of PAPR reduction in OFDM systems improves significantly as the number of sub-blocks increases. However, by keeping the same number of sub-blocks variation, oversampling factor and the number of OFDM blocks’ iteration for generating the CCDF, the OFDM systems with 128 subcarriers have an improved performance in PAPR reduction compared to OFDM systems with 256, 512 or >512 subcarriers.

Keywords: OFDM, peak to average power reduction (PAPR), bit error rate (BER), subcarriers, wireless communications

Procedia PDF Downloads 513
40818 Direct and Indirect Effects of Childhood Traumas, Emotion Regulation Difficulties and Age on Tendency to Violence

Authors: Selin Kara-Bahçekapılı, Bengisu Nehir Aydın

Abstract:

Objective: In this study, it is aimed to examine the relationship between childhood traumas (overprotection-control, emotional/physical/sexual abuse, emotional/physical neglect), age, emotional regulation difficulties, and the tendency of violence in adults. In the study, the direct and indirect effects of 6 sub-factors of childhood traumas, emotion regulation difficulties, and age on tendency to violence are evaluated on a model that theoretically reveals. Method: The population of this cross-sectional study consists of individuals between the ages of 18-65 living in Turkey. The data from 527 participants were obtained by online surveys and convenience sampling method within the scope of the study. As a result of exclusion criteria and then outlier data analysis, the data of 443 participants were included in the analysis. Data were collected by demographic information form, childhood trauma scale, emotion regulation difficulty scale, and violence tendency scale. Research data were analyzed by SPSS and AMOS using correlation, path analysis, direct and indirect effects. Results: According to the research findings, the variables in the model explained 28.2% of the variance of the mean scores of the individuals' tendency to violence. Emotion regulation difficulties have the most direct effect on the tendency to violence (d=.387; p<.01). The effects of excessive protection and control, emotional neglect, and physical neglect variables on the tendency to violence are not significant. When the significant and indirect effects of the variables on tendency to violence over emotion regulation difficulties are examined, age has a negative effect, emotional neglect has a positive effect, emotional abuse has a positive effect, and overprotection-control has a positive effect. The indirect effects of sexual abuse, physical neglect, and physical abuse on tendency to violence are not significant. Childhood traumas and age variables in the model explained 24.1% of the variance of the mean scores of the individuals’ emotion regulation difficulties. The variable that most affects emotion regulation difficulties is age (d=-.268; p<.001). The direct effects of sexual abuse, physical neglect, and physical abuse on emotion regulation difficulties are not significant. Conclusion: The results of the research emphasize the critical role of difficulty in emotion regulation on the tendency to violence. Difficulty in emotion regulation affects the tendency to violence both directly and by mediating different variables. In addition, it is seen that some sub-factors of childhood traumas have direct and/or indirect effects on the tendency to violence. Emotional abuse and age have both direct and indirect effects on the tendency to violence over emotion regulation difficulties.

Keywords: childhood trauma, emotion regulation difficulties, tendency to violence, path analysis

Procedia PDF Downloads 95
40817 Analysis of the Unmanned Aerial Vehicles’ Incidents and Accidents: The Role of Human Factors

Authors: Jacob J. Shila, Xiaoyu O. Wu

Abstract:

As the applications of unmanned aerial vehicles (UAV) continue to increase across the world, it is critical to understand the factors that contribute to incidents and accidents associated with these systems. Given the variety of daily applications that could utilize the operations of the UAV (e.g., medical, security operations, construction activities, landscape activities), the main discussion has been how to safely incorporate the UAV into the national airspace system. The types of UAV incidents being reported range from near sightings by other pilots to actual collisions with aircraft or UAV. These incidents have the potential to impact the rest of aviation operations in a variety of ways, including human lives, liability costs, and delay costs. One of the largest causes of these incidents cited is the human factor; other causes cited include maintenance, aircraft, and others. This work investigates the key human factors associated with UAV incidents. To that end, the data related to UAV incidents that have occurred in the United States is both reviewed and analyzed to identify key human factors related to UAV incidents. The data utilized in this work is gathered from the Federal Aviation Administration (FAA) drone database. This study adopts the human factor analysis and classification system (HFACS) to identify key human factors that have contributed to some of the UAV failures to date. The uniqueness of this work is the incorporation of UAV incident data from a variety of applications and not just military data. In addition, identifying the specific human factors is crucial towards developing safety operational models and human factor guidelines for the UAV. The findings of these common human factors are also compared to similar studies in other countries to determine whether these factors are common internationally.

Keywords: human factors, incidents and accidents, safety, UAS, UAV

Procedia PDF Downloads 241
40816 1/Sigma Term Weighting Scheme for Sentiment Analysis

Authors: Hanan Alshaher, Jinsheng Xu

Abstract:

Large amounts of data on the web can provide valuable information. For example, product reviews help business owners measure customer satisfaction. Sentiment analysis classifies texts into two polarities: positive and negative. This paper examines movie reviews and tweets using a new term weighting scheme, called one-over-sigma (1/sigma), on benchmark datasets for sentiment classification. The proposed method aims to improve the performance of sentiment classification. The results show that 1/sigma is more accurate than the popular term weighting schemes. In order to verify if the entropy reflects the discriminating power of terms, we report a comparison of entropy values for different term weighting schemes.

Keywords: 1/sigma, natural language processing, sentiment analysis, term weighting scheme, text classification

Procedia PDF Downloads 199
40815 Multimedia Container for Autonomous Car

Authors: Janusz Bobulski, Mariusz Kubanek

Abstract:

The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.

Keywords: an autonomous car, image processing, lidar, obstacle detection

Procedia PDF Downloads 223
40814 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS

Authors: Eunsu Jang, Kang Park

Abstract:

In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.

Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis

Procedia PDF Downloads 399
40813 Suitable Site Selection of Small Dams Using Geo-Spatial Technique: A Case Study of Dadu Tehsil, Sindh

Authors: Zahid Khalil, Saad Ul Haque, Asif Khan

Abstract:

Decision making about identifying suitable sites for any project by considering different parameters is difficult. Using GIS and Multi-Criteria Analysis (MCA) can make it easy for those projects. This technology has proved to be an efficient and adequate in acquiring the desired information. In this study, GIS and MCA were employed to identify the suitable sites for small dams in Dadu Tehsil, Sindh. The GIS software is used to create all the spatial parameters for the analysis. The parameters that derived are slope, drainage density, rainfall, land use / land cover, soil groups, Curve Number (CN) and runoff index with a spatial resolution of 30m. The data used for deriving above layers include 30-meter resolution SRTM DEM, Landsat 8 imagery, and rainfall from National Centre of Environment Prediction (NCEP) and soil data from World Harmonized Soil Data (WHSD). Land use/Land cover map is derived from Landsat 8 using supervised classification. Slope, drainage network and watershed are delineated by terrain processing of DEM. The Soil Conservation Services (SCS) method is implemented to estimate the surface runoff from the rainfall. Prior to this, SCS-CN grid is developed by integrating the soil and land use/land cover raster. These layers with some technical and ecological constraints are assigned weights on the basis of suitability criteria. The pairwise comparison method, also known as Analytical Hierarchy Process (AHP) is taken into account as MCA for assigning weights on each decision element. All the parameters and group of parameters are integrated using weighted overlay in GIS environment to produce suitable sites for the Dams. The resultant layer is then classified into four classes namely, best suitable, suitable, moderate and less suitable. This study reveals a contribution to decision-making about suitable sites analysis for small dams using geospatial data with minimal amount of ground data. This suitability maps can be helpful for water resource management organizations in determination of feasible rainwater harvesting structures (RWH).

Keywords: Remote sensing, GIS, AHP, RWH

Procedia PDF Downloads 389
40812 A Study of Predicting Judgments on Causes of Online Privacy Invasions: Based on U.S Judicial Cases

Authors: Minjung Park, Sangmi Chai, Myoung Jun Lee

Abstract:

Since there are growing concerns on online privacy, enterprises could involve various personal privacy infringements cases resulting legal causations. For companies that are involving online business, it is important for them to pay extra attentions to protect users’ privacy. If firms can aware consequences from possible online privacy invasion cases, they can more actively prevent future online privacy infringements. This study attempts to predict the probability of ruling types caused by various invasion cases under U.S Personal Privacy Act. More specifically, this research explores online privacy invasion cases which was sentenced guilty to identify types of criminal punishments such as penalty, imprisonment, probation as well as compensation in civil cases. Based on the 853 U.S judicial cases ranged from January, 2000 to May, 2016, which related on data privacy, this research examines the relationship between personal information infringements cases and adjudications. Upon analysis results of 41,724 words extracted from 853 regal cases, this study examined online users’ privacy invasion cases to predict the probability of conviction for a firm as an offender in both of criminal and civil law. This research specifically examines that a cause of privacy infringements and a judgment type, whether it leads a civil or criminal liability, from U.S court. This study applies network text analysis (NTA) for data analysis, which is regarded as a useful method to discover embedded social trends within texts. According to our research results, certain online privacy infringement cases caused by online spamming and adware have a high possibility that firms are liable in the case. Our research results provide meaningful insights to academia as well as industry. First, our study is providing a new insight by applying Big Data analytics to legal cases so that it can predict the cause of invasions and legal consequences. Since there are few researches applying big data analytics in the domain of law, specifically in online privacy, this study suggests new area that future studies can explore. Secondly, this study reflects social influences, such as a development of privacy invasion technologies and changes of users’ level of awareness of online privacy on judicial cases analysis by adopting NTA method. Our research results indicate that firms need to improve technical and managerial systems to protect users’ online privacy to avoid negative legal consequences.

Keywords: network text analysis, online privacy invasions, personal information infringements, predicting judgements

Procedia PDF Downloads 228
40811 The Effects of Virtual Reality Technology in Maternity Delivery: A Systematic Review and Meta-Analysis

Authors: Nuo Xu, Sijing Chen

Abstract:

Background: Childbirth is considered a critical traumatic event throughout our lives, positively or negatively impacting the mother's physiology, psychology, and even the whole family. Adverse birth experiences, such as labor pain, anxiety, and fear can negatively impact the mother. Studies had shown that the immersive nature of VR can distract attention from pain and increase focus on interventions for pain relief. However, the existing studies that applied VR to maternal delivery were still in their infancy and showed disparate results, and the small sample size is not representative, so this review analyzed the effects of VR in labor, such as on maternal pain and anxiety, with a view to providing a basis for future applications. Search strategy: We searched Pubmed, Embase, Web of Science, the Cochrane Library, CINAHL, China National Knowledge Infrastructure, Wan-Fang database from the building to November 17, 2021. Selection Criteria: Randomized controlled trials (RCTs) that intervened the pregnant women aged 18-35 years with gestational >34 weeks and without complications with VR technology were contained within this review. Data Collection and Analysis: Two researchers completed the study selection, data extraction, and assessment of study quality. For quantitative data we used MD or SMD, and RR (risk ratio) for qualitative data. Random-effects model and 95% confidence interval (95% CI) were used. Main Results: 12 studies were included. Using VR could relieve pain during labor (MD=-1.81, 95% CI (-2.04, -1.57), P< 0.00001) and active period (SMD=-0.41, 95% CI (-0.68, -0.14), P= 0.003), reduce anxiety (SMD=-1.39, 95% CI (-1.99, -0.78), P< 0.00001) and improve satisfaction (RR = 1.32; 95% CI (1.10, 1.59); P = 0.003), but the effect on the duration of first (SMD=-1.12, 95% CI (-2.38, 0.13), P=0.08) and second (SMD=-0.22, 95% CI (-0.67, 0.24), P=0.35) stage of labor was not statistically significant. Conclusions: Compared with conventional care, VR technology can relieve labor pain and anxiety and improve satisfaction. However, extensive experimental validation is still needed.

Keywords: virtual reality, delivery, labor pain, anxiety, meta-analysis, systematic review

Procedia PDF Downloads 91
40810 Case Study Approach Using Scenario Analysis to Analyze Unabsorbed Head Office Overheads

Authors: K. C. Iyer, T. Gupta, Y. M. Bindal

Abstract:

Head office overhead (HOOH) is an indirect cost and is recovered through individual project billings by the contractor. Delay in a project impacts the absorption of HOOH cost allocated to that particular project and thus diminishes the expected profit of the contractor. This unabsorbed HOOH cost is later claimed by contractors as damages. The subjective nature of the available formulae to compute unabsorbed HOOH is the difficulty that contractors and owners face and thus dispute it. The paper attempts to bring together the rationale of various HOOH formulae by gathering contractor’s HOOH cost data on all of its project, using case study approach and comparing variations in values of HOOH using scenario analysis. The case study approach uses project data collected from four construction projects of a contractor in India to calculate unabsorbed HOOH costs from various available formulae. Scenario analysis provides further variations in HOOH values after considering two independent situations mainly scope changes and new projects during the delay period. Interestingly, one of the findings in this study reveals that, in spite of HOOH getting absorbed by additional works available during the period of delay, a few formulae depict an increase in the value of unabsorbed HOOH, neglecting any absorption by the increase in scope. This indicates that these formulae are inappropriate for use in case of a change to the scope of work. Results of this study can help both parties in deciding on an appropriate formula more objectively, considering the events on a project causing the delay and contractor's position in respect of obtaining new projects.

Keywords: absorbed and unabsorbed overheads, head office overheads, scenario analysis, scope variation

Procedia PDF Downloads 163
40809 Role of Inherited Structures during Inversion Tectonics: An Example from Tunisia, North Africa

Authors: Aymen Arfaoui, Abdelkader Soumaya, Ali Kadri, Noureddine Ben Ayed

Abstract:

The Tunisian dorsal backland is located on the Eastern Atlas side of the Maghrebides (North Africa). The analysis of collected field data in the Rouas and Ruissate mountains area allowed us to develop new interpretations for its structural framework. Our kinematic analysis of fault-slip data reveals the presence of an extensional tectonic regime with NE-SW Shmin, characterizing the Mesozoic times. In addition, geophysical data shows that the synsedimentary normal faulting is accompanied by thickness variations of sedimentary sequences and Triassic salt movements. Then, after the Eurasia-Africa plate’s convergence during the Eocene, compressive tectonic deformations affected and reactivated the inherited NW-SE and N-S trending normal faults as dextral strike-slip and reverse faults, respectively. This tectonic inversion, with compression to the transpressional tectonic regime and NW-SE SHmax, continued during the successive shortening phases of the upper Miocene and Quaternary. The geometry of the Rouas and Ruissate belt is expressed as a fault propagation fold, affecting Jurassic and Cretaceous deposits. The Triassic evaporates constitute the decollement levels, facilitating the detachment and deformation of the sedimentary cover. The backland of this thrust belt is defined by NNE-SSW trending imbrication features that are controlled by a basement N-S fault.

Keywords: Tunisian dorsal backland, fault slip data; synsedimentary faults, tectonic inversion, decollement level, fault propagation fold

Procedia PDF Downloads 139
40808 Intelligent Human Pose Recognition Based on EMG Signal Analysis and Machine 3D Model

Authors: Si Chen, Quanhong Jiang

Abstract:

In the increasingly mature posture recognition technology, human movement information is widely used in sports rehabilitation, human-computer interaction, medical health, human posture assessment, and other fields today; this project uses the most original ideas; it is proposed to use the collection equipment for the collection of myoelectric data, reflect the muscle posture change on a degree of freedom through data processing, carry out data-muscle three-dimensional model joint adjustment, and realize basic pose recognition. Based on this, bionic aids or medical rehabilitation equipment can be further developed with the help of robotic arms and cutting-edge technology, which has a bright future and unlimited development space.

Keywords: pose recognition, 3D animation, electromyography, machine learning, bionics

Procedia PDF Downloads 76
40807 A Review on 3D Smart City Platforms Using Remotely Sensed Data to Aid Simulation and Urban Analysis

Authors: Slim Namouchi, Bruno Vallet, Imed Riadh Farah

Abstract:

3D urban models provide powerful tools for decision making, urban planning, and smart city services. The accuracy of this 3D based systems is directly related to the quality of these models. Since manual large-scale modeling, such as cities or countries is highly time intensive and very expensive process, a fully automatic 3D building generation is needed. However, 3D modeling process result depends on the input data, the proprieties of the captured objects, and the required characteristics of the reconstructed 3D model. Nowadays, producing 3D real-world model is no longer a problem. Remotely sensed data had experienced a remarkable increase in the recent years, especially data acquired using unmanned aerial vehicles (UAV). While the scanning techniques are developing, the captured data amount and the resolution are getting bigger and more precise. This paper presents a literature review, which aims to identify different methods of automatic 3D buildings extractions either from LiDAR or the combination of LiDAR and satellite or aerial images. Then, we present open source technologies, and data models (e.g., CityGML, PostGIS, Cesiumjs) used to integrate these models in geospatial base layers for smart city services.

Keywords: CityGML, LiDAR, remote sensing, SIG, Smart City, 3D urban modeling

Procedia PDF Downloads 134