Search results for: crash data
21832 An Inverse Approach for Determining Creep Properties from a Miniature Thin Plate Specimen under Bending
Authors: Yang Zheng, Wei Sun
Abstract:
This paper describes a new approach which can be used to interpret the experimental creep deformation data obtained from miniaturized thin plate bending specimen test to the corresponding uniaxial data based on an inversed application of the reference stress method. The geometry of the thin plate is fully defined by the span of the support, l, the width, b, and the thickness, d. Firstly, analytical solutions for the steady-state, load-line creep deformation rate of the thin plates for a Norton’s power law under plane stress (b → 0) and plane strain (b → ∞) conditions were obtained, from which it can be seen that the load-line deformation rate of the thin plate under plane-stress conditions is much higher than that under the plane-strain conditions. Since analytical solution is not available for the plates with random b-values, finite element (FE) analyses are used to obtain the solutions. Based on the FE results obtained for various b/l ratios and creep exponent, n, as well as the analytical solutions under plane stress and plane strain conditions, an approximate, numerical solutions for the deformation rate are obtained by curve fitting. Using these solutions, a reference stress method is utilised to establish the conversion relationships between the applied load and the equivalent uniaxial stress and between the creep deformations of thin plate and the equivalent uniaxial creep strains. Finally, the accuracy of the empirical solution was assessed by using a set of “theoretical” experimental data.Keywords: bending, creep, thin plate, materials engineering
Procedia PDF Downloads 47421831 Analyzing the Commentator Network Within the French YouTube Environment
Authors: Kurt Maxwell Kusterer, Sylvain Mignot, Annick Vignes
Abstract:
To our best knowledge YouTube is the largest video hosting platform in the world. A high number of creators, viewers, subscribers and commentators act in this specific eco-system which generates huge sums of money. Views, subscribers, and comments help to increase the popularity of content creators. The most popular creators are sponsored by brands and participate in marketing campaigns. For a few of them, this becomes a financially rewarding profession. This is made possible through the YouTube Partner Program, which shares revenue among creators based on their popularity. We believe that the role of comments in increasing the popularity is to be emphasized. In what follows, YouTube is considered as a bilateral network between the videos and the commentators. Analyzing a detailed data set focused on French YouTubers, we consider each comment as a link between a commentator and a video. Our research question asks what are the predominant features of a video which give it the highest probability to be commented on. Following on from this question, how can we use these features to predict the action of the agent in commenting one video instead of another, considering the characteristics of the commentators, videos, topics, channels, and recommendations. We expect to see that the videos of more popular channels generate higher viewer engagement and thus are more frequently commented. The interest lies in discovering features which have not classically been considered as markers for popularity on the platform. A quick view of our data set shows that 96% of the commentators comment only once on a certain video. Thus, we study a non-weighted bipartite network between commentators and videos built on the sub-sample of 96% of unique comments. A link exists between two nodes when a commentator makes a comment on a video. We run an Exponential Random Graph Model (ERGM) approach to evaluate which characteristics influence the probability of commenting a video. The creation of a link will be explained in terms of common video features, such as duration, quality, number of likes, number of views, etc. Our data is relevant for the period of 2020-2021 and focuses on the French YouTube environment. From this set of 391 588 videos, we extract the channels which can be monetized according to YouTube regulations (channels with at least 1000 subscribers and more than 4000 hours of viewing time during the last twelve months).In the end, we have a data set of 128 462 videos which consist of 4093 channels. Based on these videos, we have a data set of 1 032 771 unique commentators, with a mean of 2 comments per a commentator, a minimum of 1 comment each, and a maximum of 584 comments.Keywords: YouTube, social networks, economics, consumer behaviour
Procedia PDF Downloads 6821830 Production Increase of C-Central Wells Baher Essalm-Libya
Authors: Emed Krekshi, Walid Ben Husein
Abstract:
The Bahr Essalam gas-condensate field is located off the Libyan coast and is currently being produced by Mellitah Oil and Gas (MOG). Gas and condensate are produced from the Bahr Essalam reservoir through a mixture of platform and subsea wells, with the subsea wells being gathered at the western manifolds and delivered to the Sabratha platform via a 22-inch pipeline. Gas is gathered and dehydrated on the Sabratha platform and then delivered to the Mellitah gas plant via an existing 36-inch gas export pipeline. The condensate separated on the Sabratha platform will be delivered to the Mellitah gas plant via an existing 10-inch export pipeline. The Bahr Essalam Phase II project includes 2 production wells (CC16 & CC17) at C-Central A connected to the Sabratha platform via a new 10.9 km long 10”/14” production pipeline. Production rates from CC16 and CC17 have exceeded the maximum planned rate of 40 MMSCFD per well. A hydrothermal analysis was conducted to review and Verify input data, focusing on the variation of flowing well head as a function of flowrate.as well as Review available input data against the previous design input data to determine the extent of change. The steady-state and transient simulations performed with Olga yielded coherent results and confirmed the possibility of achieving flow rates of up to 60MMSCFD per well without exceeding the design temperatures, pressures, and velocities.Keywords: Bahr Essalam, Mellitah Oil and Gas, production flow rates, steady and transient
Procedia PDF Downloads 5821829 The Effect of Change Communication towards Commitment to Change through the Role of Organizational Trust
Authors: Enno R. Farahzehan, Wustari L. Mangundjaya
Abstract:
Organizational change is necessary to develop innovation and to compete with other competitors. Organizational changes were also made to defend the existence of the organization itself. Success in implementing organizational change consists of a variety of factors, one of which is individual (employee) who run changes. The employee must have the willingness and ability in carrying out the changes. Besides, employees must also have a commitment to change for creation of the successful organizational change. This study aims to execute the effect of change communication towards commitment to change through the role of organizational trust. The respondents of this study were employees who work in organizations, which have been or are currently running organizational changes. The data were collected using Change Communication, Commitment to Change, and Organizational Trust Inventory. The data were analyzed using regression. The result showed that there is an effect among change communication towards commitment to change which is higher when mediated by organizational trust. This paper will contribute to the knowledge and implications of organizational change, that shows change communication can affect commitment to change among employee if there is trust in the organization.Keywords: change communication, commitment to change, organizational trust, organizational change
Procedia PDF Downloads 34321828 The Classification Accuracy of Finance Data through Holder Functions
Authors: Yeliz Karaca, Carlo Cattani
Abstract:
This study focuses on the local Holder exponent as a measure of the function regularity for time series related to finance data. In this study, the attributes of the finance dataset belonging to 13 countries (India, China, Japan, Sweden, France, Germany, Italy, Australia, Mexico, United Kingdom, Argentina, Brazil, USA) located in 5 different continents (Asia, Europe, Australia, North America and South America) have been examined.These countries are the ones mostly affected by the attributes with regard to financial development, covering a period from 2012 to 2017. Our study is concerned with the most important attributes that have impact on the development of finance for the countries identified. Our method is comprised of the following stages: (a) among the multi fractal methods and Brownian motion Holder regularity functions (polynomial, exponential), significant and self-similar attributes have been identified (b) The significant and self-similar attributes have been applied to the Artificial Neuronal Network (ANN) algorithms (Feed Forward Back Propagation (FFBP) and Cascade Forward Back Propagation (CFBP)) (c) the outcomes of classification accuracy have been compared concerning the attributes that have impact on the attributes which affect the countries’ financial development. This study has enabled to reveal, through the application of ANN algorithms, how the most significant attributes are identified within the relevant dataset via the Holder functions (polynomial and exponential function).Keywords: artificial neural networks, finance data, Holder regularity, multifractals
Procedia PDF Downloads 24621827 3d Property Modelling of the Lower Acacus Reservoir, Ghadames Basin, Libya
Authors: Aimen Saleh
Abstract:
The Silurian Lower Acacus sandstone is one of the main reservoirs in North West Libya. Our aim in this study is to grasp a robust understanding of the hydrocarbon potential and distribution in the area. To date, the depositional environment of the Lower Acacus reservoir still open to discussion and contradiction. Henceforth, building three dimensional (3D) property modelling is one way to support the analysis and description of the reservoir, its properties and characterizations, so this will be of great value in this project. The 3D model integrates different data set, these incorporates well logs data, petrophysical reservoir properties and seismic data as well. The finalized depositional environment model of the Lower Acacus concludes that the area is located in a deltaic transitional depositional setting, which ranges from a wave dominated delta into tide dominated delta type. This interpretation carried out through a series of steps of model generation, core description and Formation Microresistivity Image tool (FMI) interpretation. After the analysis of the core data, the Lower Acacus layers shows a strong effect of tidal energy. Whereas these traces found imprinted in different types of sedimentary structures, for examples; presence of some crossbedding, such as herringbones structures, wavy and flaser cross beddings. In spite of recognition of some minor marine transgression events in the area, on the contrary, the coarsening upward cycles of sand and shale layers in the Lower Acacus demonstrate presence of a major regressive phase of the sea level. However, consequently, we produced a final package of this model in a complemented set of facies distribution, porosity and oil presence. And also it shows the record of the petroleum system, and the procedure of Hydrocarbon migration and accumulation. Finally, this model suggests that the area can be outlined into three main segments of hydrocarbon potential, which can be a textbook guide for future exploration and production strategies in the area.Keywords: Acacus, Ghadames , Libya, Silurian
Procedia PDF Downloads 14321826 Variations in Heat and Cold Waves over Southern India
Authors: Amit G. Dhorde
Abstract:
It is now well established that the global surface air temperatures have increased significantly during the period that followed the industrial revolution. One of the main predictions of climate change is that the occurrences of extreme weather events will increase in future. In many regions of the world, high-temperature extremes have already started occurring with rising frequency. The main objective of the present study is to understand spatial and temporal changes in days with heat and cold wave conditions over southern India. The study area includes the region of India that lies to the south of Tropic of Cancer. To fulfill the objective, daily maximum and minimum temperature data for 80 stations were collected for the period 1969-2006 from National Data Center of India Meteorological Department. After assessing the homogeneity of data, 62 stations were finally selected for the study. Heat and cold waves were classified as slight, moderate and severe based on the criteria given by Indias' meteorological department. For every year, numbers of days experiencing heat and cold wave conditions were computed. This data was analyzed with linear regression to find any existing trend. Further, the time period was divided into four decades to investigate the decadal frequency of the occurrence of heat and cold waves. The results revealed that the average annual temperature over southern India shows an increasing trend, which signifies warming over this area. Further, slight cold waves during winter season have been decreasing at the majority of the stations. The moderate cold waves also show a similar pattern at the majority of the stations. This is an indication of warming winters over the region. Besides this analysis, other extreme indices were also analyzed such as extremely hot days, hot days, very cold nights, cold nights, etc. This analysis revealed that nights are becoming warmer and days are getting warmer over some regions too.Keywords: heat wave, cold wave, southern India, decadal frequency
Procedia PDF Downloads 12821825 Study of the Effect of Inclusion of TiO2 in Active Flux on Submerged Arc Welding of Low Carbon Mild Steel Plate and Parametric Optimization of the Process by Using DEA Based Bat Algorithm
Authors: Sheetal Kumar Parwar, J. Deb Barma, A. Majumder
Abstract:
Submerged arc welding is a very complex process. It is a very efficient and high performance welding process. In this present study an attempt have been done to reduce the welding distortion by increased amount of oxide flux through TiO2 in submerged arc welding process. Care has been taken to avoid the excessiveness of the adding agent for attainment of significant results. Data Envelopment Analysis (DEA) based BAT algorithm is used for the parametric optimization purpose in which DEA Data Envelopment Analysis is used to convert multi response parameters into a single response parameter. The present study also helps to know the effectiveness of the addition of TiO2 in active flux during submerged arc welding process.Keywords: BAT algorithm, design of experiment, optimization, submerged arc welding
Procedia PDF Downloads 64021824 Introducing a Proper Total Quality Management Model for Libraries
Authors: Alireza Shahraki, Kaveh Keshmiry Zadeh
Abstract:
Total quality management in libraries is of particular importance because high-quality libraries can facilitate the sustained development process in countries. This study has been conducted to examine the feasibility of implementation of total quality management in libraries of Sistan and Baluchestan and to provide an appropriate model for this concern. All of the officials and employees of Sistan and Baluchestan libraries (23 individuals) constitute the population of the study. Data gathering tool is a questionnaire that is designated based on ISO9000. The data extracted from questionnaires were analyzed using SPSS software. Results indicate that the highest degree of conformance to the 8 principles of ISO9000 is attributed to the principle of 'users' (69.9%) and the lowest degree is associated with 'decision making based on facts' (39.1%). Moreover, a significant relationship was observed among the items (1 and 3), (2 and 5), (2 and 7), (3 and 5), (4 and 5), (4 and 7), (4 and 8), (5 and 7), and (7 and 8). According to the research findings, it can generally be said that it is not eligible now to utilize TQM in libraries of Sistan and Baluchestan.Keywords: quality management, total quality, university libraries, libraries management
Procedia PDF Downloads 34021823 A Methodology of Using Fuzzy Logics and Data Analytics to Estimate the Life Cycle Indicators of Solar Photovoltaics
Authors: Thor Alexis Sazon, Alexander Guzman-Urbina, Yasuhiro Fukushima
Abstract:
This study outlines the method of how to develop a surrogate life cycle model based on fuzzy logic using three fuzzy inference methods: (1) the conventional Fuzzy Inference System (FIS), (2) the hybrid system of Data Analytics and Fuzzy Inference (DAFIS), which uses data clustering for defining the membership functions, and (3) the Adaptive-Neuro Fuzzy Inference System (ANFIS), a combination of fuzzy inference and artificial neural network. These methods were demonstrated with a case study where the Global Warming Potential (GWP) and the Levelized Cost of Energy (LCOE) of solar photovoltaic (PV) were estimated using Solar Irradiation, Module Efficiency, and Performance Ratio as inputs. The effects of using different fuzzy inference types, either Sugeno- or Mamdani-type, and of changing the number of input membership functions to the error between the calibration data and the model-generated outputs were also illustrated. The solution spaces of the three methods were consequently examined with a sensitivity analysis. ANFIS exhibited the lowest error while DAFIS gave slightly lower errors compared to FIS. Increasing the number of input membership functions helped with error reduction in some cases but, at times, resulted in the opposite. Sugeno-type models gave errors that are slightly lower than those of the Mamdani-type. While ANFIS is superior in terms of error minimization, it could generate solutions that are questionable, i.e. the negative GWP values of the Solar PV system when the inputs were all at the upper end of their range. This shows that the applicability of the ANFIS models highly depends on the range of cases at which it was calibrated. FIS and DAFIS generated more intuitive trends in the sensitivity runs. DAFIS demonstrated an optimal design point wherein increasing the input values does not improve the GWP and LCOE anymore. In the absence of data that could be used for calibration, conventional FIS presents a knowledge-based model that could be used for prediction. In the PV case study, conventional FIS generated errors that are just slightly higher than those of DAFIS. The inherent complexity of a Life Cycle study often hinders its widespread use in the industry and policy-making sectors. While the methodology does not guarantee a more accurate result compared to those generated by the Life Cycle Methodology, it does provide a relatively simpler way of generating knowledge- and data-based estimates that could be used during the initial design of a system.Keywords: solar photovoltaic, fuzzy logic, inference system, artificial neural networks
Procedia PDF Downloads 16521822 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain
Authors: Zachary Blanks, Solomon Sonya
Abstract:
Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection
Procedia PDF Downloads 29221821 Modelling High-Frequency Crude Oil Dynamics Using Affine and Non-Affine Jump-Diffusion Models
Authors: Katja Ignatieva, Patrick Wong
Abstract:
We investigated the dynamics of high frequency energy prices, including crude oil and electricity prices. The returns of underlying quantities are modelled using various parametric models such as stochastic framework with jumps and stochastic volatility (SVCJ) as well as non-parametric alternatives, which are purely data driven and do not require specification of the drift or the diffusion coefficient function. Using different statistical criteria, we investigate the performance of considered parametric and nonparametric models in their ability to forecast price series and volatilities. Our models incorporate possible seasonalities in the underlying dynamics and utilise advanced estimation techniques for the dynamics of energy prices.Keywords: stochastic volatility, affine jump-diffusion models, high frequency data, model specification, markov chain monte carlo
Procedia PDF Downloads 10421820 Educational Framework for Coaches on Injury Prevention in Adolescent Team Sports
Authors: Chantell Gouws, Lourens Millard, Anne Naude, Jan-Wessel Meyer, Brandon Stuwart Shaw, Ina Shaw
Abstract:
Background: Millions of South African youths participate in team sports, with netball and rugby being two of the largest worldwide. This increased participation and professionalism have resulted in an increase in the number of musculoskeletal injuries. Objective: This study examined the extent to which sport coaching knowledge translates to the injuries and prevention of injuries in adolescents participating in netball and rugby. Methods: Thirty-four South African sports coaches participated in the study. Eighteen netball coaches and 16 rugby coaches with varying levels of coaching experience were selected to participate. An adapted version of Nash and Sproule’s questionnaire was used to investigate the coaches’ knowledge with regards to sport-specific common injuries, injury prevention, fitness/conditioning, individual technique development, training programs, mental training, and preparation of players. The analysis of data was carried out using a number of different techniques outlined by Nash and Sproule (2012). These techniques were determined by the type of data. Descriptive data was used to provide statistical analysis. Quantitative data was used to determine the educational framework and knowledge of sports coaches on injury prevention. Numerical data was obtained through questions on sports injuries, as well as coaches’ sports knowledge levels. Participants’ knowledge was measured using a standardized scoring system. Results: For the 0-4 years of netball coaching experience, 76.4% of the coaches had knowledge and experience and 33.3% appropriate first aid knowledge, while for the 9-12 years and 13-16 years, 100% of the coaches had knowledge and experience and first aid knowledge. For the 0-4 years in rugby coaching experience, 59.1% had knowledge and experience and 71% the appropriate first aid knowledge; for the 17-20 years, 100% had knowledge and experience and first aid, while for higher or equal to 25 years, 45.5% had knowledge and experience. In netball, 90% of injuries consisted of ankle injuries, followed by 70% for knee, 50% for shoulder, 20% for lower leg, and 15% for finger injuries. In rugby, 81% of the injuries occurred at the knee, followed by 50% for the shoulder, 40% for the ankle, 31% for the head and neck, and 25% for hamstring injuries. Six hours of training resulted in a 13% chance of injuries in netball and a 32% chance in rugby. For 10 hours of training, the injury prevalence was 10% in netball and 17% in rugby, while 15 hours resulted in an injury incidence of 58% in netball players and a 25% chance in rugby players. Conclusion: This study highlights the need for coaches to improve their knowledge in relation to injuries and injury prevention, along with factors that act as a preventative measure and promotes players’ well-being.Keywords: musculoskeletal injury, sport coaching, sport trauma
Procedia PDF Downloads 16121819 Tourists' Percepion of Osun Osogbo Festival in Osogbo, Osun State Nigeria
Authors: Yina Donald Orga
Abstract:
Osun Osogbo festival is one of the biggest art festivals in Nigeria with over 235, 518 tourist visits in 2014. The purpose of this study is to generate data on the tourists’ perception of Osun Osogbo Festival in Osogbo, Osun State Nigeria. Based on the population of 199, 860 tourist visits at Osun Osogbo festival in 2013, Krejcie and Morgan sample size table was used to select 768 tourists/respondents. Likert questionnaire were used to elicit data from the respondents. Descriptive statistic was used to describe the characteristics of respondents and analyse the tourists’ perception of the festival. The findings from data analysed suggest that the trend of domestic and international tourist visits in the past ten years for the festival had shown a consistent increase since 2004 except in 2007 and 2008 and continue to increase up to 2013. This is an indication that the tourists are satisfied with traditional, historical and authenticity features of the festival. Also, findings from the study revealed that the tourists are not satisfied with the number of toilets at Osun Sacred Grove, crowd control of visitors during the festival, medical personnel to cater for visitors during the festival, etc. In view of the findings of the study, the following recommendations are suggested; provision of more toilets at Osun Sacred grove, Osogbo Heritage Council to recruit festival guides to help control the huge crowd at the festival, the Government of State of Osun in conjunction with Red Cross Society should engage adequate medical personnel to cater for medical needs of visitors at the festival, etc.Keywords: festival, perception, positive, tourists
Procedia PDF Downloads 20621818 End To End Process to Automate Batch Application
Authors: Nagmani Lnu
Abstract:
Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing
Procedia PDF Downloads 6021817 Improving the Logistic System to Secure Effective Food Fish Supply Chain in Indonesia
Authors: Atikah Nurhayati, Asep A. Handaka
Abstract:
Indonesia is a world’s major fish producer which can feed not only its citizens but also the people of the world. Currently, the total annual production is 11 tons and expected to double by the year of 2050. Given the potential, fishery has been an important part of the national food security system in Indonesia. Despite such a potential, a big challenge is facing the Indonesians in making fish the reliable source for their food, more specifically source of protein intake. The long geographic distance between the fish production centers and the consumer concentrations has prevented effective supply chain from producers to consumers and therefore demands a good logistic system. This paper is based on our research, which aimed at analyzing the fish supply chain and is to suggest relevant improvement to the chain. The research was conducted in the Year of 2016 in selected locations of Java Island, where intensive transaction on fishery commodities occur. Data used in this research comprises secondary data of time series reports on production and distribution and primary data regarding distribution aspects which were collected through interviews with purposively selected 100 respondents representing fishers, traders and processors. The data were analyzed following the supply chain management framework and processed following logistic regression and validity tests. The main findings of the research are as follows. Firstly, it was found that improperly managed connectivity and logistic chain is the main cause for insecurity of availability and affordability for the consumers. Secondly, lack of quality of most local processed products is a major obstacle for improving affordability and connectivity. The paper concluded with a number of recommended strategies to tackle the problem. These include rationalization of the length of the existing supply chain, intensification of processing activities, and improvement of distribution infrastructure and facilities.Keywords: fishery, food security, logistic, supply chain
Procedia PDF Downloads 24121816 Hand Gestures Based Emotion Identification Using Flex Sensors
Authors: S. Ali, R. Yunus, A. Arif, Y. Ayaz, M. Baber Sial, R. Asif, N. Naseer, M. Jawad Khan
Abstract:
In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.Keywords: emotion identification, emotion models, gesture recognition, user perception
Procedia PDF Downloads 28521815 Causal Inference Engine between Continuous Emission Monitoring System Combined with Air Pollution Forecast Modeling
Authors: Yu-Wen Chen, Szu-Wei Huang, Chung-Hsiang Mu, Kelvin Cheng
Abstract:
This paper developed a data-driven based model to deal with the causality between the Continuous Emission Monitoring System (CEMS, by Environmental Protection Administration, Taiwan) in industrial factories, and the air quality around environment. Compared to the heavy burden of traditional numerical models of regional weather and air pollution simulation, the lightweight burden of the proposed model can provide forecasting hourly with current observations of weather, air pollution and emissions from factories. The observation data are included wind speed, wind direction, relative humidity, temperature and others. The observations can be collected real time from Open APIs of civil IoT Taiwan, which are sourced from 439 weather stations, 10,193 qualitative air stations, 77 national quantitative stations and 140 CEMS quantitative industrial factories. This study completed a causal inference engine and gave an air pollution forecasting for the next 12 hours related to local industrial factories. The outcomes of the pollution forecasting are produced hourly with a grid resolution of 1km*1km on IIoTC (Industrial Internet of Things Cloud) and saved in netCDF4 format. The elaborated procedures to generate forecasts comprise data recalibrating, outlier elimination, Kriging Interpolation and particle tracking and random walk techniques for the mechanisms of diffusion and advection. The solution of these equations reveals the causality between factories emission and the associated air pollution. Further, with the aid of installed real-time flue emission (Total Suspension Emission, TSP) sensors and the mentioned forecasted air pollution map, this study also disclosed the converting mechanism between the TSP and PM2.5/PM10 for different region and industrial characteristics, according to the long-term data observation and calibration. These different time-series qualitative and quantitative data which successfully achieved a causal inference engine in cloud for factory management control in practicable. Once the forecasted air quality for a region is marked as harmful, the correlated factories are notified and asked to suppress its operation and reduces emission in advance.Keywords: continuous emission monitoring system, total suspension particulates, causal inference, air pollution forecast, IoT
Procedia PDF Downloads 8721814 Artificial Intelligent-Based Approaches for Task Offloading, Resource Allocation and Service Placement of Internet of Things Applications: State of the Art
Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib
Abstract:
In order to support the continued growth, critical latency of IoT applications, and various obstacles of traditional data centers, mobile edge computing (MEC) has emerged as a promising solution that extends cloud data-processing and decision-making to edge devices. By adopting a MEC structure, IoT applications could be executed locally, on an edge server, different fog nodes, or distant cloud data centers. However, we are often faced with wanting to optimize conflicting criteria such as minimizing energy consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge devices and trying to keep high performance (reducing response time, increasing throughput and service availability) at the same time. Achieving one goal may affect the other, making task offloading (TO), resource allocation (RA), and service placement (SP) complex processes. It is a nontrivial multi-objective optimization problem to study the trade-off between conflicting criteria. The paper provides a survey on different TO, SP, and RA recent multi-objective optimization (MOO) approaches used in edge computing environments, particularly artificial intelligent (AI) ones, to satisfy various objectives, constraints, and dynamic conditions related to IoT applications.Keywords: mobile edge computing, multi-objective optimization, artificial intelligence approaches, task offloading, resource allocation, service placement
Procedia PDF Downloads 11521813 Adopting Data Science and Citizen Science to Explore the Development of African Indigenous Agricultural Knowledge Platform
Authors: Steven Sam, Ximena Schmidt, Hugh Dickinson, Jens Jensen
Abstract:
The goal of this study is to explore the potential of data science and citizen science approaches to develop an interactive, digital, open infrastructure that pulls together African indigenous agriculture and food systems data from multiple sources, making it accessible and reusable for policy, research and practice in modern food production efforts. The World Bank has recognised that African Indigenous Knowledge (AIK) is innovative and unique among local and subsistent smallholder farmers, and it is central to sustainable food production and enhancing biodiversity and natural resources in many poor, rural societies. AIK refers to tacit knowledge held in different languages, cultures and skills passed down from generation to generation by word of mouth. AIK is a key driver of food production, preservation, and consumption for more than 80% of citizens in Africa, and can therefore assist modern efforts of reducing food insecurity and hunger. However, the documentation and dissemination of AIK remain a big challenge confronting librarians and other information professionals in Africa, and there is a risk of losing AIK owing to urban migration, modernisation, land grabbing, and the emergence of relatively small-scale commercial farming businesses. There is also a clear disconnect between the AIK and scientific knowledge and modern efforts for sustainable food production. The study combines data science and citizen science approaches through active community participation to generate and share AIK for facilitating learning and promoting knowledge that is relevant for policy intervention and sustainable food production through a curated digital platform based on FAIR principles. The study adopts key informant interviews along with participatory photo and video elicitation approach, where farmers are given digital devices (mobile phones) to record and document their every practice involving agriculture, food production, processing, and consumption by traditional means. Data collected are analysed using the UK Science and Technology Facilities Council’s proven methodology of citizen science (Zooniverse) and data science. Outcomes are presented in participatory stakeholder workshops, where the researchers outline plans for creating the platform and developing the knowledge sharing standard framework and copyrights agreement. Overall, the study shows that learning from AIK, by investigating what local communities know and have, can improve understanding of food production and consumption, in particular in times of stress or shocks affecting the food systems and communities. Thus, the platform can be useful for local populations, research, and policy-makers, and it could lead to transformative innovation in the food system, creating a fundamental shift in the way the North supports sustainable, modern food production efforts in Africa.Keywords: Africa indigenous agriculture knowledge, citizen science, data science, sustainable food production, traditional food system
Procedia PDF Downloads 8221812 Energy Efficiency Analysis of Crossover Technologies in Industrial Applications
Authors: W. Schellong
Abstract:
Industry accounts for one-third of global final energy demand. Crossover technologies (e.g. motors, pumps, process heat, and air conditioning) play an important role in improving energy efficiency. These technologies are used in many applications independent of the production branch. Especially electrical power is used by drives, pumps, compressors, and lightning. The paper demonstrates the algorithm of the energy analysis by some selected case studies for typical industrial processes. The energy analysis represents an essential part of energy management systems (EMS). Generally, process control system (PCS) can support EMS. They provide information about the production process, and they organize the maintenance actions. Combining these tools into an integrated process allows the development of an energy critical equipment strategy. Thus, asset and energy management can use the same common data to improve the energy efficiency.Keywords: crossover technologies, data management, energy analysis, energy efficiency, process control
Procedia PDF Downloads 21121811 The Review of Permanent Downhole Monitoring System
Abstract:
With the increasingly difficult development and operating environment of exploration, there are many new challenges and difficulties in developing and exploiting oil and gas resources. These include the ability to dynamically monitor wells and provide data and assurance for the completion and production of high-cost and complex wells. A key technology in providing these assurances and maximizing oilfield profitability is real-time permanent reservoir monitoring. The emergence of optical fiber sensing systems has gradually begun to replace traditional electronic systems. Traditional temperature sensors can only achieve single-point temperature monitoring, but fiber optic sensing systems based on the Bragg grating principle have a high level of reliability, accuracy, stability, and resolution, enabling cost-effective monitoring, which can be done in real-time, anytime, and without well intervention. Continuous data acquisition is performed along the entire wellbore. The integrated package with the downhole pressure gauge, packer, and surface system can also realize real-time dynamic monitoring of the pressure in some sections of the downhole, avoiding oil well intervention and eliminating the production delay and operational risks of conventional surveys. Real-time information obtained through permanent optical fibers can also provide critical reservoir monitoring data for production and recovery optimization.Keywords: PDHM, optical fiber, coiled tubing, photoelectric composite cable, digital-oilfield
Procedia PDF Downloads 7921810 Wind Speed Forecasting Based on Historical Data Using Modern Prediction Methods in Selected Sites of Geba Catchment, Ethiopia
Authors: Halefom Kidane
Abstract:
This study aims to assess the wind resource potential and characterize the urban area wind patterns in Hawassa City, Ethiopia. The estimation and characterization of wind resources are crucial for sustainable urban planning, renewable energy development, and climate change mitigation strategies. A secondary data collection method was used to carry out the study. The collected data at 2 meters was analyzed statistically and extrapolated to the standard heights of 10-meter and 30-meter heights using the power law equation. The standard deviation method was used to calculate the value of scale and shape factors. From the analysis presented, the maximum and minimum mean daily wind speed at 2 meters in 2016 was 1.33 m/s and 0.05 m/s in 2017, 1.67 m/s and 0.14 m/s in 2018, 1.61m and 0.07 m/s, respectively. The maximum monthly average wind speed of Hawassa City in 2016 at 2 meters was noticed in the month of December, which is around 0.78 m/s, while in 2017, the maximum wind speed was recorded in the month of January with a wind speed magnitude of 0.80 m/s and in 2018 June was maximum speed which is 0.76 m/s. On the other hand, October was the month with the minimum mean wind speed in all years, with a value of 0.47 m/s in 2016,0.47 in 2017 and 0.34 in 2018. The annual mean wind speed was 0.61 m/s in 2016,0.64, m/s in 2017 and 0.57 m/s in 2018 at a height of 2 meters. From extrapolation, the annual mean wind speeds for the years 2016,2017 and 2018 at 10 heights were 1.17 m/s,1.22 m/s, and 1.11 m/s, and at the height of 30 meters, were 3.34m/s,3.78 m/s, and 3.01 m/s respectively/Thus, the site consists mainly primarily classes-I of wind speed even at the extrapolated heights.Keywords: artificial neural networks, forecasting, min-max normalization, wind speed
Procedia PDF Downloads 7621809 Correlates of Pedagogic Malpractices
Authors: Chinaza Uleanya, Martin Duma, Bongani Gamede
Abstract:
The research investigated pedagogic malpractices by lecturers in sub-Sahara African universities. The population of the study consisted of undergraduates and lecturers in selected universities in Nigeria and South Africa. Mixed method approach was adopted for data collection. The sample population of the study was 480 undergraduate students and 16 lecturers. Questionnaires with 4 point Likert-scale were administered to 480 respondents while interviews were conducted with 6 lecturers. In addition, the teaching strategies of 10 lecturers were observed. Data analyses indicated that poor work environment demotivates lecturers and makes them involved in pedagogic malpractice which is one of the causes of learning challenges faced by undergraduates. The finding of the study also shows that pedagogic malpractice contributes to the high rate of dropout in sub-Sahara African universities. Based on the results, it was recommended that qualified lecturers be employed and given conducive environments to work.Keywords: malpractice, pedagogy, pedagogic malpractice, correlates
Procedia PDF Downloads 30421808 Utilization of Informatics to Transform Clinical Data into a Simplified Reporting System to Examine the Analgesic Prescribing Practices of a Single Urban Hospital’s Emergency Department
Authors: Rubaiat S. Ahmed, Jemer Garrido, Sergey M. Motov
Abstract:
Clinical informatics (CI) enables the transformation of data into a systematic organization that improves the quality of care and the generation of positive health outcomes.Innovative technology through informatics that compiles accurate data on analgesic utilization in the emergency department can enhance pain management in this important clinical setting. We aim to establish a simplified reporting system through CI to examine and assess the analgesic prescribing practices in the EDthrough executing a U.S. federal grant project on opioid reduction initiatives. Queried data points of interest from a level-one trauma ED’s electronic medical records were used to create data sets and develop informational/visual reporting dashboards (on Microsoft Excel and Google Sheets) concerning analgesic usage across several pre-defined parameters and performance metrics using CI. The data was then qualitatively analyzed to evaluate ED analgesic prescribing trends by departmental clinicians and leadership. During a 12-month reporting period (Dec. 1, 2020 – Nov. 30, 2021) for the ongoing project, about 41% of all ED patient visits (N = 91,747) were for pain conditions, of which 81.6% received analgesics in the ED and at discharge (D/C). Of those treated with analgesics, 24.3% received opioids compared to 75.7% receiving opioid alternatives in the ED and at D/C, including non-pharmacological modalities. Demographics showed among patients receiving analgesics, 56.7% were aged between 18-64, 51.8% were male, 51.7% were white, and 66.2% had government funded health insurance. Ninety-one percent of all opioids prescribed were in the ED, with intravenous (IV) morphine, IV fentanyl, and morphine sulfate immediate release (MSIR) tablets accounting for 88.0% of ED dispensed opioids. With 9.3% of all opioids prescribed at D/C, MSIR was dispensed 72.1% of the time. Hydrocodone, oxycodone, and tramadol usage to only 10-15% of the time, and hydromorphone at 0%. Of opioid alternatives, non-steroidal anti-inflammatory drugs were utilized 60.3% of the time, 23.5% with local anesthetics and ultrasound-guided nerve blocks, and 7.9% with acetaminophen as the primary non-opioid drug categories prescribed by ED providers. Non-pharmacological analgesia included virtual reality and other modalities. An average of 18.5 ED opioid orders and 1.9 opioid D/C prescriptions per 102.4 daily ED patient visits was observed for the period. Compared to other specialties within our institution, 2.0% of opioid D/C prescriptions are given by ED providers, compared to the national average of 4.8%. Opioid alternatives accounted for 69.7% and 30.3% usage, versus 90.7% and 9.3% for opioids in the ED and D/C, respectively.There is a pressing need for concise, relevant, and reliable clinical data on analgesic utilization for ED providers and leadership to evaluate prescribing practices and make data-driven decisions. Basic computer software can be used to create effective visual reporting dashboards with indicators that convey relevant and timely information in an easy-to-digest manner. We accurately examined our ED's analgesic prescribing practices using CI through dashboard reporting. Such reporting tools can quickly identify key performance indicators and prioritize data to enhance pain management and promote safe prescribing practices in the emergency setting.Keywords: clinical informatics, dashboards, emergency department, health informatics, healthcare informatics, medical informatics, opioids, pain management, technology
Procedia PDF Downloads 14421807 Homeless Population Modeling and Trend Prediction Through Identifying Key Factors and Machine Learning
Authors: Shayla He
Abstract:
Background and Purpose: According to Chamie (2017), it’s estimated that no less than 150 million people, or about 2 percent of the world’s population, are homeless. The homeless population in the United States has grown rapidly in the past four decades. In New York City, the sheltered homeless population has increased from 12,830 in 1983 to 62,679 in 2020. Knowing the trend on the homeless population is crucial at helping the states and the cities make affordable housing plans, and other community service plans ahead of time to better prepare for the situation. This study utilized the data from New York City, examined the key factors associated with the homelessness, and developed systematic modeling to predict homeless populations of the future. Using the best model developed, named HP-RNN, an analysis on the homeless population change during the months of 2020 and 2021, which were impacted by the COVID-19 pandemic, was conducted. Moreover, HP-RNN was tested on the data from Seattle. Methods: The methodology involves four phases in developing robust prediction methods. Phase 1 gathered and analyzed raw data of homeless population and demographic conditions from five urban centers. Phase 2 identified the key factors that contribute to the rate of homelessness. In Phase 3, three models were built using Linear Regression, Random Forest, and Recurrent Neural Network (RNN), respectively, to predict the future trend of society's homeless population. Each model was trained and tuned based on the dataset from New York City for its accuracy measured by Mean Squared Error (MSE). In Phase 4, the final phase, the best model from Phase 3 was evaluated using the data from Seattle that was not part of the model training and tuning process in Phase 3. Results: Compared to the Linear Regression based model used by HUD et al (2019), HP-RNN significantly improved the prediction metrics of Coefficient of Determination (R2) from -11.73 to 0.88 and MSE by 99%. HP-RNN was then validated on the data from Seattle, WA, which showed a peak %error of 14.5% between the actual and the predicted count. Finally, the modeling results were collected to predict the trend during the COVID-19 pandemic. It shows a good correlation between the actual and the predicted homeless population, with the peak %error less than 8.6%. Conclusions and Implications: This work is the first work to apply RNN to model the time series of the homeless related data. The Model shows a close correlation between the actual and the predicted homeless population. There are two major implications of this result. First, the model can be used to predict the homeless population for the next several years, and the prediction can help the states and the cities plan ahead on affordable housing allocation and other community service to better prepare for the future. Moreover, this prediction can serve as a reference to policy makers and legislators as they seek to make changes that may impact the factors closely associated with the future homeless population trend.Keywords: homeless, prediction, model, RNN
Procedia PDF Downloads 12121806 Democratic Political Socialization of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok
Authors: Mathinee Khongsatid, Phusit Phukamchanoad, Sakapas Saengchai
Abstract:
This research aims to study the democratic political socialization of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok by using stratified sampling for probability sampling and using purposive sampling for non-probability sampling to collect data toward the distribution of questionnaires to 300 respondents. This covers all of the schools under the authority of Dusit District Office. The researcher analyzed the data by using descriptive statistics which include arithmetic mean and standard deviation. The result shows that 5th and 6th graders under the authority of Dusit District Office, Bangkok, have displayed some characteristics following democratic political socialization both inside and outside classroom as well as outside school. However, the democratic political socialization in classroom through grouping and class participation is much more emphasized.Keywords: democratic, political socialization, students grades 5-6, descriptive statistics
Procedia PDF Downloads 27621805 Comparison between Open and Closed System for Dewatering with Geotextile: Field and Comparative Study
Authors: Matheus Müller, Delma Vidal
Abstract:
The present paper aims to expose two techniques of dewatering for sludge, analyzing its operations and dewatering processes, aiming at improving the conditions of disposal of residues with high liquid content. It describes the field tests performed on two geotextile systems, a closed geotextile tube and an open geotextile drying bed, both of which are submitted to two filling cycles. The sludge used in the filling cycles for the field trials is from the water treatment plant of the Technological Center of Aeronautics – CTA, in São José dos Campos, Brazil. Data about volume and height abatement due to the dewatering and consolidation were collected per time, until it was observed constancy. With the laboratory analysis of the sludge allied to the data collected in the field, it was possible to perform a critical comparative study between the observed and the scientific literature, in this way, this paper expresses the data obtained and compares them with the bibliography. The tests were carried out on three fronts: field tests, including the filling cycles of the systems with the sludge from CTA, taking measurements of filling time per cycle and maximum filling height per cycle, heights against the abatement by dewatering of the systems over time; tests carried out in the laboratory, including the characterization of the sludge and removal of material samples from the systems to ascertain the solids content within the systems per time and; comparing the data obtained in the field and laboratory tests with the scientific literature. Through the study, it was possible to perceive that the process of densification of the material inside a closed system, such as the geotextile tube, occurs faster than the observed in the drying bed system. This process of accelerated densification can be brought about by the pumping pressure of the sludge in its filling and by the confinement of the residue through the permeable geotextile membrane (allowing water to pass through), accelerating the process of densification and dewatering by its own weight after the filling with sludge.Keywords: consolidation, dewatering, geotextile drying bed, geotextile tube
Procedia PDF Downloads 12721804 Developing Indicators in System Mapping Process Through Science-Based Visual Tools
Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski
Abstract:
The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.Keywords: indicators, knowledge management, system mapping, visual tools
Procedia PDF Downloads 19521803 Reference Architecture for Intelligent Enterprise Solutions
Authors: Shankar Kambhampaty, Harish Rohan Kambhampaty
Abstract:
Data in IT systems in enterprises has been growing at a phenomenal pace. This has provided opportunities to run analytics to gather intelligence on key business parameters that enable them to provide better products and services to customers. While there are several artificial intelligence (AI/ML) and business intelligence (BI) tools and technologies available in the marketplace to run analytics, there is a need for an integrated view when developing intelligent solutions in enterprises. This paper progressively elaborates a reference model for enterprise solutions, builds an integrated view of data, information, and intelligence components, and presents a reference architecture for intelligent enterprise solutions. Finally, it applies the reference architecture to an insurance organization. The reference architecture is the outcome of experience and insights gathered from developing intelligent solutions for several organizations.Keywords: architecture, model, intelligence, artificial intelligence, business intelligence, AI, BI, ML, analytics, enterprise
Procedia PDF Downloads 143