Search results for: ground truth data
24885 Geographical Data Visualization Using Video Games Technologies
Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava
Abstract:
In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material
Procedia PDF Downloads 24624884 The Effects of Spatial Dimensions and Relocation and Dimensions of Sound Absorbers in a Space on the Objective Parameters of Sound
Authors: Mustafa Kavraz
Abstract:
This study investigated the differences in the objective parameters of sound depending on the changes in the lengths of the lateral surfaces of a space and on the replacement of the sound absorbers that are placed on these surfaces. To this end, three models of room were chosen. The widths and heights of these rooms were the same but the lengths of the rooms were changed. The smallest room was 8 m. wide and 10 m. long. The lengths of the other two rooms were 15 m. and 20 m. For each model, the differences in the objective parameters of sound were determined by keeping all the material in the space intact and by changing only the positions of the sound absorbers that were placed on the walls. The sound absorbers that were used on the walls were of two different sizes. The sound absorbers that were placed on the walls were 4 m and 8 m. long and story-height (3 m.). In all model room types, the sound absorbers were placed on the long walls in three different ways: at the end of the long walls where the long walls meet the front wall; at the end of the long walls where the long walls meet the back wall; and in the middle part of the long walls. Except for the specially placed sound absorbers, the ground, wall and ceiling surfaces were covered with three different materials. There were no constructional elements such as doors and windows on the walls. On the surfaces, the materials specified in the Odeon 10 material library were used as coating material. Linoleum was used as flooring material, painted plaster as wall coating material and gypsum boards as ceiling covering (2 layers with a total of 32 mm. thickness). These were preferred due to the fact that they are the commonly used materials for these purposes. This study investigated the differences in the objective parameters of sound depending on the changes in the lengths of the lateral surfaces of a space and on the replacement of the sound absorbers that are placed on these surfaces. To this end, three models of room were chosen. The widths and heights of these rooms were the same but the lengths of the rooms were changed. The smallest room was 8 m. wide and 10 m. long. The lengths of the other two rooms were 15 m. and 20 m. For each model, the differences in the objective parameters of sound were determined by keeping all the material in the space intact and by changing only the positions of the sound absorbers that were placed on the walls. The sound absorbers that were used on the walls were of two different sizes. The sound absorbers that were placed on the walls were 4 m and 8 m. long and story-height (3 m.). In all model room types, the sound absorbers were placed on the long walls in three different ways: at the end of the long walls where the long walls meet the front wall; at the end of the long walls where the long walls meet the back wall; and in the middle part of the long walls. Except for the specially placed sound absorbers, the ground, wall and ceiling surfaces were covered with three different materials. There were no constructional elements such as doors and windows on the walls. On the surfaces, the materials specified in the Odeon 10 material library were used as coating material. Linoleum was used as flooring material, painted plaster as wall coating material and gypsum boards as ceiling covering (2 layers with a total of 32 mm. thickness). These were preferred due to the fact that they are the commonly used materials for these purposes.Keywords: sound absorber, room model, objective parameters of sound, jnd
Procedia PDF Downloads 37524883 Nonparametric Sieve Estimation with Dependent Data: Application to Deep Neural Networks
Authors: Chad Brown
Abstract:
This paper establishes general conditions for the convergence rates of nonparametric sieve estimators with dependent data. We present two key results: one for nonstationary data and another for stationary mixing data. Previous theoretical results often lack practical applicability to deep neural networks (DNNs). Using these conditions, we derive convergence rates for DNN sieve estimators in nonparametric regression settings with both nonstationary and stationary mixing data. The DNN architectures considered adhere to current industry standards, featuring fully connected feedforward networks with rectified linear unit activation functions, unbounded weights, and a width and depth that grows with sample size.Keywords: sieve extremum estimates, nonparametric estimation, deep learning, neural networks, rectified linear unit, nonstationary processes
Procedia PDF Downloads 4224882 Investigating the Dynamic Plantar Pressure Distribution in Individuals with Multiple Sclerosis
Authors: Hilal Keklicek, Baris Cetin, Yeliz Salci, Ayla Fil, Umut Altinkaynak, Kadriye Armutlu
Abstract:
Objectives and Goals: Spasticity is a common symptom characterized with a velocity dependent increase in tonic stretch reflexes (muscle tone) in patient with multiple sclerosis (MS). Hypertonic muscles affect the normal plantigrade contact by disturbing accommodation of foot to the ground while walking. It is important to know the differences between healthy and neurologic foot features for management of spasticity related deformities and/or determination of rehabilitation purposes and contents. This study was planned with the aim of investigating the dynamic plantar pressure distribution in individuals with MS and determining the differences between healthy individuals (HI). Methods: Fifty-five individuals with MS (108 foot with spasticity according to Modified Ashworth Scale) and 20 HI (40 foot) were the participants of the study. The dynamic pedobarograph was utilized for evaluation of dynamic loading parameters. Participants were informed to walk at their self-selected speed for seven times to eliminate learning effect. The parameters were divided into 2 categories including; maximum loading pressure (N/cm2) and time of maximum pressure (ms) were collected from heal medial, heal lateral, mid foot, heads of first, second, third, fourth and fifth metatarsal bones. Results: There were differences between the groups in maximum loading pressure of heal medial (p < .001), heal lateral (p < .001), midfoot (p=.041) and 5th metatarsal areas (p=.036). Also, there were differences between the groups the time of maximum pressure of all metatarsal areas, midfoot, heal medial and heal lateral (p < .001) in favor of HI. Conclusions: The study provided basic data about foot pressure distribution in individuals with MS. Results of the study primarily showed that spasticity of lower extremity muscle disrupted the posteromedial foot loading. Secondarily, according to the study result, spasticity lead to inappropriate timing during load transfer from hind foot to forefoot.Keywords: multiple sclerosis, plantar pressure distribution, gait, norm values
Procedia PDF Downloads 32124881 Effect of Electric Arc Furnace Coarse Slag Aggregate And Ground Granulated Blast Furnace Slag on Mechanical and Durability Properties of Roller Compacted Concrete Pavement
Authors: Amiya Kumar Thakur, Dinesh Ganvir, Prem Pal Bansal
Abstract:
Industrial by product utilization has been encouraged due to environment and economic factors. Since electric arc furnace slag aggregate is a by-product of steel industry and its storage is a major concern hence it can be used as a replacement of natural aggregate as its physical and mechanical property are comparable or better than the natural aggregates. The present study investigates the effect of partial and full replacement of natural coarse aggregate with coarse EAF slag aggregate and partial replacement of cement with ground granulated blast furnace slag (GGBFS) on the mechanical and durability properties of roller compacted concrete pavement (RCCP).The replacement level of EAF slag aggregate were at five levels (i.e. 0% ,25% ,50%,75% & 100%) and of GGBFS was (0 % & 30%).The EAF slag aggregate was stabilized by exposing to outdoor condition for several years and the volumetric expansion test using steam exposure device was done to check volume stability. Soil compaction method was used for mix proportioning of RCCP. The fresh properties of RCCP investigated were fresh density and modified vebe test was done to measure the consistency of concrete. For investigating the mechanical properties various tests were done at 7 and 28 days (i.e. Compressive strength, split tensile strength, flexure strength modulus of elasticity) and also non-destructive testing was done at 28 days (i.e. Ultra pulse velocity test (UPV) & rebound hammer test). The durability test done at 28 days were water absorption, skid resistance & abrasion resistance. The results showed that with the increase in slag aggregate percentage there was an increase in the fresh density of concrete and also slight increase in the vebe time but with the 30 % GGBFS replacement the vebe time decreased and the fresh density was comparable to 0% GGBFS mix. The compressive strength, split tensile strength, flexure strength & modulus of elasticity increased with the increase in slag aggregate percentage in concrete when compared to control mix. But with the 30 % GGBFS replacement there was slight decrease in mechanical properties when compared to 100 % cement concrete. In UPV test and rebound hammer test all the mixes showed excellent quality of concrete. With the increase in slag aggregate percentage in concrete there was an increase in water absorption, skid resistance and abrasion resistance but with the 30 % GGBFS percentage the skid resistance, water absorption and abrasion resistance decreased when compared to 100 % cement concrete. From the study it was found that the mix containing 30 % GGBFS with different percentages of EAF slag aggregate were having comparable results for all the mechanical and durability property when compared to 100 % cement mixes. Hence 30 % GGBFS can be used as cement replacement with 100 % EAF slag aggregate as natural coarse aggregate replacement.Keywords: durability properties, electric arc furnace slag aggregate, GGBFS, mechanical properties, roller compacted concrete pavement, soil compaction method
Procedia PDF Downloads 14624880 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 26024879 Integrated Model for Enhancing Data Security Performance in Cloud Computing
Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali
Abstract:
Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish
Procedia PDF Downloads 47724878 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 13324877 Utilization of Treated Spend Pot Lining by Product from the Primary Aluminum Production in Cement and Concrete
Authors: Hang Tran, Victor Brial, Luca Sorelli, Claudiane Ouellet-Plamondon, David Conciatori, Laurent Birry
Abstract:
Spend pot lining (SPL) is a by-product generated from primary aluminum production. SPL consists of two parts, the first cut is rich in carbonaceous materials, and the second cut is rich in aluminum and silicon oxides. After treating by the hydrometallurgical Low Caustic Leaching and Liming process, the refractory part of SPL becomes an inert material, called LCLL ash in this project. LCLL ash was calcined at different temperatures (800 and 1000°C) and Calcined LCLL ash ground as fines of cement and replacement a part of cement in concrete production. The effect of LCLL ash on the chemical properties, mechanical properties and fresh behavior of concrete was evaluated by isothermal calorimetry, compressive test, and slump test. These results were compared to the reference mixture.Keywords: spend pot lining, concrete, cement, compressive strength, calorimetry
Procedia PDF Downloads 21824876 A Dynamic Equation for Downscaling Surface Air Temperature
Authors: Ch. Surawut, D. Sukawat
Abstract:
In order to utilize results from global climate models, dynamical and statistical downscaling techniques have been developed. For dynamical downscaling, usually a limited area numerical model is used, with associated high computational cost. This research proposes dynamic equation for specific space-time regional climate downscaling from the Educational Global Climate Model (EdGCM) for Southeast Asia. The equation is for surface air temperature. These equations provide downscaling values of surface air temperature at any specific location and time without running a regional climate model. In the proposed equations, surface air temperature is approximated from ground temperature, sensible heat flux and 2m wind speed. Results from the application of the equation show that the errors from the proposed equations are less than the errors for direct interpolation from EdGCM.Keywords: dynamic equation, downscaling, inverse distance, weight interpolation
Procedia PDF Downloads 30624875 Challenges in Multi-Cloud Storage Systems for Mobile Devices
Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta
Abstract:
The demand for cloud storage is increasing because users want continuous access their data. Cloud Storage revolutionized the way how users access their data. A lot of cloud storage service providers are available as DropBox, G Drive, and providing limited free storage and for extra storage; users have to pay money, which will act as a burden on users. To avoid the issue of limited free storage, the concept of Multi Cloud Storage introduced. In this paper, we will discuss the limitations of existing Multi Cloud Storage systems for mobile devices.Keywords: cloud storage, data privacy, data security, multi cloud storage, mobile devices
Procedia PDF Downloads 69924874 Satellite Connectivity for Sustainable Mobility
Authors: Roberta Mugellesi Dow
Abstract:
As the climate crisis becomes unignorable, it is imperative that new services are developed addressing not only the needs of customers but also taking into account its impact on the environment. The Telecommunication and Integrated Application (TIA) Directorate of ESA is supporting the green transition with particular attention to the sustainable mobility.“Accelerating the shift to sustainable and smart mobility” is at the core of the European Green Deal strategy, which seeks a 90% reduction in related emissions by 2050 . Transforming the way that people and goods move is essential to increasing mobility while decreasing environmental impact, and transport must be considered holistically to produce a shared vision of green intermodal mobility. The use of space technologies, integrated with terrestrial technologies, is an enabler of smarter traffic management and increased transport efficiency for automated and connected multimodal mobility. Satellite connectivity, including future 5G networks, and digital technologies such as Digital Twin, AI, Machine Learning, and cloud-based applications are key enablers of sustainable mobility.SatCom is essential to ensure that connectivity is ubiquitously available, even in remote and rural areas, or in case of a failure, by the convergence of terrestrial and SatCom connectivity networks, This is especially crucial when there are risks of network failures or cyber-attacks targeting terrestrial communication. SatCom ensures communication network robustness and resilience. The combination of terrestrial and satellite communication networks is making possible intelligent and ubiquitous V2X systems and PNT services with significantly enhanced reliability and security, hyper-fast wireless access, as well as much seamless communication coverage. SatNav is essential in providing accurate tracking and tracing capabilities for automated vehicles and in guiding them to target locations. SatNav can also enable location-based services like car sharing applications, parking assistance, and fare payment. In addition to GNSS receivers, wireless connections, radar, lidar, and other installed sensors can enable automated vehicles to monitor surroundings, to ‘talk to each other’ and with infrastructure in real-time, and to respond to changes instantaneously. SatEO can be used to provide the maps required by the traffic management, as well as evaluate the conditions on the ground, assess changes and provide key data for monitoring and forecasting air pollution and other important parameters. Earth Observation derived data are used to provide meteorological information such as wind speed and direction, humidity, and others that must be considered into models contributing to traffic management services. The paper will provide examples of services and applications that have been developed aiming to identify innovative solutions and new business models that are allowed by new digital technologies engaging space and non space ecosystem together to deliver value and providing innovative, greener solutions in the mobility sector. Examples include Connected Autonomous Vehicles, electric vehicles, green logistics, and others. For the technologies relevant are the hybrid satcom and 5G providing ubiquitous coverage, IoT integration with non space technologies, as well as navigation, PNT technology, and other space data.Keywords: sustainability, connectivity, mobility, satellites
Procedia PDF Downloads 13324873 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches
Authors: Wuttigrai Ngamsirijit
Abstract:
Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.Keywords: decision making, human capital analytics, talent management, talent value chain
Procedia PDF Downloads 18724872 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 25924871 Sampled-Data Model Predictive Tracking Control for Mobile Robot
Authors: Wookyong Kwon, Sangmoon Lee
Abstract:
In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV
Procedia PDF Downloads 30924870 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data
Authors: Nasser A. Al-Azri
Abstract:
The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.Keywords: bioclimatic charts, passive cooling, TMY, weather data
Procedia PDF Downloads 24024869 Treatment of Acid Mine Drainage with Modified Fly Ash
Authors: Sukla Saha, Alok Sinha
Abstract:
Acid mine drainage (AMD) is the generation of acidic water from active as well as abandoned mines. AMD generates due to the oxidation of pyrites present in the rock in mining areas. Sulfur oxidizing bacteria such as Thiobacillus ferrooxidans acts as a catalyst in this oxidation process. The characteristics of AMD is extreme low pH (2-3) with elevated concentration of different heavy metals such as Fe, Al, Zn, Mn, Cu and Co and anions such sulfate and chloride. AMD contaminate the ground water as well as surface water which leads to the degradation of water quality. Moreover, it carries detrimental effect for aquatic organism and degrade the environment. In the present study, AMD is treated with fly ash, modified with alkaline agent (NaOH). This modified fly ash (MFA) was experimentally proven as a very effective neutralizing agent for the treatment of AMD. It was observed that pH of treated AMD raised to 9.22 from 1.51 with 100g/L of MFA dose. Approximately, 99% removal of Fe, Al, Mn, Cu and Co took place with the same MFA dose. The treated water comply with the effluent discharge standard of (IS: 2490-1981).Keywords: acid mine drainage, heavy metals, modified fly ash, neutralization
Procedia PDF Downloads 15124868 Geospatial Assessments on Impacts of Land Use Changes and Climate Change in Nigeria Forest Ecosystems
Authors: Samuel O. Akande
Abstract:
The human-induced climate change is likely to have severe consequences on forest ecosystems in Nigeria. Recent discussions and emphasis on issues concerning the environment justify the need for this research which examined deforestation monitoring in Oban Forest, Nigeria using Remote Sensing techniques. The Landsat images from TM (1986), ETM+ (2001) and OLI (2015) sensors were obtained from Landsat online archive and processed using Erdas Imagine 2014 and ArcGIS 10.3 to obtain the land use/land cover and Normalized Differential Vegetative Index (NDVI) values. Ground control points of deforested areas were collected for validation. It was observed that the forest cover decreased in area by about 689.14 km² between 1986 and 2015. The NDVI was used to determine the vegetation health of the forest and its implications on agricultural sustainability. The result showed that the total percentage of the healthy forest cover has reduced to about 45.9% from 1986 to 2015. The results obtained from analysed questionnaires shown that there was a positive correlation between the causes and effects of deforestation in the study area. The coefficient of determination value was calculated as R² ≥ 0.7, to ascertain the level of anthropogenic activities, such as fuelwood harvesting, intensive farming, and logging, urbanization, and engineering construction activities, responsible for deforestation in the study area. Similarly, temperature and rainfall data were obtained from Nigerian Meteorological Agency (NIMET) for the period of 1986 to 2015 in the study area. It was observed that there was a significant increase in temperature while rainfall decreased over the study area. Responses from the administered questionnaires also showed that futile destruction of forest ecosystem in Oban forest could be reduced to its barest minimum if fuelwood harvesting is disallowed. Thus, the projected impacts of climate change on Nigeria’s forest ecosystems and environmental stability is better imagined than experienced.Keywords: deforestation, ecosystems, normalized differential vegetative index, sustainability
Procedia PDF Downloads 19324867 Development of Management System of the Experience of Defensive Modeling and Simulation by Data Mining Approach
Authors: D. Nam Kim, D. Jin Kim, Jeonghwan Jeon
Abstract:
Defense Defensive Modeling and Simulation (M&S) is a system which enables impracticable training for reducing constraints of time, space and financial resources. The necessity of defensive M&S has been increasing not only for education and training but also virtual fight. Soldiers who are using defensive M&S for education and training will obtain empirical knowledge and know-how. However, the obtained knowledge of individual soldiers have not been managed and utilized yet since the nature of military organizations: confidentiality and frequent change of members. Therefore, this study aims to develop a management system for the experience of defensive M&S based on data mining approach. Since individual empirical knowledge gained through using the defensive M&S is both quantitative and qualitative data, data mining approach is appropriate for dealing with individual empirical knowledge. This research is expected to be helpful for soldiers and military policy makers.Keywords: data mining, defensive m&s, management system, knowledge management
Procedia PDF Downloads 25524866 Timely Detection and Identification of Abnormalities for Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.Keywords: detection, monitoring, identification, measurement data, multivariate techniques
Procedia PDF Downloads 23624865 Imputation of Urban Movement Patterns Using Big Data
Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson
Abstract:
Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population
Procedia PDF Downloads 23124864 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 4024863 The Admissibility of Evidence Obtained in Contravention of the Right to Privacy in a Criminal Trial: A Comparative Study of Poland and Germany
Authors: Konstancja Syller
Abstract:
International law and European regulations remain hardly silent about the admissibility of evidence obtained illegally in a criminal trial. However, Article 6 of the European Convention on Human Rights guarantees the right to a fair trial, it does not normalise a proceeding status of specified sources or means of proof outright. Therefore, it is the preserve of national legislation and national law enforcement authorities to decide on this matter. In most countries, especially in Germany and Poland, a rather complex normative approach to the issue of proof obtained in violation of the right to privacy is evident, which pursues in practise to many interpretive doubts. In Germany the jurisprudence has a significant impact within the range of the matter mentioned above. The Constitutional Court and the Supreme Court of Germany protect the right to privacy quite firmly - they ruled on inadmissibility of obtaining a proof in the form of a diary or a journal as a protection measure of constitutional guaranteed right. At the same time, however, the Supreme Court is not very convinced with reference to the issue of whether materials collected as a result of an inspection, call recordings or listening to the premises, which were carried out in breach of law, can be used in a criminal trial. Generally speaking, German courts indicate a crucial importance of the principle of Truth and the principle of proportionality, which both enable a judgement to be made as to the possibility of using an evidence obtained unlawfully. Comparing, in Poland there is almost no jurisprudence of the Constitutional Tribunal relating directly to the issue of illegal evidence. It is somehow surprising, considering the doctrinal analysis of the admissibility of using such proof in a criminal trial is performed in relation to standards resulted from the Constitution. Moreover, a crucial de lega lata legal provision, which enables allowing a proof obtained in infringement of the provisions in respect of criminal proceedings or through a forbidden act, is widely criticised within the legal profession ant therefore many courts give it their own interpretation at odds with legislator’s intentions. The comparison of two civil law legal systems’ standards regarding to the admissibility of an evidence obtained in contravention of the right to privacy in a criminal trial, taking also into account EU legislation and judicature, is the conclusive aim of this article.Keywords: criminal trial, evidence, Germany, right to privacy, Poland
Procedia PDF Downloads 15624862 The Influence of Housing Choice Vouchers on the Private Rental Market
Authors: Randy D. Colon
Abstract:
Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market
Procedia PDF Downloads 11824861 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.Keywords: metadata, FAIR, data analysis, XPCS, IoT
Procedia PDF Downloads 6224860 War and Peace in the Hands of the Media: Review of Global Media Reports and Their Influencing Factors on the Foreign and Security Policy Opinions of the Population
Authors: Ismahane Emma Karima Bessi
Abstract:
Military sociology is largely avoided. Discussing the military as a societal phenomenon and the social dimensions of war and peace is now considered a disgraceful and neglected province of social science that has a major impact on global populations. The first official press war began with William Howard Russell in the mid-19th century. The media are crucial to war and peace. Even Gaius Julius Caesar, with his "commentarii bello gallico", was a media tool to influence his warfare. Napoleon Bonaparte also knew how important the press was for his actions. This shows how important history is for crisis and war journalism. The one-sided media coverage that every country is confronted with ultimately prevents people from having a certain interest in the truth and from gross knowledge gaps in order to get an accurate picture of reality. There is a need to examine the relationship between the military, war, and the media to look at the modality in which the media is involved in military conflicts, in this case, as an adjunct, i.e., war because of the media. These are promoted or initiated by the following factors: photos intended for the visual manipulation of the population, the pressure from politicians and parties who are urging and exerting their influence on the global media to share the same pattern of opinion, and, most importantly, the media profiting from the war by listening to popular reactions and passing them on promoting with new visuals. These influence political elections. The media occupies a huge and ubiquitous part of the population. These have the ability to make a country that is in constant crisis and war mode appear in a brilliant light of peace. An article or photograph taken by one journalist has a tremendous impact as it can control the minds of millions of people. Most wars currently have state-political reasons. The parties, therefore, want to have their (potential) voters on their side, who are inflated by the media. The military is loathed or loved. Thinking must be created that a well-trained military in the instances of natural sciences, history, and sociology can save or protect the lives of many people. Theoretical methods for this are defined and evaluated in more detail in this paper.Keywords: war, history, military, science, journalism, crisis
Procedia PDF Downloads 8324859 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns
Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim
Abstract:
Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation
Procedia PDF Downloads 34024858 Social Data Aggregator and Locator of Knowledge (STALK)
Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat
Abstract:
Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.Keywords: social network, analysis, Facebook, Linkedin, git, big data
Procedia PDF Downloads 44424857 Effects of the Type of Soil on the Efficiency of a Bioremediation Dispositive by Using Bacterium Hydrocarbonoclastes
Authors: Amel Bouderhem, Aminata Ould El Hadj Khelil, Amina N. Djrarbaoui, Aroussi Aroussi
Abstract:
The present work aims to find the influence of the nature of the soil on the effectiveness of the biodegradation of hydrocarbons by a mixture of bacterial strains hydrocarbonoclastes. Processes of bioaugmentation and biostimulation trial are applied to samples of soils polluted voluntarily by the crude oil. For the evaluation of the biodegradation of hydrocarbons, the bacterial load, the pH and organic carbon total are followed in the different experimental batches. He bacterial load of the sandy soil varies among the witnesses of 45,2 .108 CFU/ml at the beginning of the experimentation to 214,07.108 CFU/ml at the end of the experiment. Of the soil silty-clay varies between 103,31 .108 CFU/ml and 614,86.108 CFU/ml . It was found a strong increase in the bacterial biomass during the processing of all samples. This increase is more important in the samples of sand bioaugmente or biomass increased from 63.16 .108 CFU/ml to 309.68 .108 CFU/ml than in soil samples silty clay- bioaugmente whose content in bacteria evolved of 73,01 .108 CFU/ml to 631.80 . 108CFU/mlKeywords: pollution, hydrocarbons, bioremediation, bacteria hydrocarbonoclastes, ground, texture
Procedia PDF Downloads 47624856 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates
Authors: Rima Shishakly, Mervyn Misajon
Abstract:
Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)
Procedia PDF Downloads 222