Search results for: geographic data streams
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25871

Search results for: geographic data streams

24341 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation

Authors: Natalia Kalinowska

Abstract:

The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.

Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach

Procedia PDF Downloads 254
24340 UAV’s Enhanced Data Collection for Heterogeneous Wireless Sensor Networks

Authors: Kamel Barka, Lyamine Guezouli, Assem Rezki

Abstract:

In this article, we propose a protocol called DataGA-DRF (a protocol for Data collection using a Genetic Algorithm through Dynamic Reference Points) that collects data from Heterogeneous wireless sensor networks. This protocol is based on DGA (Destination selection according to Genetic Algorithm) to control the movement of the UAV (Unmanned aerial vehicle) between dynamic reference points that virtually represent the sensor node deployment. The dynamics of these points ensure an even distribution of energy consumption among the sensors and also improve network performance. To determine the best points, DataGA-DRF uses a classification algorithm such as K-Means.

Keywords: heterogeneous wireless networks, unmanned aerial vehicles, reference point, collect data, genetic algorithm

Procedia PDF Downloads 88
24339 Circular Tool and Dynamic Approach to Grow the Entrepreneurship of Macroeconomic Metabolism

Authors: Maria Areias, Diogo Simões, Ana Figueiredo, Anishur Rahman, Filipa Figueiredo, João Nunes

Abstract:

It is expected that close to 7 billion people will live in urban areas by 2050. In order to improve the sustainability of the territories and its transition towards circular economy, it’s necessary to understand its metabolism and promote and guide the entrepreneurship answer. The study of a macroeconomic metabolism involves the quantification of the inputs, outputs and storage of energy, water, materials and wastes for an urban region. This quantification and analysis representing one opportunity for the promotion of green entrepreneurship. There are several methods to assess the environmental impacts of an urban territory, such as human and environmental risk assessment (HERA), life cycle assessment (LCA), ecological footprint assessment (EF), material flow analysis (MFA), physical input-output table (PIOT), ecological network analysis (ENA), multicriteria decision analysis (MCDA) among others. However, no consensus exists about which of those assessment methods are best to analyze the sustainability of these complex systems. Taking into account the weaknesses and needs identified, the CiiM - Circular Innovation Inter-Municipality project aims to define an uniform and globally accepted methodology through the integration of various methodologies and dynamic approaches to increase the efficiency of macroeconomic metabolisms and promoting entrepreneurship in a circular economy. The pilot territory considered in CiiM project has a total area of 969,428 ha, comprising a total of 897,256 inhabitants (about 41% of the population of the Center Region). The main economic activities in the pilot territory, which contribute to a gross domestic product of 14.4 billion euros, are: social support activities for the elderly; construction of buildings; road transport of goods, retailing in supermarkets and hypermarkets; mass production of other garments; inpatient health facilities; and the manufacture of other components and accessories for motor vehicles. The region's business network is mostly constituted of micro and small companies (similar to the Central Region of Portugal), with a total of 53,708 companies identified in the CIM Region of Coimbra (39 large companies), 28,146 in the CIM Viseu Dão Lafões (22 large companies) and 24,953 in CIM Beiras and Serra da Estrela (13 large companies). For the construction of the database was taking into account data available at the National Institute of Statistics (INE), General Directorate of Energy and Geology (DGEG), Eurostat, Pordata, Strategy and Planning Office (GEP), Portuguese Environment Agency (APA), Commission for Coordination and Regional Development (CCDR) and Inter-municipal Community (CIM), as well as dedicated databases. In addition to the collection of statistical data, it was necessary to identify and characterize the different stakeholder groups in the pilot territory that are relevant to the different metabolism components under analysis. The CIIM project also adds the potential of a Geographic Information System (GIS) so that it is be possible to obtain geospatial results of the territorial metabolisms (rural and urban) of the pilot region. This platform will be a powerful visualization tool of flows of products/services that occur within the region and will support the stakeholders, improving their circular performance and identifying new business ideas and symbiotic partnerships.

Keywords: circular economy tools, life cycle assessment macroeconomic metabolism, multicriteria decision analysis, decision support tools, circular entrepreneurship, industrial and regional symbiosis

Procedia PDF Downloads 107
24338 HPPDFIM-HD: Transaction Distortion and Connected Perturbation Approach for Hierarchical Privacy Preserving Distributed Frequent Itemset Mining over Horizontally-Partitioned Dataset

Authors: Fuad Ali Mohammed Al-Yarimi

Abstract:

Many algorithms have been proposed to provide privacy preserving in data mining. These protocols are based on two main approaches named as: the perturbation approach and the Cryptographic approach. The first one is based on perturbation of the valuable information while the second one uses cryptographic techniques. The perturbation approach is much more efficient with reduced accuracy while the cryptographic approach can provide solutions with perfect accuracy. However, the cryptographic approach is a much slower method and requires considerable computation and communication overhead. In this paper, a new scalable protocol is proposed which combines the advantages of the perturbation and distortion along with cryptographic approach to perform privacy preserving in distributed frequent itemset mining on horizontally distributed data. Both the privacy and performance characteristics of the proposed protocol are studied empirically.

Keywords: anonymity data, data mining, distributed frequent itemset mining, gaussian perturbation, perturbation approach, privacy preserving data mining

Procedia PDF Downloads 509
24337 Implementation of Data Science in Field of Homologation

Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande

Abstract:

For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.

Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)

Procedia PDF Downloads 165
24336 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis

Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate

Abstract:

This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.

Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull

Procedia PDF Downloads 78
24335 Eco-Environmental Vulnerability Evaluation in Mountain Regions Using Remote Sensing and Geographical Information System: A Case Study of Pasol Gad Watershed of Garhwal Himalaya, India

Authors: Suresh Kumar Bandooni, Mirana Laishram

Abstract:

The Mid Himalaya of Garhwal Himalaya in Uttarakhand (India) has a complex Physiographic features withdiversified climatic conditions and therefore it is suspect to environmental vulnerability. Thenatural disasters and also anthropogenic activities accelerate the rate of environmental vulnerability. To analyse the environmental vulnerability, we have used geoinformatics technologies and numerical models and it is adoptedby using Spatial Principal Component Analysis (SPCA). The model consist of many factors such as slope, landuse/landcover, soil, forest fire risk, landslide susceptibility zone, human population density and vegetation index. From this model, the environmental vulnerability integrated index (EVSI) is calculated for Pasol Gad Watershed of Garhwal Himalaya for the years 1987, 2000, and 2013 and the Vulnerability is classified into five levelsi.e. Very low, low, medium, high and very highby means of cluster principle. The resultsforeco-environmental vulnerability distribution in study area shows that medium, high and very high levels are dominating in the area and it is mainly caused by the anthropogenic activities and natural disasters. Therefore, proper management forconservation of resources is utmost necessity of present century. It is strongly believed that participation at community level along with social worker, institutions and Non-governmental organization (NGOs) have become a must to conserve and protect the environment.

Keywords: eco-environment vulnerability, spatial principal component analysis, remote sensing, geographic information system, institutions, Himalaya

Procedia PDF Downloads 264
24334 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City

Authors: Liang Weichien

Abstract:

Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.

Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation

Procedia PDF Downloads 288
24333 A Preliminary Survey of Mosses, in Galahitiya, Meneripitiya Grama Niladhari Division in Rathnapura District of Sri Lanka

Authors: B. W. U. Deepashika

Abstract:

Rathnapura is located in the south-western part of Sri Lanka, the so-called wet zone. This area receives rainfall mainly from south-west monsoons from May to September. During the remaining months of the year, there is also a considerable precipitation due to convective rains. The average annual precipitation is about 4,000 to 5,000 mm. The average temperature varies from 24 to 35 °C, and there are high humidity levels. Mosses are one of the important groups of the flora of this region and they are very sensitive to climatic changes. Proper exploration and systematic studies on mosses in many parts of the country have not yet been carried out. Therefore, launching a study on the bryophyte flora of the country has become very important. The preliminary survey of bryophytes was carried out in Galahitiya, Meneripitiya Grama Niladari Division, located in Ratnapura district, in Sabaragamuwa province which is situated 20 kilometres away from Rathnapura. Its geographical coordinates are 6° 35' North, 80° 35' East. Samples were collected from different habitats including home gardens, near the wells, small forest patch, tea land, near the stream, from non-cemented wall, from cement wall, and from ditches. Two small quadrates (1ˣ 1m2) were used in each study site. Taxa were identified up to the generic level using taxonomic keys produced for different geographic regions of the world. In the present survey, a total of 09 mosses belonging to seven families were identified to their generic level. They are Family-Bryaceae (3) (Bryum sp, Brachymenium sp, Pohlia sp), Fissidentaceae (1) (Fissidens sp), Leucobryaceae (1) (Octoblepharum sp), Calymperaceae (1) (Calymperes sp), Polytrichaceae (1) (Pogonatum sp), Pterobryaceae (1) (Pterobryopsis sp), Sematophyllaceae (1) (Taxithelium sp).

Keywords: mosses, wet zone, Sabaragamuwa province, Sri Lanka

Procedia PDF Downloads 230
24332 Data-Focused Digital Transformation for Smart Net-Zero Cities: A Systems Thinking Approach

Authors: Farzaneh Mohammadi Jouzdani, Vahid Javidroozi, Monica Mateo Garcia, Hanifa Shah

Abstract:

The emergence of developing smart net-zero cities in recent years has attracted significant attention and interest from worldwide communities and scholars as a potential solution to the critical requirement for urban sustainability. This research-in-progress paper aims to investigate the development of smart net-zero cities to propose a digital transformation roadmap for smart net-zero cities with a primary focus on data. Employing systems thinking as an underpinning theory, the study advocates for the necessity of utilising a holistic strategy for understanding the complex interdependencies and interrelationships that characterise urban systems. The proposed methodology will involve an in-depth investigation of current data-driven approaches in the smart net-zero city. This is followed by utilising predictive analysis methods to evaluate the holistic impact of the approaches on moving toward a Smart net-zero city. It is expected to achieve systemic intervention followed by a data-focused and systemic digital transformation roadmap for smart net-zero, contributing to a more holistic understanding of urban sustainability.

Keywords: smart city, net-zero city, digital transformation, systems thinking, data integration, data-driven approach

Procedia PDF Downloads 30
24331 Analysis of an Alternative Data Base for the Estimation of Solar Radiation

Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag

Abstract:

The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.

Keywords: energy potential, reanalyses, renewable energy, solar radiation

Procedia PDF Downloads 167
24330 Big Data Analytics and Public Policy: A Study in Rural India

Authors: Vasantha Gouri Prathapagiri

Abstract:

Innovations in ICT sector facilitate qualitative life style for citizens across the globe. Countries that facilitate usage of new techniques in ICT, i.e., big data analytics find it easier to fulfil the needs of their citizens. Big data is characterised by its volume, variety, and speed. Analytics involves its processing in a cost effective way in order to draw conclusion for their useful application. Big data also involves into the field of machine learning, artificial intelligence all leading to accuracy in data presentation useful for public policy making. Hence using data analytics in public policy making is a proper way to march towards all round development of any country. The data driven insights can help the government to take important strategic decisions with regard to socio-economic development of her country. Developed nations like UK and USA are already far ahead on the path of digitization with the support of Big Data analytics. India is a huge country and is currently on the path of massive digitization being realised through Digital India Mission. Internet connection per household is on the rise every year. This transforms into a massive data set that has the potential to improvise the public services delivery system into an effective service mechanism for Indian citizens. In fact, when compared to developed nations, this capacity is being underutilized in India. This is particularly true for administrative system in rural areas. The present paper focuses on the need for big data analytics adaptation in Indian rural administration and its contribution towards development of the country on a faster pace. Results of the research focussed on the need for increasing awareness and serious capacity building of the government personnel working for rural development with regard to big data analytics and its utility for development of the country. Multiple public policies are framed and implemented for rural development yet the results are not as effective as they should be. Big data has a major role to play in this context as can assist in improving both policy making and implementation aiming at all round development of the country.

Keywords: Digital India Mission, public service delivery system, public policy, Indian administration

Procedia PDF Downloads 163
24329 4G LTE Dynamic Pricing: The Drivers, Benefits, and Challenges

Authors: Ahmed Rashad Harb Riad Ismail

Abstract:

The purpose of this research is to study the potential of Dynamic Pricing if deployed by mobile operators and analyse its effects from both operators and consumers side. Furthermore, to conclude, throughout the research study, the recommended conditions for successful Dynamic Pricing deployment, recommended factors identifying the type of markets where Dynamic Pricing can be effective, and proposal for a Dynamic Pricing stakeholders’ framework were presented. Currently, the mobile telecommunications industry is witnessing a dramatic growth rate in the data consumption, being fostered mainly by higher data speed technology as the 4G LTE and by the smart devices penetration rates. However, operators’ revenue from data services lags behind and is decupled from this data consumption growth. Pricing strategy is a key factor affecting this ecosystem. Since the introduction of the 4G LTE technology will increase the pace of data growth in multiples, consequently, if pricing strategies remain constant, then the revenue and usage gap will grow wider, risking the sustainability of the ecosystem. Therefore, this research study is focused on Dynamic Pricing for 4G LTE data services, researching the drivers, benefits and challenges of 4G LTE Dynamic Pricing and the feasibility of its deployment in practice from different perspectives including operators, regulators, consumers, and telecommunications equipment manufacturers point of views.

Keywords: LTE, dynamic pricing, EPC, research

Procedia PDF Downloads 340
24328 Prediction of Wind Speed by Artificial Neural Networks for Energy Application

Authors: S. Adjiri-Bailiche, S. M. Boudia, H. Daaou, S. Hadouche, A. Benzaoui

Abstract:

In this work the study of changes in the wind speed depending on the altitude is calculated and described by the model of the neural networks, the use of measured data, the speed and direction of wind, temperature and the humidity at 10 m are used as input data and as data targets at 50m above sea level. Comparing predict wind speeds and extrapolated at 50 m above sea level is performed. The results show that the prediction by the method of artificial neural networks is very accurate.

Keywords: MATLAB, neural network, power low, vertical extrapolation, wind energy, wind speed

Procedia PDF Downloads 698
24327 A Case Study at PT Bank XYZ on The Role of Compensation, Career Development, and Employee Engagement towards Employee Performance

Authors: Ahmad Badawi Saluy, Novawiguna Kemalasari

Abstract:

This study aims to examine, analyze and explain the impacts of compensation, career development and employee engagement to employee’s performance partially and simultaneously (Case Study at PT Bank XYZ). The research design used is quantitative descriptive research causality involving 30 respondents. Sources of data are from primary and secondary data, primary data obtained from questionnaires distribution and secondary data obtained from journals and books. Data analysis used model test using smart application PLS 3 that consists of test outer model and inner model. The results showed that compensation, career development and employee engagement partially have a positive impact on employee performance, while they have a positive and significant impact on employee performance simultaneously. The independent variable has the greatest impact is the employee engagement.

Keywords: compensation, career development, employee engagement, employee performance

Procedia PDF Downloads 155
24326 Spectral Anomaly Detection and Clustering in Radiological Search

Authors: Thomas L. McCullough, John D. Hague, Marylesa M. Howard, Matthew K. Kiser, Michael A. Mazur, Lance K. McLean, Johanna L. Turk

Abstract:

Radiological search and mapping depends on the successful recognition of anomalies in large data sets which contain varied and dynamic backgrounds. We present a new algorithmic approach for real-time anomaly detection which is resistant to common detector imperfections, avoids the limitations of a source template library and provides immediate, and easily interpretable, user feedback. This algorithm is based on a continuous wavelet transform for variance reduction and evaluates the deviation between a foreground measurement and a local background expectation using methods from linear algebra. We also present a technique for recognizing and visualizing spectrally similar clusters of data. This technique uses Laplacian Eigenmap Manifold Learning to perform dimensional reduction which preserves the geometric "closeness" of the data while maintaining sensitivity to outlying data. We illustrate the utility of both techniques on real-world data sets.

Keywords: radiological search, radiological mapping, radioactivity, radiation protection

Procedia PDF Downloads 698
24325 Knowledge Engineering Based Smart Healthcare Solution

Authors: Rhaed Khiati, Muhammad Hanif

Abstract:

In the past decade, smart healthcare systems have been on an ascendant drift, especially with the evolution of hospitals and their increasing reliance on bioinformatics and software specializing in healthcare. Doctors have become reliant on technology more than ever, something that in the past would have been looked down upon, as technology has become imperative in reducing overall costs and improving the quality of patient care. With patient-doctor interactions becoming more necessary and more complicated than ever, systems must be developed while taking into account costs, patient comfort, and patient data, among other things. In this work, we proposed a smart hospital bed, which mixes the complexity and big data usage of traditional healthcare systems with the comfort found in soft beds while taking certain concerns like data confidentiality, security, and maintaining SLA agreements, etc. into account. This research work potentially provides users, namely patients and doctors, with a seamless interaction with to their respective nurses, as well as faster access to up-to-date personal data, including prescriptions and severity of the condition in contrast to the previous research in the area where there is lack of consideration of such provisions.

Keywords: big data, smart healthcare, distributed systems, bioinformatics

Procedia PDF Downloads 202
24324 Transformation of the Business Model in an Occupational Health Care Company Embedded in an Emerging Personal Data Ecosystem: A Case Study in Finland

Authors: Tero Huhtala, Minna Pikkarainen, Saila Saraniemi

Abstract:

Information technology has long been used as an enabler of exchange for goods and services. Services are evolving from generic to personalized, and the reverse use of customer data has been discussed in both academia and industry for the past few years. This article presents the results of an empirical case study in the area of preventive health care services. The primary data were gathered in workshops, in which future personal data-based services were conceptualized by analyzing future scenarios from a business perspective. The aim of this study is to understand business model transformation in emerging personal data ecosystems. The work was done as a case study in the context of occupational healthcare. The results have implications to theory and practice, indicating that adopting personal data management principles requires transformation of the business model, which, if successfully managed, may provide access to more resources, potential to offer better value, and additional customer channels. These advantages correlate with the broadening of the business ecosystem. Expanding the scope of this study to include more actors would improve the validity of the research. The results draw from existing literature and are based on findings from a case study and the economic properties of the healthcare industry in Finland.

Keywords: ecosystem, business model, personal data, preventive healthcare

Procedia PDF Downloads 255
24323 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software

Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman

Abstract:

Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.

Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation

Procedia PDF Downloads 128
24322 Contextual Analysis of Spekboom (Portulacaria afra) on Air Quality: A Case of Durban, South Africa

Authors: C. Greenstone, R. Hansmann, K. Lawrence

Abstract:

Portulacaria afra, commonly known as Spekboom is an indigenous South African plant. Spekboom is recognized for its medicinal, nutrient rich, easy to grow, drought tolerant and have climate change combating benefits. Durban’s air quality currently falls below the acceptable level. Urban greening absorbs air pollutants which can improve human health; however, urban planning often neglects the aspect of air quality on human health. It is therefore imperative that there is an investigation generating some quantification of the Spekboom plant on air quality. Though there are numerous advantages that Spekboom brings to ecosystems, the effect of Spekboom on air quality in context specific locales remains under researched. This study seeks to address this gap and bring forward the effect of Spekboom on air quality and improving human health overall using locations with specific characteristics ranging from industrial, commercial and residential. The study adopted a field sampling and spatial analysis approach through the collection of cuttings of Spekboom from various locations to measure the amount of toxins absorbed by the plant and thereafter using Geographic Information Systems (GIS) to spatially map the location of each sample. Through the results found, the implementation of Spekboom as an air purifier in areas that have poor air quality can be carried out. Spekboom could even be cultivated around cities forming a green belt to improve air quality on a much larger scale. Due to Spekboom's low maintenance characteristics, it makes the entire implementation process quite simple. Proposed Future research will be to collect yearly cuttings from the same plant in order to get a longitudinal, long-term assessment of air quality improvements in areas where Spekboom is implemented.

Keywords: air quality, human health, portulacaria afra, spekboom

Procedia PDF Downloads 23
24321 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network

Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang

Abstract:

‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.

Keywords: deep learning network, smart metering, water end use, water-energy data

Procedia PDF Downloads 308
24320 Strategy Management of Soybean (Glycine max L.) for Dealing with Extreme Climate through the Use of Cropsyst Model

Authors: Aminah Muchdar, Nuraeni, Eddy

Abstract:

The aims of the research are: (1) to verify the cropsyst plant model of experimental data in the field of soybean plants and (2) to predict planting time and potential yield soybean plant with the use of cropsyst model. This research is divided into several stages: (1) first calibration stage which conducted in the field from June until September 2015.(2) application models stage, where the data obtained from calibration in the field will be included in cropsyst models. The required data models are climate data, ground data/soil data,also crop genetic data. The relationship between the obtained result in field with simulation cropsyst model indicated by Efficiency Index (EF) which the value is 0,939.That is showing that cropsyst model is well used. From the calculation result RRMSE which the value is 1,922%.That is showing that comparative fault prediction results from simulation with result obtained in the field is 1,92%. The conclusion has obtained that the prediction of soybean planting time cropsyst based models that have been made valid for use. and the appropriate planting time for planting soybeans mainly on rain-fed land is at the end of the rainy season, in which the above study first planting time (June 2, 2015) which gives the highest production, because at that time there was still some rain. Tanggamus varieties more resistant to slow planting time cause the percentage decrease in the yield of each decade is lower than the average of all varieties.

Keywords: soybean, Cropsyst, calibration, efficiency Index, RRMSE

Procedia PDF Downloads 185
24319 Impact of Acculturation Stress and Work-Family Conflict on the Health and Wellbeing of African Immigrants in the US: A Case Study of Ghanaian Immigrants

Authors: Rodlyn Remina Hines

Abstract:

Africans who migrate to the United States (U.S.) go through an acculturation period. When they join the U.S. workforce during the period they are still acquainting to the new geographic area and culture, they may experience work and family conflict in addition to the stressors of acculturation. This study investigated the impact of acculturation stress and work-family conflict on the health and wellbeing of African immigrants in the U.S. using a growing immigrant population. Ghanaian immigrants (n = 100, males= 43%; females= 56%) residing in New York and Massachusetts, United States (U.S.), were recruited via purposive sampling to investigate the role acculturation stress and work-family conflict play on the health and wellbeing of African immigrants in the U.S. Using the Sociocultural theory, three hypotheses were proposed: (1) High acculturation stress will lead to high work-family conflict, (2) High work-family conflict will result in poor health and wellbeing, and (3) Work-family conflict will mediate the relationship between acculturation stress and health and wellbeing. The results fully supported the first hypothesis and partially supported the second and third. High acculturation stress led to high work-family conflict. Although high work-family conflict resulted in poorer health and wellbeing, high family support mediated work-family conflict and health and wellbeing. Participants who reported poor health also reported a lack of family or other support and those who reported strong family or other support also reported excellent health and wellbeing even with high work-family conflict. The latter group did not expect their health and wellbeing to get worse. I draw on these findings to conclude that African immigrants in the U.S. experience significant acculturation stress and work-family conflict resulting in poor health and wellbeing during their acculturation period if there is a lack of family or other support. These findings have implications for practitioners and policymakers.

Keywords: acculturation stress, work-family conflict, Ghanaian immigrants, health and wellbeing

Procedia PDF Downloads 93
24318 Low Impact Development Strategies Applied in the Water System Planning in the Coastal Eco-Green Campus

Authors: Ying Li, Zaisheng Hong, Weihong Wang

Abstract:

With the rapid enlargement of the size of Chinese universities, newly built campuses are springing up everywhere in recent years. It is urged to build eco-green campus because the role of higher education institutions in the transition to a more sustainable society has been highlighted for almost three decades. On condition that a new campus is usually built on an undeveloped site, where the basic infrastructure is not completed, finding proper strategies in planning and design of the campus becomes a primary concern. Low Impact Development (LID) options have been proposed as an alternative approach to make better use of rainwater in planning and design of an undeveloped site. On the basis of analyzing the natural circumstance, geographic condition, and other relative information, four main LID approaches are coordinated in this study of Hebei Union University, which are ‘Storage’, ‘Retaining’, ‘Infiltration’ and ‘Purification’. ‘Storage’ refers to a big central lake in the campus for rainwater harvesting. ‘Retaining’ means rainwater gardens scattered in the campus, also being known as bioretention areas which mimic the naturally created pools of water, to decrease surface flow runoff. ‘Infiltration’ is designed of grassed swales, which also play a part of floodway channel. ‘Purification’ is known as either natural or artificial wetland to reduce pollutants such as nitrogen and phosphorous in the waterbody. With above mentioned measures dealing with the synthetic use of rainwater in the acid & alkali area in the coastal district, an eco-green campus construction and an ecological sustainability will be realized, which will give us more enlightenment and reference.

Keywords: newly built campus, low impact development, planning design, rainwater reuse

Procedia PDF Downloads 251
24317 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 433
24316 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning

Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul

Abstract:

In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.

Keywords: electrocardiogram, dictionary learning, sparse coding, classification

Procedia PDF Downloads 389
24315 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 254
24314 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm

Authors: Monojit Manna, Arpan Adhikary

Abstract:

In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.

Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection

Procedia PDF Downloads 81
24313 Modeling Breathable Particulate Matter Concentrations over Mexico City Retrieved from Landsat 8 Satellite Imagery

Authors: Rodrigo T. Sepulveda-Hirose, Ana B. Carrera-Aguilar, Magnolia G. Martinez-Rivera, Pablo de J. Angeles-Salto, Carlos Herrera-Ventosa

Abstract:

In order to diminish health risks, it is of major importance to monitor air quality. However, this process is accompanied by the high costs of physical and human resources. In this context, this research is carried out with the main objective of developing a predictive model for concentrations of inhalable particles (PM10-2.5) using remote sensing. To develop the model, satellite images, mainly from Landsat 8, of the Mexico City’s Metropolitan Area were used. Using historical PM10 and PM2.5 measurements of the RAMA (Automatic Environmental Monitoring Network of Mexico City) and through the processing of the available satellite images, a preliminary model was generated in which it was possible to observe critical opportunity areas that will allow the generation of a robust model. Through the preliminary model applied to the scenes of Mexico City, three areas were identified that cause great interest due to the presumed high concentration of PM; the zones are those that present high plant density, bodies of water and soil without constructions or vegetation. To date, work continues on this line to improve the preliminary model that has been proposed. In addition, a brief analysis was made of six models, presented in articles developed in different parts of the world, this in order to visualize the optimal bands for the generation of a suitable model for Mexico City. It was found that infrared bands have helped to model in other cities, but the effectiveness that these bands could provide for the geographic and climatic conditions of Mexico City is still being evaluated.

Keywords: air quality, modeling pollution, particulate matter, remote sensing

Procedia PDF Downloads 160
24312 Co-Synthesis of Exopolysaccharides and Polyhydroxyalkanoates Using Waste Streams: Solid-State Fermentation as an Alternative Approach

Authors: Laura Mejias, Sandra Monteagudo, Oscar Martinez-Avila, Sergio Ponsa

Abstract:

Bioplastics are gaining attention as potential substitutes of conventional fossil-derived plastics and new components of specialized applications in different industries. Besides, these constitute a sustainable alternative since they are biodegradable and can be obtained starting from renewable sources. Thus, agro-industrial wastes appear as potential substrates for bioplastics production using microorganisms, considering they are a suitable source for nutrients, low-cost, and available worldwide. Therefore, this approach contributes to the biorefinery and circular economy paradigm. The present study assesses the solid-state fermentation (SSF) technology for the co-synthesis of exopolysaccharides (EPS) and polyhydroxyalkanoates (PHA), two attractive biodegradable bioplastics, using the leftover of the brewery industry brewer's spent grain (BSG). After an initial screening of diverse PHA-producer bacteria, it was found that Burkholderia cepacia presented the highest EPS and PHA production potential via SSF of BSG. Thus, B. cepacia served to identify the most relevant aspects affecting the EPS+PHA co-synthesis at a lab-scale (100g). Since these are growth-dependent processes, they were monitored online through oxygen consumption using a dynamic respirometric system, but also quantifying the biomass production (gravimetric) and the obtained products (EtOH precipitation for EPS and solid-liquid extraction coupled with GC-FID for PHA). Results showed that B. cepacia has grown up to 81 mg per gram of dry BSG (gDM) at 30°C after 96 h, representing up to 618 times higher than the other tested strains' findings. Hence, the crude EPS production was 53 mg g-1DM (2% carbohydrates), but purity reached 98% after a dialysis purification step. Simultaneously, B. cepacia accumulated up to 36% (dry basis) of the produced biomass as PHA, mainly composed of polyhydroxybutyrate (P3HB). The maximum PHA production was reached after 48 h with 12.1 mg g⁻¹DM, representing threefold the levels previously reported using SSF. Moisture content and aeration strategy resulted in the most significant variables affecting the simultaneous production. Results show the potential of co-synthesis via SSF as an attractive alternative to enhance bioprocess feasibility for obtaining these bioplastics in residue-based systems.

Keywords: bioplastics, brewer’s spent grain, circular economy, solid-state fermentation, waste to product

Procedia PDF Downloads 147