Search results for: heterogeneous massive data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25998

Search results for: heterogeneous massive data

24378 Perceived and Performed E-Health Literacy: Survey and Simulated Performance Test

Authors: Efrat Neter, Esther Brainin, Orna Baron-Epel

Abstract:

Background: Connecting end-users to newly developed ICT technologies and channeling patients to new products requires an assessment of compatibility. End user’s assessment is conveyed in the concept of eHealth literacy. The study examined the association between perceived and performed eHealth literacy (EHL) in a heterogeneous age sample in Israel. Methods: Participants included 100 Israeli adults (mean age 43,SD 13.9) who were first phone interviewed and then tested on a computer simulation of health-related Internet tasks. Performed, perceived and evaluated EHL were assessed. Levels of successful completion of tasks represented EHL performance and evaluated EHL included observed motivation, confidence, and amount of help provided. Results: The skills of accessing, understanding, appraising, applying, and generating new information had a decreasing successful completion rate with increase in complexity of the task. Generating new information, though highly correlated with all other skills, was least correlated with the other skills. Perceived and performed EHL were correlated (r=.40, P=.001), while facets of performance (i.e, digital literacy and EHL) were highly correlated (r=.89, P<.001). Participants low and high in performed EHL were significantly different: low performers were older, had attained less education, used the Internet for less time and perceived themselves as less healthy. They also encountered more difficulties, required more assistance, were less confident in their conduct and exhibited less motivation than high performers. Conclusions: The association in this age-hetrogenous ample was larger than in previous age-homogenous samples. The moderate association between perceived and performed EHL indicates that the two are associated yet distinct, the latter requiring separate assessment. Features of future rapid performed EHL tools are discussed.

Keywords: eHealth, health literacy, performance, simulation

Procedia PDF Downloads 236
24377 Probing Scientific Literature Metadata in Search for Climate Services in African Cities

Authors: Zohra Mhedhbi, Meheret Gaston, Sinda Haoues-Jouve, Julia Hidalgo, Pierre Mazzega

Abstract:

In the current context of climate change, supporting national and local stakeholders to make climate-smart decisions is necessary but still underdeveloped in many countries. To overcome this problem, the Global Frameworks for Climate Services (GFCS), implemented under the aegis of the United Nations in 2012, has initiated many programs in different countries. The GFCS contributes to the development of Climate Services, an instrument based on the production and transfer of scientific climate knowledge for specific users such as citizens, urban planning actors, or agricultural professionals. As cities concentrate on economic, social and environmental issues that make them more vulnerable to climate change, the New Urban Agenda (NUA), adopted at Habitat III in October 2016, highlights the importance of paying particular attention to disaster risk management, climate and environmental sustainability and urban resilience. In order to support the implementation of the NUA, the World Meteorological Organization (WMO) has identified the urban dimension as one of its priorities and has proposed a new tool, the Integrated Urban Services (IUS), for more sustainable and resilient cities. In the southern countries, there’s a lack of development of climate services, which can be partially explained by problems related to their economic financing. In addition, it is often difficult to make climate change a priority in urban planning, given the more traditional urban challenges these countries face, such as massive poverty, high population growth, etc. Climate services and Integrated Urban Services, particularly in African cities, are expected to contribute to the sustainable development of cities. These tools will help promoting the acquisition of meteorological and socio-ecological data on their transformations, encouraging coordination between national or local institutions providing various sectoral urban services, and should contribute to the achievement of the objectives defined by the United Nations Framework Convention on Climate Change (UNFCCC) or the Paris Agreement, and the Sustainable Development Goals. To assess the state of the art on these various points, the Web of Science metadatabase is queried. With a query combining the keywords "climate*" and "urban*", more than 24,000 articles are identified, source of more than 40,000 distinct keywords (but including synonyms and acronyms) which finely mesh the conceptual field of research. The occurrence of one or more names of the 514 African cities of more than 100,000 inhabitants or countries, reduces this base to a smaller corpus of about 1410 articles (2990 keywords). 41 countries and 136 African cities are cited. The lexicometric analysis of the metadata of the articles and the analysis of the structural indicators (various centralities) of the networks induced by the co-occurrence of expressions related more specifically to climate services show the development potential of these services, identify the gaps which remain to be filled for their implementation and allow to compare the diversity of national and regional situations with regard to these services.

Keywords: African cities, climate change, climate services, integrated urban services, lexicometry, networks, urban planning, web of science

Procedia PDF Downloads 195
24376 Data-Focused Digital Transformation for Smart Net-Zero Cities: A Systems Thinking Approach

Authors: Farzaneh Mohammadi Jouzdani, Vahid Javidroozi, Monica Mateo Garcia, Hanifa Shah

Abstract:

The emergence of developing smart net-zero cities in recent years has attracted significant attention and interest from worldwide communities and scholars as a potential solution to the critical requirement for urban sustainability. This research-in-progress paper aims to investigate the development of smart net-zero cities to propose a digital transformation roadmap for smart net-zero cities with a primary focus on data. Employing systems thinking as an underpinning theory, the study advocates for the necessity of utilising a holistic strategy for understanding the complex interdependencies and interrelationships that characterise urban systems. The proposed methodology will involve an in-depth investigation of current data-driven approaches in the smart net-zero city. This is followed by utilising predictive analysis methods to evaluate the holistic impact of the approaches on moving toward a Smart net-zero city. It is expected to achieve systemic intervention followed by a data-focused and systemic digital transformation roadmap for smart net-zero, contributing to a more holistic understanding of urban sustainability.

Keywords: smart city, net-zero city, digital transformation, systems thinking, data integration, data-driven approach

Procedia PDF Downloads 23
24375 Impact of Drought in Farm Level Income in the United States

Authors: Anil Giri, Kyle Lovercamp, Sankalp Sharma

Abstract:

Farm level incomes fluctuate significantly due to extreme weather events such as drought. In the light of recent extreme weather events it is important to understand the implications of extreme weather events, flood and drought, on farm level incomes. This study examines the variation in farm level incomes for the United States in drought and no- drought years. Factoring heterogeneity in different enterprises (crop, livestock) and geography this paper analyzes the impact of drought in farm level incomes at state and national level. Livestock industry seems to be affected more by the lag in production of input feed for production, crops, as preliminary results show. Furthermore, preliminary results also show that while crop producers are not affected much due to drought, as price and quantity effect worked on opposite direction with same magnitude, that was not the case for livestock and horticulture enterprises. Results also showed that even when price effect was not as high the crop insurance component helped absorb much of shock for crop producers. Finally, the effect was heterogeneous for different states more on the coastal states compared Midwest region. This study should generate a lot of interest from policy makers across the world as some countries are actively seeking to increase subsidies in their agriculture sector. This study shows how subsidies absorb the shocks for one enterprise more than others. Finally, this paper should also be able to give an insight to economists to design/recommend policies such that it is optimal given the production level of different enterprises in different countries.

Keywords: farm level income, United States, crop, livestock

Procedia PDF Downloads 281
24374 Analysis of an Alternative Data Base for the Estimation of Solar Radiation

Authors: Graciela Soares Marcelli, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Claudineia Brazil, Rafael Haag

Abstract:

The sun is a source of renewable energy, and its use as both a source of heat and light is one of the most promising energy alternatives for the future. To measure the thermal or photovoltaic systems a solar irradiation database is necessary. Brazil still has a reduced number of meteorological stations that provide frequency tests, as an alternative to the radio data platform, with reanalysis systems, quite significant. ERA-Interim is a global fire reanalysis by the European Center for Medium-Range Weather Forecasts (ECMWF). The data assimilation system used for the production of ERA-Interim is based on a 2006 version of the IFS (Cy31r2). The system includes a 4-dimensional variable analysis (4D-Var) with a 12-hour analysis window. The spatial resolution of the dataset is approximately 80 km at 60 vertical levels from the surface to 0.1 hPa. This work aims to make a comparative analysis between the ERA-Interim data and the data observed in the Solarimmetric Atlas of the State of Rio Grande do Sul, to verify its applicability in the absence of an observed data network. The analysis of the results obtained for a study region as an alternative to the energy potential of a given region.

Keywords: energy potential, reanalyses, renewable energy, solar radiation

Procedia PDF Downloads 164
24373 Fractional, Component and Morphological Composition of Ambient Air Dust in the Areas of Mining Industry

Authors: S.V. Kleyn, S.Yu. Zagorodnov, А.А. Kokoulina

Abstract:

Technogenic emissions of the mining and processing complex are characterized by a high content of chemical components and solid dust particles. However, each industrial enterprise and the surrounding area have features that require refinement and parameterization. Numerous studies have shown the negative impact of fine dust PM10 and PM2.5 on the health, as well as the possibility of toxic components absorption, including heavy metals by dust particles. The target of the study was the quantitative assessment of the fractional and particle size composition of ambient air dust in the area of impact by primary magnesium production complex. Also, we tried to describe the morphology features of dust particles. Study methods. To identify the dust emission sources, the analysis of the production process has been carried out. The particulate composition of the emissions was measured using laser particle analyzer Microtrac S3500 (covered range of particle size is 20 nm to 2000 km). Particle morphology and the component composition were established by electron microscopy by scanning microscope of high resolution (magnification rate - 5 to 300 000 times) with X-ray fluorescence device S3400N ‘HITACHI’. The chemical composition was identified by X-ray analysis of the samples using an X-ray diffractometer XRD-700 ‘Shimadzu’. Determination of the dust pollution level was carried out using model calculations of emissions in the atmosphere dispersion. The calculations were verified by instrumental studies. Results of the study. The results demonstrated that the dust emissions of different technical processes are heterogeneous and fractional structure is complicated. The percentage of particle sizes up to 2.5 micrometres inclusive was ranged from 0.00 to 56.70%; particle sizes less than 10 microns inclusive – 0.00 - 85.60%; particle sizes greater than 10 microns - 14.40% -100.00%. During microscopy, the presence of nanoscale size particles has been detected. Studied dust particles are round, irregular, cubic and integral shapes. The composition of the dust includes magnesium, sodium, potassium, calcium, iron, chlorine. On the base of obtained results, it was performed the model calculations of dust emissions dispersion and establishment of the areas of fine dust РМ 10 and РМ 2.5 distribution. It was found that the dust emissions of fine powder fractions PM10 and PM2.5 are dispersed over large distances and beyond the border of the industrial site of the enterprise. The population living near the enterprise is exposed to the risk of diseases associated with dust exposure. Data are transferred to the economic entity to make decisions on the measures to minimize the risks. Exposure and risks indicators on the health are used to provide named patient health and preventive care to the citizens living in the area of negative impact of the facility.

Keywords: dust emissions, еxposure assessment, PM 10, PM 2.5

Procedia PDF Downloads 261
24372 4G LTE Dynamic Pricing: The Drivers, Benefits, and Challenges

Authors: Ahmed Rashad Harb Riad Ismail

Abstract:

The purpose of this research is to study the potential of Dynamic Pricing if deployed by mobile operators and analyse its effects from both operators and consumers side. Furthermore, to conclude, throughout the research study, the recommended conditions for successful Dynamic Pricing deployment, recommended factors identifying the type of markets where Dynamic Pricing can be effective, and proposal for a Dynamic Pricing stakeholders’ framework were presented. Currently, the mobile telecommunications industry is witnessing a dramatic growth rate in the data consumption, being fostered mainly by higher data speed technology as the 4G LTE and by the smart devices penetration rates. However, operators’ revenue from data services lags behind and is decupled from this data consumption growth. Pricing strategy is a key factor affecting this ecosystem. Since the introduction of the 4G LTE technology will increase the pace of data growth in multiples, consequently, if pricing strategies remain constant, then the revenue and usage gap will grow wider, risking the sustainability of the ecosystem. Therefore, this research study is focused on Dynamic Pricing for 4G LTE data services, researching the drivers, benefits and challenges of 4G LTE Dynamic Pricing and the feasibility of its deployment in practice from different perspectives including operators, regulators, consumers, and telecommunications equipment manufacturers point of views.

Keywords: LTE, dynamic pricing, EPC, research

Procedia PDF Downloads 333
24371 Prediction of Wind Speed by Artificial Neural Networks for Energy Application

Authors: S. Adjiri-Bailiche, S. M. Boudia, H. Daaou, S. Hadouche, A. Benzaoui

Abstract:

In this work the study of changes in the wind speed depending on the altitude is calculated and described by the model of the neural networks, the use of measured data, the speed and direction of wind, temperature and the humidity at 10 m are used as input data and as data targets at 50m above sea level. Comparing predict wind speeds and extrapolated at 50 m above sea level is performed. The results show that the prediction by the method of artificial neural networks is very accurate.

Keywords: MATLAB, neural network, power low, vertical extrapolation, wind energy, wind speed

Procedia PDF Downloads 693
24370 A Case Study at PT Bank XYZ on The Role of Compensation, Career Development, and Employee Engagement towards Employee Performance

Authors: Ahmad Badawi Saluy, Novawiguna Kemalasari

Abstract:

This study aims to examine, analyze and explain the impacts of compensation, career development and employee engagement to employee’s performance partially and simultaneously (Case Study at PT Bank XYZ). The research design used is quantitative descriptive research causality involving 30 respondents. Sources of data are from primary and secondary data, primary data obtained from questionnaires distribution and secondary data obtained from journals and books. Data analysis used model test using smart application PLS 3 that consists of test outer model and inner model. The results showed that compensation, career development and employee engagement partially have a positive impact on employee performance, while they have a positive and significant impact on employee performance simultaneously. The independent variable has the greatest impact is the employee engagement.

Keywords: compensation, career development, employee engagement, employee performance

Procedia PDF Downloads 152
24369 Spectral Anomaly Detection and Clustering in Radiological Search

Authors: Thomas L. McCullough, John D. Hague, Marylesa M. Howard, Matthew K. Kiser, Michael A. Mazur, Lance K. McLean, Johanna L. Turk

Abstract:

Radiological search and mapping depends on the successful recognition of anomalies in large data sets which contain varied and dynamic backgrounds. We present a new algorithmic approach for real-time anomaly detection which is resistant to common detector imperfections, avoids the limitations of a source template library and provides immediate, and easily interpretable, user feedback. This algorithm is based on a continuous wavelet transform for variance reduction and evaluates the deviation between a foreground measurement and a local background expectation using methods from linear algebra. We also present a technique for recognizing and visualizing spectrally similar clusters of data. This technique uses Laplacian Eigenmap Manifold Learning to perform dimensional reduction which preserves the geometric "closeness" of the data while maintaining sensitivity to outlying data. We illustrate the utility of both techniques on real-world data sets.

Keywords: radiological search, radiological mapping, radioactivity, radiation protection

Procedia PDF Downloads 696
24368 Knowledge Engineering Based Smart Healthcare Solution

Authors: Rhaed Khiati, Muhammad Hanif

Abstract:

In the past decade, smart healthcare systems have been on an ascendant drift, especially with the evolution of hospitals and their increasing reliance on bioinformatics and software specializing in healthcare. Doctors have become reliant on technology more than ever, something that in the past would have been looked down upon, as technology has become imperative in reducing overall costs and improving the quality of patient care. With patient-doctor interactions becoming more necessary and more complicated than ever, systems must be developed while taking into account costs, patient comfort, and patient data, among other things. In this work, we proposed a smart hospital bed, which mixes the complexity and big data usage of traditional healthcare systems with the comfort found in soft beds while taking certain concerns like data confidentiality, security, and maintaining SLA agreements, etc. into account. This research work potentially provides users, namely patients and doctors, with a seamless interaction with to their respective nurses, as well as faster access to up-to-date personal data, including prescriptions and severity of the condition in contrast to the previous research in the area where there is lack of consideration of such provisions.

Keywords: big data, smart healthcare, distributed systems, bioinformatics

Procedia PDF Downloads 198
24367 Transformation of the Business Model in an Occupational Health Care Company Embedded in an Emerging Personal Data Ecosystem: A Case Study in Finland

Authors: Tero Huhtala, Minna Pikkarainen, Saila Saraniemi

Abstract:

Information technology has long been used as an enabler of exchange for goods and services. Services are evolving from generic to personalized, and the reverse use of customer data has been discussed in both academia and industry for the past few years. This article presents the results of an empirical case study in the area of preventive health care services. The primary data were gathered in workshops, in which future personal data-based services were conceptualized by analyzing future scenarios from a business perspective. The aim of this study is to understand business model transformation in emerging personal data ecosystems. The work was done as a case study in the context of occupational healthcare. The results have implications to theory and practice, indicating that adopting personal data management principles requires transformation of the business model, which, if successfully managed, may provide access to more resources, potential to offer better value, and additional customer channels. These advantages correlate with the broadening of the business ecosystem. Expanding the scope of this study to include more actors would improve the validity of the research. The results draw from existing literature and are based on findings from a case study and the economic properties of the healthcare industry in Finland.

Keywords: ecosystem, business model, personal data, preventive healthcare

Procedia PDF Downloads 249
24366 The Feasibility of Glycerol Steam Reforming in an Industrial Sized Fixed Bed Reactor Using Computational Fluid Dynamic (CFD) Simulations

Authors: Mahendra Singh, Narasimhareddy Ravuru

Abstract:

For the past decade, the production of biodiesel has significantly increased along with its by-product, glycerol. Biodiesel-derived glycerol massive entry into the glycerol market has caused its value to plummet. Newer ways to utilize the glycerol by-product must be implemented or the biodiesel industry will face serious economic problems. The biodiesel industry should consider steam reforming glycerol to produce hydrogen gas. Steam reforming is the most efficient way of producing hydrogen and there is a lot of demand for it in the petroleum and chemical industries. This study investigates the feasibility of glycerol steam reforming in an industrial sized fixed bed reactor. In this paper, using computational fluid dynamic (CFD) simulations, the extent of the transport resistances that would occur in an industrial sized reactor can be visualized. An important parameter in reactor design is the size of the catalyst particle. The size of the catalyst cannot be too large where transport resistances are too high, but also not too small where an extraordinary amount of pressure drop occurs. The goal of this paper is to find the best catalyst size under various flow rates that will result in the highest conversion. Computational fluid dynamics simulated the transport resistances and a pseudo-homogenous reactor model was used to evaluate the pressure drop and conversion. CFD simulations showed that glycerol steam reforming has strong internal diffusion resistances resulting in extremely low effectiveness factors. In the pseudo-homogenous reactor model, the highest conversion obtained with a Reynolds number of 100 (29.5 kg/h) was 9.14% using a 1/6 inch catalyst diameter. Due to the low effectiveness factors and high carbon deposition rates, a fluidized bed is recommended as the appropriate reactor to carry out glycerol steam reforming.

Keywords: computational fluid dynamic, fixed bed reactor, glycerol, steam reforming, biodiesel

Procedia PDF Downloads 308
24365 The Impact of Food Inflation on Poverty: An Analysis of the Different Households in the Philippines

Authors: Kara Gianina D. Rosas, Jade Emily L. Tong

Abstract:

This study assesses the vulnerability of households to food price shocks. Using the Philippines as a case study, the researchers aim to understand how such shocks can cause food insecurity in different types of households. This paper measures the impact of actual food price changes during the food crisis of 2006-2009 on poverty in relation to their spatial location. Households are classified as rural or urban and agricultural or non-agricultural. By treating food prices and consumption patterns as heterogeneous, this study differs from conventional poverty analysis as actual prices are used. Merging the Family, Income and Expenditure Survey (FIES) with the Consumer Price Index dataset (CPI), the researchers were able to determine the effects on poverty measures, specifically, headcount index, poverty gap, and poverty severity. The study finds that, without other interventions, food inflation would lead to a significant increase in the number of households that fall below the poverty threshold, except for households whose income is derived from agricultural activities. It also finds that much of the inflation during these years was fueled by the rise in staple food prices. Essentially, this paper aims to broaden the economic perspective of policymakers with regard to the heterogeneity of impacts of inflation through analyzing the deeper microeconomic levels of different subgroups. In hopes of finding a solution to lessen the inequality gap of poverty between the rural and urban poor, this paper aims to aid policymakers in creating projects targeted towards food insecurity.

Keywords: poverty, food inflation, agricultural households, non-agricultural households, net consumption ratio, urban poor, rural poor, head count index, poverty gap, poverty severity

Procedia PDF Downloads 246
24364 Design of an Instrumentation Setup and Data Acquisition System for a GAS Turbine Engine Using Suitable DAQ Software

Authors: Syed Nauman Bin Asghar Bukhari, Mohtashim Mansoor, Mohammad Nouman

Abstract:

Engine test-Bed system is a fundamental tool to measure dynamic parameters, economic performance, and reliability of an aircraft Engine, and its automation and accuracy directly influences the precision of acquired and analysed data. In this paper, we present the design of digital Data Acquisition (DAQ) system for a vintage aircraft engine test bed that lacks the capability of displaying all the analyzed parameters at one convenient location (one panel-one screen). Recording such measurements in the vintage test bed is not only time consuming but also prone to human errors. Digitizing such measurement system requires a Data Acquisition (DAQ) system capable of recording these parameters and displaying them on one screen-one panel monitor. The challenge in designing upgrade to the vintage systems arises with a need to build and integrate digital measurement system from scratch with a minimal budget and modifications to the existing vintage system. The proposed design not only displays all the key performance / maintenance parameters of the gas turbine engines for operator as well as quality inspector on separate screens but also records the data for further processing / archiving.

Keywords: Gas turbine engine, engine test cell, data acquisition, instrumentation

Procedia PDF Downloads 123
24363 Cable De-Commissioning of Legacy Accelerators at CERN

Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson

Abstract:

CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.

Keywords: CERN, de-cabling, injectors, quality assurance procedure

Procedia PDF Downloads 93
24362 War Heritage: Different Perceptions of the Dominant Discourse among Visitors to the “Adem Jashari” Memorial Complex in Prekaz

Authors: Zana Llonçari Osmani, Nita Llonçari

Abstract:

In Kosovo, public rhetoric and popular sentiment position the War of 1998-99 (the war) as central to the formation of contemporary Kosovo's national identity. This period was marked by the forced massive displacement of Kosovo Albanians, the destruction of entire settlements, the loss of family members, and the profound emotional trauma experienced by civilians, particularly those who actively participated in the war as members of the Kosovo Liberation Army (KLA). Amidst these profound experiences, the Prekaz Massacre (The Massacre) is widely regarded as the defining event that preceded the final struggles of 1999 and the long-awaited attainment of independence. This study aims to explore how different visitors perceive the dominant discourse at The Memorial, a site dedicated to commemorating the Prekaz Massacre, and to identify the factors that influence their perceptions. The research employs a comprehensive mixed-method approach, combining online surveys, critical discourse analysis of visitor impressions, and content analysis of media representations. The findings of the study highlight the significant role played by original material remains in shaping visitor perceptions of The Memorial in comparison to the curated symbols and figurative representations interspersed throughout the landscape. While the design elements and physical layout of the memorial undeniably hold significance in conveying the memoryscape, there are notable shortcomings in enhancing the overall visitor experience. Visitors are still primarily influenced by the tangible remnants of the war, suggesting that there is room for improvement in how design elements can more effectively contribute to the memorial's narrative and the collective memory of the Prekaz Massacre.

Keywords: critical discourse analysis, memorialisation, national discourse, public rhetoric, war tourism

Procedia PDF Downloads 85
24361 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network

Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang

Abstract:

‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.

Keywords: deep learning network, smart metering, water end use, water-energy data

Procedia PDF Downloads 306
24360 The Effect of Neurocognitive Exercise Program on ADHD Symptoms, Attention, and Dynamic Balance in Medication Naive Children with ADHD: A Pilot Study

Authors: Nurullah Buker, Ezgi Karagoz, Yesim Salik Sengul, Sevay Alsen Guney, Gokhan Yoyler, Aylin Ozbek

Abstract:

Attention Deficit Hyperactivity Disorder (ADHD) is one of the most common neurodevelopmental disorders with heterogeneous clinical features such as inattention, hyperactivity, and impulsivity. Many different types of exercise interventions were employed for children with ADHD. However, previous studies have usually examined the effects of non-specific exercise programs or short-term effects of exercise. The aim of this study is to investigate the effect of the Neurocognitive Exercise Program (NEP), which is a structured exercise program derived from Life Kinetik, and a relatively new for children with ADHD, on symptoms, attention, and dynamic balance in medication-naïve children with ADHD. Fourteen medication-naive children (7-12 years) with ADHD were included in the intervention group. NEP was performed once a week for ten weeks. The intervention group also performed a structured home exercise program for another six days, for ten weeks. The children in the intervention group were assessed at baseline, in the third month, in the sixth month, and in the twelfth month regarding ADHD-related symptoms, attention, and dynamic balance. Fifteen age-matched typically developing children were assessed once for establishing normative values. Hyperactivity-Impulsivity score and dynamic balance were found to improve after NEP in the ADHD group in the 3rd month (p<0.05). In addition, these results were similar for both groups after NEP and at the end of the 12th month (p>0.05). The NEP may provide beneficial effects on hyperactivity-impulsivity, oppositional defiant, and dynamic balance in children with ADHD, and the improvements may be maintained in the long term.

Keywords: ADHD, attention problems, dynamic balance, neurocognitive exercise

Procedia PDF Downloads 81
24359 Strategy Management of Soybean (Glycine max L.) for Dealing with Extreme Climate through the Use of Cropsyst Model

Authors: Aminah Muchdar, Nuraeni, Eddy

Abstract:

The aims of the research are: (1) to verify the cropsyst plant model of experimental data in the field of soybean plants and (2) to predict planting time and potential yield soybean plant with the use of cropsyst model. This research is divided into several stages: (1) first calibration stage which conducted in the field from June until September 2015.(2) application models stage, where the data obtained from calibration in the field will be included in cropsyst models. The required data models are climate data, ground data/soil data,also crop genetic data. The relationship between the obtained result in field with simulation cropsyst model indicated by Efficiency Index (EF) which the value is 0,939.That is showing that cropsyst model is well used. From the calculation result RRMSE which the value is 1,922%.That is showing that comparative fault prediction results from simulation with result obtained in the field is 1,92%. The conclusion has obtained that the prediction of soybean planting time cropsyst based models that have been made valid for use. and the appropriate planting time for planting soybeans mainly on rain-fed land is at the end of the rainy season, in which the above study first planting time (June 2, 2015) which gives the highest production, because at that time there was still some rain. Tanggamus varieties more resistant to slow planting time cause the percentage decrease in the yield of each decade is lower than the average of all varieties.

Keywords: soybean, Cropsyst, calibration, efficiency Index, RRMSE

Procedia PDF Downloads 180
24358 Aluminum Matrix Composites Reinforced by Glassy Carbon-Titanium Spatial Structure

Authors: B. Hekner, J. Myalski, P. Wrzesniowski

Abstract:

This study presents aluminum matrix composites reinforced by glassy carbon (GC) and titanium (Ti). In the first step, the heterophase (GC+Ti), spatial form (similar to skeleton) of reinforcement was obtained via own method. The polyurethane foam (with spatial, open-cells structure) covered by suspension of Ti particles in phenolic resin was pyrolyzed. In the second step, the prepared heterogeneous foams were infiltrated by aluminium alloy. The manufactured composites are designated to industrial application, especially as a material used in tribological field. From this point of view, the glassy carbon was applied to stabilise a coefficient of friction on the required value 0.6 and reduce wear. Furthermore, the wear can be limited due to titanium phase application, which reveals high mechanical properties. Moreover, fabrication of thin titanium layer on the carbon skeleton leads to reduce contact between aluminium alloy and carbon and thus aluminium carbide phase creation. However, the main modification involves the manufacturing of reinforcement in the form of 3D, skeleton foam. This kind on reinforcement reveals a few important advantages compared to classical form of reinforcement-particles: possibility to control homogeneity of reinforcement phase in composite material; low-advanced technique of composite manufacturing- infiltration; possibility to application the reinforcement only in required places of material; strict control of phase composition; High quality of bonding between components of material. This research is founded by NCN in the UMO-2016/23/N/ST8/00994.

Keywords: metal matrix composites, MMC, glassy carbon, heterophase composites, tribological application

Procedia PDF Downloads 118
24357 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 429
24356 Thermal Ageing of a 316 Nb Stainless Steel: From Mechanical and Microstructural Analyses to Thermal Ageing Models for Long Time Prediction

Authors: Julien Monnier, Isabelle Mouton, Francois Buy, Adrien Michel, Sylvain Ringeval, Joel Malaplate, Caroline Toffolon, Bernard Marini, Audrey Lechartier

Abstract:

Chosen to design and assemble massive components for nuclear industry, the 316 Nb austenitic stainless steel (also called 316 Nb) suits well this function thanks to its mechanical, heat and corrosion handling properties. However, these properties might change during steel’s life due to thermal ageing causing changes within its microstructure. Our main purpose is to determine if the 316 Nb will keep its mechanical properties after an exposition to industrial temperatures (around 300 °C) during a long period of time (< 10 years). The 316 Nb is composed by different phases, which are austenite as main phase, niobium-carbides, and ferrite remaining from the ferrite to austenite transformation during the process. Our purpose is to understand thermal ageing effects on the material microstructure and properties and to submit a model predicting the evolution of 316 Nb properties as a function of temperature and time. To do so, based on Fe-Cr and 316 Nb phase diagrams, we studied the thermal ageing of 316 Nb steel alloys (1%v of ferrite) and welds (10%v of ferrite) for various temperatures (350, 400, and 450 °C) and ageing time (from 1 to 10.000 hours). Higher temperatures have been chosen to reduce thermal treatment time by exploiting a kinetic effect of temperature on 316 Nb ageing without modifying reaction mechanisms. Our results from early times of ageing show no effect on steel’s global properties linked to austenite stability, but an increase of ferrite hardness during thermal ageing has been observed. It has been shown that austenite’s crystalline structure (cfc) grants it a thermal stability, however, ferrite crystalline structure (bcc) favours iron-chromium demixion and formation of iron-rich and chromium-rich phases within ferrite. Observations of thermal ageing effects on ferrite’s microstructure were necessary to understand the changes caused by the thermal treatment. Analyses have been performed by using different techniques like Atomic Probe Tomography (APT) and Differential Scanning Calorimetry (DSC). A demixion of alloy’s elements leading to formation of iron-rich (α phase, bcc structure), chromium-rich (α’ phase, bcc structure), and nickel-rich (fcc structure) phases within the ferrite have been observed and associated to the increase of ferrite’s hardness. APT results grant information about phases’ volume fraction and composition, allowing to associate hardness measurements to the volume fractions of the different phases and to set up a way to calculate α’ and nickel-rich particles’ growth rate depending on temperature. The same methodology has been applied to DSC results, which allowed us to measure the enthalpy of α’ phase dissolution between 500 and 600_°C. To resume, we started from mechanical and macroscopic measurements and explained the results through microstructural study. The data obtained has been match to CALPHAD models’ prediction and used to improve these calculations and employ them to predict 316 Nb properties’ change during the industrial process.

Keywords: stainless steel characterization, atom probe tomography APT, vickers hardness, differential scanning calorimetry DSC, thermal ageing

Procedia PDF Downloads 93
24355 Sparse Coding Based Classification of Electrocardiography Signals Using Data-Driven Complete Dictionary Learning

Authors: Fuad Noman, Sh-Hussain Salleh, Chee-Ming Ting, Hadri Hussain, Syed Rasul

Abstract:

In this paper, a data-driven dictionary approach is proposed for the automatic detection and classification of cardiovascular abnormalities. Electrocardiography (ECG) signal is represented by the trained complete dictionaries that contain prototypes or atoms to avoid the limitations of pre-defined dictionaries. The data-driven trained dictionaries simply take the ECG signal as input rather than extracting features to study the set of parameters that yield the most descriptive dictionary. The approach inherently learns the complicated morphological changes in ECG waveform, which is then used to improve the classification. The classification performance was evaluated with ECG data under two different preprocessing environments. In the first category, QT-database is baseline drift corrected with notch filter and it filters the 60 Hz power line noise. In the second category, the data are further filtered using fast moving average smoother. The experimental results on QT database confirm that our proposed algorithm shows a classification accuracy of 92%.

Keywords: electrocardiogram, dictionary learning, sparse coding, classification

Procedia PDF Downloads 386
24354 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 251
24353 Multimedia Container for Autonomous Car

Authors: Janusz Bobulski, Mariusz Kubanek

Abstract:

The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.

Keywords: an autonomous car, image processing, lidar, obstacle detection

Procedia PDF Downloads 226
24352 Biogas Enhancement Using Iron Oxide Nanoparticles and Multi-Wall Carbon Nanotubes

Authors: John Justo Ambuchi, Zhaohan Zhang, Yujie Feng

Abstract:

Quick development and usage of nanotechnology have resulted to massive use of various nanoparticles, such as iron oxide nanoparticles (IONPs) and multi-wall carbon nanotubes (MWCNTs). Thus, this study investigated the role of IONPs and MWCNTs in enhancing bioenergy recovery. Results show that IONPs at a concentration of 750 mg/L and MWCNTs at a concentration of 1500 mg/L induced faster substrate utilization and biogas production rates than the control. IONPs exhibited higher carbon oxygen demand (COD) removal efficiency than MWCNTs while on the contrary, MWCNT performance on biogas generation was remarkable than IONPs. Furthermore, scanning electron microscopy (SEM) investigation revealed extracellular polymeric substances (EPS) excretion from AGS had an interaction with nanoparticles. This interaction created a protective barrier to microbial consortia hence reducing their cytotoxicity. Microbial community analyses revealed genus predominance of bacteria of Anaerolineaceae and Longilinea. Their role in biodegradation of the substrate could have highly been boosted by nanoparticles. The archaea predominance of the genus level of Methanosaeta and Methanobacterium enhanced methanation process. The presence of bacteria of genus Geobacter was also reported. Their presence might have significantly contributed to direct interspecies electron transfer in the system. Exposure of AGS to nanoparticles promoted direct interspecies electron transfer among the anaerobic fermenting bacteria and their counterpart methanogens during the anaerobic digestion process. This results provide useful insightful information in understanding the response of microorganisms to IONPs and MWCNTs in the complex natural environment.

Keywords: anaerobic granular sludge, extracellular polymeric substances, iron oxide nanoparticles, multi-wall carbon nanotubes

Procedia PDF Downloads 293
24351 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm

Authors: Monojit Manna, Arpan Adhikary

Abstract:

In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.

Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection

Procedia PDF Downloads 78
24350 Microbial Degradation of Lignin for Production of Valuable Chemicals

Authors: Fnu Asina, Ivana Brzonova, Keith Voeller, Yun Ji, Alena Kubatova, Evguenii Kozliak

Abstract:

Lignin, a heterogeneous three-dimensional biopolymer, is one of the building blocks of lignocellulosic biomass. Due to its limited chemical reactivity, lignin is currently processed as a low-value by-product in pulp and paper mills. Among various industrial lignins, Kraft lignin represents a major source of by-products generated during the widely employed pulping process across the pulp and paper industry. Therefore, valorization of Kraft lignin holds great potential as this would provide a readily available source of aromatic compounds for various industrial applications. Microbial degradation is well known for using both highly specific ligninolytic enzymes secreted by microorganisms and mild operating conditions compared with conventional chemical approaches. In this study, the degradation of Indulin AT lignin was assessed by comparing the effects of Basidiomycetous fungi (Coriolus versicolour and Trametes gallica) and Actinobacteria (Mycobacterium sp. and Streptomyces sp.) to two commercial laccases, T. versicolour ( ≥ 10 U/mg) and C. versicolour ( ≥ 0.3 U/mg). After 54 days of cultivation, the extent of microbial degradation was significantly higher than that of commercial laccases, reaching a maximum of 38 wt% degradation for C. versicolour treated samples. Lignin degradation was further confirmed by thermal carbon analysis with a five-step temperature protocol. Compared with commercial laccases, a significant decrease in char formation at 850ºC was observed among all microbial-degraded lignins with a corresponding carbon percentage increase from 200ºC to 500ºC. To complement the carbon analysis result, chemical characterization of the degraded products at different stages of the delignification by microorganisms and commercial laccases was performed by Pyrolysis-GC-MS.

Keywords: lignin, microbial degradation, pyrolysis-GC-MS, thermal carbon analysis

Procedia PDF Downloads 412
24349 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: fingerprint, template protection, bio-cryptography, minutiae protection

Procedia PDF Downloads 170