Search results for: accurate data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26492

Search results for: accurate data

25832 Malaria Vulnerability Mapping from the Space: A Case Study of Damaturu Town-Nigeria

Authors: Isa Muhammad Zumo

Abstract:

Malaria is one of the worst illnesses that may affect humans. It is typically transmitted by the bite of a female Anopheles mosquito and is caused by parasitic protozoans from the Plasmodium parasite. Government and non-governmental organizations made numerous initiatives to combat the threat of malaria in communities. Nevertheless, the necessary attention was not paid to accurate and current information regarding the size and location of these favourable locations for mosquito development. Because mosquitoes can only reproduce in specific habitats with surface water, this study will locate and map those favourable sites that act as mosquito breeding grounds. Spatial and attribute data relating to favourable mosquito breeding places will be collected and analysed using Geographic Information Systems (GIS). The major findings will be in five classes, showing the vulnerable and risky areas for malaria cases. These risk categories are very high, high, moderate, low, and extremely low vulnerable areas. The maps produced by this study will be of great use to the health department in combating the malaria pandemic.

Keywords: Malaria, vulnerability, mapping, space, Damaturu

Procedia PDF Downloads 57
25831 An Analytical and Numerical Solutions for the Thermal Analysis of a Mechanical Draft Wet Cooling Tower

Authors: Hamed Djalal

Abstract:

The thermal analysis of the mechanical draft wet cooling tower is performed in this study by the heat and mass transfer modelization in the packing zone. After combining the heat and mass transfer laws, the mass and energy balances and by involving the Merkel assumptions; firstly, an ordinary differential equations system is derived and solved numerically by the Runge-Kutta method to determine the water and air temperatures, the humidity, and also other properties variation along the packing zone. Secondly, by making some linear assumptions for the air saturation curve, an analytical solution is formed, which is developed for the air washer calculation, but in this study, it is applied for the cooling tower to express also the previous parameters mathematically as a function of the packing height. Finally, a good agreement with experimental data is achieved by both solutions, but the numerical one seems to be the more accurate for modeling the heat and mass transfer process in the wet cooling tower.

Keywords: evaporative cooling, cooling tower, air washer, humidification, moist air, heat, and mass transfer

Procedia PDF Downloads 96
25830 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive

Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh

Abstract:

Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.

Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data

Procedia PDF Downloads 293
25829 Modelling Spatial Dynamics of Terrorism

Authors: André Python

Abstract:

To this day, terrorism persists as a worldwide threat, exemplified by the recent deadly attacks in January 2015 in Paris and the ongoing massacres perpetrated by ISIS in Iraq and Syria. In response to this threat, states deploy various counterterrorism measures, the cost of which could be reduced through effective preventive measures. In order to increase the efficiency of preventive measures, policy-makers may benefit from accurate predictive models that are able to capture the complex spatial dynamics of terrorism occurring at a local scale. Despite empirical research carried out at country-level that has confirmed theories explaining the diffusion processes of terrorism across space and time, scholars have failed to assess diffusion’s theories on a local scale. Moreover, since scholars have not made the most of recent statistical modelling approaches, they have been unable to build up predictive models accurate in both space and time. In an effort to address these shortcomings, this research suggests a novel approach to systematically assess the theories of terrorism’s diffusion on a local scale and provide a predictive model of the local spatial dynamics of terrorism worldwide. With a focus on the lethal terrorist events that occurred after 9/11, this paper addresses the following question: why and how does lethal terrorism diffuse in space and time? Based on geolocalised data on worldwide terrorist attacks and covariates gathered from 2002 to 2013, a binomial spatio-temporal point process is used to model the probability of terrorist attacks on a sphere (the world), the surface of which is discretised in the form of Delaunay triangles and refined in areas of specific interest. Within a Bayesian framework, the model is fitted through an integrated nested Laplace approximation - a recent fitting approach that computes fast and accurate estimates of posterior marginals. Hence, for each location in the world, the model provides a probability of encountering a lethal terrorist attack and measures of volatility, which inform on the model’s predictability. Diffusion processes are visualised through interactive maps that highlight space-time variations in the probability and volatility of encountering a lethal attack from 2002 to 2013. Based on the previous twelve years of observation, the location and lethality of terrorist events in 2014 are statistically accurately predicted. Throughout the global scope of this research, local diffusion processes such as escalation and relocation are systematically examined: the former process describes an expansion from high concentration areas of lethal terrorist events (hotspots) to neighbouring areas, while the latter is characterised by changes in the location of hotspots. By controlling for the effect of geographical, economical and demographic variables, the results of the model suggest that the diffusion processes of lethal terrorism are jointly driven by contagious and non-contagious factors that operate on a local scale – as predicted by theories of diffusion. Moreover, by providing a quantitative measure of predictability, the model prevents policy-makers from making decisions based on highly uncertain predictions. Ultimately, this research may provide important complementary tools to enhance the efficiency of policies that aim to prevent and combat terrorism.

Keywords: diffusion process, terrorism, spatial dynamics, spatio-temporal modeling

Procedia PDF Downloads 349
25828 Optimizing Residential Housing Renovation Strategies at Territorial Scale: A Data Driven Approach and Insights from the French Context

Authors: Rit M., Girard R., Villot J., Thorel M.

Abstract:

In a scenario of extensive residential housing renovation, stakeholders need models that support decision-making through a deep understanding of the existing building stock and accurate energy demand simulations. To address this need, we have modified an optimization model using open data that enables the study of renovation strategies at both territorial and national scales. This approach provides (1) a definition of a strategy to simplify decision trees from theoretical combinations, (2) input to decision makers on real-world renovation constraints, (3) more reliable identification of energy-saving measures (changes in technology or behaviour), and (4) discrepancies between currently planned and actually achieved strategies. The main contribution of the studies described in this document is the geographic scale: all residential buildings in the areas of interest were modeled and simulated using national data (geometries and attributes). These buildings were then renovated, when necessary, in accordance with the environmental objectives, taking into account the constraints applicable to each territory (number of renovations per year) or at the national level (renovation of thermal deficiencies (Energy Performance Certificates F&G)). This differs from traditional approaches that focus only on a few buildings or archetypes. This model can also be used to analyze the evolution of a building stock as a whole, as it can take into account both the construction of new buildings and their demolition or sale. Using specific case studies of French territories, this paper highlights a significant discrepancy between the strategies currently advocated by decision-makers and those proposed by our optimization model. This discrepancy is particularly evident in critical metrics such as the relationship between the number of renovations per year and achievable climate targets or the financial support currently available to households and the remaining costs. In addition, users are free to seek optimizations for their building stock across a range of different metrics (e.g., financial, energy, environmental, or life cycle analysis). These results are a clear call to re-evaluate existing renovation strategies and take a more nuanced and customized approach. As the climate crisis moves inexorably forward, harnessing the potential of advanced technologies and data-driven methodologies is imperative.

Keywords: residential housing renovation, MILP, energy demand simulations, data-driven methodology

Procedia PDF Downloads 67
25827 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects

Authors: Behnam Tavakkol

Abstract:

Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.

Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data

Procedia PDF Downloads 215
25826 Democracy Bytes: Interrogating the Exploitation of Data Democracy by Radical Terrorist Organizations

Authors: Nirmala Gopal, Sheetal Bhoola, Audecious Mugwagwa

Abstract:

This paper discusses the continued infringement and exploitation of data by non-state actors for destructive purposes, emphasizing radical terrorist organizations. It will discuss how terrorist organizations access and use data to foster their nefarious agendas. It further examines how cybersecurity, designed as a tool to curb data exploitation, is ineffective in raising global citizens' concerns about how their data can be kept safe and used for its acquired purpose. The study interrogates several policies and data protection instruments, such as the Data Protection Act, Cyber Security Policies, Protection of Personal Information(PPI) and General Data Protection Regulations (GDPR), to understand data use and storage in democratic states. The study outcomes point to the fact that international cybersecurity and cybercrime legislation, policies, and conventions have not curbed violations of data access and use by radical terrorist groups. The study recommends ways to enhance cybersecurity and reduce cyber risks using democratic principles.

Keywords: cybersecurity, data exploitation, terrorist organizations, data democracy

Procedia PDF Downloads 202
25825 Healthcare Data Mining Innovations

Authors: Eugenia Jilinguirian

Abstract:

In the healthcare industry, data mining is essential since it transforms the field by collecting useful data from large datasets. Data mining is the process of applying advanced analytical methods to large patient records and medical histories in order to identify patterns, correlations, and trends. Healthcare professionals can improve diagnosis accuracy, uncover hidden linkages, and predict disease outcomes by carefully examining these statistics. Additionally, data mining supports personalized medicine by personalizing treatment according to the unique attributes of each patient. This proactive strategy helps allocate resources more efficiently, enhances patient care, and streamlines operations. However, to effectively apply data mining, however, and ensure the use of private healthcare information, issues like data privacy and security must be carefully considered. Data mining continues to be vital for searching for more effective, efficient, and individualized healthcare solutions as technology evolves.

Keywords: data mining, healthcare, big data, individualised healthcare, healthcare solutions, database

Procedia PDF Downloads 64
25824 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering

Authors: Yunus Doğan, Ahmet Durap

Abstract:

Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.

Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods

Procedia PDF Downloads 360
25823 Sustainable Design for Building Envelope in Hot Climates: A Case Study for the Role of the Dome as a Component of an Envelope in Heat Exchange

Authors: Akeel Noori Almulla Hwaish

Abstract:

Architectural design is influenced by the actual thermal behaviour of building components, and this in turn depends not only on their steady and periodic thermal characteristics, but also on exposure effects, orientation, surface colour, and climatic fluctuations at the given location. Design data and environmental parameters should be produced in an accurate way for specified locations, so that architects and engineers can confidently apply them in their design calculations that enable precise evaluation of the influence of various parameters relating to each component of the envelope, which indicates overall thermal performance of building. The present paper will be carried out with an objective of thermal behaviour assessment and characteristics of the opaque and transparent parts of one of the very unique components used as a symbolic distinguished element of building envelope, its thermal behaviour under the impact of solar temperatures, and its role in heat exchange related to a specific U-value of specified construction materials alternatives. The research method will consider the specified Hot-Dry weather and new mosque in Baghdad, Iraq as a case study. Also, data will be presented in light of the criteria of indoor thermal comfort in terms of design parameters and thermal assessment for a“model dome”. Design alternatives and considerations of energy conservation, will be discussed as well using comparative computer simulations. Findings will be incorporated to outline the conclusions clarifying the important role of the dome in heat exchange of the whole building envelope for approaching an indoor thermal comfort level and further research in the future.

Keywords: building envelope, sustainable design, dome impact, hot-climates, heat exchange

Procedia PDF Downloads 473
25822 Uniqueness and Repeatability Analysis for Slim Tube Determined Minimum Miscibility Pressure

Authors: Waqar Ahmad Butt, Gholamreza Vakili Nezhaad, Ali Soud Al Bemani, Yahya Al Wahaibi

Abstract:

Miscible gas injection processes as secondary recovery methods can be applied to a huge number of mature reservoirs to improve the trapped oil displacement. Successful miscible gas injection processes require an accurate estimation of the minimum miscibility pressure (MMP) to make injection process feasible, economical, and effective. There are several methods of MMP determination like slim tube approach, vanishing interfacial tension and rising bubble apparatus but slim tube is the deployed experimental technique in this study. Slim tube method is assumed to be non-standardized for MMP determination with respect to both operating procedure and design. Therefore, 25 slim tube runs were being conducted with three different coil lengths (12, 18 and 24 m) of constant diameter using three different injection rates (0.08, 0.1 and 0.15 cc/min) to evaluate uniqueness and repeatability of determined MMP. A trend of decrease in MMP with increase in coil length was found. No unique trend was found between MMP and injection rate. Lowest MMP and highest recovery were observed with highest coil length and lowest injection rate. It shows that slim tube measured MMP does not depend solely on interacting fluids characteristics but also affected by used coil selection and injection rate choice. Therefore, both slim tube design and procedure need to be standardized. It is recommended to use lowest possible injection rate and estimated coil length depending upon the distance between injections and producing wells for accurate and reliable MMP determination.

Keywords: coil length, injection rate, minimum miscibility pressure, multiple contacts miscibility

Procedia PDF Downloads 252
25821 Performance Evaluation of Parallel Surface Modeling and Generation on Actual and Virtual Multicore Systems

Authors: Nyeng P. Gyang

Abstract:

Even though past, current and future trends suggest that multicore and cloud computing systems are increasingly prevalent/ubiquitous, this class of parallel systems is nonetheless underutilized, in general, and barely used for research on employing parallel Delaunay triangulation for parallel surface modeling and generation, in particular. The performances, of actual/physical and virtual/cloud multicore systems/machines, at executing various algorithms, which implement various parallelization strategies of the incremental insertion technique of the Delaunay triangulation algorithm, were evaluated. T-tests were run on the data collected, in order to determine whether various performance metrics differences (including execution time, speedup and efficiency) were statistically significant. Results show that the actual machine is approximately twice faster than the virtual machine at executing the same programs for the various parallelization strategies. Results, which furnish the scalability behaviors of the various parallelization strategies, also show that some of the differences between the performances of these systems, during different runs of the algorithms on the systems, were statistically significant. A few pseudo superlinear speedup results, which were computed from the raw data collected, are not true superlinear speedup values. These pseudo superlinear speedup values, which arise as a result of one way of computing speedups, disappear and give way to asymmetric speedups, which are the accurate kind of speedups that occur in the experiments performed.

Keywords: cloud computing systems, multicore systems, parallel Delaunay triangulation, parallel surface modeling and generation

Procedia PDF Downloads 204
25820 Access to Health Data in Medical Records in Indonesia in Terms of Personal Data Protection Principles: The Limitation and Its Implication

Authors: Anny Retnowati, Elisabeth Sundari

Abstract:

This research aims to elaborate the meaning of personal data protection principles on patient access to health data in medical records in Indonesia and its implications. The method uses normative legal research by examining health law in Indonesia regarding the patient's right to access their health data in medical records. The data will be analysed qualitatively using the interpretation method to elaborate on the limitation of the meaning of personal data protection principles on patients' access to their data in medical records. The results show that patients only have the right to obtain copies of their health data in medical records. There is no right to inspect directly at any time. Indonesian health law limits the principle of patients' right to broad access to their health data in medical records. This restriction has implications for the reduction of personal data protection as part of human rights. This research contribute to show that a limitaion of personal data protection may abuse the human rights.

Keywords: access, health data, medical records, personal data, protection

Procedia PDF Downloads 91
25819 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 355
25818 Remote Sensing and GIS Based Methodology for Identification of Low Crop Productivity in Gautam Buddha Nagar District

Authors: Shivangi Somvanshi

Abstract:

Poor crop productivity in salt-affected environment in the country is due to insufficient and untimely canal supply to agricultural land and inefficient field water management practices. This could further degrade due to inadequate maintenance of canal network, ongoing secondary soil salinization and waterlogging, worsening of groundwater quality. Large patches of low productivity in irrigation commands are occurring due to waterlogging and salt-affected soil, particularly in the scarcity rainfall year. Satellite remote sensing has been used for mapping of areas of low crop productivity, waterlogging and salt in irrigation commands. The spatial results obtained for these problems so far are less reliable for further use due to rapid change in soil quality parameters over the years. The existing spatial databases of canal network and flow data, groundwater quality and salt-affected soil were obtained from the central and state line departments/agencies and were integrated with GIS. Therefore, an integrated methodology based on remote sensing and GIS has been developed in ArcGIS environment on the basis of canal supply status, groundwater quality, salt-affected soils, and satellite-derived vegetation index (NDVI), salinity index (NDSI) and waterlogging index (NSWI). This methodology was tested for identification and delineation of area of low productivity in the Gautam Buddha Nagar district (Uttar Pradesh). It was found that the area affected by this problem lies mainly in Dankaur and Jewar blocks of the district. The problem area was verified with ground data and was found to be approximately 78% accurate. The methodology has potential to be used in other irrigation commands in the country to obtain reliable spatial data on low crop productivity.

Keywords: remote sensing, GIS, salt affected soil, crop productivity, Gautam Buddha Nagar

Procedia PDF Downloads 284
25817 Analysis and Forecasting of Bitcoin Price Using Exogenous Data

Authors: J-C. Leneveu, A. Chereau, L. Mansart, T. Mesbah, M. Wyka

Abstract:

Extracting and interpreting information from Big Data represent a stake for years to come in several sectors such as finance. Currently, numerous methods are used (such as Technical Analysis) to try to understand and to anticipate market behavior, with mixed results because it still seems impossible to exactly predict a financial trend. The increase of available data on Internet and their diversity represent a great opportunity for the financial world. Indeed, it is possible, along with these standard financial data, to focus on exogenous data to take into account more macroeconomic factors. Coupling the interpretation of these data with standard methods could allow obtaining more precise trend predictions. In this paper, in order to observe the influence of exogenous data price independent of other usual effects occurring in classical markets, behaviors of Bitcoin users are introduced in a model reconstituting Bitcoin value, which is elaborated and tested for prediction purposes.

Keywords: big data, bitcoin, data mining, social network, financial trends, exogenous data, global economy, behavioral finance

Procedia PDF Downloads 354
25816 Ab Initio Study of Structural, Elastic, Electronic and Thermal Properties of Full Heusler

Authors: M. Khalfa, H. Khachai, F. Chiker, K. Bougherara, R. Khenata, G. Murtaza, M. Harmel

Abstract:

A theoretical study of structural, elastic, electronic and thermodynamic properties of Fe2VX, (with X = Al and Ga), were studied by means of the full-relativistic version of the full-potential augmented plane wave plus local orbitals method. For exchange and correlation potential we used both generalized-gradient approximation (GGA) and local-density approximation (LDA). Our calculated ground state properties like as lattice constants, bulk modulus and elastic constants appear more accurate when we employed the GGA rather than the LDA approximation, and these results agree very well with the available experimental and theoretical data. Further, prediction of the thermal effects on some macroscopic properties of Fe2VAl and Fe2VGa are given in this paper using the quasi-harmonic Debye model in which the lattice vibrations are taken into account. We have obtained successfully the variations of the primitive cell volume, volume expansion coefficient, heat capacities and Debye temperature with pressure and temperature in the ranges of 0–40 GPa and 0–1500 K.

Keywords: full Heusler, FP-LAPW, electronic properties, thermal properties

Procedia PDF Downloads 493
25815 Accelerating Decision-Making in Oil and Gas Wells: 'A Digital Transformation Journey for Rapid and Precise Insights from Well History Data'

Authors: Linung Kresno Adikusumo, Ivan Ramos Sampe Immanuel, Liston Sitanggang

Abstract:

An excellent, well work program in the oil and gas industry can have numerous positive business impacts, contributing to operational efficiency, increased production, enhanced safety, and improved financial performance. In summary, an excellent, well work program not only ensures the immediate success of specific projects but also has a broader positive impact on the overall business performance and reputation of the oil and gas company. It positions the company for long-term success in a competitive and dynamic industry. Nevertheless, a number of challenges were encountered when developing a good work program, such as the poor quality and lack of integration of well documentation, the incompleteness of the well history, and the low accessibility of well documentation. As a result, the well work program was delivered less accurately, plus well damage was managed slowly. Our solution implementing digital technology by developing a web-based database and application not only solves those issues but also provides an easy-to-access report and user-friendly display for management as well as engineers to analyze the report’s content. This application aims to revolutionize the documentation of well history in the field of oil and gas exploration and production. The current lack of a streamlined and comprehensive system for capturing, organizing, and accessing well-related data presents challenges in maintaining accurate and up-to-date records. Our innovative solution introduces a user-friendly and efficient platform designed to capture well history documentation seamlessly.

Keywords: digital, drilling, well work, application

Procedia PDF Downloads 75
25814 Integrated Intensity and Spatial Enhancement Technique for Color Images

Authors: Evan W. Krieger, Vijayan K. Asari, Saibabu Arigela

Abstract:

Video imagery captured for real-time security and surveillance applications is typically captured in complex lighting conditions. These less than ideal conditions can result in imagery that can have underexposed or overexposed regions. It is also typical that the video is too low in resolution for certain applications. The purpose of security and surveillance video is that we should be able to make accurate conclusions based on the images seen in the video. Therefore, if poor lighting and low resolution conditions occur in the captured video, the ability to make accurate conclusions based on the received information will be reduced. We propose a solution to this problem by using image preprocessing to improve these images before use in a particular application. The proposed algorithm will integrate an intensity enhancement algorithm with a super resolution technique. The intensity enhancement portion consists of a nonlinear inverse sign transformation and an adaptive contrast enhancement. The super resolution section is a single image super resolution technique is a Fourier phase feature based method that uses a machine learning approach with kernel regression. The proposed technique intelligently integrates these algorithms to be able to produce a high quality output while also being more efficient than the sequential use of these algorithms. This integration is accomplished by performing the proposed algorithm on the intensity image produced from the original color image. After enhancement and super resolution, a color restoration technique is employed to obtain an improved visibility color image.

Keywords: dynamic range compression, multi-level Fourier features, nonlinear enhancement, super resolution

Procedia PDF Downloads 553
25813 The System Dynamics Research of China-Africa Trade, Investment and Economic Growth

Authors: Emma Serwaa Obobisaa, Haibo Chen

Abstract:

International trade and outward foreign direct investment are important factors which are generally recognized in the economic growth and development. Though several scholars have struggled to reveal the influence of trade and outward foreign direct investment (FDI) on economic growth, most studies utilized common econometric models such as vector autoregression and aggregated the variables, which for the most part prompts, however, contradictory and mixed results. Thus, there is an exigent need for the precise study of the trade and FDI effect of economic growth while applying strong econometric models and disaggregating the variables into its separate individual variables to explicate their respective effects on economic growth. This will guarantee the provision of policies and strategies that are geared towards individual variables to ensure sustainable development and growth. This study, therefore, seeks to examine the causal effect of China-Africa trade and Outward Foreign Direct Investment on the economic growth of Africa using a robust and recent econometric approach such as system dynamics model. Our study impanels and tests an ensemble of a group of vital variables predominant in recent studies on trade-FDI-economic growth causality: Foreign direct ınvestment, international trade and economic growth. Our results showed that the system dynamics method provides accurate statistical inference regarding the direction of the causality among the variables than the conventional method such as OLS and Granger Causality predominantly used in the literature as it is more robust and provides accurate, critical values.

Keywords: economic growth, outward foreign direct investment, system dynamics model, international trade

Procedia PDF Downloads 103
25812 On the Combination of Patient-Generated Data with Data from a Secure Clinical Network Environment: A Practical Example

Authors: Jeroen S. de Bruin, Karin Schindler, Christian Schuh

Abstract:

With increasingly more mobile health applications appearing due to the popularity of smartphones, the possibility arises that these data can be used to improve the medical diagnostic process, as well as the overall quality of healthcare, while at the same time lowering costs. However, as of yet there have been no reports of a successful combination of patient-generated data from smartphones with data from clinical routine. In this paper, we describe how these two types of data can be combined in a secure way without modification to hospital information systems, and how they can together be used in a medical expert system for automatic nutritional classification and triage.

Keywords: mobile health, data integration, expert systems, disease-related malnutrition

Procedia PDF Downloads 476
25811 The Critical Relevance of Credit and Debt Data in Household Food Security Analysis: The Risks of Ineffective Response Actions

Authors: Siddharth Krishnaswamy

Abstract:

Problem Statement: Currently, when analyzing household food security, the most commonly studied food access indicators are household income and expenditure. Larger studies do take into account other indices such as credit and employment. But these are baselines studies and by definition are conducted infrequently. Food security analysis for access is usually dedicated to analyzing income and expenditure indicators. And both these indicators are notoriously inconsistent. Yet this data can very often end up being the basis on which household food access is calculated; and by extension, be used for decision making. Objectives: This paper argues that along with income and expenditure, credit and debit information should be collected so that an accurate analysis of household food security (and in particular) food access can be determined. The lack of collection and analysis of this information routinely means that there is often a “masking” of the actual situation; a household’s food access and food availability patterns may be adequate mainly as a result of borrowing and may even be due to a long- term dependency (a debt cycle). In other words, such a household is, in reality, worse off than it appears a factor masked by its performance on basic access indicators. Procedures/methodologies/approaches: Existing food security data sets collected in 2005 in Azerbaijan, 2010 across Myanmar and 2014-15 across Uganda were used to support the theory that analyzing income and expenditure of a HHs and analyzing the same in addition to data on credit & borrowing patterns will result in an entirely different scenario of food access of the household. Furthermore, the data analyzed depicts food consumption patterns across groups of households and then relates this to the extent of dependency on credit, i.e. households borrowing money in order to meet food needs. Finally, response options that were based on analyzing only income and expenditure; and response options based on income, expenditure, credit, and borrowing – from the same geographical area of operation are studied and discussed. Results: The purpose of this work was to see if existing methods of household food security analysis could be improved. It is hoped that food security analysts will collect household level information on credit and debit and analyze them against income, expenditure and consumption patterns. This will help determine if a household’s food access and availability are dependent on unsustainable strategies such as borrowing money for food or undertaking sustained debts. Conclusions: The results clearly show the amount of relevant information that is missing in Food Access analysis if debit and borrowing of the household is not analyzed along with the typical Food Access indicators that are usually analyzed. And the serious repercussions this has on Programmatic response and interventions.

Keywords: analysis, food security indicators, response, resilience analysis

Procedia PDF Downloads 331
25810 The Prospects of Leveraging (Big) Data for Accelerating a Just Sustainable Transition around Different Contexts

Authors: Sombol Mokhles

Abstract:

This paper tries to show the prospects of utilising (big)data for enabling just the transition of diverse cities. Our key purpose is to offer a framework of applications and implications of utlising (big) data in comparing sustainability transitions across different cities. Relying on the cosmopolitan comparison, this paper explains the potential application of (big) data but also its limitations. The paper calls for adopting a data-driven and just perspective in including different cities around the world. Having a just and inclusive approach at the front and centre ensures a just transition with synergistic effects that leave nobody behind.

Keywords: big data, just sustainable transition, cosmopolitan city comparison, cities

Procedia PDF Downloads 98
25809 Quantification and Detection of Non-Sewer Water Infiltration and Inflow in Urban Sewer Systems

Authors: M. Beheshti, S. Saegrov, T. M. Muthanna

Abstract:

Separated sewer systems are designed to transfer the wastewater from houses and industrial sections to wastewater treatment plants. Unwanted water in the sewer systems is a well-known problem, i.e. storm-water inflow is around 50% of the foul sewer, and groundwater infiltration to the sewer system can exceed 50% of total wastewater volume in deteriorated networks. Infiltration and inflow of non-sewer water (I/I) into sewer systems is unfavorable in separated sewer systems and can trigger overloading the system and reducing the efficiency of wastewater treatment plants. Moreover, I/I has negative economic, environmental, and social impacts on urban areas. Therefore, for having sustainable management of urban sewer systems, I/I of unwanted water into the urban sewer systems should be considered carefully and maintenance and rehabilitation plan should be implemented on these water infrastructural assets. This study presents a methodology to identify and quantify the level of I/I into the sewer system. Amount of I/I is evaluated by accurate flow measurement in separated sewer systems for specified isolated catchments in Trondheim city (Norway). Advanced information about the characteristics of I/I is gained by CCTV inspection of sewer pipelines with high I/I contribution. Achieving enhanced knowledge about the detection and localization of non-sewer water in foul sewer system during the wet and dry weather conditions will enable the possibility for finding the problem of sewer system and prioritizing them and taking decisions for rehabilitation and renewal planning in the long-term. Furthermore, preventive measures and optimization of sewer systems functionality and efficiency can be executed by maintenance of sewer system. In this way, the exploitation of sewer system can be improved by maintenance and rehabilitation of existing pipelines in a sustainable way by more practical cost-effective and environmental friendly way. This study is conducted on specified catchments with different properties in Trondheim city. Risvollan catchment is one of these catchments with a measuring station to investigate hydrological parameters through the year, which also has a good database. For assessing the infiltration in a separated sewer system, applying the flow rate measurement method can be utilized in obtaining a general view of the network condition from infiltration point of view. This study discusses commonly used and advanced methods of localizing and quantifying I/I in sewer systems. A combination of these methods give sewer operators the possibility to compare different techniques and obtain reliable and accurate I/I data which is vital for long-term rehabilitation plans.

Keywords: flow rate measurement, infiltration and inflow (I/I), non-sewer water, separated sewer systems, sustainable management

Procedia PDF Downloads 333
25808 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes

Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono

Abstract:

Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is a widely used approach for LV segmentation but suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is proposed to improve the accuracy and speed of the model-based segmentation. Firstly, a robust and efficient detector based on Hough forest is proposed to localize cardiac feature points, and such points are used to predict the initial fitting of the LV shape model. Secondly, to achieve more accurate and detailed segmentation, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. The performance of the proposed method is evaluated on a dataset of 800 cardiac ultrasound images that are mostly of abnormal shapes. The proposed method is compared to several combinations of ASM and existing initialization methods. The experiment results demonstrate that the accuracy of feature point detection for initialization was improved by 40% compared to the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops, thus speeding up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.

Keywords: hough forest, active shape model, segmentation, cardiac left ventricle

Procedia PDF Downloads 335
25807 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection

Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar

Abstract:

Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.

Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE

Procedia PDF Downloads 367
25806 Strategic Workplace Security: The Role of Malware and the Threat of Internal Vulnerability

Authors: Modesta E. Ezema, Christopher C. Ezema, Christian C. Ugwu, Udoka F. Eze, Florence M. Babalola

Abstract:

Some employees knowingly or unknowingly contribute to loss of data and also expose data to threat in the process of getting their jobs done. Many organizations today are faced with the challenges of how to secure their data as cyber criminals constantly devise new ways of attacking the organization’s secret data. However, this paper enlists the latest strategies that must be put in place in order to protect these important data from being attacked in a collaborative work place. It also introduces us to Advanced Persistent Threats (APTs) and how it works. The empirical study was conducted to collect data from the employee in data centers on how data could be protected from malicious codes and cyber criminals and their responses are highly considered to help checkmate the activities of malicious code and cyber criminals in our work places.

Keywords: data, employee, malware, work place

Procedia PDF Downloads 382
25805 Application of Building Information Modelling In Analysing IGBC® Ratings (Sustainability Analyses)

Authors: Lokesh Harshe

Abstract:

The building construction sector is using 36% of global energy consumption with 39% of CO₂ emission. Professionals in the Built Environment Sector have long been aware of the industry’s contribution towards CO₂ emissions and are now moving towards more sustainable practices. As a result of this, many organizations have introduced rating systems to address the issue of global warming in the construction sector by ranking construction projects based on sustainability parameters. The pre-construction phase of any building project is the most essential time to make decisions for addressing the sustainability aspects. Traditionally, it is very difficult to collect data from different stakeholders and bring it together to form a decision based on factual data to perform sustainability analyses in the pre-construction phase. Building Information Modelling (BIM) is the solution where one single model is the result of the collaborative approach of BIM processes where all the information is shared, extracted, communicated, and stored on a single platform that everyone can access and make decisions based on real-time data. The focus of this research is on the Indian Green Rating System IGBC® with the objective of understanding IGBC® requirements and developing a framework to create the relationship between the rating processes and BIM. A Hypothetical (Architectural) model of a hostel building is developed using AutoCAD 2019 & Revit Arch. 2019, where the framework is applied to generate results on sustainability analysis using Green Building Studio (GBS) and Revit Add-ins. The results of any sustainability analysis are generated within a fraction of a minute, which is very quick in comparison with traditional sustainability analysis. This may save a considerable amount of time as well as cost. The future scope is to integrate Architectural, Structural, and MEP Models to perform accurate sustainability analyses with inputs from industry professionals working on real-life Green BIM projects.

Keywords: sustainability analyses, BIM, green rating systems, IGBC®, LEED

Procedia PDF Downloads 52
25804 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model

Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim

Abstract:

Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).

Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph

Procedia PDF Downloads 279
25803 Acceptance of Big Data Technologies and Its Influence towards Employee’s Perception on Job Performance

Authors: Jia Yi Yap, Angela S. H. Lee

Abstract:

With the use of big data technologies, organization can get result that they are interested in. Big data technologies simply load all the data that is useful for the organizations and provide organizations a better way of analysing data. The purpose of this research is to get employees’ opinion from films in Malaysia to explore the use of big data technologies in their organization in order to provide how it may affect the perception of the employees on job performance. Therefore, in order to identify will accepting big data technologies in the organization affect the perception of the employee, questionnaire will be distributed to different employee from different Small and medium-sized enterprises (SME) organization listed in Malaysia. The conceptual model proposed will test with other variables in order to see the relationship between variables.

Keywords: big data technologies, employee, job performance, questionnaire

Procedia PDF Downloads 296