Search results for: workflow privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 605

Search results for: workflow privacy

425 Fast Transient Workflow for External Automotive Aerodynamic Simulations

Authors: Christina Peristeri, Tobias Berg, Domenico Caridi, Paul Hutcheson, Robert Winstanley

Abstract:

In recent years the demand for rapid innovations in the automotive industry has led to the need for accelerated simulation procedures while retaining a detailed representation of the simulated phenomena. The project’s aim is to create a fast transient workflow for external aerodynamic CFD simulations of road vehicles. The geometry used was the SAE Notchback Closed Cooling DrivAer model, and the simulation results were compared with data from wind tunnel tests. The meshes generated for this study were of two types. One was a mix of polyhedral cells near the surface and hexahedral cells away from the surface. The other was an octree hex mesh with a rapid method of fitting to the surface. Three different grid refinement levels were used for each mesh type, with the biggest total cell count for the octree mesh being close to 1 billion. A series of steady-state solutions were obtained on three different grid levels using a pseudo-transient coupled solver and a k-omega-based RANS turbulence model. A mesh-independent solution was found in all cases with a medium level of refinement with 200 million cells. Stress-Blended Eddy Simulation (SBES) was chosen for the transient simulations, which uses a shielding function to explicitly switch between RANS and LES mode. A converged pseudo-transient steady-state solution was used to initialize the transient SBES run that was set up with the SIMPLEC pressure-velocity coupling scheme to reach the fastest solution (on both CPU & GPU solvers). An important part of this project was the use of FLUENT’s Multi-GPU solver. Tesla A100 GPU has been shown to be 8x faster than an Intel 48-core Sky Lake CPU system, leading to significant simulation speed-up compared to the traditional CPU solver. The current study used 4 Tesla A100 GPUs and 192 CPU cores. The combination of rapid octree meshing and GPU computing shows significant promise in reducing time and hardware costs for industrial strength aerodynamic simulations.

Keywords: CFD, DrivAer, LES, Multi-GPU solver, octree mesh, RANS

Procedia PDF Downloads 84
424 Bitcoin, Blockchain and Smart Contract: Attacks and Mitigations

Authors: Mohamed Rasslan, Doaa Abdelrahman, Mahmoud M. Nasreldin, Ghada Farouk, Heba K. Aslan

Abstract:

Blockchain is a distributed database that endorses transparency while bitcoin is a decentralized cryptocurrency (electronic cash) that endorses anonymity and is powered by blockchain technology. Smart contracts are programs that are stored on a blockchain. Smart contracts are executed when predetermined conditions are fulfilled. Smart contracts automate the agreement execution in order to make sure that all participants immediate-synchronism of the outcome-certainty, without any intermediary's involvement or time loss. Currently, the Bitcoin market worth billions of dollars. Bitcoin could be transferred from one purchaser to another without the need for an intermediary bank. Network nodes through cryptography verify bitcoin transactions, which are registered in a public-book called “blockchain”. Bitcoin could be replaced by other coins, merchandise, and services. Rapid growing of the bitcoin market-value, encourages its counterparts to make use of its weaknesses and exploit vulnerabilities for profit. Moreover, it motivates scientists to define known vulnerabilities, offer countermeasures, and predict future threats. In his paper, we study blockchain technology and bitcoin from the attacker’s point of view. Furthermore, mitigations for the attacks are suggested, and contemporary security solutions are discussed. Finally, research methods that achieve strict security and privacy protocol are elaborated.

Keywords: Cryptocurrencies, Blockchain, Bitcoin, Smart Contracts, Peer-to-Peer Network, Security Issues, Privacy Techniques

Procedia PDF Downloads 49
423 The Regulation of Reputational Information in the Sharing Economy

Authors: Emre Bayamlıoğlu

Abstract:

This paper aims to provide an account of the legal and the regulative aspects of the algorithmic reputation systems with a special emphasis on the sharing economy (i.e., Uber, Airbnb, Lyft) business model. The first section starts with an analysis of the legal and commercial nature of the tripartite relationship among the parties, namely, the host platform, individual sharers/service providers and the consumers/users. The section further examines to what extent an algorithmic system of reputational information could serve as an alternative to legal regulation. Shortcomings are explained and analyzed with specific examples from Airbnb Platform which is a pioneering success in the sharing economy. The following section focuses on the issue of governance and control of the reputational information. The section first analyzes the legal consequences of algorithmic filtering systems to detect undesired comments and how a delicate balance could be struck between the competing interests such as freedom of speech, privacy and the integrity of the commercial reputation. The third section deals with the problem of manipulation by users. Indeed many sharing economy businesses employ certain techniques of data mining and natural language processing to verify consistency of the feedback. Software agents referred as "bots" are employed by the users to "produce" fake reputation values. Such automated techniques are deceptive with significant negative effects for undermining the trust upon which the reputational system is built. The third section is devoted to explore the concerns with regard to data mobility, data ownership, and the privacy. Reputational information provided by the consumers in the form of textual comment may be regarded as a writing which is eligible to copyright protection. Algorithmic reputational systems also contain personal data pertaining both the individual entrepreneurs and the consumers. The final section starts with an overview of the notion of reputation as a communitarian and collective form of referential trust and further provides an evaluation of the above legal arguments from the perspective of public interest in the integrity of reputational information. The paper concludes with certain guidelines and design principles for algorithmic reputation systems, to address the above raised legal implications.

Keywords: sharing economy, design principles of algorithmic regulation, reputational systems, personal data protection, privacy

Procedia PDF Downloads 440
422 Understanding Complexity at Pre-Construction Stage in Project Planning of Construction Projects

Authors: Mehran Barani Shikhrobat, Roger Flanagan

Abstract:

The construction planning and scheduling based on using the current tools and techniques is resulted deterministic in nature (Gantt chart, CPM) or applying a very little probability of completion (PERT) for each task. However, every project embodies assumptions and influences and should start with a complete set of clearly defined goals and constraints that remain constant throughout the duration of the project. Construction planners continue to apply the traditional methods and tools of “hard” project management that were developed for “ideal projects,” neglecting the potential influence of complexity on the design and construction process. The aim of this research is to investigate the emergence and growth of complexity in project planning and to provide a model to consider the influence of complexity on the total project duration at the post-contract award pre-construction stage of a project. The literature review showed that complexity originates from different sources of environment, technical, and workflow interactions. They can be divided into two categories of complexity factors, first, project tasks, and second, project organisation management. Project tasks may originate from performance, lack of resources, or environmental changes for a specific task. Complexity factors that relate to organisation and management refer to workflow and interdependence of different parts. The literature review highlighted the ineffectiveness of traditional tools and techniques in planning for complexity. However, this research focus on understanding the fundamental causes of the complexity of construction projects were investigated through a questionnaire with industry experts. The results were used to develop a model that considers the core complexity factors and their interactions. System dynamics were used to investigate the model to consider the influence of complexity on project planning. Feedback from experts revealed 20 major complexity factors that impact project planning. The factors are divided into five categories known as core complexity factors. To understand the weight of each factor in comparison, the Analytical Hierarchy Process (AHP) analysis method is used. The comparison showed that externalities are ranked as the biggest influence across the complexity factors. The research underlines that there are many internal and external factors that impact project activities and the project overall. This research shows the importance of considering the influence of complexity on the project master plan undertaken at the post-contract award pre-construction phase of a project.

Keywords: project planning, project complexity measurement, planning uncertainty management, project risk management, strategic project scheduling

Procedia PDF Downloads 70
421 Health Status Monitoring of COVID-19 Patient's through Blood Tests and Naïve-Bayes

Authors: Carlos Arias-Alcaide, Cristina Soguero-Ruiz, Paloma Santos-Álvarez, Adrián García-Romero, Inmaculada Mora-Jiménez

Abstract:

Analysing clinical data with computers in such a way that have an impact on the practitioners’ workflow is a challenge nowadays. This paper provides a first approach for monitoring the health status of COVID-19 patients through the use of some biomarkers (blood tests) and the simplest Naïve Bayes classifier. Data of two Spanish hospitals were considered, showing the potential of our approach to estimate reasonable posterior probabilities even some days before the event.

Keywords: Bayesian model, blood biomarkers, classification, health tracing, machine learning, posterior probability

Procedia PDF Downloads 183
420 Preparedness for Nurses to Adopt the Implementation of Inpatient Medication Order Entry (IPMOE) System at United Christian Hospital (UCH) in Hong Kong

Authors: Yiu K. C. Jacky, Tang S. K. Eric, W. Y. Tsang, C. Y. Li, C. K. Leung

Abstract:

Objectives : (1) To enhance the competence of nurses on using IPMOE for drug administration; (2) To ensure the transition on implementation of IPMOE in safer and smooth way hospital-wide. Methodology: (1) Well-structured Governance: To make provision for IPMOE implementation, multidisciplinary governance structure at Corporate and Local levels are well established. (2) Staff Engagement: A series of staff engagement events were conducted including Staff Forum, IPMOE Hospital Visit, Kick-off Ceremony and establishment of IPMOE Webpage for familiarizing the forthcoming implementation with frontline staff. (3) Well-organized training program: from Workshop to Workplace Two different IPMOE training programs were tailor-made which aimed at introducing the core features of administration module. Fifty-five identical training classes and six train-the-trainer workshops were organized at 2-3Q 2015. Lending Scheme on IPMOE hardware for hands-on practicing was launched and further extended the training from workshop to workplace. (4) Standard Guidelines and Workflow: the related workflow and guidelines are developed which facilitates users to acquire the competence towards IPMOE and fully familiarize with the standardized contingency plan. (5) Facilities and Equipment: The installations of IPMOE hardware were promptly arranged for rollout. Besides, IPMOE training venue was well-established for staff training. (6) Risk Management Strategy: UCH Medication Safety Forum is organized in December 2015 for sharing “Tricks & Tips” on IPMOE which further disseminate at webpage for arousal of medication safety. Hospital-wide annual audit on drug administration was planned to figure out the compliance and deliberate the rooms for improvement. Results: Through the comprehensive training plan, over 1,000 UCH nurses attended the training program with positive feedback. They agreed that their competence on using IPMOE was enhanced. By the end of November 2015, 28 wards (over 1,000 Inpatient-bed) involving departments of M&G, SUR, O&T and O&G have been successfully rolled out IPMOE in 5-month. A smooth and safe transition of implementation of IPMOE was achieved. Eventually, we all get prepared for embedding IPMOE into daily nursing and work altogether for medication safety at UCH.

Keywords: drug administration, inpatient medication order entry system, medication safety, nursing informatics

Procedia PDF Downloads 302
419 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 223
418 Social Media Resignation the Only Way to Protect User Data and Restore Cognitive Balance, a Literature Review

Authors: Rajarshi Motilal

Abstract:

The birth of the Internet and the rise of social media marked an important chapter in the history of humankind. Often termed the fourth scientific revolution, the Internet has changed human lives and cognisance. The birth of Web 2.0, followed by the launch of social media and social networking sites, added another milestone to these technological advancements where connectivity and influx of information became dominant. With billions of individuals using the internet and social media sites in the 21st century, “users” became “consumers”, and orthodox marketing reshaped itself to digital marketing. Furthermore, organisations started using sophisticated algorithms to predict consumer purchase behaviour and manipulate it to sustain themselves in such a competitive environment. The rampant storage and analysis of individual data became the new normal, raising many questions about data privacy. The excessive usage of the Internet among individuals brought in other problems of them becoming addicted to it, scavenging for societal approval and instant gratification, subsequently leading to a collective dualism, isolation, and finally, depression. This study aims to determine the relationship between social media usage in the modern age and the rise of psychological and cognitive imbalances in human minds. The literature review is positioned timely as an addition to the existing work at a time when the world is constantly debating on whether social media resignation is the only way to protect user data and restore the decaying cognitive balance.

Keywords: social media, digital marketing, consumer behaviour, internet addiction, data privacy

Procedia PDF Downloads 47
417 A Medical Vulnerability Scoring System Incorporating Health and Data Sensitivity Metrics

Authors: Nadir A. Carreon, Christa Sonderer, Aakarsh Rao, Roman Lysecky

Abstract:

With the advent of complex software and increased connectivity, the security of life-critical medical devices is becoming an increasing concern, particularly with their direct impact on human safety. Security is essential, but it is impossible to develop completely secure and impenetrable systems at design time. Therefore, it is important to assess the potential impact on the security and safety of exploiting a vulnerability in such critical medical systems. The common vulnerability scoring system (CVSS) calculates the severity of exploitable vulnerabilities. However, for medical devices it does not consider the unique challenges of impacts to human health and privacy. Thus, the scoring of a medical device on which human life depends (e.g., pacemakers, insulin pumps) can score very low, while a system on which human life does not depend (e.g., hospital archiving systems) might score very high. In this paper, we propose a medical vulnerability scoring system (MVSS) that extends CVSS to address the health and privacy concerns of medical devices. We propose incorporating two new parameters, namely health impact, and sensitivity impact. Sensitivity refers to the type of information that can be stolen from the device, and health represents the impact on the safety of the patient if the vulnerability is exploited (e.g., potential harm, life-threatening). We evaluate fifteen different known vulnerabilities in medical devices and compare MVSS against two state-of-the-art medical device-oriented vulnerability scoring systems and the foundational CVSS.

Keywords: common vulnerability system, medical devices, medical device security, vulnerabilities

Procedia PDF Downloads 128
416 Design of Visual Repository, Constraint and Process Modeling Tool Based on Eclipse Plug-Ins

Authors: Rushiraj Heshi, Smriti Bhandari

Abstract:

Master Data Management requires creation of Central repository, applying constraints on Repository and designing processes to manage data. Designing of Repository, constraints on repository and business processes is very tedious and time consuming task for large Enterprise. Hence Visual Repository, constraints and Process (Workflow) modeling is the most critical step in Master Data Management.In this paper, we realize a Visual Modeling tool for implementing Repositories, Constraints and Processes based on Eclipse Plugin using GMF/EMF which follows principles of Model Driven Engineering (MDE).

Keywords: EMF, GMF, GEF, repository, constraint, process

Procedia PDF Downloads 462
415 Process Improvement and Redesign of the Immuno Histology (IHC) Lab at MSKCC: A Lean and Ergonomic Study

Authors: Samantha Meyerholz

Abstract:

MSKCC offers patients cutting edge cancer care with the highest quality standards. However, many patients and industry members do not realize that the operations of the Immunology Histology Lab (IHC) are the backbone for carrying out this mission. The IHC lab manufactures blocks and slides containing critical tissue samples that will be read by a Pathologist to diagnose and dictate a patient’s treatment course. The lab processes 200 requests daily, leading to the generation of approximately 2,000 slides and 1,100 blocks each day. Lab material is transported through labeling, cutting, staining and sorting manufacturing stations, while being managed by multiple techs throughout the space. The quality of the stain as well as wait times associated with processing requests, is directly associated with patients receiving rapid treatments and having a wider range of care options. This project aims to improve slide request turnaround time for rush and non-rush cases, while increasing the quality of each request filled (no missing slides or poorly stained items). Rush cases are to be filled in less than 24 hours, while standard cases are allotted a 48 hour time period. Reducing turnaround times enable patients to communicate sooner with their clinical team regarding their diagnosis, ultimately leading faster treatments and potentially better outcomes. Additional project goals included streamlining tech and material workflow, while reducing waste and increasing efficiency. This project followed a DMAIC structure with emphasis on lean and ergonomic principles that could be integrated into an evolving lab culture. Load times and batching processes were analyzed using process mapping, FMEA analysis, waste analysis, engineering observation, 5S and spaghetti diagramming. Reduction of lab technician movement as well as their body position at each workstation was of top concern to pathology leadership. With new equipment being brought into the lab to carry out workflow improvements, screen and tool placement was discussed with the techs in focus groups, to reduce variation and increase comfort throughout the workspace. 5S analysis was completed in two phases in the IHC lab, helping to drive solutions that reduced rework and tech motion. The IHC lab plans to continue utilizing these techniques to further reduce the time gap between tissue analysis and cancer care.

Keywords: engineering, ergonomics, healthcare, lean

Procedia PDF Downloads 198
414 Computational, Human, and Material Modalities: An Augmented Reality Workflow for Building form Found Textile Structures

Authors: James Forren

Abstract:

This research paper details a recent demonstrator project in which digital form found textile structures were built by human craftspersons wearing augmented reality (AR) head-worn displays (HWDs). The project utilized a wet-state natural fiber / cementitious matrix composite to generate minimal bending shapes in tension which, when cured and rotated, performed as minimal-bending compression members. The significance of the project is that it synthesizes computational structural simulations with visually guided handcraft production. Computational and physical form-finding methods with textiles are well characterized in the development of architectural form. One difficulty, however, is physically building computer simulations: often requiring complicated digital fabrication workflows. However, AR HWDs have been used to build a complex digital form from bricks, wood, plastic, and steel without digital fabrication devices. These projects utilize, instead, the tacit knowledge motor schema of the human craftsperson. Computational simulations offer unprecedented speed and performance in solving complex structural problems. Human craftspersons possess highly efficient complex spatial reasoning motor schemas. And textiles offer efficient form-generating possibilities for individual structural members and overall structural forms. This project proposes that the synthesis of these three modalities of structural problem-solving – computational, human, and material - may not only develop efficient structural form but offer further creative potentialities when the respective intelligence of each modality is productively leveraged. The project methodology pertains to its three modalities of production: 1) computational, 2) human, and 3) material. A proprietary three-dimensional graphic statics simulator generated a three-legged arch as a wireframe model. This wireframe was discretized into nine modules, three modules per leg. Each module was modeled as a woven matrix of one-inch diameter chords. And each woven matrix was transmitted to a holographic engine running on HWDs. Craftspersons wearing the HWDs then wove wet cementitious chords within a simple falsework frame to match the minimal bending form displayed in front of them. Once the woven components cured, they were demounted from the frame. The components were then assembled into a full structure using the holographically displayed computational model as a guide. The assembled structure was approximately eighteen feet in diameter and ten feet in height and matched the holographic model to under an inch of tolerance. The construction validated the computational simulation of the minimal bending form as it was dimensionally stable for a ten-day period, after which it was disassembled. The demonstrator illustrated the facility with which computationally derived, a structurally stable form could be achieved by the holographically guided, complex three-dimensional motor schema of the human craftsperson. However, the workflow traveled unidirectionally from computer to human to material: failing to fully leverage the intelligence of each modality. Subsequent research – a workshop testing human interaction with a physics engine simulation of string networks; and research on the use of HWDs to capture hand gestures in weaving seeks to develop further interactivity with rope and chord towards a bi-directional workflow within full-scale building environments.

Keywords: augmented reality, cementitious composites, computational form finding, textile structures

Procedia PDF Downloads 138
413 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves

Authors: Shengnan Chen, Shuhua Wang

Abstract:

Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.

Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves

Procedia PDF Downloads 258
412 The Technophobia among Older Adults in China

Authors: Erhong Sun, Xuchun Ye

Abstract:

Technophobia, namely the fear or dislike of modern advanced technologies, plays a central role in age-related digital divides and is considered a new risk factor for older adults, which can affect the daily lives of people through low adherence to digital living. Indeed, there is considerable heterogeneity in the group of older adults who feel technophobia. Therefore, the aim of this study was to identify different technophobia typologies of older people and to examine their associations with the subjective age factor. A sample of 704 retired elderly over the age of 55 was recruited in China. Technophobia and subjective age were assessed with a questionnaire, respectively. Latent profile analysis was used to identify technophobia subgroups, using three dimensions including techno-anxiety, techno-paranoia, and privacy concerns as indicators. The association between the identified technophobia subgroups and subjective age was explored. In summary, four different technophobia typologies were identified among older adults in China. Combined with an investigation of personal background characteristics and subjective age, it draws a more nuanced image of the technophobia phenome among older adults in China. First, not all older adults suffer from technophobia, with about half of the elderly subjects belonging to the profiles of “Low-technophobia” and “Medium-technophobia.” Second, privacy concern plays an important role in the classification of technophobia among older adults. Third, subjective age might be a protective factor for technophobia in older adults. Although the causal direction between identified technophobia typologies and subjective age remains uncertain, our suggests that future interventions should better focus on subjective age by breaking the age stereotype of technology to reduce the negative effect of technophobia on older. Future development of this research will involve extensive investigation of the detailed impact of technophobia in senior populations, measurement of the negative outcomes, as well as formulation of innovative educational and clinical pathways.

Keywords: technophobia, older adults, latent profile analysis, subjective age

Procedia PDF Downloads 45
411 A Review of Gas Hydrate Rock Physics Models

Authors: Hemin Yuan, Yun Wang, Xiangchun Wang

Abstract:

Gas hydrate is drawing attention due to the fact that it has an enormous amount all over the world, which is almost twice the conventional hydrocarbon reserves, making it a potential alternative source of energy. It is widely distributed in permafrost and continental ocean shelves, and many countries have launched national programs for investigating the gas hydrate. Gas hydrate is mainly explored through seismic methods, which include bottom simulating reflectors (BSR), amplitude blanking, and polarity reverse. These seismic methods are effective at finding the gas hydrate formations but usually contain large uncertainties when applying to invert the micro-scale petrophysical properties of the formations due to lack of constraints. Rock physics modeling links the micro-scale structures of the rocks to the macro-scale elastic properties and can work as effective constraints for the seismic methods. A number of rock physics models have been proposed for gas hydrate modeling, which addresses different mechanisms and applications. However, these models are generally not well classified, and it is confusing to determine the appropriate model for a specific study. Moreover, since the modeling usually involves multiple models and steps, it is difficult to determine the source of uncertainties. To solve these problems, we summarize the developed models/methods and make four classifications of the models according to the hydrate micro-scale morphology in sediments, the purpose of reservoir characterization, the stage of gas hydrate generation, and the lithology type of hosting sediments. Some sub-categories may overlap each other, but they have different priorities. Besides, we also analyze the priorities of different models, bring up the shortcomings, and explain the appropriate application scenarios. Moreover, by comparing the models, we summarize a general workflow of the modeling procedure, which includes rock matrix forming, dry rock frame generating, pore fluids mixing, and final fluid substitution in the rock frame. These procedures have been widely used in various gas hydrate modeling and have been confirmed to be effective. We also analyze the potential sources of uncertainties in each modeling step, which enables us to clearly recognize the potential uncertainties in the modeling. In the end, we explicate the general problems of the current models, including the influences of pressure and temperature, pore geometry, hydrate morphology, and rock structure change during gas hydrate dissociation and re-generation. We also point out that attenuation is also severely affected by gas hydrate in sediments and may work as an indicator to map gas hydrate concentration. Our work classifies rock physics models of gas hydrate into different categories, generalizes the modeling workflow, analyzes the modeling uncertainties and potential problems, which can facilitate the rock physics characterization of gas hydrate bearding sediments and provide hints for future studies.

Keywords: gas hydrate, rock physics model, modeling classification, hydrate morphology

Procedia PDF Downloads 112
410 An Investigation on Opportunities and Obstacles on Implementation of Building Information Modelling for Pre-fabrication in Small and Medium Sized Construction Companies in Germany: A Practical Approach

Authors: Nijanthan Mohan, Rolf Gross, Fabian Theis

Abstract:

The conventional method used in the construction industries often resulted in significant rework since most of the decisions were taken onsite under the pressure of project deadlines and also due to the improper information flow, which results in ineffective coordination. However, today’s architecture, engineering, and construction (AEC) stakeholders demand faster and accurate deliverables, efficient buildings, and smart processes, which turns out to be a tall order. Hence, the building information modelling (BIM) concept was developed as a solution to fulfill the above-mentioned necessities. Even though BIM is successfully implemented in most of the world, it is still in the early stages in Germany, since the stakeholders are sceptical of its reliability and efficiency. Due to the huge capital requirement, the small and medium-sized construction companies are still reluctant to implement BIM workflow in their projects. The purpose of this paper is to analyse the opportunities and obstacles to implementing BIM for prefabrication. Among all other advantages of BIM, pre-fabrication is chosen for this paper because it plays a vital role in creating an impact on time as well as cost factors of a construction project. The positive impact of prefabrication can be explicitly observed by the project stakeholders and participants, which enables the breakthrough of the skepticism factor among the small scale construction companies. The analysis consists of the development of a process workflow for implementing prefabrication in building construction, followed by a practical approach, which was executed with two case studies. The first case study represents on-site prefabrication, and the second was done for off-site prefabrication. It was planned in such a way that the first case study gives a first-hand experience for the workers at the site on the BIM model so that they can make much use of the created BIM model, which is a better representation compared to the traditional 2D plan. The main aim of the first case study is to create a belief in the implementation of BIM models, which was succeeded by the execution of offshore prefabrication in the second case study. Based on the case studies, the cost and time analysis was made, and it is inferred that the implementation of BIM for prefabrication can reduce construction time, ensures minimal or no wastes, better accuracy, less problem-solving at the construction site. It is also observed that this process requires more planning time, better communication, and coordination between different disciplines such as mechanical, electrical, plumbing, architecture, etc., which was the major obstacle for successful implementation. This paper was carried out in the perspective of small and medium-sized mechanical contracting companies for the private building sector in Germany.

Keywords: building information modelling, construction wastes, pre-fabrication, small and medium sized company

Procedia PDF Downloads 84
409 Detection of Patient Roll-Over Using High-Sensitivity Pressure Sensors

Authors: Keita Nishio, Takashi Kaburagi, Yosuke Kurihara

Abstract:

Recent advances in medical technology have served to enhance average life expectancy. However, the total time for which the patients are prescribed complete bedrest has also increased. With patients being required to maintain a constant lying posture- also called bedsore- development of a system to detect patient roll-over becomes imperative. For this purpose, extant studies have proposed the use of cameras, and favorable results have been reported. Continuous on-camera monitoring, however, tends to violate patient privacy. We have proposed unconstrained bio-signal measurement system that could detect body-motion during sleep and does not violate patient’s privacy. Therefore, in this study, we propose a roll-over detection method by the date obtained from the bi-signal measurement system. Signals recorded by the sensor were assumed to comprise respiration, pulse, body motion, and noise components. Compared the body-motion and respiration, pulse component, the body-motion, during roll-over, generate large vibration. Thus, analysis of the body-motion component facilitates detection of the roll-over tendency. The large vibration associated with the roll-over motion has a great effect on the Root Mean Square (RMS) value of time series of the body motion component calculated during short 10 s segments. After calculation, the RMS value during each segment was compared to a threshold value set in advance. If RMS value in any segment exceeded the threshold, corresponding data were considered to indicate occurrence of a roll-over. In order to validate the proposed method, we conducted experiment. A bi-directional microphone was adopted as a high-sensitivity pressure sensor and was placed between the mattress and bedframe. Recorded signals passed through an analog Band-pass Filter (BPF) operating over the 0.16-16 Hz bandwidth. BPF allowed the respiration, pulse, and body-motion to pass whilst removing the noise component. Output from BPF was A/D converted with the sampling frequency 100Hz, and the measurement time was 480 seconds. The number of subjects and data corresponded to 5 and 10, respectively. Subjects laid on a mattress in the supine position. During data measurement, subjects—upon the investigator's instruction—were asked to roll over into four different positions—supine to left lateral, left lateral to prone, prone to right lateral, and right lateral to supine. Recorded data was divided into 48 segments with 10 s intervals, and the corresponding RMS value for each segment was calculated. The system was evaluated by the accuracy between the investigator’s instruction and the detected segment. As the result, an accuracy of 100% was achieved. While reviewing the time series of recorded data, segments indicating roll-over tendencies were observed to demonstrate a large amplitude. However, clear differences between decubitus and the roll-over motion could not be confirmed. Extant researches possessed a disadvantage in terms of patient privacy. The proposed study, however, demonstrates more precise detection of patient roll-over tendencies without violating their privacy. As a future prospect, decubitus estimation before and after roll-over could be attempted. Since in this paper, we could not confirm the clear differences between decubitus and the roll-over motion, future studies could be based on utilization of the respiration and pulse components.

Keywords: bedsore, high-sensitivity pressure sensor, roll-over, unconstrained bio-signal measurement

Procedia PDF Downloads 94
408 Ethical Artificial Intelligence: An Exploratory Study of Guidelines

Authors: Ahmad Haidar

Abstract:

The rapid adoption of Artificial Intelligence (AI) technology holds unforeseen risks like privacy violation, unemployment, and algorithmic bias, triggering research institutions, governments, and companies to develop principles of AI ethics. The extensive and diverse literature on AI lacks an analysis of the evolution of principles developed in recent years. There are two fundamental purposes of this paper. The first is to provide insights into how the principles of AI ethics have been changed recently, including concepts like risk management and public participation. In doing so, a NOISE (Needs, Opportunities, Improvements, Strengths, & Exceptions) analysis will be presented. Second, offering a framework for building Ethical AI linked to sustainability. This research adopts an explorative approach, more specifically, an inductive approach to address the theoretical gap. Consequently, this paper tracks the different efforts to have “trustworthy AI” and “ethical AI,” concluding a list of 12 documents released from 2017 to 2022. The analysis of this list unifies the different approaches toward trustworthy AI in two steps. First, splitting the principles into two categories, technical and net benefit, and second, testing the frequency of each principle, providing the different technical principles that may be useful for stakeholders considering the lifecycle of AI, or what is known as sustainable AI. Sustainable AI is the third wave of AI ethics and a movement to drive change throughout the entire lifecycle of AI products (i.e., idea generation, training, re-tuning, implementation, and governance) in the direction of greater ecological integrity and social fairness. In this vein, results suggest transparency, privacy, fairness, safety, autonomy, and accountability as recommended technical principles to include in the lifecycle of AI. Another contribution is to capture the different basis that aid the process of AI for sustainability (e.g., towards sustainable development goals). The results indicate data governance, do no harm, human well-being, and risk management as crucial AI for sustainability principles. This study’s last contribution clarifies how the principles evolved. To illustrate, in 2018, the Montreal declaration mentioned eight principles well-being, autonomy, privacy, solidarity, democratic participation, equity, and diversity. In 2021, notions emerged from the European Commission proposal, including public trust, public participation, scientific integrity, risk assessment, flexibility, benefit and cost, and interagency coordination. The study design will strengthen the validity of previous studies. Yet, we advance knowledge in trustworthy AI by considering recent documents, linking principles with sustainable AI and AI for sustainability, and shedding light on the evolution of guidelines over time.

Keywords: artificial intelligence, AI for sustainability, declarations, framework, regulations, risks, sustainable AI

Procedia PDF Downloads 65
407 Humanising Hospital Retrofitting: The Case Study of Malaysia Public Hospitals

Authors: Nur Faridatull Syafinaz Ahmad Tajudin

Abstract:

A hospital is a setting where individuals who are ill or injured are treated and cared for by doctors and nurses. Sanatoriums are settings where people can receive treatment and rest, particularly when recovering from a protracted illness. According to the report, hospitals are primarily designed to meet the needs of medical personnel by maximising their functionality and workflow. Hospitals frequently do a poor job of determining the patients' physical and emotional requirements and expectations. The literature on hospital design has recently focused more on the seeming need to "humanise" medical facilities. Despite the popularity of this design objective, "humanising" a space has hardly ever been defined or critically examined. The term "humanistic design" covered a broad range of design elements and designer interpretations. In reality, the hospital's layout and design the hospital may have a massive effect on patients' feel experience things and heal.

Keywords: hospital retrofitting, hospital design, humanising hospital, spatial design

Procedia PDF Downloads 78
406 Unified Theory of Acceptance and Use of Technology in Evaluating Voters' Intention Towards the Adoption of Electronic Forensic Election Audit System

Authors: Sijuade A. A., Oguntoye J. P., Awodoye O. O., Adedapo O. A., Wahab W. B., Okediran O. O., Omidiora E. O., Olabiyisi S. O.

Abstract:

Electronic voting systems have been introduced to improve the efficiency, accuracy, and transparency of the election process in many countries around the world, including Nigeria. However, concerns have been raised about the security and integrity of these systems. One way to address these concerns is through the implementation of electronic forensic election audit systems. This study aims to evaluate voters' intention to the adoption of electronic forensic election audit systems using the Unified Theory of Acceptance and Use of Technology (UTAUT) model. In the study, the UTAUT model which is a widely used model in the field of information systems to explain the factors that influence individuals' intention to use a technology by integrating performance expectancy, effort expectancy, social influence, facilitating conditions, cost factor and privacy factor to voters’ behavioural intention was proposed. A total of 294 sample data were collected from a selected population of electorates who had at one time or the other participated in at least an electioneering process in Nigeria. The data was then analyzed statistically using Partial Least Square Structural Equation Modeling (PLS-SEM). The results obtained show that all variables have a significant effect on the electorates’ behavioral intention to adopt the development and implementation of an electronic forensic election audit system in Nigeria.

Keywords: election Audi, voters, UTAUT, performance expectancy, effort expectancy, social influence, facilitating condition social influence, facilitating conditions, cost factor, privacy factor, behavioural intention

Procedia PDF Downloads 41
405 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions

Authors: Joel Niklaus, Matthias Sturmer

Abstract:

The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.

Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling

Procedia PDF Downloads 108
404 Computer-Aided Depression Screening: A Literature Review on Optimal Methodologies for Mental Health Screening

Authors: Michelle Nighswander

Abstract:

Suicide can be a tragic response to mental illness. It is difficult for people to disclose or discuss suicidal impulses. The stigma surrounding mental health can create a reluctance to seek help for mental illness. Patients may feel pressure to exhibit a socially desirable demeanor rather than reveal these issues, especially if they sense their healthcare provider is pressed for time or does not have an extensive history with their provider. Overcoming these barriers can be challenging. Although there are several validated depression and suicide risk instruments, varying processes used to administer these tools may impact the truthfulness of the responses. A literature review was conducted to find evidence of the impact of the environment on the accuracy of depression screening. Many investigations do not describe the environment and fewer studies use a comparison design. However, three studies demonstrated that computerized self-reporting might be more likely to elicit truthful and accurate responses due to increased privacy when responding compared to a face-to-face interview. These studies showed patients reported positive reactions to computerized screening for other stigmatizing health conditions such as alcohol use during pregnancy. Computerized self-screening for depression offers the possibility of more privacy and patient reflection, which could then send a targeted message of risk to the healthcare provider. This could potentially increase the accuracy while also increasing time efficiency for the clinic. Considering the persistent effects of mental health stigma, how these screening questions are posed can impact patients’ responses. This literature review analyzes trends in depression screening methodologies, the impact of setting on the results and how this may assist in overcoming one barrier caused by stigma.

Keywords: computerized self-report, depression, mental health stigma, suicide risk

Procedia PDF Downloads 103
403 A Case Study on the Impact of Technology Readiness in a Department of Clinical Nurses

Authors: Julie Delany

Abstract:

To thrive in today’s digital climate, it is vital that organisations adopt new technology and prepare for rising digital trends. This proves more difficult in government where, traditionally, people lack change readiness. While individuals may have a desire to work smarter, this does not necessarily mean embracing technology. This paper discusses the rollout of an application into a small department of highly experienced nurses. The goal was to both streamline the department's workflow and provide a platform for gathering essential business metrics. The biggest challenges were adoption and motivating the nurses to change their routines and learn new computer skills. Two-thirds struggled with the change, and as a result, some jeopardised the validity of the business metrics. In conclusion, there are lessons learned and recommendations for similar projects.

Keywords: change ready, information technology, end-user, iterative method, rollout plan, data analytics

Procedia PDF Downloads 104
402 Smart City Solutions for Enhancing the Cultural and Historic Value of Urban Heritage Sites

Authors: Farnoosh Faal

Abstract:

The trend among smart cities is to incorporate technological advancements to better manage and protect their cultural heritage sites. This study investigates how smart city solutions can improve the cultural and historical significance of urban heritage sites and assesses present practices and potential for the future. The paper delves into the literature to examine how smart city technologies can be utilized to increase knowledge and respect for cultural heritage, as well as promote sustainable tourism and economic growth. The article reviews various instances of smart city initiatives across different regions of the world, pinpointing innovative tactics and best practices in improving the cultural and historical worth of urban heritage sites. Additionally, it analyzes the difficulties and limitations associated with implementing these solutions, including community involvement, privacy concerns, and data management issues. The conclusions drawn from this paper propose that smart city solutions offer a substantial opportunity to augment the cultural and historical value of urban heritage sites. By effectively integrating technology into heritage management, there can be greater comprehension and admiration for cultural heritage, enhanced visitor experience, and support for sustainable tourism. However, to fully exploit the potential of smart city solutions in this context, it is crucial to prioritize community engagement and participation, as well as ensure that data management practices are transparent, responsible, and respectful of privacy. In summary, this paper offers guidance and advice to policymakers, urban planners, and heritage management professionals who want to increase the cultural and historical significance of urban heritage sites through the application of smart city solutions. It emphasizes the significance of creating comprehensive and cooperative strategies, as well as ensuring that efforts to preserve heritage are sustainable, fair, and efficient.

Keywords: smart city, Urban heritage, sustainable tourism, heritage preservation

Procedia PDF Downloads 62
401 The Recording of Personal Data in the Spanish Criminal Justice System and Its Impact on the Right to Privacy

Authors: Deborah García-Magna

Abstract:

When a person goes through the criminal justice system, either as a suspect, arrested, prosecuted or convicted, certain personal data are recorded, and a wide range of persons and organizations may have access to it. The recording of data can have a great impact on the daily life of the person concerned during the period of time determined by the legislation. In addition, this registered information can refer to various aspects not strictly related directly to the alleged or actually committed infraction. In some areas, the Spanish legislation does not clearly determine the cancellation period of the registers nor what happens when they are cancelled since some of the files are not really erased and remain recorded, even if their consultation is no more allowed or it is stated that they should not be taken into account. Thus, access to the recorded data of arrested or convicted persons may reduce their possibilities of reintegration into society. In this research, some of the areas in which data recording has a special impact on the lives of affected persons are analyzed in a critical manner, taking into account Spanish legislation and jurisprudence, and the influence of the European Court of Human Rights, the Council of Europe and other supranational instruments. In particular, the analysis cover the scope of video-surveillance in public spaces, the police record, the recording of personal data for the purposes of police investigation (especially DNA and psychological profiles), the registry of administrative and minor offenses (especially as they are taken into account to impose aggravating circumstaces), criminal records (of adults, minors and legal entities), and the registration of special circumstances occurred during the execution of the sentence (files of inmates under special surveillance –FIES–, disciplinary sanctions, special therapies in prison, etc.).

Keywords: ECHR jurisprudence, formal and informal criminal control, privacy, disciplinary sanctions, social reintegration

Procedia PDF Downloads 120
400 An Exploratory Investigation into the Quality of Life of People with Multi-Drug Resistant Pulmonary Tuberculosis (MDR-PTB) Using the ICF Core Sets: A Preliminary Investigation

Authors: Shamila Manie, Soraya Maart, Ayesha Osman

Abstract:

Introduction: People diagnosed with multidrug resistant pulmonary tuberculosis (MDR-PTB) is subjected to prolonged hospitalization in South Africa. It has thus become essential for research to shift its focus from a purely medical approach, but to include social and environmental factors when looking at the impact of the disease on those affected. Aim: To explore the factors affecting individuals with multi-drug resistant pulmonary tuberculosis during long-term hospitalization using the comprehensive ICF core-sets for obstructive pulmonary disease (OPD) and cardiopulmonary (CPR) conditions at Brooklyn Chest Hospital (BCH). Methods: A quantitative descriptive, cross-sectional study design was utilized. A convenient sample of 19 adults at Brooklyn Chest Hospital were interviewed. Results: Most participants reported a decrease in exercise tolerance levels (b455: n=11). However it did not limit participation. Participants reported that a lack of privacy in the environment (e155) was a barrier to health. The presence of health professionals (e355) and the provision of skills development services (e585) are facilitators to health and well-being. No differences exist in the functional ability of HIV positive and negative participants in this sample. Conclusion: The ICF Core Sets appeared valid in identifying the barriers and facilitators experienced by individuals with MDR-PTB admitted to BCH. The hospital environment must be improved to add to the QoL of those admitted, especially improving privacy within the wards. Although the social grant is seen as a facilitator, greater emphasis must be placed on preparing individuals to be economically active in the labour for when they are discharged.

Keywords: multidrug resistant tuberculosis, MDR ICF core sets, health-related quality of life (HRQoL), hospitalization

Procedia PDF Downloads 316
399 Autonomic Recovery Plan with Server Virtualization

Authors: S. Hameed, S. Anwer, M. Saad, M. Saady

Abstract:

For autonomic recovery with server virtualization, a cogent plan that includes recovery techniques and backups with virtualized servers can be developed instead of assigning an idle server to backup operations. In addition to hardware cost reduction and data center trail, the disaster recovery plan can ensure system uptime and to meet objectives of high availability, recovery time, recovery point, server provisioning, and quality of services. This autonomic solution would also support disaster management, testing, and development of the recovery site. In this research, a workflow plan is proposed for supporting disaster recovery with virtualization providing virtual monitoring, requirements engineering, solution decision making, quality testing, and disaster management. This recovery model would make disaster recovery a lot easier, faster, and less error prone.

Keywords: autonomous intelligence, disaster recovery, cloud computing, server virtualization

Procedia PDF Downloads 135
398 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia

Authors: Eyosiyas Aga

Abstract:

The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.

Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service

Procedia PDF Downloads 240
397 Insurance of Agricultural Activities as the Basis for Food Security

Authors: J. B. Akshataeva, G. T. Aigarinova, A. Amankulova, D. S. Kalkanova

Abstract:

This article examines some aspects of the insurance of agricultural activities, strategic documents on deepening investment opportunities. Insurance market development is before the society and the state. It also examines problems of agricultural insurance development in the market economy of Kazakhstan as the basis for food security.

Keywords: agriculture, food safety, insurance, privacy issues

Procedia PDF Downloads 477
396 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 485