Search results for: big data ecosystem
24343 Data Security and Privacy Challenges in Cloud Computing
Authors: Amir Rashid
Abstract:
Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud
Procedia PDF Downloads 29724342 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 8524341 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector
Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar
Abstract:
Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability
Procedia PDF Downloads 18324340 Design and Implementation of a Geodatabase and WebGIS
Authors: Sajid Ali, Dietrich Schröder
Abstract:
The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application
Procedia PDF Downloads 33824339 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis
Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi
Abstract:
The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH researchKeywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis
Procedia PDF Downloads 8824338 Nitrification Efficiency and Community Structure of Municipal Activated Sewage Sludge
Authors: Oluyemi O. Awolusi, Abimbola M. Enitan, Sheena Kumari, Faizal Bux
Abstract:
Nitrification is essential to biological processes designed to remove ammonia and/or total nitrogen. It removes the excess nitrogenous compound in wastewater which could be very toxic to the aquatic fauna or cause a serious imbalance of such aquatic ecosystem. Efficient nitrification is linked to an in-depth knowledge of the structure and dynamics of the nitrifying community structure within the wastewater treatment systems. In this study, molecular technique was employed for characterizing the microbial structure of activated sludge [ammonia oxidizing bacteria (AOB) and nitrite oxidizing bacteria (NOB)] in a municipal wastewater treatment with intention of linking it to the plant efficiency. PCR-based phylogenetic analysis was also carried out for. The average operating and environmental parameters, as well as specific nitrification rate of a plant, was investigated during the study. During the investigation, the average temperature was 23±1.5oC. Other operational parameters such as mixed liquor suspended solids and chemical oxygen demand inversely correlated with ammonia removal. The dissolved oxygen level in the plant was constantly lower than the optimum (between 0.24 and 1.267 mg/l) during this study. The plant was treating wastewater with the influent ammonia concentration of 31.69 and 24.47 mg/l. The influent flow rates (ML/day) was 96.81 during the period. The dominant nitrifiers include: Nitrosomonas spp. Nitrobacter spp. and Nitrospira spp. The AOB had a correlation with nitrification efficiency and temperature. This study shows that the specific ammonia oxidizing rate and the specific nitrate formation rates can serve as a good indicator of the plant overall nitrification performance.Keywords: Ammonia monooxygenase α-subunit gene, amoA, ammonia-oxidizing bacteria, AOB, nitrite-oxidizing bacteria, NOB, specific nitrification rate
Procedia PDF Downloads 45724337 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study
Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos
Abstract:
This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.Keywords: in-place devices, IoT, human-centred data-analytics, spatial design
Procedia PDF Downloads 19624336 Nursery Treatments May Improve Restoration Outcomes by Reducing Seedling Transplant Shock
Authors: Douglas E. Mainhart, Alejandro Fierro-Cabo, Bradley Christoffersen, Charlotte Reemts
Abstract:
Semi-arid ecosystems across the globe have faced land conversion for agriculture and resource extraction activities, posing a threat to the important ecosystem services they provide. Revegetation-centered restoration efforts in these regions face low success rates due to limited soil water availability and high temperatures leading to elevated seedling mortality after planting. Typical methods to alleviate these stresses require costly post-planting interventions aimed at improving soil moisture status. We set out to evaluate the efficacy of applying in-nursery treatments to address transplant shock. Four native Tamaulipan thornscrub species were compared. Three treatments were applied: elevated CO2, drought hardening (four-week exposure each), and antitranspirant foliar spray (the day prior to planting). Our goal was to answer two primary questions: (1) Do treatments improve survival and growth of seedlings in the early period post-planting? (2) If so, what underlying physiological changes are associated with this improved performance? To this end, we measured leaf gas exchange (stomatal conductance, light saturated photosynthetic rate, water use efficiency), leaf morphology (specific leaf area), and osmolality before and upon the conclusion of treatments. A subset of seedlings from all treatments have been planted, which will be monitored in coming months for in-field survival and growth.First month field survival for all treatment groups were high due to ample rainfall following planting (>85%). Growth data was unreliable due to high herbivory (68% of all sampled plants). While elevated CO2 had infrequent or no detectable influence on all aspects of leaf gas exchange, drought hardening reduced stomatal conductance in three of the four species measured without negatively impacting photosynthesis. Both CO2 and drought hardening elevated leaf osmolality in two species. Antitranspirant application significantly reduced conductance in all species for up to four days and reduced photosynthesis in two species. Antitranspirants also increased the variability of water use efficiency compared to controls. Collectively, these results suggest that antitranspirants and drought hardening are viable treatments for reducing short-term water loss during the transplant shock period. Elevated CO2, while not effective at reducing water loss, may be useful for promoting more favorable water status via osmotic adjustment. These practices could improve restoration outcomes in Tamaulipan thornscrub and other semi-arid systems. Further research should focus on evaluating combinations of these treatments and their species-specific viability.Keywords: conservation, drought conditioning, semi-arid restoration, plant physiology
Procedia PDF Downloads 8524335 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce
Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada
Abstract:
With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.Keywords: distributed algorithm, MapReduce, multi-class, support vector machine
Procedia PDF Downloads 39924334 Information Management Approach in the Prediction of Acute Appendicitis
Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki
Abstract:
This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree
Procedia PDF Downloads 34924333 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 17924332 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints
Authors: Amjad Khan
Abstract:
The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking
Procedia PDF Downloads 28324331 Traffic Prediction with Raw Data Utilization and Context Building
Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao
Abstract:
Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.Keywords: traffic prediction, raw data utilization, context building, data reduction
Procedia PDF Downloads 12624330 Bio-Inspired Design Approach Analysis: A Case Study of Antoni Gaudi and Santiago Calatrava
Authors: Marzieh Imani
Abstract:
Antoni Gaudi and Santiago Calatrava have reputation for designing bio-inspired creative and technical buildings. Even though they have followed different independent approaches towards design, the source of bio-inspiration seems to be common. Taking a closer look at their projects reveals that Calatrava has been influenced by Gaudi in terms of interpreting nature and applying natural principles into the design process. This research firstly discusses the dialogue between Biomimicry and architecture. This review also explores human/nature discourse during the history by focusing on how nature revealed itself to the fine arts. This is explained by introducing naturalism and romantic style in architecture as the outcome of designers’ inclination towards nature. Reviewing the literature, theoretical background and practical illustration of nature have been included. The most dominant practical aspects of imitating nature are form and function. Nature has been reflected in architectural science resulted in shaping different architectural styles such as organic, green, sustainable, bionic, and biomorphic. By defining a set of common aspects of Gaudi and Calatrava‘s design approach and by considering biomimetic design categories (organism, ecosystem, and behaviour as the main division and form, function, process, material, and construction as subdivisions), Gaudi’s and Calatrava’s project have been analysed. This analysis explores if their design approaches are equivalent or different. Based on this analysis, Gaudi’s architecture can be recognised as biomorphic while Calatrava’s projects are literally biomimetic. Referring to these architects, this review suggests a new set of principles by which a bio-inspired project can be determined either biomorphic or biomimetic.Keywords: biomimicry, Calatrava, Gaudi, nature
Procedia PDF Downloads 28624329 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya
Authors: Abdalla Abdelnabi, Yousf Abushalah
Abstract:
The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.Keywords: 3D seismic data, well logging, petrel, kingdom suite
Procedia PDF Downloads 14824328 Analysis of Spatial and Temporal Data Using Remote Sensing Technology
Authors: Kapil Pandey, Vishnu Goyal
Abstract:
Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing
Procedia PDF Downloads 43124327 Determination of Soil Loss by Erosion in Different Land Covers Categories and Slope Classes in Bovilla Watershed, Tirana, Albania
Authors: Valmir Baloshi, Fran Gjoka, Nehat Çollaku, Elvin Toromani
Abstract:
As a sediment production mechanism, soil erosion is the main environmental threat to the Bovilla watershed, including the decline of water quality of the Bovilla reservoir that provides drinking water to Tirana city (the capital of Albania). Therefore, an experiment with 25 erosion plots for soil erosion monitoring has been set up since June 2017. The aim was to determine the soil loss on plot and watershed scale in Bovilla watershed (Tirana region) for implementation of soil and water protection measures or payments for ecosystem services (PES) programs. The results of erosion monitoring for the period June 2017 - May 2018 showed that the highest values of surface runoff were noted in bare land of 38829.91 liters on slope of 74% and the lowest values in forest land of 12840.6 liters on slope of 64% while the highest values of soil loss were found in bare land of 595.15 t/ha on slope of 62% and lowest values in forest land of 18.99 t/ha on slope of 64%. These values are much higher than the average rate of soil loss in the European Union (2.46 ton/ha/year). In the same sloping class, the soil loss was reduced from orchard or bare land to the forest land, and in the same category of land use, the soil loss increased with increasing land slope. It is necessary to conduct chemical analyses of sediments to determine the amount of chemical elements leached out of the soil and end up in the reservoir of Bovilla. It is concluded that PES programs should be implemented for rehabilitation of sub-watersheds Ranxe, Vilez and Zall-Bastar of the Bovilla watershed with valuable conservation practices.Keywords: ANOVA, Bovilla, land cover, slope, soil loss, watershed management
Procedia PDF Downloads 15824326 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations
Authors: Hailye Tekleselassie
Abstract:
Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.Keywords: IoT, data, security, edge computing
Procedia PDF Downloads 8224325 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks
Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh
Abstract:
In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.Keywords: aggregation, estimation, queuing, wireless sensor network
Procedia PDF Downloads 18624324 Research and Application of Consultative Committee for Space Data Systems Wireless Communications Standards for Spacecraft
Authors: Cuitao Zhang, Xiongwen He
Abstract:
According to the new requirements of the future spacecraft, such as networking, modularization and non-cable, this paper studies the CCSDS wireless communications standards, and focuses on the low data-rate wireless communications for spacecraft monitoring and control. The application fields and advantages of wireless communications are analyzed. Wireless communications technology has significant advantages in reducing the weight of the spacecraft, saving time in spacecraft integration, etc. Based on this technology, a scheme for spacecraft data system is put forward. The corresponding block diagram and key wireless interface design of the spacecraft data system are given. The design proposal of the wireless node and information flow of the spacecraft are also analyzed. The results show that the wireless communications scheme is reasonable and feasible. The wireless communications technology can meet the future spacecraft demands in networking, modularization and non-cable.Keywords: Consultative Committee for Space Data Systems (CCSDS) standards, information flow, non-cable, spacecraft, wireless communications
Procedia PDF Downloads 32724323 Inversion of Electrical Resistivity Data: A Review
Authors: Shrey Sharma, Gunjan Kumar Verma
Abstract:
High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.Keywords: inversion, limitations, optimization, resistivity
Procedia PDF Downloads 36324322 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example
Authors: Wang Yang
Abstract:
Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map
Procedia PDF Downloads 10324321 Community Level Vulnerabilities to Climate Change in Cox’s Bazar-Teknaf Coastal Area of Bangladesh
Authors: Pronob Kumar Mozumder, M. Abdur Rob Mollah
Abstract:
This research was conducted in two coastal locations of Bangladesh from February, 2013 to January, 2014.The objective of this research was to assess the potential vulnerabilities of climate change on local ecosystem and people and to identify and recommend local level adaptation strategies to climate change. Focus group discussions, participatory rural appraisal, interviewing local elderly people were conducted. Perceptions about climate change indicate that local people are experiencing impacts of climate change. According to local people, temperature, cyclone, rain, water-logging, siltation, salinity, erosion, and flash flood are increasing. Vulnerability assessment revealed that local people are variously affected by abnormal climate related disasters. This is jeopardizing their livelihoods, risking their lives, health, and their assets. This prevailing climatic situation in the area is also impacting their environmental conditions, biodiversity and natural resources, and their economic activities. The existing adaptation includes using traditional boat and mobile phone while fishing and making house on high land and lower height. Proposed adaptation for fishing boat are using more than 60 feet length with good timber, putting at least 3 longitudinal bar along upper side, using enough vertical side bars. The homestead measures include use of cross bracing of wall frame, roof tying with extra-post by ropes and plantation of timber tree against wind.Keywords: community level vulnerabilities, climate change, Cox’s Bazar-Teknaf Coastal Area, Bangladesh
Procedia PDF Downloads 53524320 Design and Development of Data Mining Application for Medical Centers in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: data mining, medical record system, systems programming, computing
Procedia PDF Downloads 20724319 A Comprehensive Framework to Ensure Data Security in Cloud Computing: Analysis, Solutions, and Approaches
Authors: Loh Fu Quan, Fong Zi Heng, Burra Venkata Durga Kumar
Abstract:
Cloud computing has completely transformed the way many businesses operate. Traditionally, confidential data of a business is stored in computers located within the premise of the business. Therefore, a lot of business capital is put towards maintaining computing resources and hiring IT teams to manage them. The advent of cloud computing changes everything. Instead of purchasing and managing their infrastructure, many businesses have started to shift towards working with the cloud with the help of a cloud service provider (CSP), leading to cost savings. However, it also introduces security risks. This research paper focuses on the security risks that arise during data migration and user authentication in cloud computing. To overcome this problem, this paper provides a comprehensive framework that includes Transport Layer Security (TLS), user authentication, security tokens and multi-level data encryption. This framework aims to prevent authorized access to cloud resources and data leakage, ensuring the confidentiality of sensitive information. This framework can be used by cloud service providers to strengthen the security of their cloud and instil confidence in their users.Keywords: Cloud computing, Cloud security, Cloud security issues, Cloud security framework
Procedia PDF Downloads 11624318 Using AI for Analysing Political Leaders
Authors: Shuai Zhao, Shalendra D. Sharma, Jin Xu
Abstract:
This research uses advanced machine learning models to learn a number of hypotheses regarding political executives. Specifically, it analyses the impact these powerful leaders have on economic growth by using leaders’ data from the Archigos database from 1835 to the end of 2015. The data is processed by the AutoGluon, which was developed by Amazon. Automated Machine Learning (AutoML) and AutoGluon can automatically extract features from the data and then use multiple classifiers to train the data. Use a linear regression model and classification model to establish the relationship between leaders and economic growth (GDP per capita growth), and to clarify the relationship between their characteristics and economic growth from a machine learning perspective. Our work may show as a model or signal for collaboration between the fields of statistics and artificial intelligence (AI) that can light up the way for political researchers and economists.Keywords: comparative politics, political executives, leaders’ characteristics, artificial intelligence
Procedia PDF Downloads 8524317 Data Quality on Regular Immunization Programme at Birkod District: Somali Region, Ethiopia
Authors: Eyob Seife, Tesfalem Teshome, Bereket Seyoum, Behailu Getachew, Yohans Demis
Abstract:
Developing countries continue to face preventable communicable diseases, such as vaccine-preventable diseases. The Expanded Programme on Immunization (EPI) was established by the World Health Organization in 1974 to control these diseases. Health data use is crucial in decision-making, but ensuring data quality remains challenging. The study aimed to assess the accuracy ratio, timeliness, and quality index of regular immunization programme data in the Birkod district of the Somali Region, Ethiopia. For poor data quality, technical, contextual, behavioral, and organizational factors are among contributors. The study used a quantitative cross-sectional design conducted in September 2022GC using WHO-recommended data quality self-assessment tools. The accuracy ratio and timeliness of reports on regular immunization programmes were assessed for two health centers and three health posts in the district for one fiscal year. Moreover, the quality index assessment was conducted at the district level and health facilities by trained assessors. The study found poor data quality in the accuracy ratio and timeliness of reports at all health units, which includes zeros. Overreporting was observed for most facilities, particularly at the health post level. Health centers showed a relatively better accuracy ratio than health posts. The quality index assessment revealed poor quality at all levels. The study recommends that responsible bodies at different levels improve data quality using various approaches, such as the capacitation of health professionals and strengthening the quality index components. The study highlighted the need for attention to data quality in general, specifically at the health post level, and improving the quality index at all levels, which is essential.Keywords: Birkod District, data quality, quality index, regular immunization programme, Somali Region-Ethiopia
Procedia PDF Downloads 8824316 The Results of Longitudinal Water Quality Monitoring of the Brandywine River, Chester County, Pennsylvania by High School Students
Authors: Dina L. DiSantis
Abstract:
Strengthening a sense of responsibility while relating global sustainability concepts such as water quality and pollution to a local water system can be achieved by teaching students to conduct and interpret water quality monitoring tests. When students conduct their own research, they become better stewards of the environment. Providing outdoor learning and place-based opportunities for students helps connect them to the natural world. By conducting stream studies and collecting data, students are able to better understand how the natural environment is a place where everything is connected. Students have been collecting physical, chemical and biological data along the West and East Branches of the Brandywine River, in Pennsylvania for over ten years. The stream studies are part of the advanced placement environmental science and aquatic science courses that are offered as electives to juniors and seniors at the Downingtown High School West Campus in Downingtown, Pennsylvania. Physical data collected includes: temperature, turbidity, width, depth, velocity, and volume of flow or discharge. The chemical tests conducted are: dissolved oxygen, carbon dioxide, pH, nitrates, alkalinity and phosphates. Macroinvertebrates are collected with a kick net, identified and then released. Students collect the data from several locations while traveling by canoe. In the classroom, students prepare a water quality data analysis and interpretation report based on their collected data. The summary of the results from longitudinal water quality data collection by students, as well as the strengths and weaknesses of student data collection will be presented.Keywords: place-based, student data collection, sustainability, water quality monitoring
Procedia PDF Downloads 15524315 Sandy Soil Properties under Different Plant Cover Types in Drylands, Sudan
Authors: Rayan Elsiddig Eltaib, Yamanaka Norikazu, Mubarak Abdelrahman Abdalla
Abstract:
This study investigated the effects of Acacia Senegal, Calotropis procera, Leptadenia pyrotechnica, Ziziphus spina Christi, Balanites aegyptiaca, Indigofera oblongigolia, Arachis hypogea and Sesimum indicum grown in the western region of White Nile State on soil properties of the 0-10, 10-30, 30-60 and 60-90 cm depths. Soil properties were: pH(paste), electrical conductivity of the saturation extract (ECe), total N (TN), organic carbon (OC), soluble K, available P, aggregate stability and water holding capacity. Triplicate Soil samples were collected after the end of the rainy season using 5 cm diameter auger. Results indicated that pH, ECe and TN were not significantly different among plant cover types. In the top 10-30 cm depth, OC under all types was significantly higher than the control (4.1 to 7.7 fold). The highest (0.085%) OC was found under the Z. spina Christi and A. Senegal whereas the lowest (0.045%) was reported under the A. hypogea. In the 10-30 cm depth, with the exception of A. hypogea, Z. spina christi and S. indicum, P content was almost similar but significantly higher than the control by 72 to 129%. In the 10-30 cm depth, K content under the S. indicum (0.46 meq/L) was exceptionally high followed by Z. spina christi (0.102 meq/L) as compared to the control (0.029 meq/L). Water holding capacity and aggregate stability of the top 0-10 cm depth were not significantly different among plant cover types. Based on the fact that accumulation of organic matter in the soil profile of any ecosystem is an important indicator of soil quality, results of this study may conclude that (1) cultivation of A.senegal, B.aegyptiaca and Z. spina Christi improved soil quality whereas (2) cultivation of A. hypogea or soil that is solely invaded with C. procera and L.pyrotechnica may induce soil degradation.Keywords: canopy, crops, shrubs, soil properties, trees
Procedia PDF Downloads 28224314 Assessing the Impact of Autonomous Vehicles on Supply Chain Performance – A Case Study of Agri-Food Supply Chain
Authors: Nitish Suvarna, Anjali Awasthi
Abstract:
In an era marked by rapid technological advancements, the integration of Autonomous Vehicles into supply chain networks represents a transformative shift, promising to redefine the paradigms of logistics and transportation. This thesis delves into a comprehensive assessment of the impact of autonomous vehicles on supply chain performance, with a particular focus on network design, operational efficiency, and environmental sustainability. Employing the advanced simulation capabilities of anyLogistix (ALX), the study constructs a digital twin of a conventional supply chain network, encompassing suppliers, production facilities, distribution centers, and customer endpoints. The research methodically integrates Autonomous Vehicles into this intricate network, aiming to unravel the multifaceted effects on transportation logistics including transit times, cost-efficiency, and sustainability. Through simulations and scenarios analysis, the study scrutinizes the operational resilience and adaptability of supply chains in the face of dynamic market conditions and disruptive technologies like Autonomous Vehicles. Furthermore, the thesis undertakes carbon footprint analysis, quantifying the environmental benefits and challenges associated with the adoption of Autonomous Vehicles in supply chain operations. The insights from this research are anticipated to offer a strategic framework for industry stakeholders, guiding the adoption of Autonomous Vehicles to foster a more efficient, responsive, and sustainable supply chain ecosystem. The findings aim to serve as a cornerstone for future research and practical implementations in the realm of intelligent transportation and supply chain management.Keywords: autonomous vehicle, agri-food supply chain, ALX simulation, anyLogistix
Procedia PDF Downloads 74