Search results for: minimum data set
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26525

Search results for: minimum data set

24815 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 89
24814 3D Numerical Studies on External Aerodynamics of a Flying Car

Authors: Sasitharan Ambicapathy, J. Vignesh, P. Sivaraj, Godfrey Derek Sams, K. Sabarinath, V. R. Sanal Kumar

Abstract:

The external flow simulation of a flying car at take off phase is a daunting task owing to the fact that the prediction of the transient unsteady flow features during its deployment phase is very complex. In this paper 3D numerical simulations of external flow of Ferrari F430 proposed flying car with different NACA 9618 rectangular wings have been carried. Additionally, the aerodynamics characteristics have been generated for optimizing its geometry for achieving the minimum take off velocity with better overall performance in both road and air. The three-dimensional standard k-omega turbulence model has been used for capturing the intrinsic flow physics during the take off phase. In the numerical study, a fully implicit finite volume scheme of the compressible, Reynolds-Averaged, Navier-Stokes equations is employed. Through the detailed parametric analytical studies we have conjectured that Ferrari F430 flying car facilitated with high wings having three different deployment histories during the take off phase is the best choice for accomplishing its better performance for the commercial applications.

Keywords: aerodynamics of flying car, air taxi, negative lift, roadable airplane

Procedia PDF Downloads 420
24813 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 68
24812 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 197
24811 Performance Evaluation of Task Scheduling Algorithm on LCQ Network

Authors: Zaki Ahmad Khan, Jamshed Siddiqui, Abdus Samad

Abstract:

The Scheduling and mapping of tasks on a set of processors is considered as a critical problem in parallel and distributed computing system. This paper deals with the problem of dynamic scheduling on a special type of multiprocessor architecture known as Linear Crossed Cube (LCQ) network. This proposed multiprocessor is a hybrid network which combines the features of both linear type of architectures as well as cube based architectures. Two standard dynamic scheduling schemes namely Minimum Distance Scheduling (MDS) and Two Round Scheduling (TRS) schemes are implemented on the LCQ network. Parallel tasks are mapped and the imbalance of load is evaluated on different set of processors in LCQ network. The simulations results are evaluated and effort is made by means of through analysis of the results to obtain the best solution for the given network in term of load imbalance left and execution time. The other performance matrices like speedup and efficiency are also evaluated with the given dynamic algorithms.

Keywords: dynamic algorithm, load imbalance, mapping, task scheduling

Procedia PDF Downloads 450
24810 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce

Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada

Abstract:

With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.

Keywords: distributed algorithm, MapReduce, multi-class, support vector machine

Procedia PDF Downloads 401
24809 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 350
24808 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 180
24807 Study the Action of Malathion Induced Enzymatic Changes in the Target Organ of Fish Labeo Rohita

Authors: Sudha Summarwar, Jyotsana Pandey, Deepali Lall

Abstract:

The Malathion compound has the great tendency to be accumulated in the organs of the fishes both if it is present in traces or in higher amount in the aquatic environment. It has the tendency to be accumulated more in quantity in the organs directly exposed to it. The accumulation was found to be time and concentration dependent. The accumulation of malathion was maximum in gills and is the minimum in the brain. Effect of different sub-lethal concentrations (l/5th, l/l0th, l/15th, l/20th, and 1/25th fractions of 96 hr. LC50) of malathion compound on acid phosphatase (AcPase), alkaline phosphatase (AlPase), serum glutamic oxalacetic transaminase (SGOT) and Serum Glucose-6-Phosphatase (S-G-6-Pase), serum glutamic pyruvic transaminase (SGPT) in blood of Labeo rohita exposed for the period of 15. 30, 45, and 60 days, have been studied in present investigations. In general the alterations were concentrations and duration dependent.

Keywords: AcPase, AlPase, Labeo rohita, malathion, S-G-6-Pase, SGOT, SGPT

Procedia PDF Downloads 327
24806 Antitrypanosomal Activity of Stigmasterol: An in silico Approach

Authors: Mohammed Auwal Ibrahim, Aminu Mohammed

Abstract:

Stigmasterol has previously been reported to possess antitrypanosomal activity using in vitro and in vivo models. However, the mechanism of antitrypanosomal activity is yet to be elucidated. In the present study, molecular docking was used to decipher the mode of interaction and binding affinity of stigmasterol to three known antitrypanosomal drug targets viz; adenosine kinase, ornithine decarboxylase and triose phosphate isomerase. Stigmasterol was found to bind to the selected trypanosomal enzymes with minimum binding energy of -4.2, -6.5 and -6.6 kcal/mol for adenosine kinase, ornithine decarboxylase, and triose phosphate isomerase respectively. However, hydrogen bond was not involved in the interaction of stigmasterol with all the three enzymes, but hydrophobic interaction seemed to play a vital role in the binding phenomenon which was predicted to be non-competitive like type of inhibition. It was concluded that binding to the three selected enzymes, especially triose phosphate isomerase, might be involved in the antitrypanosomal activity of stigmasterol but not mediated via a hydrogen bond interaction.

Keywords: antitrypanosomal, in silico, molecular docking, stigmasterol

Procedia PDF Downloads 278
24805 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 284
24804 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 127
24803 Catch Composition and Amount of Illegal and Unreported Fishing in Iranian Coastal Waters - Hormozgan Province

Authors: Yasemi Mehran, Parsa Mehran, Farzingohar Mehrnaz

Abstract:

Illegal, unreported, and unregulated (IUU) fishing has been identified as one of the most serious threats to the sustainability of the world’s fisheries. In the present study, illegal and unreported fishing of different species in waters of Persian Gulf and Oman Sea (Hormozgan province) were evaluated. Among 47 species of 33 families identified in this study, with 39 species belong to teleosts, 4 species belong to elasmobranchs and 4 species belong to invertebrate. The total weight of illegal and unreported catch were 78525.22 tonnes. Maximum and minimum values were found for Dussumiera acuta (20640.74 tonnes) and Tenualosa ilisha (0.733 tonnes), respectively. The most commercial species group was scombridae, carangidae and clupeidae, respectively. Teleosts with 91.15%, elasmobranchs with 4.82 and invertebrates with 4.03% constituted total weight of illegal and unreported fishing. Results of this study provide valuable information in order to access a sustainable management on fish resources.

Keywords: catch composition, illegal, unreported fishing, Hormozgan province

Procedia PDF Downloads 297
24802 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 150
24801 Analysis of Spatial and Temporal Data Using Remote Sensing Technology

Authors: Kapil Pandey, Vishnu Goyal

Abstract:

Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.

Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing

Procedia PDF Downloads 433
24800 Bioassay Guided Isolation of Cytotoxic and Antimicrobial Components from Ethyl Acetate Extracts of Cassia sieberiana D.C. (Fabaceae)

Authors: Sani Abubakar, Oumar Al-Mubarak Adoum

Abstract:

The leaves extracts of Cassia sieberiana D. C. were screened for antimicrobial bioassay against Staphylococcus aureus, Salmonella typhi, and Escherichia coli and cytotoxicity using Brine Shrimp Test (BST). The crude ethanol extract, Chloroform soluble fraction, aqueous soluble fraction, ethyl acetate soluble fraction, methanol soluble fraction, and n-hexane soluble fraction were tested against antimicrobial and cytotoxicity. The Ethyl acetate fraction obtained proved to be most active in inducing complete lethality at minimum doses in BST and also active on Salmonella typhi. The bioactivity result was used to guide the column chromatography, which led to the isolation of pure compound CSB-8, which was found active in the BST with an LC₅₀ value of 34(722-182)µg/ml and showed remarkable activity on Salmonella typhi (zone of inhibition 25mm) at 10,000µg/ml. The ¹H-NMR, ¹³C NMR, FTIR, and GC-MS spectra of the compound suggested the proposed structure to be 2-pentadecanone.

Keywords: antimicrobial bioassay, cytotoxicity, column chromatagraphy, Cassia sieberiana D.C.

Procedia PDF Downloads 45
24799 Aspen Plus Simulation of Saponification of Ethyl Acetate in the Presence of Sodium Hydroxide in a Plug Flow Reactor

Authors: U. P. L. Wijayarathne, K. C. Wasalathilake

Abstract:

This work presents the modelling and simulation of saponification of ethyl acetate in the presence of sodium hydroxide in a plug flow reactor using Aspen Plus simulation software. Plug flow reactors are widely used in the industry due to the non-mixing property. The use of plug flow reactors becomes significant when there is a need for continuous large scale reaction or fast reaction. Plug flow reactors have a high volumetric unit conversion as the occurrence for side reactions is minimum. In this research Aspen Plus V8.0 has been successfully used to simulate the plug flow reactor. In order to simulate the process as accurately as possible HYSYS Peng-Robinson EOS package was used as the property method. The results obtained from the simulation were verified by the experiment carried out in the EDIBON plug flow reactor module. The correlation coefficient (r2) was 0.98 and it proved that simulation results satisfactorily fit for the experimental model. The developed model can be used as a guide for understanding the reaction kinetics of a plug flow reactor.

Keywords: aspen plus, modelling, plug flow reactor, simulation

Procedia PDF Downloads 602
24798 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations

Authors: Hailye Tekleselassie

Abstract:

Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.

Keywords: IoT, data, security, edge computing

Procedia PDF Downloads 83
24797 Software Verification of Systematic Resampling for Optimization of Particle Filters

Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey

Abstract:

Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.

Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking

Procedia PDF Downloads 84
24796 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks

Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh

Abstract:

In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.

Keywords: aggregation, estimation, queuing, wireless sensor network

Procedia PDF Downloads 186
24795 Research and Application of Consultative Committee for Space Data Systems Wireless Communications Standards for Spacecraft

Authors: Cuitao Zhang, Xiongwen He

Abstract:

According to the new requirements of the future spacecraft, such as networking, modularization and non-cable, this paper studies the CCSDS wireless communications standards, and focuses on the low data-rate wireless communications for spacecraft monitoring and control. The application fields and advantages of wireless communications are analyzed. Wireless communications technology has significant advantages in reducing the weight of the spacecraft, saving time in spacecraft integration, etc. Based on this technology, a scheme for spacecraft data system is put forward. The corresponding block diagram and key wireless interface design of the spacecraft data system are given. The design proposal of the wireless node and information flow of the spacecraft are also analyzed. The results show that the wireless communications scheme is reasonable and feasible. The wireless communications technology can meet the future spacecraft demands in networking, modularization and non-cable.

Keywords: Consultative Committee for Space Data Systems (CCSDS) standards, information flow, non-cable, spacecraft, wireless communications

Procedia PDF Downloads 329
24794 The Role of Hausa Oral Praise Singer in Conflict Management and Social Mobilization in Nigeria

Authors: Ladan Surajo

Abstract:

Nigeria as a third world country is full of people who cannot read and write, thereby constituting a stumbling block to the modern way of communication. It is a well known fact that Nigeria is a heterogeneous country with an estimated 450 or more ethnic groups communicating in divergent languages. Despite this scenario, English, Hausa, Igbo and Yoruba languages are predominantly used in the country. Apart from English language, Hausa has a wider coverage of usage among the indigenous languages in Nigeria, thereby using it in the area of social mobilization and conflict management cannot be overemphasized. Hausa Oral Singers are depicting their artistic and God endowed talents through singing to mobilize and sensitize the local communities about government programmes and the ills of other social problems of the society. It is the belief of this researcher that if used properly, the Hausa Oral Singers will assist immensely in reducing to the barest minimum some social ills of the society in Nigeria. More so that music is the food of the heart and has a resounding impact in changing the behaviour of individuals and groups.

Keywords: oral, singers, praise, social mobilization, conflict management

Procedia PDF Downloads 462
24793 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 365
24792 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example

Authors: Wang Yang

Abstract:

Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.

Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map

Procedia PDF Downloads 104
24791 Growth Performance and Critical Supersaturation of Heterogeneous Condensation for High Concentration of Insoluble Sub-Micron Particles

Authors: Jie Yin, Jun Zhang

Abstract:

Measuring the growth performance and critical supersaturation of particle group have a high reference value for constructing a supersaturated water vapor environment that can improve the removal efficiency of the high-concentration particle group. The critical supersaturation and the variation of the growth performance with supersaturation for high-concentration particles were measured by a flow cloud chamber. Findings suggest that the influence of particle concentration on the growth performance will reduce with the increase of supersaturation. Reducing residence time and increasing particle concentration have similar effects on the growth performance of the high-concentration particle group. Increasing particle concentration and shortening residence time will increase the critical supersaturation of the particle group. The critical supersaturation required to activate a high-concentration particle group is lower than that of the single-particle when the minimum particle size in the particle group is the same as that of a single particle.

Keywords: sub-micron particles, heterogeneous condensation, critical supersaturation, nucleation

Procedia PDF Downloads 157
24790 A Proposal of Ontology about Brazilian Government Transparency Portal

Authors: Estela Mayra de Moura Vianna, Thiago José Tavares Ávila, Bruno Morais Silva, Diego Henrique Bezerra, Paulo Henrique Gomes Silva, Alan Pedro da Silva

Abstract:

The Brazilian Federal Constitution defines the access to information as a crucial right of the citizen and the Law on Access to Public Information, which regulates this right. Accordingly, the Fiscal Responsibility Act, 2000, amended in 2009 by the “Law of Transparency”, began demanding a wider disclosure of public accounts for the society, including electronic media for public access. Thus, public entities began to create "Transparency Portals," which aim to gather a diversity of data and information. However, this information, in general, is still published in formats that do not simplify understanding of the data by citizens and that could be better especially available for audit purposes. In this context, a proposal of ontology about Brazilian Transparency Portal can play a key role in how these data will be better available. This study aims to identify and implement in ontology, the data model about Transparency Portal ecosystem, with emphasis in activities that use these data for some applications, like audits, press activities, social government control, and others.

Keywords: audit, government transparency, ontology, public sector

Procedia PDF Downloads 506
24789 Study of the Antimicrobial Activity of Aminoreductone against Pathogenic Bacteria in Comparison with Other Antibiotics

Authors: Vu Thu Trang, Lam Xuan Thanh, Samira Sarter, Tomoko Shimamura, Hiroaki Takeuchi  

Abstract:

Antimicrobial activities of aminoreductone (AR), a product formed in the initial stage of Maillard reaction, were screened against pathogenic bacteria. A significant growth inhibition of AR against all 7 isolates (Staphylococcus aureus ATCC® 25923™, Salmonella Typhimurium ATCC® 14028™, Bacillus cereus ATCC® 13061™, Bacillus subtilis ATCC® 11774™, Escherichia coli ATCC® 25922™, Enterococcus faecalis ATCC® 29212™, Listeria innocua ATCC® 33090™) were observed by the standard disc diffusion methods. The inhibition zone for each isolate by AR (2.5 mg) ranged from 15±0 mm to 28.3±0.4 mm in diameter. The minimum inhibitory concentration (MIC) of AR ranging from 20 mM to 26 mM was proven in the seven isolates tested. AR also showed the similar effect of growth inhibition in comparison with antibiotics frequently used for the treatment of infections bacteria, such as amikacin, ciprofloxacin, meropennem, and levofloxacin. The results indicated that foods containing AR are valuable sources of bioactive compounds towards pathogenic bacteria.

Keywords: pathogenic bacteria, aminoreductone, Maillard reaction, antimicrobial activity

Procedia PDF Downloads 384
24788 Design and Development of Data Mining Application for Medical Centers in Remote Areas

Authors: Grace Omowunmi Soyebi

Abstract:

Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.

Keywords: data mining, medical record system, systems programming, computing

Procedia PDF Downloads 209
24787 Computational Fluid Dynamics and Experimental Evaluation of Two Batch Type Electrocoagulation Stirred Tank Reactors Used in the Removal of Cr (VI) from Waste Water

Authors: Phanindra Prasad Thummala, Umran Tezcan Un

Abstract:

In this study, hydrodynamics analysis of two batch type electrocoagulation stirred tank reactors, used for the electrocoagulation treatment of Cr(VI) wastewater, was carried using computational fluid dynamics (CFD). The aim of the study was to evaluate the impact of mixing characteristics on overall performance of electrocoagulation reactor. The CFD simulations were performed using ANSYS FLUENT 14.4 software. The mixing performance of each reactor was evaluated by numerically modelling tracer dispersion in each reactor configuration. The uniformity in tracer dispersion was assumed when 90% of the ratio of the maximum to minimum concentration of the tracer was realized. In parallel, experimental evaluation of both the electrocoagulation reactors for removal of Cr(VI) from wastewater was also carried out. The results of CFD and experimental analysis clearly show that the reactor which can give higher uniformity in lesser time, will perform better as an electrocoagulation reactor for removal of Cr(VI) from wastewater.

Keywords: CFD, stirred tank reactors, electrocoagulation, Cr(VI) wastewater

Procedia PDF Downloads 462
24786 A Comprehensive Framework to Ensure Data Security in Cloud Computing: Analysis, Solutions, and Approaches

Authors: Loh Fu Quan, Fong Zi Heng, Burra Venkata Durga Kumar

Abstract:

Cloud computing has completely transformed the way many businesses operate. Traditionally, confidential data of a business is stored in computers located within the premise of the business. Therefore, a lot of business capital is put towards maintaining computing resources and hiring IT teams to manage them. The advent of cloud computing changes everything. Instead of purchasing and managing their infrastructure, many businesses have started to shift towards working with the cloud with the help of a cloud service provider (CSP), leading to cost savings. However, it also introduces security risks. This research paper focuses on the security risks that arise during data migration and user authentication in cloud computing. To overcome this problem, this paper provides a comprehensive framework that includes Transport Layer Security (TLS), user authentication, security tokens and multi-level data encryption. This framework aims to prevent authorized access to cloud resources and data leakage, ensuring the confidentiality of sensitive information. This framework can be used by cloud service providers to strengthen the security of their cloud and instil confidence in their users.

Keywords: Cloud computing, Cloud security, Cloud security issues, Cloud security framework

Procedia PDF Downloads 121