Search results for: linked data
25627 Manual Pit Emptiers and Their Heath: Profiles, Determinants and Interventions
Authors: Ivy Chumo, Sheillah Simiyu, Hellen Gitau, Isaac Kisiangani, Caroline Kabaria Kanyiva Muindi, Blessing Mberu
Abstract:
The global sanitation workforce bridges the gap between sanitation infrastructure and the provision of sanitation services through essential public service work. Manual pit emptiers often perform the work at the cost of their dignity, safety, and health as their work requires repeated heavy physical activities such as lifting, carrying, pulling, and pushing. This exposes them to occupational and environmental health hazards and risking illness, injury, and death. The study will extend the studies by presenting occupational health risks and suggestions for improvement in informal settlements of Nairobi, Kenya. This is a qualitative study conducted among sanitation stakeholders in Korogocho, Mukuru and Kibera informal settlements in Nairobi. Data were captured using digital voice recorders, transcribed and thematically analysed. The discussion notes were further supported by observational notes made during the interviews. These formed the basis for a robust picture of occupational health of manual pit emptiers; a lack or inappropriate use of protective clothing, and prolonged duration of working hours were described to contribute to the occupational health hazard. To continue working, manual pit emptiers had devised coping strategies which include working in groups, improvised protective clothing, sharing the available protective clothing, working at night and consuming alcohol drinks while at work. Many of these strategies are detrimental to their health. Occupational health hazards among pit emptiers are key for effective working and is as a result of a lack of collaboration amongst stakeholders linked to health, safety and lack of PPE of pit emptiers. Collaborations amongst sanitation stakeholders is paramount for health, safety, and in ensuring the provision and use of personal protective devices.Keywords: sanitation, occupational health, manual emptiers, informal settlements
Procedia PDF Downloads 20625626 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking
Authors: Trevor Toy, Josef Langerman
Abstract:
Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.Keywords: big data markets, open banking, blockchain, personal data management
Procedia PDF Downloads 7525625 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16725624 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 11125623 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42625622 Machine Learning Data Architecture
Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap
Abstract:
Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning
Procedia PDF Downloads 7025621 Impact Analysis of Quality Control Practices in Veterinary Diagnostic Labs in Lahore, Pakistan
Authors: Faiza Marrium, Masood Rabbani, Ali Ahmad Sheikh, Muhammad Yasin Tipu Javed Muhammad, Sohail Raza
Abstract:
More than 75% diseases spreading in the past 10 years in human population globally are linked to veterinary sector. Veterinary diagnostic labs are the powerful ally for diagnosis, prevention and monitoring of animal diseases in any country. In order to avoid detrimental effects of errors in disease diagnostic and biorisk management, there is a dire need to establish quality control system. In current study, 3 private and 6 public sectors veterinary diagnostic labs were selected for survey. A questionnaire survey in biorisk management guidelines of CWA 15793 was designed to find quality control breaches in lab design, personal, equipment and consumable, quality control measures adopted in lab, waste management, environmental monitoring and customer care. The data was analyzed through frequency distribution statistically by using (SPSS) version 18.0. A non-significant difference was found in all parameters of lab design, personal, equipment and consumable, quality control measures adopted in lab, waste management, environmental monitoring and customer care with an average percentage of 46.6, 57.77, 52.7, 55.5, 54.44, 48.88 and 60, respectively. A non-significant difference among all nine labs were found, with highest average compliance percentage of all parameters are lab 2 (78.13), Lab 3 (70.56), Lab 5 (57.51), Lab 6 (56.37), Lab 4 (55.02), Lab 9 (49.58), Lab 7 (47.76), Lab 1 (41.01) and Lab 8 (36.09). This study shows that in Lahore district veterinary diagnostic labs are not giving proper attention to quality of their system and there is no significant difference between setups of private and public sector laboratories. These results show that most of parameters are between 50 and 80 percent, which needs some work and improvement as per WHO criteria.Keywords: veterinary lab, quality management system, accreditation, regulatory body, disease identification
Procedia PDF Downloads 14925620 Securing Land Rights for Food Security in Africa: An Appraisal of Links Between Smallholders’ Land Rights and the Right to Adequate Food in Ethiopia
Authors: Husen Ahmed Tura
Abstract:
There are strong links between secure land rights and food security in Africa. However, as land is owned by governments, land users do not have adequate legislative protection. This article explores normative and implementation gaps in relation to small-scale farmers’ land rights under the Ethiopia’s law. It finds that the law facilitates eviction of small-scale farmers and indigenous peoples from their land without adequate alternative means of livelihood. It argues that as access to land and other natural resources is strongly linked to the right to adequate food, Ethiopia should reform its land laws in the light of its legal obligations under international human rights law to respect, protect and fulfill the right to adequate food and ensure freedom from hunger.Keywords: smallholder, secure land rights , food security, right to food, land grabbing, forced evictions
Procedia PDF Downloads 31225619 Study of COVID-19 Intensity Correlated with Specific Biomarkers and Environmental Factors
Authors: Satendra Pal Singh, Dalip Kr. Kakru, Jyoti Mishra, Rajesh Thakur, Tarana Sarwat
Abstract:
COVID-19 is still an intrigue as far as morbidity or mortality is concerned. The rate of recovery varies from person to person, & it depends upon the accessibility of the healthcare system and the roles played by the physicians and caregivers. It is envisaged that with the passage of time, people would become immune to this virus, and those who are vulnerable would sustain themselves with the help of vaccines. The proposed study deals with the severeness of COVID-19 is associated with some specific biomarkers linked to correlate age and gender. We will be assessing the overall homeostasis of the persons who were affected by the coronavirus infection and also of those who recovered from it. Some people show more severe effects, while others show very mild symptoms, however, they show low CT values. Thus far, it is unclear why the new strain of Covid has different effects on different people in terms of age, gender, and ABO blood typing. According to data, the fatality rate with heart disease was 10.5 percent, 7.3 percent were diabetic, and 6 percent who are already infected from other comorbidities. However, some COVID-19 cases are worse than others & it is not fully explainable as of date. Overall data show that the ABO blood group is effective or prone to the risk of SARS-COV2 infection, while another study also shows the phenotypic effects of the blood group related to covid. It is an accepted fact that females have more strong immune systems than males, which may be related to the fact that females have two ‘X’ chromosomes, which might contain a more effective immunity booster gene on the X chromosome, and are capable to protect the female. Also specific sex hormones also induce a better immune response in a specific gender. This calls for in-depth analysis to be able to gain insight into this dilemma. COVID-19 is still not fully characterized, and thus we are not very familiar with its biology, mode of infection, susceptibility, and overall viral load in the human body. How many virus particles are needed to infect a person? How, then, comorbidity contribute to coronavirus infection? Since the emergence of this virus in 2020, a large number of papers have been published, and seemingly, vaccines have been prepared. But still, a large number of questions remain unanswered. The proneness of humans for infection by covid-19 needs to be established to be able to develop a better strategy to fight this virus. Our study will be on the Impact of demography on the Severity of covid-19 infection & at the same time, will look into gender-specific sensitivity of Covid-19 and the Operational variation of different biochemical markers in Covid-19 positive patients. Besides, we will be studying the co-relation, if any, of COVID severity & ABO Blood group type and the occurrence of the most common blood group type amongst positive patience.Keywords: coronavirus, ABO blood group, age, gender
Procedia PDF Downloads 10225618 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 37325617 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 43625616 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 25525615 Adverse Childhood Experiences (ACES) and Later-Life Depression: Perceived Social Support as a Potential Protective Factor
Authors: E. Von Cheong, Carol Sinnott, Darren Dahly, Patricia M. Kearney
Abstract:
Introduction and Aim: Adverse childhood experiences (ACEs) are all too common and have been linked to poorer health and wellbeing across the life course. While the prevention of ACEs is a worthy goal, it is important that we also try to lessen the impact of ACEs for those who do experience them. This study aims to investigate associations between adverse childhood experiences (ACEs) and later-life depressive symptoms; and to explore whether perceived social support (PSS) moderates these. Method: We analysed baseline data from the Mitchelstown (Ireland) 2010-11 cohort involving 2047 men and women aged 50–69 years. Self-reported assessments included ACEs (Centre for Disease Control ACE questionnaire), PSS (Oslo Social Support Scale), and depressive symptoms (CES-D). The primary exposure was self-report of at least one ACE. We also investigated the effects of ACE exposure by the subtypes abuse, neglect, and household dysfunction. Associations between each of these exposures and depressive symptoms were estimated using logistic regression, adjusted for socio-demographic factors that were selected using the Directed Acyclic Graph (DAG) approach. We also tested whether the estimated associations varied across levels of PSS (poor, moderate, and good). Results: 23.7% of participants reported at least one ACE (95% CI: 21.9% to 25.6%). ACE exposures (overall or subtype) were associated with a higher odds of depressive symptoms, but only among individuals with poor PSS. For example, exposure to any ACE (vs. none) was associated with 3 times the odds of depressive symptoms (Adjusted OR 2.97; 95% CI 1.63 to 5.40) among individuals reporting poor PSS, while among those reporting moderate PSS, the adjusted OR was 1.18 (95% CI 0.72 to 1.94). Discussion: ACEs are common among older adults in Ireland and are associated with higher odds of later-life depressive symptoms among those also reporting poor PSS. Interventions that enhance perception of social support following ACE exposure may help reduce the burden of depression in older populations.Keywords: adverse childhood experiences, depression, later-life, perceived social support
Procedia PDF Downloads 24825614 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.Keywords: electronic health records, electronic emergency department information system, emergency department, data quality
Procedia PDF Downloads 27825613 The Role of Financial Literacy in Driving Consumer Well-Being
Authors: Amin Nazifi, Amir Raki, Doga Istanbulluoglu
Abstract:
The incorporation of technological advancements into financial services, commonly referred to as Fintech, is primarily aimed at promoting services that are accessible, convenient, and inclusive, thereby benefiting both consumers and businesses. Fintech services employ a variety of technologies, including Artificial Intelligence (AI), blockchain, and big data, to enhance the efficiency and productivity of traditional services. Cryptocurrency, a component of Fintech, is projected to be a trillion-dollar industry, with over 320 million consumers globally investing in various forms of cryptocurrencies. However, these potentially transformative services can also lead to adverse outcomes. For instance, recent Fintech innovations have been increasingly linked to misconduct and disservice, resulting in serious implications for consumer well-being. This could be attributed to the ease of access to Fintech, which enables adults to trade cryptocurrencies, shares, and stocks via mobile applications. However, there is little known about the darker aspects of technological advancements, such as Fintech. Hence, this study aims to generate scholarly insights into the design of robust and resilient Fintech services that can add value to businesses and enhance consumer well-being. Using a mixed-method approach, the study will investigate the personal and contextual factors influencing consumers’ adoption and usage of technology innovations and their impacts on consumer well-being. First, semi-structured interviews will be conducted with a sample of Fintech users until theoretical saturation is achieved. Subsequently, based on the findings of the first study, a quantitative study will be conducted to develop and empirically test the impacts of these factors on consumers’ well-being using an online survey with a sample of 300 participants experienced in using Fintech services. This study will contribute to the growing Transformative Service Research (TSR) literature by addressing the latest priorities in service research and shedding light on the impact of fintech services on consumer well-being.Keywords: consumer well-being, financial literacy, Fintech, service innovation
Procedia PDF Downloads 7225612 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset
Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba
Abstract:
We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process
Procedia PDF Downloads 26525611 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator
Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain
Abstract:
Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.Keywords: percent depth dose, flatness, symmetry, golden beam data
Procedia PDF Downloads 49325610 Training Manual of Organic Agriculture Farming for the Farmers: A Case Study from Kunjpura and Surrounding Villages
Authors: Rishi Pal Singh
Abstract:
In Indian Scenario, Organic agriculture is growing by the conscious efforts of inspired people who are able to create the best promising relationship between the earth and men. Nowadays, the major challenge is its entry into the policy-making framework, its entry into the global market and weak sensitization among the farmers. But, during the last two decades, the contamination in environment and food which is linked with the bad agricultural potential/techniques has diverted the mind set of farmers towards the organic farming. In the view of above concept, a small-scale project has been installed to promote the 20 farmers from the Kunjura and surrounding villages for organic farming. This project is working since from the last 3 crops (starting from October, 2016) and found that it can meet both demands and complete development of rural areas. Farmers of this concept are working on the principles such that the nature never demands unreasonable quantities of water, mining and to destroy the microbes and other organisms. As per details of Organic Monitor estimates, global sales reached in billion in the present analysis. In this initiative, firstly, wheat and rice were considered for farming and observed that the production of crop has grown almost 10-15% per year from the last crop production. This is not linked only with the profit or loss but also emphasized on the concept of health, ecology, fairness and care of soil enrichment. Several techniques were used like use of biological fertilizers instead of chemicals, multiple cropping, temperature management, rain water harvesting, development of own seed, vermicompost and integration of animals. In the first year, to increase the fertility of the land, legumes (moong, cow pea and red gram) were grown in strips for the 60, 90 and 120 days. Simultaneously, the mixture of compost and vermicompost in the proportion of 2:1 was applied at the rate of 2.0 ton per acre which was enriched with 5 kg Azotobacter and 5 kg Rhizobium biofertilizer. To complete the amount of phosphorus, 250 kg rock phosphate was used. After the one month, jivamrut can be used with the irrigation water or during the rainy days. In next season, compost-vermicompost mixture @ 2.5 ton/ha was used for all type of crops. After the completion of this treatment, now the soil is ready for high value ordinary/horticultural crops. The amount of above stated biofertilizers, compost-vermicompost and rock phosphate may be increased for the high alternative fertilizers. The significance of the projects is that now the farmers believe in cultural alternative (use of disease-free their own seed, organic pest management), maintenance of biodiversity, crop rotation practices and health benefits of organic farming. This type of organic farming projects should be installed at the level of gram/block/district administration.Keywords: organic farming, Kunjpura, compost, bio-fertilizers
Procedia PDF Downloads 19925609 Variable-Fidelity Surrogate Modelling with Kriging
Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans
Abstract:
Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients
Procedia PDF Downloads 55925608 Robust Barcode Detection with Synthetic-to-Real Data Augmentation
Authors: Xiaoyan Dai, Hsieh Yisan
Abstract:
Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.Keywords: barcode detection, data augmentation, deep learning, image-based processing
Procedia PDF Downloads 17625607 Nutrition Environments and the Development of Taste Preferences: A Cross-Sectional Study of Primary School Children in Trinidad and Tobago
Authors: Fareena Alladin
Abstract:
In the Caribbean, issues of food security, health and taste are intricately linked, seen most clearly in the increasing incidence of lifestyle diseases among children coupled with a taste for high calorie and Westernized diets. In order to fully appreciate this link, the role of nutrition environments must be examined. To this end, the present study incorporates tenets of Bourdieu’s social constructivist theory with the Community Nutrition Environment Model. The aim of this study was to examine the relationships between availability of and access to healthy/unhealthy foods within nutrition environments, namely the household and school, and the development of taste preferences for healthy/unhealthy foods among primary school children in a selected educational district in Trinidad and Tobago. A cross-sectional survey of 400 children between the ages of 9 and 11 years was conducted. Data analysis was conducted using SPSS 24. Results indicated that availability of healthy food at home was positively correlated with preference for vegetables, and negatively correlated with preference for salty snacks and fast food. The availability of unhealthy food within the home was found to be negatively correlated with preference for vegetables and positively correlated with preference for salty snacks. Access to unhealthy foods at school had a positive correlation with preference for fast food. These findings highlight the role of the food environment in shaping taste preferences, and point to the need for interrogating the centrality of food security concerns in emerging health concerns of Caribbean countries. Such interrogations are a necessary part of the development of research agendas, and policy formulation and implementation.Keywords: food security, nutrition environment, taste preference, Trinidad and Tobago
Procedia PDF Downloads 13825606 The Future of Insurance: P2P Innovation versus Traditional Business Model
Authors: Ivan Sosa Gomez
Abstract:
Digitalization has impacted the entire insurance value chain, and the growing movement towards P2P platforms and the collaborative economy is also beginning to have a significant impact. P2P insurance is defined as innovation, enabling policyholders to pool their capital, self-organize, and self-manage their own insurance. In this context, new InsurTech start-ups are emerging as peer-to-peer (P2P) providers, based on a model that differs from traditional insurance. As a result, although P2P platforms do not change the fundamental basis of insurance, they do enable potentially more efficient business models to be established in terms of ensuring the coverage of risk. It is therefore relevant to determine whether p2p innovation can have substantial effects on the future of the insurance sector. For this purpose, it is considered necessary to develop P2P innovation from a business perspective, as well as to build a comparison between a traditional model and a P2P model from an actuarial perspective. Objectives: The objectives are (1) to represent P2P innovation in the business model compared to the traditional insurance model and (2) to establish a comparison between a traditional model and a P2P model from an actuarial perspective. Methodology: The research design is defined as action research in terms of understanding and solving the problems of a collectivity linked to an environment, applying theory and best practices according to the approach. For this purpose, the study is carried out through the participatory variant, which involves the collaboration of the participants, given that in this design, participants are considered experts. For this purpose, prolonged immersion in the field is carried out as the main instrument for data collection. Finally, an actuarial model is developed relating to the calculation of premiums that allows for the establishment of projections of future scenarios and the generation of conclusions between the two models. Main Contributions: From an actuarial and business perspective, we aim to contribute by developing a comparison of the two models in the coverage of risk in order to determine whether P2P innovation can have substantial effects on the future of the insurance sector.Keywords: Insurtech, innovation, business model, P2P, insurance
Procedia PDF Downloads 9425605 Words Spotting in the Images Handwritten Historical Documents
Authors: Issam Ben Jami
Abstract:
Information retrieval in digital libraries is very important because most famous historical documents occupy a significant value. The word spotting in historical documents is a very difficult notion, because automatic recognition of such documents is naturally cursive, it represents a wide variability in the level scale and translation words in the same documents. We first present a system for the automatic recognition, based on the extraction of interest points words from the image model. The extraction phase of the key points is chosen from the representation of the image as a synthetic description of the shape recognition in a multidimensional space. As a result, we use advanced methods that can find and describe interesting points invariant to scale, rotation and lighting which are linked to local configurations of pixels. We test this approach on documents of the 15th century. Our experiments give important results.Keywords: feature matching, historical documents, pattern recognition, word spotting
Procedia PDF Downloads 27825604 Analysis of Delivery of Quad Play Services
Authors: Rahul Malhotra, Anurag Sharma
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: FTTH, quad play, play service, access networks, data rate
Procedia PDF Downloads 41925603 Aerosol Characterization in a Coastal Urban Area in Rimini, Italy
Authors: Dimitri Bacco, Arianna Trentini, Fabiana Scotto, Flavio Rovere, Daniele Foscoli, Cinzia Para, Paolo Veronesi, Silvia Sandrini, Claudia Zigola, Michela Comandini, Marilena Montalti, Marco Zamagni, Vanes Poluzzi
Abstract:
The Po Valley, in the north of Italy, is one of the most polluted areas in Europe. The air quality of the area is linked not only to anthropic activities but also to its geographical characteristics and stagnant weather conditions with frequent inversions, especially in the cold season. Even the coastal areas present high values of particulate matter (PM10 and PM2.5) because the area closed between the Adriatic Sea and the Apennines does not favor the dispersion of air pollutants. The aim of the present work was to identify the main sources of particulate matter in Rimini, a tourist city in northern Italy. Two sampling campaigns were carried out in 2018, one in winter (60 days) and one in summer (30 days), in 4 sites: an urban background, a city hotspot, a suburban background, and a rural background. The samples are characterized by the concentration of the ionic composition of the particulates and of the main a hydro-sugars, in particular levoglucosan, a marker of the biomass burning, because one of the most important anthropogenic sources in the area, both in the winter and surprisingly even in the summer, is the biomass burning. Furthermore, three sampling points were chosen in order to maximize the contribution of a specific biomass source: a point in a residential area (domestic cooking and domestic heating), a point in the agricultural area (weed fires), and a point in the tourist area (restaurant cooking). In these sites, the analyzes were enriched with the quantification of the carbonaceous component (organic and elemental carbon) and with measurement of the particle number concentration and aerosol size distribution (6 - 600 nm). The results showed a very significant impact of the combustion of biomass due to domestic heating in the winter period, even though many intense peaks were found attributable to episodic wood fires. In the summer season, however, an appreciable signal was measured linked to the combustion of biomass, although much less intense than in winter, attributable to domestic cooking activities. Further interesting results were the verification of the total absence of sea salt's contribution in the particulate with the lower diameter (PM2.5), and while in the PM10, the contribution becomes appreciable only in particular wind conditions (high wind from north, north-east). Finally, it is interesting to note that in a small town, like Rimini, in summer, the traffic source seems to be even more relevant than that measured in a much larger city (Bologna) due to tourism.Keywords: aerosol, biomass burning, seacoast, urban area
Procedia PDF Downloads 13625602 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 18525601 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 9025600 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization
Authors: Hironori Karachi, Haruka Yamashita
Abstract:
Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.Keywords: data science, non-negative matrix factorization, missing data, quality of services
Procedia PDF Downloads 13425599 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies
Authors: Margaret S. Wright
Abstract:
Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.Keywords: data management, decision making, disaster planning documentation, public health nursing
Procedia PDF Downloads 22525598 Nutritional Profile and Food Intake Trends amongst Hospital Dieted Diabetic Eye Disease Patients of India
Authors: Parmeet Kaur, Nighat Yaseen Sofi, Shakti Kumar Gupta, Veena Pandey, Rajvaedhan Azad
Abstract:
Nutritional status and prevailing blood glucose level trends amongst hospitalized patients has been linked to clinical outcome. Therefore, the present study was undertaken to assess hospitalized Diabetic Eye Disease (DED) patients' anthropometric and dietary intake trends. DED patients with type 1 or 2 diabetes > 20 years were enrolled. Actual food intake was determined by weighed food record method. Mifflin St Joer predictive equation multiplied by a combined stress and activity factor of 1.3 was applied to estimate caloric needs. A questionnaire was further administered to obtain reasons of inadequate dietary intake. Results indicated validity of joint analyses of body mass index in combination with waist circumference for clinical risk prediction. Dietary data showed a significant difference (p < 0.0005) between average daily caloric and carbohydrate intake and actual daily caloric and carbohydrate needs. Mean fasting and post-prandial plasma glucose levels were 150.71 ± 72.200 mg/dL and 219.76 ± 97.365 mg/dL, respectively. Improvement in food delivery systems and nutrition educations were indicated for reducing plate waste and to enable better understanding of dietary aspects of diabetes management. A team approach of nurses, physicians and other health care providers is required besides the expertise of dietetics professional. To conclude, findings of the present study will be useful in planning nutritional care process (NCP) for optimizing glucose control as a component of quality medical nutrition therapy (MNT) in hospitalized DED patients.Keywords: nutritional status, diabetic eye disease, nutrition care process, medical nutrition therapy
Procedia PDF Downloads 357