Search results for: gaps in data ecosystems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26022

Search results for: gaps in data ecosystems

24912 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 421
24911 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 62
24910 Effects of Soil Erosion on Vegetation Development

Authors: Josephine Wanja Nyatia

Abstract:

The relationship between vegetation and soil erosion deserves attention due to its scientific importance and practical applications. A great deal of information is available about the mechanisms and benefits of vegetation in the control of soil erosion, but the effects of soil erosion on vegetation development and succession is poorly documented. Research shows that soil erosion is the most important driving force for the degradation of upland and mountain ecosystems. Soil erosion interferes with the process of plant community development and vegetation succession, commencing with seed formation and impacting throughout the whole growth phase and affecting seed availability, dispersal, germination and establishment, plant community structure and spatial distribution. There have been almost no studies on the effects of soil erosion on seed development and availability, of surface flows on seed movement and redistribution, and their influences on soil seed bank and on vegetation establishment and distribution. However, these effects may be the main cause of low vegetation cover in regions of high soil erosion activity, and these issues need to be investigated. Moreover, soil erosion is not only a negative influence on vegetation succession and restoration but also a driving force of plant adaptation and evolution. Consequently, we need to study the effects of soil erosion on ecological processes and on development and regulation of vegetation succession from the points of view of pedology and vegetation, plant and seed ecology, and to establish an integrated theory and technology for deriving practical solutions to soil erosion problems

Keywords: soil erosion, vegetation, development, seed availability

Procedia PDF Downloads 85
24909 Conceptual and Funnel Methods Contribution to Critical Literature Review: PhD construction Management

Authors: Samuel Quashie

Abstract:

This study is aimed at demonstrating the applicability and contribution of ‘Conceptual and Funnelling Methods’ during the literature review stages, for PhD in Construction Management, which focused on the ‘Development of an Integrated Management for Post-Disaster Reconstruction’, the viability of this approach using conceptual and funnel methods are demonstrated. The ‘conceptual review method’ builds upon the strengths of relevant material, detailing major points and areas covered and evaluates lesser relevant literature. Publications are reviewed in an integrated style, challenging the scientific theory and seeking to develop new insights. The funnel method grouped reviews by commonality, regardless of the topic or thesis statement. It shows that the literature review is acquired using different kinds of information to increase the variety and diversity of the investigation. Results demonstrated conceptual and funnel methods ability to reviewed and appraised the relevant literature. It puts them into an integrated style, allows an evaluation of credentials, originality, theory base, context and significance of the quality work to emerge. Objectives of the review are met and gaps in knowledge are identified and direct further studies to answer the research questions.

Keywords: Ph.D, construction management, critical literature review, conceptual and funnel methods

Procedia PDF Downloads 415
24908 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 368
24907 State and Benefit: Delivering the First State of the Bays Report for Victoria

Authors: Scott Rawlings

Abstract:

Victoria’s first State of the Bays report is an historic baseline study of the health of Port Phillip Bay and Western Port. The report includes 50 assessments of 36 indicators across a broad array of topics from the nitrogen cycle and water quality to key marine species and habitats. This paper discusses the processes for determining and assessing the indicators and comments on future priorities identified to maintain and improve the health of these water ways. Victoria’s population is now at six million, and growing at a rate of over 100,000 people per year - the highest increase in Australia – and the population of greater Melbourne is over four million. Port Phillip Bay and Western Port are vital marine assets at the centre of this growth and will require adaptive strategies if they are to remain in good condition and continue to deliver environmental, economic and social benefits. In 2014, it was in recognition of these pressures that the incoming Victorian Government committed to reporting on the state of the bays every five years. The inaugural State of the Bays report was issued by the independent Victorian Commissioner for Environmental Sustainability. The report brought together what is known about both bays, based on existing research. It was a baseline on which future reports will build and, over time, include more of Victoria’s marine environment. Port Phillip Bay and Western Port generally demonstrate healthy systems. Specific threats linked to population growth are a significant pressure. Impacts are more significant where human activity is more intense and where nutrients are transported to the bays around the mouths of creeks and drainage systems. The transport of high loads of nutrients and pollutants to the bays from peak rainfall events is likely to increase with climate change – as will sea level rise. Marine pests are also a threat. More than 100 introduced marine species have become established in Port Phillip Bay and can compete with native species, alter habitat, reduce important fish stocks and potentially disrupt nitrogen cycling processes. This study confirmed that our data collection regime is better within the Marine Protected Areas of Port Phillip Bay than in other parts. The State of the Bays report is a positive and practical example of what can be achieved through collaboration and cooperation between environmental reporters, Government agencies, academic institutions, data custodians, and NGOs. The State of the Bays 2016 provides an important foundation by identifying knowledge gaps and research priorities for future studies and reports on the bays. It builds a strong evidence base to effectively manage the bays and support an adaptive management framework. The Report proposes a set of indicators for future reporting that will support a step-change in our approach to monitoring and managing the bays – a shift from reporting only on what we do know, to reporting on what we need to know.

Keywords: coastal science, marine science, Port Phillip Bay, state of the environment, Western Port

Procedia PDF Downloads 209
24906 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 428
24905 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 252
24904 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 272
24903 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 260
24902 Adoption of Digital Storytelling Tool to Teach 21st Century Skills by Malaysian Pre-service Teachers

Authors: Siti Aisyah binti Jumpaan

Abstract:

21ˢᵗ century skills (PAK-21) integration has made its way into Malaysian curriculum when Ministry of Education introduce its implementation since 2016. This study was conducted to explore pre-service teachers’ readiness in integrating 21st century skills in the classroom via the digital storytelling (DST) method and to find gaps between theory and practice that can be integral towards pre-service teachers’ professional growth. Qualitative research method was used in this research involving six respondents who were selected using a purposive sampling method. Their response from interviews and lesson plan analysis were analysed using narrative analysis. The findings showed that pre-service teachers showed a moderate level of readiness in integrating 21st century skills using DST. Pre-service teachers demonstrated high level of preparedness in writing their lesson plan, but their interview revealed that they faced struggles in implementation due to several factors, such as lack of technology and failure to obtain students’ participation. This study further strengthens the need for specialised curriculum for pre-service teachers in teaching 21st century skills via DST.

Keywords: digital storytelling, 21ˢᵗ century skills, preservice teachers, teacher training

Procedia PDF Downloads 90
24901 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 488
24900 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 556
24899 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 167
24898 Application of a Theoretical framework as a Context for a Travel Behavior Change Policy Intervention

Authors: F. Moghtaderi, M. Burke, J. Troelsen

Abstract:

There has been a significant decline in active travel as well as the massive increase use of car-dependent travel mode in many countries during past two decades. Evidential risks for people’s physical and mental health problems are followed by this increased use of motorized travel mode. These problems range from overweight and obesity to increasing air pollution. In response to these rising concerns, local councils and other interested organizations around the world have introduced a variety of initiatives regarding reduce the dominance of cars for the daily journeys. However, the nature of these kinds of interventions, which related to the human behavior, make lots of complexities. People’s travel behavior and changing this behavior, has two different aspects. People’s attitudes and perceptions toward the sustainable and healthy modes of travel, and motorized travel modes (especially private car use) is one these two aspects. The other one related to people’s behavior change processes. There are no comprehensive model in order to guide policy interventions to increase the level of succeed of such interventions. A comprehensive theoretical framework is required in accordance to facilitate and guide the processes of data collection and analysis to achieve the best possible guidelines for policy makers. Regarding this gaps in the travel behavior change research, this paper attempted to identify and suggest a multidimensional framework in order to facilitate planning interventions. A structured mixed-method is suggested regarding the expand the scope and improve the analytic power of the result according to the complexity of human behavior. In order to recognize people’s attitudes, a theory with the focus on people’s attitudes towards a particular travel behavior was needed. The literature around the theory of planned behavior (TPB) was the most useful, and had been proven to be a good predictor of behavior change. Another aspect of the research, related to the people’s decision-making process regarding explore guidelines for the further interventions. Therefore, a theory was needed to facilitate and direct the interventions’ design. The concept of the transtheoretical model of behavior change (TTM) was used regarding reach a set of useful guidelines for the further interventions with the aim to increase active travel and sustainable modes of travel. Consequently, a combination of these two theories (TTM and TPB) had presented as an appropriate concept to identify and design implemented travel behavior change interventions.

Keywords: behavior change theories, theoretical framework, travel behavior change interventions, urban research

Procedia PDF Downloads 373
24897 Adolescents’ and Young Adults’ Well-Being, Health, and Loneliness during the COVID-19 Pandemic

Authors: Jessica Hemberg, Amanda Sundqvist, Yulia Korzhina, Lillemor Östman, Sofia Gylfe, Frida Gädda, Lisbet Nyström, Henrik Groundstroem, Pia Nyman-Kurkiala

Abstract:

Purpose: There are large gaps in the literature on COVID-19 pandemic-related mental health outcomes and after-effects specific to adolescents and young adults. The study's aim was to explore adolescents’ and young adults’ experiences of well-being, health, and loneliness during the COVID-19 pandemic. Method: A qualitative exploratory design with qualitative content analysis was used. Twenty-three participants (aged 19-27; four men and 19 women) were interviewed. Results: Four themes emerged: Changed social networks – fewer and closer contacts, changed mental and physical health, increased physical and social loneliness, well-being, internal growth, and need for support. Conclusion: Adolescents’ and young adults’ experiences of well-being, health, and loneliness are subtle and complex. Participants experienced changed social networks, mental and physical health, and well-being. Also, internal growth, need for support, and increased loneliness were seen. Clear information on how to seek help and support from professionals should be made available.

Keywords: adolescents, COVID-19 pandemic, health, interviews, loneliness, qualitative, well-being, young adults

Procedia PDF Downloads 95
24896 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 412
24895 Microplastic Concentrations and Fluxes in Urban Compartments: A Systemic Approach at the Scale of the Paris Megacity

Authors: Rachid Dris, Robin Treilles, Max Beaurepaire, Minh Trang Nguyen, Sam Azimi, Vincent Rocher, Johnny Gasperi, Bruno Tassin

Abstract:

Microplastic sources and fluxes in urban catchments are only poorly studied. Most often, the approaches taken focus on a single source and only carry out a description of the contamination levels and type (shape, size, polymers). In order to gain an improved knowledge of microplastic inputs at urban scales, estimating and comparing various fluxes is necessary. The Laboratoire Eau, Environnement et Systèmes Urbains (LEESU), the Laboratoire Eau Environnement (LEE) and the SIAAP (Service public de l’assainissement francilien) initiated several projects to investigate different urban sources and flows of microplastics. A systemic approach is undertaken at the scale of Paris Megacity, and several compartments are considered, including atmospheric fallout, wastewater treatments plants, runoff and combined sewer overflows. These investigations are carried out within the Limnoplast and OPUR projects. Atmospheric fallout was sampled during consecutive periods ranging from 2 to 3 weeks with a stainless-steel funnel. Both wet and dry periods were considered. Different treatment steps were sampled in 2 wastewater treatment plants (Seine-Amont for activated sludge and Seine-Centre for biofiltration) of the SIAAP, including sludge samples. Microplastics were also investigated in combined sewer overflows as well as in stormwater at the outlet suburban catchment (Sucy-en-Brie, France) during four rain events. Samples are treated using hydroperoxide digestion (H₂O₂ 30 %) in order to reduce organic material. Microplastics are then extracted from the samples with a density separation step using NaI (d=1.6 g.cm⁻³). Samples are filtered on metallic filters with a porosity of 14 µm between steps to separate them from the solutions (H₂O₂ and NaI). The last filtration was carried out on alumina filters. Infrared mapping analysis (using a micro-FTIR with an MCT detector) is performed on each alumina filter. The resulting maps are analyzed using a microplastic analysis software simple, developed by Aalborg University, Denmark and Alfred Wegener Institute, Germany. Blanks were systematically carried out to consider sample contamination. This presentation aims at synthesizing the data found in the various projects. In order to carry out a systemic approach and compare the various inputs, all the data were converted into annual microplastic fluxes (number of microplastics per year), and extrapolated to the Parisian agglomeration. PP, PE and alkyd are the most prevalent polymers found in storm water samples. Rain intensity and microplastic concentrations did not show any clear correlation. Considering the runoff volumes and the impervious surface area of the studied catchment, a flux of 4*107–9*107 MPs.yr⁻¹.ha⁻¹ was estimated. Samples of wastewater treatment plants and atmospheric fallout are currently being analyzed in order to finalize this assessment. The representativeness of such samplings and uncertainties related to the extrapolations will be discussed and gaps in knowledge will be identified. The data provided by such an approach will help to prioritize future research as well as policy efforts.

Keywords: microplastics, atmosphere, wastewater, urban runoff, Paris megacity, urban waters

Procedia PDF Downloads 180
24894 A Systamatic Review on Experimental, FEM Analysis and Simulation of Metal Spinning Process

Authors: Amol M. Jadhav, Sharad S. Chudhari, S. S. Khedkar

Abstract:

This review presents a through survey of research paper work on the experimental analysis, FEM Analysis & simulation of the metal spinning process. In this literature survey all the papers being taken from Elsevier publication and most of the from journal of material processing technology. In a last two decade or so, metal spinning process gradually used as chip less formation for the production of engineering component in a small to medium batch quantities. The review aims to provide include into the experimentation, FEM analysis of various components, simulation of metal spinning process and act as guide for research working on metal spinning processes. The review of existing work has several gaps in current knowledge of metal spinning processes. The evaluation of experiment is thickness strain, the spinning force, the twisting angle, the surface roughness of the conventional & shear metal spinning process; the evaluation of FEM of metal spinning to path definition with sufficient fine mesh to capture behavior of work piece; The evaluation of feed rate of roller, direction of roller,& type of roller stimulated. The metal spinning process has the more flexible to produce a wider range of product shape & to form more challenge material.

Keywords: metal spinning, FEM analysis, simulation of metal spinning, mechanical engineering

Procedia PDF Downloads 385
24893 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 176
24892 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 83
24891 Ultra-High Molecular Weight Polyethylene (UHMWPE) for Radiation Dosimetry Applications

Authors: Malik Sajjad Mehmood, Aisha Ali, Hamna Khan, Tariq Yasin, Masroor Ikram

Abstract:

Ultra-high molecular weight polyethylene (UHMWPE) is one of the polymers belongs to polyethylene (PE) family having monomer –CH2– and average molecular weight is approximately 3-6 million g/mol. Due its chemical, mechanical, physical and biocompatible properties, it has been extensively used in the field of electrical insulation, medicine, orthopedic, microelectronics, engineering, chemistry and the food industry etc. In order to alter/modify the properties of UHMWPE for particular application of interest, certain various procedures are in practice e.g. treating the material with high energy irradiations like gamma ray, e-beam, and ion bombardment. Radiation treatment of UHMWPE induces free radicals within its matrix, and these free radicals are the precursors of chain scission, chain accumulation, formation of double bonds, molecular emission, crosslinking etc. All the aforementioned physical and chemical processes are mainly responsible for the modification of polymers properties to use them in any particular application of our interest e.g. to fabricate LEDs, optical sensors, antireflective coatings, polymeric optical fibers, and most importantly for radiation dosimetry applications. It is therefore, to check the feasibility of using UHMWPE for radiation dosimetery applications, the compressed sheets of UHMWPE were irradiated at room temperature (~25°C) for total dose values of 30 kGy and 100 kGy, respectively while one were kept un-irradiated as reference. Transmittance data (from 400 nm to 800 nm) of e-beam irradiated UHMWPE and its hybrids were measured by using Muller matrix spectro-polarimeter. As a result significant changes occur in the absorption behavior of irradiated samples. To analyze these (radiation induced) changes in polymer matrix Urbach edge method and modified Tauc’s equation has been used. The results reveal that optical activation energy decreases with irradiation. The values of activation energies are 2.85 meV, 2.48 meV, and 2.40 meV for control, 30 kGy, and 100 kGy samples, respectively. Direct and indirect energy band gaps were also found to decrease with irradiation due to variation of C=C unsaturation in clusters. We believe that the reported results would open new horizons for radiation dosimetery applications.

Keywords: electron beam, radiation dosimetry, Tauc’s equation, UHMWPE, Urbach method

Procedia PDF Downloads 405
24890 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: data science, non-negative matrix factorization, missing data, quality of services

Procedia PDF Downloads 130
24889 Climate Change Impacts on Oyster Aquaculture - Part I: Identification of Key Factors

Authors: Emmanuel Okine Neokye, Xiuquan Wang, Krishna K. Thakur, Pedro Quijon, Rana Ali Nawaz, , Sana Basheer

Abstract:

Oysters are enriched with high-quality protein and are widely known for their exquisite taste. The production of oysters plays an important role in the local economies of coastal communities in many countries, including Atlantic Canada, because of their high economic value. However, because of the changing climatic conditions in recent years, oyster aquaculture faces potentially negative impacts, such as increasing water acidification, rising water temperatures, high salinity, invasive species, algal blooms, and other environmental factors. Although a few isolated effects of climate change on oyster aquaculture have been reported in recent years, it is not well understood how climate change will affect oyster aquaculture from a systematic perspective. In the first part of this study, we present a systematic review of the impacts of climate change and some key environmental factors affecting oyster production on a global scale. The study also identifies knowledge gaps and challenges. In addition, we present key research directions that will facilitate future investigations.

Keywords: climate change, oyster production, oyster aquaculture, greenhouse gases

Procedia PDF Downloads 11
24888 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies

Authors: Margaret S. Wright

Abstract:

Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.

Keywords: data management, decision making, disaster planning documentation, public health nursing

Procedia PDF Downloads 221
24887 An Embarrassingly Simple Semi-supervised Approach to Increase Recall in Online Shopping Domain to Match Structured Data with Unstructured Data

Authors: Sachin Nagargoje

Abstract:

Complete labeled data is often difficult to obtain in a practical scenario. Even if one manages to obtain the data, the quality of the data is always in question. In shopping vertical, offers are the input data, which is given by advertiser with or without a good quality of information. In this paper, an author investigated the possibility of using a very simple Semi-supervised learning approach to increase the recall of unhealthy offers (has badly written Offer Title or partial product details) in shopping vertical domain. The author found that the semisupervised learning method had improved the recall in the Smart Phone category by 30% on A=B testing on 10% traffic and increased the YoY (Year over Year) number of impressions per month by 33% at production. This also made a significant increase in Revenue, but that cannot be publicly disclosed.

Keywords: semi-supervised learning, clustering, recall, coverage

Procedia PDF Downloads 120
24886 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 258
24885 Assessing Diagnostic and Evaluation Tools for Use in Urban Immunisation Programming: A Critical Narrative Review and Proposed Framework

Authors: Tim Crocker-Buque, Sandra Mounier-Jack, Natasha Howard

Abstract:

Background: Due to both the increasing scale and speed of urbanisation, urban areas in low and middle-income countries (LMICs) host increasingly large populations of under-immunized children, with the additional associated risks of rapid disease transmission in high-density living environments. Multiple interdependent factors are associated with these coverage disparities in urban areas and most evidence comes from relatively few countries, e.g., predominantly India, Kenya, Nigeria, and some from Pakistan, Iran, and Brazil. This study aimed to identify, describe, and assess the main tools used to measure or improve coverage of immunisation services in poor urban areas. Methods: Authors used a qualitative review design, including academic and non-academic literature, to identify tools used to improve coverage of public health interventions in urban areas. Authors selected and extracted sources that provided good examples of specific tools, or categories of tools, used in a context relevant to urban immunization. Diagnostic (e.g., for data collection, analysis, and insight generation) and programme tools (e.g., for investigating or improving ongoing programmes) and interventions (e.g., multi-component or stand-alone with evidence) were selected for inclusion to provide a range of type and availability of relevant tools. These were then prioritised using a decision-analysis framework and a tool selection guide for programme managers developed. Results: Authors reviewed tools used in urban immunisation contexts and tools designed for (i) non-immunization and/or non-health interventions in urban areas, and (ii) immunisation in rural contexts that had relevance for urban areas (e.g., Reaching every District/Child/ Zone). Many approaches combined several tools and methods, which authors categorised as diagnostic, programme, and intervention. The most common diagnostic tools were cross-sectional surveys, key informant interviews, focus group discussions, secondary analysis of routine data, and geographical mapping of outcomes, resources, and services. Programme tools involved multiple stages of data collection, analysis, insight generation, and intervention planning and included guidance documents from WHO (World Health Organisation), UNICEF (United Nations Children's Fund), USAID (United States Agency for International Development), and governments, and articles reporting on diagnostics, interventions, and/or evaluations to improve urban immunisation. Interventions involved service improvement, education, reminder/recall, incentives, outreach, mass-media, or were multi-component. The main gaps in existing tools were an assessment of macro/policy-level factors, exploration of effective immunization communication channels, and measuring in/out-migration. The proposed framework uses a problem tree approach to suggest tools to address five common challenges (i.e. identifying populations, understanding communities, issues with service access and use, improving services, improving coverage) based on context and available data. Conclusion: This study identified many tools relevant to evaluating urban LMIC immunisation programmes, including significant crossover between tools. This was encouraging in terms of supporting the identification of common areas, but problematic as data volumes, instructions, and activities could overwhelm managers and tools are not always suitably applied to suitable contexts. Further research is needed on how best to combine tools and methods to suit local contexts. Authors’ initial framework can be tested and developed further.

Keywords: health equity, immunisation, low and middle-income countries, poverty, urban health

Procedia PDF Downloads 139
24884 Innovations and Challenges: Multimodal Learning in Cybersecurity

Authors: Tarek Saadawi, Rosario Gennaro, Jonathan Akeley

Abstract:

There is rapidly growing demand for professionals to fill positions in Cybersecurity. This is recognized as a national priority both by government agencies and the private sector. Cybersecurity is a very wide technical area which encompasses all measures that can be taken in an electronic system to prevent criminal or unauthorized use of data and resources. This requires defending computers, servers, networks, and their users from any kind of malicious attacks. The need to address this challenge has been recognized globally but is particularly acute in the New York metropolitan area, home to some of the largest financial institutions in the world, which are prime targets of cyberattacks. In New York State alone, there are currently around 57,000 jobs in the Cybersecurity industry, with more than 23,000 unfilled positions. The Cybersecurity Program at City College is a collaboration between the Departments of Computer Science and Electrical Engineering. In Fall 2020, The City College of New York matriculated its first students in theCybersecurity Master of Science program. The program was designed to fill gaps in the previous offerings and evolved out ofan established partnership with Facebook on Cybersecurity Education. City College has designed a program where courses, curricula, syllabi, materials, labs, etc., are developed in cooperation and coordination with industry whenever possible, ensuring that students graduating from the program will have the necessary background to seamlessly segue into industry jobs. The Cybersecurity Program has created multiple pathways for prospective students to obtain the necessary prerequisites to apply in order to build a more diverse student population. The program can also be pursued on a part-time basis which makes it available to working professionals. Since City College’s Cybersecurity M.S. program was established to equip students with the advanced technical skills needed to thrive in a high-demand, rapidly-evolving field, it incorporates a range of pedagogical formats. From its outset, the Cybersecurity program has sought to provide both the theoretical foundations necessary for meaningful work in the field along with labs and applied learning projects aligned with skillsets required by industry. The efforts have involved collaboration with outside organizations and with visiting professors designing new courses on topics such as Adversarial AI, Data Privacy, Secure Cloud Computing, and blockchain. Although the program was initially designed with a single asynchronous course in the curriculum with the rest of the classes designed to be offered in-person, the advent of the COVID-19 pandemic necessitated a move to fullyonline learning. The shift to online learning has provided lessons for future development by providing examples of some inherent advantages to the medium in addition to its drawbacks. This talk will address the structure of the newly-implemented Cybersecurity Master’s Program and discuss the innovations, challenges, and possible future directions.

Keywords: cybersecurity, new york, city college, graduate degree, master of science

Procedia PDF Downloads 146
24883 Ontology for a Voice Transcription of OpenStreetMap Data: The Case of Space Apprehension by Visually Impaired Persons

Authors: Said Boularouk, Didier Josselin, Eitan Altman

Abstract:

In this paper, we present a vocal ontology of OpenStreetMap data for the apprehension of space by visually impaired people. Indeed, the platform based on produsage gives a freedom to data producers to choose the descriptors of geocoded locations. Unfortunately, this freedom, called also folksonomy leads to complicate subsequent searches of data. We try to solve this issue in a simple but usable method to extract data from OSM databases in order to send them to visually impaired people using Text To Speech technology. We focus on how to help people suffering from visual disability to plan their itinerary, to comprehend a map by querying computer and getting information about surrounding environment in a mono-modal human-computer dialogue.

Keywords: TTS, ontology, open street map, visually impaired

Procedia PDF Downloads 295