Search results for: privacy and data protection law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26410

Search results for: privacy and data protection law

24940 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 413
24939 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 54
24938 Wearable System for Prolonged Cooling and Dehumidifying of PPE in Hot Environments

Authors: Lun Lou, Jintu Fan

Abstract:

While personal protective equipment (PPE) prevents the healthcare personnel from exposing to harmful surroundings, it creates a barrier to the dissipation of body heat and perspiration, leading to severe heat stress during prolonged exposure, especially in hot environments. It has been found that most of the existed personal cooling strategies have limitations in achieving effective cooling performance with long duration and lightweight. This work aimed to develop a lightweight (<1.0 kg) and less expensive wearable air cooling and dehumidifying system (WCDS) that can be applied underneath the protective clothing and provide 50W mean cooling power for more than 5 hours at 35°C environmental temperature without compromising the protection of PPE. For the WCDS, blowers will be used to activate an internal air circulation inside the clothing microclimate, which doesn't interfere with the protection of PPE. An air cooling and dehumidifying chamber (ACMR) with a specific design will be developed to reduce the air temperature and humidity inside the protective clothing. Then the cooled and dried air will be supplied to upper chest and back areas through a branching tubing system for personal cooling. A detachable ice cooling unit will be applied from the outside of the PPE to extract heat from the clothing microclimate. This combination allows for convenient replacement of the cooling unit to refresh the cooling effect, which can realize a continuous cooling function without taking off the PPE or adding too much weight. A preliminary thermal manikin test showed that the WCDS was able to reduce the microclimate temperature inside the PPE averagely by about 8°C for 60 minutes when the environmental temperature was 28.0 °C and 33.5 °C, respectively. Replacing the ice cooling unit every hour can maintain this cooling effect, while the longest operation duration is determined by the battery of the blowers, which can last for about 6 hours. This unique design is especially helpful for the PPE users, such as health care workers in infectious and hot environments when continuous cooling and dehumidifying are needed, but the change of protective clothing may increase the risk of infection. The new WCDS will not only improve the thermal comfort of PPE users but can also extend their safe working duration.

Keywords: personal thermal management, heat stress, ppe, health care workers, wearable device

Procedia PDF Downloads 66
24937 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 361
24936 Photoprotective and Antigenotoxic Effects of a Mixture of Posoqueria latifolia Flower Extract and Kaempferol Against Ultraviolet B Radiation

Authors: Silvia Ximena Barrios, Diego Armando Villamizar Mantilla, Raquel Elvira Ocazionez, , Elena E. Stashenko, María Pilar Vinardell, Jorge Luis Fuentes

Abstract:

Introduction: Skin overexposure to solar radiation has been a serious public health concern, because of its potential carcinogenicity. Therefore, preventive protection strategies using photoprotective agents are critical to counteract the harmful effect of solar radiation. Plants may be a source of photoprotective compounds that inhibit cellular mutations involved in skin cancer initiation. This work evaluated the photoprotective and antigenotoxic effects against ultraviolet B (UVB) radiation of a mixture of Posoqueria latifolia flower extract and Kaempferol (MixPoKa). Methods: The photoprotective efficacy of MixPoka (Posoqueria latifolia flower extract 250 μg/ml and Kaempferol 349.5 μM) was evaluated using in vitro indices such as sun protection factor SPFᵢₙ ᵥᵢₜᵣₒ and critical wavelength (λc). The MixPoKa photostability (Eff) at human minimal erythema doses (MED), according to the Fitzpatrick skin scale, was also estimated. Cytotoxicity and genotoxicity/antigenotoxicity were studied in MRC5 human fibroblasts using the trypan blue exclusion and Comet assays, respectively. Kinetics of the genetic damage repair post irradiation in the presence and absence of the MixPoka, was also evaluated. Results: The MixPoka -UV absorbance spectrum was high across the spectral bands between 200 and 400 nm. The UVB photoprotection efficacy of MixPoka was high (SPFᵢₙ ᵥᵢₜᵣₒ = 25.70 ± 0.06), showed wide photoprotection spectrum (λc = 380 ± 0), and resulted photostable (Eff = 92.3–100.0%). The MixPoka was neither cytotoxic nor genotoxic in MRC5 human fibroblasts; but presented significant antigenotoxic effect against UVB radiation. Additionally, MixPoka stimulate DNA repair post-irradiation. The potential of this phytochemical mixture as sunscreen ingredients was discussed. Conclusion: MixPoka showed a significant antigenotoxic effect against UVB radiation and stimulated DNA repair after irradiation. MixPoka could be used as an ingredient in a sunscreen cream.

Keywords: flower extract, photoprotection, antigenotoxicity, cytotoxicity, genotoxicit

Procedia PDF Downloads 68
24935 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 420
24934 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 241
24933 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 261
24932 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 247
24931 Measuring Organizational Resiliency for Flood Response in Thailand

Authors: Sudha Arlikatti, Laura Siebeneck, Simon A. Andrew

Abstract:

The objective of this research is to measure organizational resiliency through five attributes namely, rapidity, redundancy, resourcefulness, and robustness and to provide recommendations for resiliency building in flood risk communities. The research was conducted in Thailand following the severe floods of 2011 triggered by Tropical Storm Nock-ten. The floods lasted over eight months starting in June 2011 affecting 65 of the country’s 76 provinces and over 12 million people. Funding from a US National Science Foundation grant was used to collect ephemeral data in rural (Ayutthaya), suburban (Pathum Thani), and urban (Bangkok) provinces of Thailand. Semi-structured face-to-face interviews were conducted in Thai with 44 contacts from public, private, and non-profit organizations including universities, schools, automobile companies, vendors, tourist agencies, monks from temples, faith based organizations, and government agencies. Multiple triangulations were used to analyze the data by identifying selective themes from the qualitative data, validated with quantitative data and news media reports. This helped to obtain a more comprehensive view of how organizations in different geographic settings varied in their understanding of what enhanced or hindered their resilience and consequently their speed and capacities to respond. The findings suggest that the urban province of Bangkok scored highest in resourcefulness, rapidity of response, robustness, and ability to rebound. This is not surprising considering that it is the country’s capital and the seat of government, economic, military and tourism sectors. However, contrary to expectations all 44 respondents noted that the rural province of Ayutthaya was the fastest to recover amongst the three. Its organizations scored high on redundancy and rapidity of response due to the strength of social networks, a flood disaster sub-culture due to annual flooding, and the help provided by monks from and faith based organizations. Organizations in the suburban community of Pathum Thani scored lowest on rapidity of response and resourcefulness due to limited and ambiguous warnings, lack of prior flood experience and controversies that government flood protection works like sandbagging favored the capital city of Bangkok over them. Such a micro-level examination of organizational resilience in rural, suburban and urban areas in a country through mixed methods studies has its merits in getting a nuanced understanding of the importance of disaster subcultures and religious norms for resilience. This can help refocus attention on the strengths of social networks and social capital, for flood mitigation.

Keywords: disaster subculture, flood response, organizational resilience, Thailand floods, religious beliefs and response, social capital and disasters

Procedia PDF Downloads 148
24930 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 481
24929 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 544
24928 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 153
24927 Child Marriages in Africa: Using a Rights-Based Approach to Protect the Girl-Child in Nigeria

Authors: Foluke Abimbola

Abstract:

The United Nations Convention on the rights of the child has been signed and ratified by several countries due to the concern about various abuses and crimes committed against children both locally and internationally. It is a shame that in view of the peculiar hardships being experienced by children today, the natural right to childhood has to be protected by a vast array of laws and international conventions. 194 countries have so far acceded to and ratified the convention on the Rights of a Child while some countries such as Nigeria have enacted the convention as a domestic law, yet child abuse is still rampant not only in Nigeria but all over the world. In Nigeria, the Child Rights Act was passed into law in 2003, with its provisions similar to the United Nations Convention on the Rights of a Child. Despite the age of marriage provided in the Nigerian Child’s Rights Act 2003, many communities still practice child marriages to the detriment of the girl-child. Cases where these children have to withdraw from school as a result of these unripe marriages abound. Unfortunately, the Constitution of the Federal Republic of Nigeria 1999 appears to indirectly support early marriages for girls in section 29 (4) where it states that a woman who is married is deemed to be of full age whereas ‘full age’ as a general term in the Constitution is from 18 years old and above. Section 29 (4) may thus be interpreted to mean that a girl of 12 years old, if married, is deemed to be of ‘full-age.’ In view of these discrepancies which continue to justify this unwholesome practice, this paper shall proffer solutions to this unlawful act and make recommendations to existing institutions, using a rights-based approach, on how to prevent and/or substantially reduce this practice. A comparative analysis with other African countries will be adopted in order to conduct a research for effective policies that may be implemented for the protection of these girls. Thus, this paper will further examine the issue of child marriage which is still quite rampant in African countries particularly in Nigeria which also affects the girl-child’s right to an education. Such children are in need of special protection and this paper will recommend ways in which state institutions, particularly in Nigeria, may be able to introduce policies to curb incidences of child marriage and child sexual abuse while proffering strategies for the prevention of these crimes.

Keywords: child abuse, child marriages, child rights, constitutions, child rights, the girl-child

Procedia PDF Downloads 127
24926 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 395
24925 Protection and Immune Responses of DNA Vaccines Targeting Virulence Factors of Streptococcus iniae in Nile Tilapia (Oreochromis niloticus)

Authors: Pattanapon Kayansamruaj, Ha Thanh Dong, Nopadon Pirarat, Channarong Rodkhum

Abstract:

Streptococcus iniae (SI) is a devastating pathogenic bacteria causing heavy mortality in farmed fish. The application of commercialized bacterin vaccine has been reported failures as the outbreaks of the new serotype of SI were emerged in farms after vaccination and subsequently caused severe losses. In the present study, we attempted to develop effective DNA vaccines against SI infection using Nile tilapia (Oreochromis niloticus) as an animal model. Two monovalent DNA vaccines were constructed by the insertion of coding sequences of cell wall-associated virulence factors-encoding genes, comprised of eno (α-enolase) and mtsB (hydrophobic membrane protein), into cytomegalovirus expression vector (pCI-neo). In the animal trial, 30-g Nile tilapia were injected intramuscularly with 15 µg of each vaccine (mock vaccine group was injected by naked pCI-neo) and maintained for 35 days prior challenging with pathogenic SI at the dosage of 107 CFU/fish. At 13 days post-challenge, the relative percent survival of pEno, pMtsB and mock vaccine were 57%, 45% and 27%, respectively. The expression levels of immune responses-associated genes, namely, IL1β, TNF-α, TGF-β, COX2, IL-6, IL-12 and IL-13, were investigated from the spleen of experimental animal at 7 days post-vaccination (PV) and 7 days post-challenge (PC) using quantitative RT-PCR technique. Generally, at 7 days PV, the pEno vaccinated group exhibited highest level of up-regulation (1.7 to 2.9 folds) of every gene, but TGF-β, comparing to pMtsB and mock vaccine groups. However, at 7 days PC, pEno group showed significant up-regulation (1.4 to 8.5 folds) of immune-related genes as similar as mock vaccine group, while pMtsB group had lowest level of up-regulation (0.7 to 3.3 folds). Summarily, this study indicated that the pEno and pMtsB vaccines could elicit the immune responses of the fish and the magnitude of gene expression at 7 days PV was also consistent with the protection level conferred by the vaccine.

Keywords: gene expression, DNA vaccine, Nile tilapia, Streptococcus iniae

Procedia PDF Downloads 322
24924 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 169
24923 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 77
24922 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: data science, non-negative matrix factorization, missing data, quality of services

Procedia PDF Downloads 121
24921 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies

Authors: Margaret S. Wright

Abstract:

Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.

Keywords: data management, decision making, disaster planning documentation, public health nursing

Procedia PDF Downloads 209
24920 The Special Testimony as a Methodology for Social Workers to Ensure the Rights of Children and Adolescents Who Are Victims of Sexual Violence

Authors: Natany Rodrigues De Carvalho, Denise Bomtempo Birche De Carvalho

Abstract:

The purpose of this study is to analyze the Special Testimony as a methodology for social workers to ensure the rights of children and adolescents who are victims of sexual violence. The specific objectives are: a) to contextualize, through the specialized literature, the social history of childhood and adolescence; b) to investigate, in the scientific literature, the sexual violence against children and adolescents as an analytical category; c) identify, with the social workers, if there is any defense of children and adolescents in the special testimony. To answer the research objectives we use qualitative research, in three axes that complement each other: a) participant observation through the insertion in the research field (supervised internship I and II); b) survey of literature on the subject; c) semi-structured interviews with social workers of the TJDFT. We used content analysis to systematize and interpret the collected data. The results of the research were organized into three chapters with the following contents: a) literature review, contextualizing the social history of childhood and adolescence to the present; b) sexual violence against children and adolescents and their categories of analysis; c) understanding of the special testimony in the Federal District and Territories in guaranteeing the rights of children and adolescents, identifying their main points from the perspective of social workers. The results showed how the lack of interdisciplinarity in the Special Testimony can lead to the non-integral protection of children and adolescents victims of sexual violence.

Keywords: childhood and adolescence, sexual violence, special testimony, social work

Procedia PDF Downloads 308
24919 An Embarrassingly Simple Semi-supervised Approach to Increase Recall in Online Shopping Domain to Match Structured Data with Unstructured Data

Authors: Sachin Nagargoje

Abstract:

Complete labeled data is often difficult to obtain in a practical scenario. Even if one manages to obtain the data, the quality of the data is always in question. In shopping vertical, offers are the input data, which is given by advertiser with or without a good quality of information. In this paper, an author investigated the possibility of using a very simple Semi-supervised learning approach to increase the recall of unhealthy offers (has badly written Offer Title or partial product details) in shopping vertical domain. The author found that the semisupervised learning method had improved the recall in the Smart Phone category by 30% on A=B testing on 10% traffic and increased the YoY (Year over Year) number of impressions per month by 33% at production. This also made a significant increase in Revenue, but that cannot be publicly disclosed.

Keywords: semi-supervised learning, clustering, recall, coverage

Procedia PDF Downloads 112
24918 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 248
24917 Creating Legitimate Expectations in International Energy Investments: Role of the Stability Provisions

Authors: Rahmi Kopar

Abstract:

Legitimate expectations principle is considered one of the most dominant elements of the Fair and Equitable Treatment Standard which is today’s most relied upon treaty standard. Since its utilization by arbitral tribunals is relatively new, the contours of the legitimate expectations concept under investment treaty law have not been precisely defined yet. There are various fragmented views arising both from arbitral tribunals and scholarly writings with respect to its limits and use even though the principle is ‘firmly rooted in arbitral practice.’ International energy investments, due to their characteristics, are more prone to certain types of risks, especially the political risks. Thus, there are several mechanisms to protect an energy investment against those risks. Stabilisation is one of these investment protection methods. Stability provisions can be found under domestic legislations, as a contractual clause, or as a separate legal stability agreement. This paper will start by examining the roots of the contentious concept of legitimate expectations with reference to its application in domestic legal systems from where the doctrine under investment treaty law context was transplanted. Then the paper will turn to the investment treaty law and analyse the main contours of the doctrine as understood and applied by arbitral tribunals. 'What gives rise to the investor’s legitimate expectations?' question is answered mainly by three categories of sources: the general legal framework prevalent in a host state, the representations made by the officials or organs of a host state, and the contractual commitments. However, there is no unanimity among the arbitral tribunals and the scholars with respect to the form these sources should take. At this point, the study will discuss the sources of a stability provision and the effect of these stability provisions found in various legal sources in creating a legitimate expectation for the investor. The main questions to be discussed in this paper are as follows: a) Do the stability provisions found under different legal sources create a legitimate expectation on the investor side? b) If yes, what levels of legitimate expectations do they create? These questions will be answered mainly by reference to investment treaty jurisprudence.

Keywords: fair and equitable treatment standard, international energy investments, investment protection, legitimate expectations, stabilization

Procedia PDF Downloads 207
24916 Corrosion Protective Coatings in Machines Design

Authors: Cristina Diaz, Lucia Perez, Simone Visigalli, Giuseppe Di Florio, Gonzalo Fuentes, Roberto Canziani, Paolo Gronchi

Abstract:

During the last 50 years, the selection of materials is one of the main decisions in machine design for different industrial applications. It is due to numerous physical, chemical, mechanical and technological factors to consider in it. Corrosion effects are related with all of these factors and impact in the life cycle, machine incidences and the costs for the life of the machine. Corrosion affects the deterioration or destruction of metals due to the reaction with the environment, generally wet. In food industry, dewatering industry, concrete industry, paper industry, etc. corrosion is an unsolved problem and it might introduce some alterations of some characteristics in the final product. Nowadays, depending on the selected metal, its surface and its environment of work, corrosion prevention might be a change of metal, use a coating, cathodic protection, use of corrosion inhibitors, etc. In the vast majority of the situations, use of a corrosion resistant material or in its defect, a corrosion protection coating is the solution. Stainless steels are widely used in machine design, because of their strength, easily cleaned capacity, corrosion resistance and appearance. Typical used are AISI 304 and AISI 316. However, their benefits don’t fit every application, and some coatings are required against corrosion such as some paintings, galvanizing, chrome plating, SiO₂, TiO₂ or ZrO₂ coatings, etc. In this work, some coatings based in a bilayer made of Titanium-Tantalum, Titanium-Niobium, Titanium-Hafnium or Titanium-Zirconium, have been developed used magnetron sputtering configuration by PVD (Physical Vapor Deposition) technology, for trying to reduce corrosion effects on AISI 304, AISI 316 and comparing it with Titanium alloy substrates. Ti alloy display exceptional corrosion resistance to chlorides, sour and oxidising acidic media and seawater. In this study, Ti alloy (99%) has been included for comparison with coated AISI 304 and AISI 316 stainless steel. Corrosion tests were conducted by a Gamry Instrument under ASTM G5-94 standard, using different electrolytes such as tomato salsa, wine, olive oil, wet compost, a mix of sand and concrete with water and NaCl for testing corrosion in different industrial environments. In general, in all tested environments, the results showed an improvement of corrosion resistance of all coated AISI 304 and AISI 316 stainless steel substrates when they were compared to uncoated stainless steel substrates. After that, comparing these results with corrosion studies on uncoated Ti alloy substrate, it was observed that in some cases, coated stainless steel substrates, reached similar current density that uncoated Ti alloy. Moreover, Titanium-Zirconium and Titanium-Tantalum coatings showed for all substrates in study including coated Ti alloy substrates, a reduction in current density more than two order in magnitude. As conclusion, Ti-Ta, Ti-Zr, Ti-Nb and Ti-Hf coatings have been developed for improving corrosion resistance of AISI 304 and AISI 316 materials. After corrosion tests in several industry environments, substrates have shown improvements on corrosion resistance. Similar processes have been carried out in Ti alloy (99%) substrates. Coated AISI 304 and AISI 316 stainless steel, might reach similar corrosion protection on the surface than uncoated Ti alloy (99%). Moreover, coated Ti Alloy (99%) might increase its corrosion resistance using these coatings.

Keywords: coatings, corrosion, PVD, stainless steel

Procedia PDF Downloads 151
24915 Spatial Mapping and Change Detection of a Coastal Woodland Mangrove Habitat in Fiji

Authors: Ashneel Ajay Singh, Anish Maharaj, Havish Naidu, Michelle Kumar

Abstract:

Mangrove patches are the foundation species located in the estuarine land areas. These patches provide a nursery, food source and protection for numerous aquatic, intertidal and well as land-based organisms. Mangroves also help in coastal protection, maintain water clarity and are one of the biggest sinks for blue carbon sequestration. In the Pacific Island countries, numerous coastal communities have a heavy socioeconomic dependence on coastal resources and mangroves play a key ecological and economical role in structuring the availability of these resources. Fiji has a large mangrove patch located in the Votua area of the Ba province. Globally, mangrove population continues to decline with the changes in climatic conditions and anthropogenic activities. Baseline information through wetland maps and time series change are essential references for development of effective mangrove management plans. These maps reveal the status of the resource and the effects arising from anthropogenic activities and climate change. In this study, we used remote sensing and GIS tools for mapping and temporal change detection over a period of >20 years in Votua, Fiji using Landsat imagery. Landsat program started in 1972 initially as Earth Resources Technology Satellite. Since then it has acquired millions of images of Earth. This archive allows mapping of temporal changes in mangrove forests. Mangrove plants consisted of the species Rhizophora stylosa, Rhizophora samoensis, Bruguiera gymnorrhiza, Lumnitzera littorea, Heritiera littoralis, Excoecaria agallocha and Xylocarpus granatum. Change detection analysis revealed significant reduction in the mangrove patch over the years. This information serves as a baseline for the development and implementation of effective management plans for one of Fiji’s biggest mangrove patches.

Keywords: climate change, GIS, Landsat, mangrove, temporal change

Procedia PDF Downloads 171
24914 Soil and Environmental Management Awareness as Professional Competency of the Agricultural Extension Officers for Their Plans Implementation

Authors: Muhammad Zafarullah Khan

Abstract:

Agricultural Extension Officers’ (AEOs) competency level in soil and environmental management awareness is important for interacting with farming communities of different types of soil. Questionnaire was developed for all AEOs for data collection to know the present position and needed position of competency on Likert scale from 01-05 by assigning very low (01) and very high (05). Wide gap was found in competency of suitability of various soil types for horticultural and agronomic crops and reclamation of saline soil. We observed that suitability ranking of various soil types for horticultural crops (Diff. = 1.21), agronomic crops (Diff. = 1.20) and soil borne diseases (Diff. = 1.19) were the top three important competencies where training or improvement is needed. To better fill this gap we recommend that professional qualification of AEOs should be enhanced and training opportunities should be provided to them particularly to deal with soil and environmental management awareness. Thus training opportunities may increase their competency and will add highly skilled manpower to the system for sustainable development to protect environment. It is therefore, recommended that AEOs may be provided pre and in service trainings of soil environmental management in order to equip them with a capacity to work with farming community effectively to boost the living standard of farming community and alleviate poverty for environmental protection.

Keywords: professional competency, agricultural extension officers, soil and environmental management awareness, plans implementation

Procedia PDF Downloads 381
24913 Ontology for a Voice Transcription of OpenStreetMap Data: The Case of Space Apprehension by Visually Impaired Persons

Authors: Said Boularouk, Didier Josselin, Eitan Altman

Abstract:

In this paper, we present a vocal ontology of OpenStreetMap data for the apprehension of space by visually impaired people. Indeed, the platform based on produsage gives a freedom to data producers to choose the descriptors of geocoded locations. Unfortunately, this freedom, called also folksonomy leads to complicate subsequent searches of data. We try to solve this issue in a simple but usable method to extract data from OSM databases in order to send them to visually impaired people using Text To Speech technology. We focus on how to help people suffering from visual disability to plan their itinerary, to comprehend a map by querying computer and getting information about surrounding environment in a mono-modal human-computer dialogue.

Keywords: TTS, ontology, open street map, visually impaired

Procedia PDF Downloads 285
24912 Design and Development of a Platform for Analyzing Spatio-Temporal Data from Wireless Sensor Networks

Authors: Walid Fantazi

Abstract:

The development of sensor technology (such as microelectromechanical systems (MEMS), wireless communications, embedded systems, distributed processing and wireless sensor applications) has contributed to a broad range of WSN applications which are capable of collecting a large amount of spatiotemporal data in real time. These systems require real-time data processing to manage storage in real time and query the data they process. In order to cover these needs, we propose in this paper a Snapshot spatiotemporal data model based on object-oriented concepts. This model allows saving storing and reducing data redundancy which makes it easier to execute spatiotemporal queries and save analyzes time. Further, to ensure the robustness of the system as well as the elimination of congestion from the main access memory we propose a spatiotemporal indexing technique in RAM called Captree *. As a result, we offer an RIA (Rich Internet Application) -based SOA application architecture which allows the remote monitoring and control.

Keywords: WSN, indexing data, SOA, RIA, geographic information system

Procedia PDF Downloads 243
24911 Holistic Urban Development: Incorporating Both Global and Local Optimization

Authors: Christoph Opperer

Abstract:

The rapid urbanization of modern societies and the need for sustainable urban development demand innovative solutions that meet both individual and collective needs while addressing environmental concerns. To address these challenges, this paper presents a study that explores the potential of spatial and energetic/ecological optimization to enhance the performance of urban settlements, focusing on both architectural and urban scales. The study focuses on the application of biological principles and self-organization processes in urban planning and design, aiming to achieve a balance between ecological performance, architectural quality, and individual living conditions. The research adopts a case study approach, focusing on a 10-hectare brownfield site in the south of Vienna. The site is surrounded by a small-scale built environment as an appropriate starting point for the research and design process. However, the selected urban form is not a prerequisite for the proposed design methodology, as the findings can be applied to various urban forms and densities. The methodology used in this research involves dividing the overall building mass and program into individual small housing units. A computational model has been developed to optimize the distribution of these units, considering factors such as solar exposure/radiation, views, privacy, proximity to sources of disturbance (such as noise), and minimal internal circulation areas. The model also ensures that existing vegetation and buildings on the site are preserved and incorporated into the optimization and design process. The model allows for simultaneous optimization at two scales, architectural and urban design, which have traditionally been addressed sequentially. This holistic design approach leads to individual and collective benefits, resulting in urban environments that foster a balance between ecology and architectural quality. The results of the optimization process demonstrate a seemingly random distribution of housing units that, in fact, is a densified hybrid between traditional garden settlements and allotment settlements. This urban typology is selected due to its compatibility with the surrounding urban context, although the presented methodology can be extended to other forms of urban development and density levels. The benefits of this approach are threefold. First, it allows for the determination of ideal housing distribution that optimizes solar radiation for each building density level, essentially extending the concept of sustainable building to the urban scale. Second, the method enhances living quality by considering the orientation and positioning of individual functions within each housing unit, achieving optimal views and privacy. Third, the algorithm's flexibility and robustness facilitate the efficient implementation of urban development with various stakeholders, architects, and construction companies without compromising its performance. The core of the research is the application of global and local optimization strategies to create efficient design solutions. By considering both, the performance of individual units and the collective performance of the urban aggregation, we ensure an optimal balance between private and communal benefits. By promoting a holistic understanding of urban ecology and integrating advanced optimization strategies, our methodology offers a sustainable and efficient solution to the challenges of modern urbanization.

Keywords: sustainable development, self-organization, ecological performance, solar radiation and exposure, daylight, visibility, accessibility, spatial distribution, local and global optimization

Procedia PDF Downloads 53