Search results for: privacy and data protection law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26411

Search results for: privacy and data protection law

25061 Hierarchical Clustering Algorithms in Data Mining

Authors: Z. Abdullah, A. R. Hamdan

Abstract:

Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.

Keywords: clustering, unsupervised learning, algorithms, hierarchical

Procedia PDF Downloads 875
25060 End to End Monitoring in Oracle Fusion Middleware for Data Verification

Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan

Abstract:

In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.

Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring

Procedia PDF Downloads 468
25059 Motives and Barriers of Using Airbnb: Findings from Mixed Method Approach

Authors: Ghada Mohammed, Mohamed Abdel Salam, Passent Tantawi

Abstract:

The study aimed to investigate the impact of motives and barriers for Egyptian users to use Airbnb as a platform of peer-to-peer accommodation instead of hotels on overall attitude towards Airbnb. A sequential mixed-methods approach was adopted to this study and it proposed a comprehensive research model adapted from both literature and results of qualitative phase and then tested via an online questionnaire. The findings revealed that, motives, price, home benefits, privacy, and online reviews significantly explained overall attitude towards Airbnb, while the main barriers were respectively: perceived risk and distrust in which they can predict the overall attitude. While from the subjective norms, only social influence can predict behavioral intention to use Airbnb. The study may serve as a practical reference for practitioners as well as researchers when developing programs and strategies to manage Airbnb consumers' needs and decision process. Some of the main conclusions drawn from this study are that variety was one of the major things that users like about Airbnb and the most important motives are the functional ones like price rather than the experiential ones like authenticity.

Keywords: airbnb, barriers, disruptive innovation, motives, sharing economy

Procedia PDF Downloads 138
25058 Synthesis and Characterization of Heterogeneous Silver Nanoparticles for Protection of Ancient Egyptian Artifacts from Microbial Deterioration

Authors: Mohamed Abd Elfattah Ibraheem Elghrbawy

Abstract:

Biodeterioration of cultural heritage is a complex process which is caused by the interaction of many physical, chemical and biological agents; the growth of microorganisms can cause staining, cracking, powdering, disfigurement and displacement of monuments material, which leads to the permanent loss of monuments material. Organisms causing biodeterioration on monuments have usually been controlled by chemical products (biocides). In order to overcome the impact of biocides on the environment, human health and monument substrates, alternative tools such as antimicrobial agents from natural products can be used for monuments conservation and protection. The problem is how to formulate antibacterial agents with high efficiency and low toxicity. Various types of biodegradable metal nanoparticles (MNPs) have many applications in plant extract delivery. So, Nano-encapsulation of metal and natural antimicrobial agents using polymers such as chitosan increases their efficacy, specificity and targeting ability. Green synthesis and characterization of metal nanoparticles such as silver with natural products extracted from some plants having antimicrobial properties, using the ecofriendly method one pot synthesis. Encapsulation of the new synthesized mixture using some biopolymers such as chitosan nanoparticles. The dispersions and homogeneity of the antimicrobial heterogeneous metal nanoparticles encapsulated by biopolymers will be characterized and confirmed by Fourier Transform Infrared Spectroscopy (FTIR), Transmission Electron Microscopy (TEM), Scanning Electron Microscopy (SEM) and Zeta seizer. The effect of the antimicrobial biopolymer metal nano-formulations on normal human cell lines will be investigated to evaluate the environmental safety of these formulations. The antimicrobial toxic activity of the biopolymeric antimicrobial metal nanoparticles formulations will be will be investigated to evaluate their efficiency towards different pathogenic bacteria and fungi.

Keywords: antimicrobial, biodeterioration, chitosan, cultural heritage, silver

Procedia PDF Downloads 66
25057 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering

Authors: K. Umbleja, M. Ichino

Abstract:

Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.

Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis

Procedia PDF Downloads 153
25056 WiFi Data Offloading: Bundling Method in a Canvas Business Model

Authors: Majid Mokhtarnia, Alireza Amini

Abstract:

Mobile operators deal with increasing in the data traffic as a critical issue. As a result, a vital responsibility of the operators is to deal with such a trend in order to create added values. This paper addresses a bundling method in a Canvas business model in a WiFi Data Offloading (WDO) strategy by which some elements of the model may be affected. In the proposed method, it is supposed to sell a number of data packages for subscribers in which there are some packages with a free given volume of data-offloaded WiFi complimentary. The paper on hands analyses this method in the views of attractiveness and profitability. The results demonstrate that the quality of implementation of the WDO strongly affects the final result and helps the decision maker to make the best one.

Keywords: bundling, canvas business model, telecommunication, WiFi data offloading

Procedia PDF Downloads 185
25055 Digital Health During a Pandemic: Critical Analysis of the COVID-19 Contact Tracing Apps

Authors: Mohanad Elemary, Imose Itua, Rajeswari B. Matam

Abstract:

Virologists and public health experts have been predicting potential pandemics from coronaviruses for decades. The viruses which caused the SARS and MERS pandemics and the Nipah virus led to many lost lives, but still, the COVID-19 pandemic caused by the SARS-CoV2 virus surprised many scientific communities, experts, and governments with its ease of transmission and its pathogenicity. Governments of various countries reacted by locking down entire populations to their homes to combat the devastation caused by the virus, which led to a loss of livelihood and economic hardship to many individuals and organizations. To revive national economies and support their citizens in resuming their lives, governments focused on the development and use of contact tracing apps as a digital way to track and trace exposure. Google and Apple introduced the Exposure Notification Systems (ENS) framework. Independent organizations and countries also developed different frameworks for contact tracing apps. The efficiency, popularity, and adoption rate of these various apps have been different across countries. In this paper, we present a critical analysis of the different contact tracing apps with respect to their efficiency, adoption rate and general perception, and the governmental strategies and policies, which led to the development of the applications. When it comes to the European countries, each of them followed an individualistic approach to the same problem resulting in different realizations of a similarly functioning application with differing results of use and acceptance. The study conducted an extensive review of existing literature, policies, and reports across multiple disciplines, from which a framework was developed and then validated through interviews with six key stakeholders in the field, including founders and executives in digital health startups and corporates as well as experts from international organizations like The World Health Organization. A framework of best practices and tactics is the result of this research. The framework looks at three main questions regarding the contact tracing apps; how to develop them, how to deploy them, and how to regulate them. The findings are based on the best practices applied by governments across multiple countries, the mistakes they made, and the best practices applied in similar situations in the business world. The findings include multiple strategies when it comes to the development milestone regarding establishing frameworks for cooperation with the private sector and how to design the features and user experience of the app for a transparent, effective, and rapidly adaptable app. For the deployment section, several tactics were discussed regarding communication messages, marketing campaigns, persuasive psychology, and the initial deployment scale strategies. The paper also discusses the data privacy dilemma and how to build for a more sustainable system of health-related data processing and utilization. This is done through principles-based regulations specific for health data to allow for its avail for the public good. This framework offers insights into strategies and tactics that could be implemented as protocols for future public health crises and emergencies whether global or regional.

Keywords: contact tracing apps, COVID-19, digital health applications, exposure notification system

Procedia PDF Downloads 129
25054 Distributed Perceptually Important Point Identification for Time Series Data Mining

Authors: Tak-Chung Fu, Ying-Kit Hung, Fu-Lai Chung

Abstract:

In the field of time series data mining, the concept of the Perceptually Important Point (PIP) identification process is first introduced in 2001. This process originally works for financial time series pattern matching and it is then found suitable for time series dimensionality reduction and representation. Its strength is on preserving the overall shape of the time series by identifying the salient points in it. With the rise of Big Data, time series data contributes a major proportion, especially on the data which generates by sensors in the Internet of Things (IoT) environment. According to the nature of PIP identification and the successful cases, it is worth to further explore the opportunity to apply PIP in time series ‘Big Data’. However, the performance of PIP identification is always considered as the limitation when dealing with ‘Big’ time series data. In this paper, two distributed versions of PIP identification based on the Specialized Binary (SB) Tree are proposed. The proposed approaches solve the bottleneck when running the PIP identification process in a standalone computer. Improvement in term of speed is obtained by the distributed versions.

Keywords: distributed computing, performance analysis, Perceptually Important Point identification, time series data mining

Procedia PDF Downloads 420
25053 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks

Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam

Abstract:

In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.

Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion

Procedia PDF Downloads 107
25052 The Impact of the Windows Opening on the Design of Buildings in Islamic Architecture

Authors: Salma I. Dwidar, Amal A. Abdel-Sattar

Abstract:

The window openings are the key to the relationship between the inside and the outside of any building. It is the eye that sees from, the lunges of the construction, and the ear to hear. The success of the building, as well as the comfort of the uses, depends mainly on this relationship. Usually, windows are affected by human factors like religious, social, political and economic factors as well as environmental factors like climatic, aesthetic and functional factors. In Islamic architecture, the windows were one of the most important elements of physiological and psychological comfort to the users of the buildings. Windows considered one of the main parameters in designing internal and external facade, where the window openings occupied a big part of the formation of the external facade of the buildings. This paper discusses the importance of the window openings and its relationship to residential buildings in the Islamic architecture. It addresses the rules that have been followed in the design of windows in Islamic architecture to achieve privacy and thermal comfort while there are no technological elements within the dwellings. Also, it demonstrates the effects of windows on the building form and identity and how it gives a distinctive fingerprint of the architecture buildings.

Keywords: window openings, thermal comfort, residential buildings, the Islamic architecture, human considerations

Procedia PDF Downloads 211
25051 Knowledge Discovery and Data Mining Techniques in Textile Industry

Authors: Filiz Ersoz, Taner Ersoz, Erkin Guler

Abstract:

This paper addresses the issues and technique for textile industry using data mining techniques. Data mining has been applied to the stitching of garments products that were obtained from a textile company. Data mining techniques were applied to the data obtained from the CHAID algorithm, CART algorithm, Regression Analysis and, Artificial Neural Networks. Classification technique based analyses were used while data mining and decision model about the production per person and variables affecting about production were found by this method. In the study, the results show that as the daily working time increases, the production per person also decreases. In addition, the relationship between total daily working and production per person shows a negative result and the production per person show the highest and negative relationship.

Keywords: data mining, textile production, decision trees, classification

Procedia PDF Downloads 342
25050 The Effect of Composite Hybridization on the Back Face Deformation of Armor Plates

Authors: Attef Kouadria, Yehya Bouteghrine, Amar Manaa, Tarek Mouats, Djalel Eddine Tria, Hamid Abdelhafid Ghouti

Abstract:

Personal protection systems have been used in several forms for centuries. The need for light-weight composite structures has been in great demand due to their weight and high mechanical properties ratios in comparison to heavy and cumbersome steel plates. In this regard, lighter ceramic plates with a backing plate made of high strength polymeric fibers, mostly aramids, are widely used for protection against ballistic threats. This study aims to improve the ballistic performance of ceramic/composite plates subjected to ballistic impact by reducing the back face deformation (BFD) measured after each test. A new hybridization technique was developed in this investigation to increase the energy absorption capabilities of the backing plates. The hybridization consists of combining different types of aramid fabrics with different linear densities of aramid fibers (Dtex) and areal densities with an epoxy resin to form the backing plate. Therefore, several composite structures architectures were prepared and tested. For better understanding the effect of the hybridization, a serial of tensile, compression, and shear tests were conducted to determine the mechanical properties of the homogeneous composite materials prepared from different fabrics. It was found that the hybridization allows the backing plate to combine between the mechanical properties of the used fabrics. Aramid fabrics with higher Dtex were found to increase the mechanical strength of the backing plate, while those with lower Dtex found to enhance the lateral wave dispersion ratio due to their lower areal density. Therefore, the back face deformation was significantly reduced in comparison to a homogeneous composite plate.

Keywords: aramid fabric, ballistic impact, back face deformation, body armor, composite, mechanical testing

Procedia PDF Downloads 141
25049 Investigation of Delivery of Triple Play Data in GE-PON Fiber to the Home Network

Authors: Ashima Anurag Sharma

Abstract:

Optical fiber based networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This research paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparison between various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be decreases due to increase in bit error rate.

Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT

Procedia PDF Downloads 518
25048 Microarray Gene Expression Data Dimensionality Reduction Using PCA

Authors: Fuad M. Alkoot

Abstract:

Different experimental technologies such as microarray sequencing have been proposed to generate high-resolution genetic data, in order to understand the complex dynamic interactions between complex diseases and the biological system components of genes and gene products. However, the generated samples have a very large dimension reaching thousands. Therefore, hindering all attempts to design a classifier system that can identify diseases based on such data. Additionally, the high overlap in the class distributions makes the task more difficult. The data we experiment with is generated for the identification of autism. It includes 142 samples, which is small compared to the large dimension of the data. The classifier systems trained on this data yield very low classification rates that are almost equivalent to a guess. We aim at reducing the data dimension and improve it for classification. Here, we experiment with applying a multistage PCA on the genetic data to reduce its dimensionality. Results show a significant improvement in the classification rates which increases the possibility of building an automated system for autism detection.

Keywords: PCA, gene expression, dimensionality reduction, classification, autism

Procedia PDF Downloads 550
25047 Mathematical Model of the Spread of Herpes Simplex Virus Type-2 in Heterosexual Relations with and without Condom Usage in a College Population

Authors: Jacob A. Braun

Abstract:

This paper uses mathematical modeling to show the spread of Herpes Simplex type-2 with and without the usage of condoms in a college population. The model uses four differential equations to calculate the data for the simulation. The dt increment used is one week. It also runs based on a fixated period. The period chosen was five years to represent time spent in college. The average age of the individual is 21, once again to represent the age of someone in college. In the total population, there are almost two times as many women who have Herpes Simplex Type-2 as men. Additionally, Herpes Simplex Type-2 does not have a known cure. The goal of the model is to show how condom usage affects women’s chances of receiving the virus in the hope of being able to reduce the number of women infected. In the end, the model demonstrates that condoms offer significant protection to women from the virus. Since fewer women are infected with the virus when condoms are used, in turn, fewer males are infected. Since Herpes Simplex Type-2 affects the carrier for their whole life, a small decrease of infections could lead to large ramifications over time. Specifically, a small decrease of infections at a young age, such as college, could have a very big effect on the long-term number of people infected with the virus.

Keywords: college, condom, Herpes, mathematical modelling

Procedia PDF Downloads 204
25046 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices

Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner

Abstract:

Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.

Keywords: biometrics, electrocardiographic, machine learning, signals processing

Procedia PDF Downloads 133
25045 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 61
25044 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 83
25043 Chemical Warfare Agent Simulant by Photocatalytic Filtering Reactor: Effect of Operating Parameters

Authors: Youcef Serhane, Abdelkrim Bouzaza, Dominique Wolbert, Aymen Amin Assadi

Abstract:

Throughout history, the use of chemical weapons is not exclusive to combats between army corps; some of these weapons are also found in very targeted intelligence operations (political assassinations), organized crime, and terrorist organizations. To improve the speed of action, important technological devices have been developed in recent years, in particular in the field of protection and decontamination techniques to better protect and neutralize a chemical threat. In order to assess certain protective, decontaminating technologies or to improve medical countermeasures, tests must be conducted. In view of the great toxicity of toxic chemical agents from (real) wars, simulants can be used, chosen according to the desired application. Here, we present an investigation about using a photocatalytic filtering reactor (PFR) for highly contaminated environments containing diethyl sulfide (DES). This target pollutant is used as a simulant of CWA, namely of Yperite (Mustard Gas). The influence of the inlet concentration (until high concentrations of DES (1200 ppmv, i.e., 5 g/m³ of air) has been studied. Also, the conversion rate was monitored under different relative humidity and different flow rates (respiratory flow - standards: ISO / DIS 8996 and NF EN 14387 + A1). In order to understand the efficacity of pollutant neutralization by PFR, a kinetic model based on the Langmuir–Hinshelwood (L–H) approach and taking into account the mass transfer step was developed. This allows us to determine the adsorption and kinetic degradation constants with no influence of mass transfer. The obtained results confirm that this small configuration of reactor presents an extremely promising way for the use of photocatalysis for treatment to deal with highly contaminated environments containing real chemical warfare agents. Also, they can give birth to an individual protection device (an autonomous cartridge for a gas mask).

Keywords: photocatalysis, photocatalytic filtering reactor, diethylsulfide, chemical warfare agents

Procedia PDF Downloads 93
25042 Trends of Cutaneous Melanoma in New Zealand: 2010 to 2020

Authors: Jack S. Pullman, Daniel Wen, Avinash Sharma, Bert Van Der Werf, Richard Martin

Abstract:

Background: New Zealand (NZ) melanoma incidence rates are amongst the highest in the world. Previous studies investigating the incidence of melanoma in NZ were performed for the periods 1995 – 1999 and 2000 – 2004 and suggested increasing melanoma incidence rates. Aim: The aim of the study is to provide an up-to-date review of trends in cutaneous melanoma in NZ from the New Zealand Cancer Registry (NZCR) 2010 – 2020. Methods: De-identified data were obtained from the NZCR, and relevant demographic and histopathologic information was extracted. Statistical analyses were conducted to calculate age-standardized incidence rates for invasive melanoma (IM) and melanoma in situ (MIS). Secondary results included Breslow thickness and melanoma subtype analysis. Results: There was a decline in the IM age-standardized incidence rate from 30.4 to 23.9 per 100,000 person-years between 2010 to 2020, alongside an increase in MIS incidence rate from 37.1 to 50.3 per 100,000 person-years. Men had a statistically significant higher IM incidence rate (p <0.001) and Breslow thickness (p <0.001) compared with women. Increased age was associated with a higher incidence of IM, presentation with melanoma of greater Breslow thickness and more advanced T stage. Conclusion: The incidence of IM in NZ has decreased in the last decade and was associated with an increase in MIS incidence over the same period. This can be explained due to earlier detection, dermoscopy, the maturity of prevention campaigns and/or a change in skin protection behavior.

Keywords: melanoma, incidence, epidemiology, New Zealand

Procedia PDF Downloads 54
25041 Discussing Embedded versus Central Machine Learning in Wireless Sensor Networks

Authors: Anne-Lena Kampen, Øivind Kure

Abstract:

Machine learning (ML) can be implemented in Wireless Sensor Networks (WSNs) as a central solution or distributed solution where the ML is embedded in the nodes. Embedding improves privacy and may reduce prediction delay. In addition, the number of transmissions is reduced. However, quality factors such as prediction accuracy, fault detection efficiency and coordinated control of the overall system suffer. Here, we discuss and highlight the trade-offs that should be considered when choosing between embedding and centralized ML, especially for multihop networks. In addition, we present estimations that demonstrate the energy trade-offs between embedded and centralized ML. Although the total network energy consumption is lower with central prediction, it makes the network more prone for partitioning due to the high forwarding load on the one-hop nodes. Moreover, the continuous improvements in the number of operations per joule for embedded devices will move the energy balance toward embedded prediction.

Keywords: central machine learning, embedded machine learning, energy consumption, local machine learning, wireless sensor networks, WSN

Procedia PDF Downloads 139
25040 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm

Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan

Abstract:

This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.

Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data

Procedia PDF Downloads 212
25039 The Effects of a Nursing Dignity Care Program on Patients’ Dignity in Care

Authors: Yea-Pyng Lin

Abstract:

Dignity is a core element of nursing care. Maintaining the dignity of patients is an important issue because the health and recovery of patients can be adversely affected by a lack of dignity in their care. The aim of this study was to explore the effects of a nursing dignity care program upon patients’ dignity in care. A quasi-experimental research design was implemented. Nurses were recruited by purposive sampling, and their patients were recruited by simple random sampling. Nurses in the experimental group received the nursing educational program on dignity care, while nurses in the control group received in-service education as usual. Data were collected via two instruments: the dignity in care scale for nurses and the dignity in care scale to patients, both of which were developed by the researcher. Both questionnaires consisted of three domains: agreement, importance, and frequencies of providing dignity care. A total of 178 nurses in the experimental group and 193 nurses in the control group completed the pretest and the follow-up evaluations at the first month, the third month, and the sixth month. The number of patients who were cared for by the nurses in the experimental group was 94 in the pretest. The number of patients in the post-test at the first, third, and sixth months were 91, 85, and 77, respectively. In the control group, 88 patients completed the II pretest, and 80 filled out the post-test at the first month, 77 at the third, and 74 at the sixth month. The major findings revealed the scores of agreement domain among nurses in the experimental group were found significantly different from those who in the control group at each point of time. The scores of importance domain between these two groups also displayed significant differences at pretest and the first month of post-test. Moreover, the frequencies of proving dignity care to patients were significant at pretest, the third month and sixth month of post-test. However, the experimental group had only significantly different from those who in the control group on the frequencies of receiving dignity care especially in the items of ‘privacy care,’ ‘communication care,’ and ‘emotional care’ for the patients. The results show that the nursing program on dignity care could increase nurses’ dignity care for patients in three domains of agreement, importance, and frequencies of providing dignity care. For patients, only the frequencies of receiving dignity care were significantly increased. Therefore, the nursing program on dignity care could be applicable for nurses’ in-service education and practice to enhance the ability of nurses to care for patient’s dignity.

Keywords: nurses, patients, dignity care, quasi-experimental, nursing education

Procedia PDF Downloads 456
25038 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 387
25037 Heritage and Tourism in the Era of Big Data: Analysis of Chinese Cultural Tourism in Catalonia

Authors: Xinge Liao, Francesc Xavier Roige Ventura, Dolores Sanchez Aguilera

Abstract:

With the development of the Internet, the study of tourism behavior has rapidly expanded from the traditional physical market to the online market. Data on the Internet is characterized by dynamic changes, and new data appear all the time. In recent years the generation of a large volume of data was characterized, such as forums, blogs, and other sources, which have expanded over time and space, together they constitute large-scale Internet data, known as Big Data. This data of technological origin that derives from the use of devices and the activity of multiple users is becoming a source of great importance for the study of geography and the behavior of tourists. The study will focus on cultural heritage tourist practices in the context of Big Data. The research will focus on exploring the characteristics and behavior of Chinese tourists in relation to the cultural heritage of Catalonia. Geographical information, target image, perceptions in user-generated content will be studied through data analysis from Weibo -the largest social networks of blogs in China. Through the analysis of the behavior of heritage tourists in the Big Data environment, this study will understand the practices (activities, motivations, perceptions) of cultural tourists and then understand the needs and preferences of tourists in order to better guide the sustainable development of tourism in heritage sites.

Keywords: Barcelona, Big Data, Catalonia, cultural heritage, Chinese tourism market, tourists’ behavior

Procedia PDF Downloads 126
25036 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption

Authors: Darusalam, Jorish Hulstijn, Marijn Janssen

Abstract:

Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.

Keywords: open data, accountability, anti-corruption, framework

Procedia PDF Downloads 318
25035 Generative Adversarial Network for Bidirectional Mappings between Retinal Fundus Images and Vessel Segmented Images

Authors: Haoqi Gao, Koichi Ogawara

Abstract:

Retinal vascular segmentation of color fundus is the basis of ophthalmic computer-aided diagnosis and large-scale disease screening systems. Early screening of fundus diseases has great value for clinical medical diagnosis. The traditional methods depend on the experience of the doctor, which is time-consuming, labor-intensive, and inefficient. Furthermore, medical images are scarce and fraught with legal concerns regarding patient privacy. In this paper, we propose a new Generative Adversarial Network based on CycleGAN for retinal fundus images. This method can generate not only synthetic fundus images but also generate corresponding segmentation masks, which has certain application value and challenge in computer vision and computer graphics. In the results, we evaluate our proposed method from both quantitative and qualitative. For generated segmented images, our method achieves dice coefficient of 0.81 and PR of 0.89 on DRIVE dataset. For generated synthetic fundus images, we use ”Toy Experiment” to verify the state-of-the-art performance of our method.

Keywords: retinal vascular segmentations, generative ad-versarial network, cyclegan, fundus images

Procedia PDF Downloads 133
25034 Efficient GIS Based Public Health System for Disease Prevention

Authors: K. M. G. T. R. Waidyarathna, S. M. Vidanagamachchi

Abstract:

Public Health System exists in Sri Lanka has a satisfactory complete information flow when compared to other systems in developing countries. The availability of a good health information system contributed immensely to achieve health indices that are in line with the developed countries like US and UK. The health information flow at the moment is completely paper based. In Sri Lanka, the fields like banking, accounting and engineering have incorporated information and communication technology to the same extent that can be observed in any other country. The field of medicine has behind those fields throughout the world mainly due to its complexity, issues like privacy, confidentially and lack of people with knowledge in both fields of Information Technology (IT) and Medicine. Sri Lanka’s situation is much worse and the gap is rapidly increasing with huge IT initiatives by private-public partnerships in all other countries. The major goal of the framework is to support minimizing the spreading diseases. To achieve that a web based framework should be implemented for this application domain with web mapping. The aim of this GIS based public health system is a secure, flexible, easy to maintain environment for creating and maintaining public health records and easy to interact with relevant parties.

Keywords: DHIS2, GIS, public health, Sri Lanka

Procedia PDF Downloads 555
25033 Syndromic Surveillance Framework Using Tweets Data Analytics

Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden

Abstract:

Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.

Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza

Procedia PDF Downloads 104
25032 Tackling the Digital Divide: Enhancing Video Consultation Access for Digital Illiterate Patients in the Hospital

Authors: Wieke Ellen Bouwes

Abstract:

This study aims to unravel which factors enhance accessibility of video consultations (VCs) for patients with low digital literacy. Thirteen in-depth interviews with patients, hospital employees, eHealth experts, and digital support organizations were held. Patients with low digital literacy received in-home support during real-time video consultations and are observed during the set-up of these consultations. Key findings highlight the importance of patient acceptance, emphasizing video consultations benefits and avoiding standardized courses. The lack of a uniform video consultation system across healthcare providers poses a barrier. Familiarity with support organizations – to support patients in usage of digital tools - among healthcare practitioners enhances accessibility. Moreover, considerations regarding the Dutch General Data Protection Regulation (GDPR) law influence support patients receive. Also, provider readiness to use video consultations influences patient access. Further, alignment between learning styles and support methods seems to determine abilities to learn how to use video consultations. Future research could delve into tailored learning styles and technological solutions for remote access to further explore effectiveness of learning methods.

Keywords: video consultations, digital literacy skills, effectiveness of support, intra- and inter-organizational relationships, patient acceptance of video consultations

Procedia PDF Downloads 65