Search results for: Privacy Preserving Data Publication (PPDP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25488

Search results for: Privacy Preserving Data Publication (PPDP)

24498 Facilitating Primary Care Practitioners to Improve Outcomes for People With Oropharyngeal Dysphagia Living in the Community: An Ongoing Realist Review

Authors: Caroline Smith, Professor Debi Bhattacharya, Sion Scott

Abstract:

Introduction: Oropharyngeal Dysphagia (OD) effects around 15% of older people, however it is often unrecognised and under diagnosed until they are hospitalised. There is a need for primary care healthcare practitioners (HCPs) to assume a proactive role in identifying and managing OD to prevent adverse outcomes such as aspiration pneumonia. Understanding the determinants of primary care HCPs undertaking this new behaviour provides the intervention targets for addressing. This realist review, underpinned by the Theoretical Domains Framework (TDF), aims to synthesise relevant literature and develop programme theories to understand what interventions work, how they work and under what circumstances to facilitate HCPs to prevent harm from OD. Combining realist methodology with behavioural science will permit conceptualisation of intervention components as theoretical behavioural constructs, thus informing the design of a future behaviour change intervention. Furthermore, through the TDF’s linkage to a taxonomy of behaviour change techniques, we will identify corresponding behaviour change techniques to include in this intervention. Methods & analysis: We are following the five steps for undertaking a realist review: 1) clarify the scope 2) Literature search 3) appraise and extract data 4) evidence synthesis 5) evaluation. We have searched Medline, Google scholar, PubMed, EMBASE, CINAHL, AMED, Scopus and PsycINFO databases. We are obtaining additional evidence through grey literature, snowball sampling, lateral searching and consulting the stakeholder group. Literature is being screened, evaluated and synthesised in Excel and Nvivo. We will appraise evidence in relation to its relevance and rigour. Data will be extracted and synthesised according to its relation to Initial programme theories (IPTs). IPTs were constructed after the preliminary literature search, informed by the TDF and with input from a stakeholder group of patient and public involvement advisors, general practitioners, speech and language therapists, geriatricians and pharmacists. We will follow the Realist and Meta-narrative Evidence Syntheses: Evolving Standards (RAMESES) quality and publication standards to report study results. Results: In this ongoing review our search has identified 1417 manuscripts with approximately 20% progressing to full text screening. We inductively generated 10 IPTs that hypothesise practitioners require: the knowledge to spot the signs and symptoms of OD; the skills to provide initial advice and support; and access to resources in their working environment to support them conducting these new behaviours. We mapped the 10 IPTs to 8 TDF domains and then generated a further 12 IPTs deductively using domain definitions to fulfil the remaining 6 TDF domains. Deductively generated IPTs broadened our thinking to consider domains such as ‘Emotion,’ ‘Optimism’ and ‘Social Influence’, e.g. If practitioners perceive that patients, carers and relatives expect initial advice and support, then they will be more likely to provide this, because they will feel obligated to do so. After prioritisation with stakeholders using a modified nominal group technique approach, a maximum of 10 IPTs will progress to test against the literature.

Keywords: behaviour change, deglutition disorders, primary healthcare, realist review

Procedia PDF Downloads 82
24497 The Quality of Business Relationships in the Tourism System: An Imaginary Organisation Approach

Authors: Armando Luis Vieira, Carlos Costa, Arthur Araújo

Abstract:

The tourism system is viewable as a network of relationships amongst business partners where the success of each actor will ultimately be determined by the success of the whole network. Especially since the publication of Gümmesson’s (1996) ‘theory of imaginary organisations’, which suggests that organisational effectiveness largely depends on managing relationships and sharing resources and activities, relationship quality (RQ) has been increasingly recognised as a main source of value creation and competitive advantage. However, there is still ambiguity around this topic, and managers and researchers have been recurrently reporting the need to better understand and capitalise on the quality of interactions with business partners. This research aims at testing an RQ model from a relational, imaginary organisation’s approach. Two mail surveys provide the perceptions of 725 hotel representatives about their business relationships with tour operators, and 1,224 corporate client representatives about their business relationships with hotels (21.9 % and 38.8 % response rate, respectively). The analysis contributes to enhance our understanding on the linkages between RQ and its determinants, and identifies the role of their dimensions. Structural equation modelling results highlight trust as the dominant dimension, the crucial role of commitment and satisfaction, and suggest customer orientation as complementary building block. Findings also emphasise problem solving behaviour and selling orientation as the most relevant dimensions of customer orientation. The comparison of the two ‘dyads’ deepens the discussion and enriches the suggested theoretical and managerial guidelines concerning the contribution of quality relationships to business performance.

Keywords: corporate clients, destination competitiveness, hotels, relationship quality, structural equations modelling, tour operators

Procedia PDF Downloads 390
24496 A Policy Strategy for Building Energy Data Management in India

Authors: Shravani Itkelwar, Deepak Tewari, Bhaskar Natarajan

Abstract:

The energy consumption data plays a vital role in energy efficiency policy design, implementation, and impact assessment. Any demand-side energy management intervention's success relies on the availability of accurate, comprehensive, granular, and up-to-date data on energy consumption. The Building sector, including residential and commercial, is one of the largest consumers of energy in India after the Industrial sector. With economic growth and increasing urbanization, the building sector is projected to grow at an unprecedented rate, resulting in a 5.6 times escalation in energy consumption till 2047 compared to 2017. Therefore, energy efficiency interventions will play a vital role in decoupling the floor area growth and associated energy demand, thereby increasing the need for robust data. In India, multiple institutions are involved in the collection and dissemination of data. This paper focuses on energy consumption data management in the building sector in India for both residential and commercial segments. It evaluates the robustness of data available through administrative and survey routes to estimate the key performance indicators and identify critical data gaps for making informed decisions. The paper explores several issues in the data, such as lack of comprehensiveness, non-availability of disaggregated data, the discrepancy in different data sources, inconsistent building categorization, and others. The identified data gaps are justified with appropriate examples. Moreover, the paper prioritizes required data in order of relevance to policymaking and groups it into "available," "easy to get," and "hard to get" categories. The paper concludes with recommendations to address the data gaps by leveraging digital initiatives, strengthening institutional capacity, institutionalizing exclusive building energy surveys, and standardization of building categorization, among others, to strengthen the management of building sector energy consumption data.

Keywords: energy data, energy policy, energy efficiency, buildings

Procedia PDF Downloads 182
24495 Ethnic Identity Formation in Diaspora of Bajau Samah: An Ethnomusicological Study of Bertitik Music Ensemble in the Northwest Coast of Sabah, Malaysia

Authors: Mohd Hassan Abdullah, Mohd Azam Sulong, Mohd Nizam Nasrifan, Nor Azman Mohd Ramli, Suflan Faidzal Arshad

Abstract:

The Bajau Samah is a maritime ethnic community that inhabits the west coast of Sabah, Malaysia. The majority of these ethnicities embrace Islam and practice their own culture. Bertitik music ensemble is one of the musical practices performed in various social events, especially weddings. The ensemble, which combines several musical instruments including gongs, drums and kulintangan is played by six musicians to accompany various social events in the community. The position of the Bajau Samah in a multi-ethnic community such as Kadazandusun, Rungus, Suluk, Malay, Iranun and others exposes to the cultural activities with various artistic elements of the surrounding community. Western influences have also played an important role in the process of hybridity and acculturation in this society. Cultural change and the influx of foreign cultures have threatened the sustainability of this musical practice. This study aims to musicologically analyze the elements of bertitik ensemble that form the uniqueness of the cultural identity of the Bajau Samah Ethnic group. An ethnomusicological approach has been used to parse the essence of the bertitik music repertoire in depth. Ethnographic study design which comprises fieldwork, interviews, observations and document analysis as the main methods were utilized to collect data. Music recordings were transcribed in the form of musical notation and then analyzed based on the theory of "the norms of musical styles". This study reveals that musical elements featured in the ensemble represent the symbol and cultural identity to this ethnic group. The findings of the study were documented in the form of musicological analysis, audio and video as well as transcriptions of the musical notation of the repertoire of the music ensemble. This study is in line with the National cultural policy gazetted by the government, which is "Conservation, preservation and development of culture towards strengthening the foundations of National Culture through joint research, development, education, expansion and cultural relations" It will benefit various parties including students, teachers, academics, cultural arts activists and so on towards preserving the nation's cultural heritage as well as strengthening the spirit of nationhood among the people of various races and ethnic group in Malaysia.

Keywords: ethnomusicology, ethnic music, Malaysian music, cultural identity

Procedia PDF Downloads 136
24494 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing

Procedia PDF Downloads 255
24493 Wind Speed Data Analysis in Colombia in 2013 and 2015

Authors: Harold P. Villota, Alejandro Osorio B.

Abstract:

The energy meteorology is an area for study energy complementarity and the use of renewable sources in interconnected systems. Due to diversify the energy matrix in Colombia with wind sources, is necessary to know the data bases about this one. However, the time series given by 260 automatic weather stations have empty, and no apply data, so the purpose is to fill the time series selecting two years to characterize, impute and use like base to complete the data between 2005 and 2020.

Keywords: complementarity, wind speed, renewable, colombia, characteri, characterization, imputation

Procedia PDF Downloads 161
24492 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis

Authors: Hyun-Woo Cho

Abstract:

Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.

Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques

Procedia PDF Downloads 384
24491 A Sequence of Traumatic Pain: Feminist Issues within Laila Al-Othman’s Ṣamt al-Farāshāt (Silence of the Butterflies)

Authors: Khaled Igbaria

Abstract:

Laila Al-Othman is a well-known feminist writer in Kuwait and the entire Arab world. She was born in 1943 in Kuwait to a large and wealthy family. The author has written several short stories, as well as novels, such as The Woman and the Cat (1985) and Wasumayya Comes out of the Sea (1986), which was chosen as one of the best 100 Arab novels of the 21st century. Another prominent novel of hers is Ṣamt al-Farāshāt [Silence of the Butterflies] (2007), which was highly controversial in her native Kuwait upon publication. For this study, her engagement in feminism was achieved by exploring the different ways in which her novel, Ṣamt al-Farāshāt [Silence of the Butterflies], addresses several feminist issues, mainly forced marriage, rape and sexual abuse, gender-based physical, sexual violence, and enforced silence. This paper focuses on demonstrating social obstacles and continuous trauma caused by a sequence of pain experienced by Arab females in their patriarchal society. This study argues that the novel reveals a sustained effort to raise the banner of feminism and a strong desire to liberate Arab women from patriarchal domination. Al-Othman successfully and uniquely represents women as gender-based traumatic victims of sexual and physical violence, forced silence, and general oppression in the patriarchal Arab society, as those needing help, support, protection, and liberation. They are not represented as independent or free. Methodologically, the study employs a qualitative literary analysis method in addition to trauma theory psychoanalysis, concentrating on feminist issues highlighted in the novel.

Keywords: Al-Othman, Arab women pain, trauma within narration., Silence of the Butterflies

Procedia PDF Downloads 57
24490 Recommender System Based on Mining Graph Databases for Data-Intensive Applications

Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi

Abstract:

In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.

Keywords: graph databases, NLP, recommendation systems, similarity metrics

Procedia PDF Downloads 102
24489 Digital Revolution a Veritable Infrastructure for Technological Development

Authors: Osakwe Jude Odiakaosa

Abstract:

Today’s digital society is characterized by e-education or e-learning, e-commerce, and so on. All these have been propelled by digital revolution. Digital technology such as computer technology, Global Positioning System (GPS) and Geographic Information System (GIS) has been having a tremendous impact on the field of technology. This development has positively affected the scope, methods, speed of data acquisition, data management and the rate of delivery of the results (map and other map products) of data processing. This paper tries to address the impact of revolution brought by digital technology.

Keywords: digital revolution, internet, technology, data management

Procedia PDF Downloads 446
24488 Effect of Phenolic Compounds on Off-Odor Development and Oxidative Stability of Camel Meat during Refrigerated Storage

Authors: Sajid Maqsood, Aysha Al Rashedi, Aisha Abushelaibi, Kusaimah Manheem

Abstract:

Impact of different natural antioxidants on lipid oxidation, microbial load and sensorial quality in ground camel meat (leg region) during 9 days of refrigerated storage were investigated. Control camel meat showed higher lipid oxidation products (Peroxide value and Thiobarbituric acid reactive substances (TBARS)) during the storage period. Upon addition of different natural antioxidants PV and TBARS were retarded, especially in samples added with tannic acid (TA), catechin (CT) and gallic acid (GA) (p<0.05). Haem iron content decreased with increasing storage period and was found to be lower in samples added with caffeic acid (CA) and gallic acid (GA) at the end of storage period (p<0.05). Furthermore, lower mesophilic bacterial count (MBC) and psychrophilic bacterial counts (PBC) were observed in TA and CT treated samples compared to control and other samples (p<0.05). Camel meat treated with TA and CT also received higher likeness scores for colour, odor and overall appearance compared to control samples (p<0.05). Therefore, adding different natural antioxidants especially TA and CT showed retarding effect on lipid oxidation and microbial growth and were also effective in maintaining sensory attributes (color and odor) of ground camel meat during storage at 4°C. Hence, TA and CT could be considered as the potential natural antioxidant for preserving the quality of the camel meat displayed at refrigerated shelves.

Keywords: natural antioxidants, lipid oxidation, quality, camel meat

Procedia PDF Downloads 431
24487 Implementation of Big Data Concepts Led by the Business Pressures

Authors: Snezana Savoska, Blagoj Ristevski, Violeta Manevska, Zlatko Savoski, Ilija Jolevski

Abstract:

Big data is widely accepted by the pharmaceutical companies as a result of business demands create through legal pressure. Pharmaceutical companies have many legal demands as well as standards’ demands and have to adapt their procedures to the legislation. To manage with these demands, they have to standardize the usage of the current information technology and use the latest software tools. This paper highlights some important aspects of experience with big data projects implementation in a pharmaceutical Macedonian company. These projects made improvements of their business processes by the help of new software tools selected to comply with legal and business demands. They use IT as a strategic tool to obtain competitive advantage on the market and to reengineer the processes towards new Internet economy and quality demands. The company is required to manage vast amounts of structured as well as unstructured data. For these reasons, they implement projects for emerging and appropriate software tools which have to deal with big data concepts accepted in the company.

Keywords: big data, unstructured data, SAP ERP, documentum

Procedia PDF Downloads 265
24486 The Human Process of Trust in Automated Decisions and Algorithmic Explainability as a Fundamental Right in the Exercise of Brazilian Citizenship

Authors: Paloma Mendes Saldanha

Abstract:

Access to information is a prerequisite for democracy while also guiding the material construction of fundamental rights. The exercise of citizenship requires knowing, understanding, questioning, advocating for, and securing rights and responsibilities. In other words, it goes beyond mere active electoral participation and materializes through awareness and the struggle for rights and responsibilities in the various spaces occupied by the population in their daily lives. In times of hyper-cultural connectivity, active citizenship is shaped through ethical trust processes, most often established between humans and algorithms. Automated decisions, so prevalent in various everyday situations, such as purchase preference predictions, virtual voice assistants, reduction of accidents in autonomous vehicles, content removal, resume selection, etc., have already found their place as a normalized discourse that sometimes does not reveal or make clear what violations of fundamental rights may occur when algorithmic explainability is lacking. In other words, technological and market development promotes a normalization for the use of automated decisions while silencing possible restrictions and/or breaches of rights through a culturally modeled, unethical, and unexplained trust process, which hinders the possibility of the right to a healthy, transparent, and complete exercise of citizenship. In this context, the article aims to identify the violations caused by the absence of algorithmic explainability in the exercise of citizenship through the construction of an unethical and silent trust process between humans and algorithms in automated decisions. As a result, it is expected to find violations of constitutionally protected rights such as privacy, data protection, and transparency, as well as the stipulation of algorithmic explainability as a fundamental right in the exercise of Brazilian citizenship in the era of virtualization, facing a threefold foundation called trust: culture, rules, and systems. To do so, the author will use a bibliographic review in the legal and information technology fields, as well as the analysis of legal and official documents, including national documents such as the Brazilian Federal Constitution, as well as international guidelines and resolutions that address the topic in a specific and necessary manner for appropriate regulation based on a sustainable trust process for a hyperconnected world.

Keywords: artificial intelligence, ethics, citizenship, trust

Procedia PDF Downloads 56
24485 Saving Energy at a Wastewater Treatment Plant through Electrical and Production Data Analysis

Authors: Adriano Araujo Carvalho, Arturo Alatrista Corrales

Abstract:

This paper intends to show how electrical energy consumption and production data analysis were used to find opportunities to save energy at Taboada wastewater treatment plant in Callao, Peru. In order to access the data, it was used independent data networks for both electrical and process instruments, which were taken to analyze under an ISO 50001 energy audit, which considered, thus, Energy Performance Indexes for each process and a step-by-step guide presented in this text. Due to the use of aforementioned methodology and data mining techniques applied on information gathered through electronic multimeters (conveniently placed on substation switchboards connected to a cloud network), it was possible to identify thoroughly the performance of each process and thus, evidence saving opportunities which were previously hidden before. The data analysis brought both costs and energy reduction, allowing the plant to save significant resources and to be certified under ISO 50001.

Keywords: energy and production data analysis, energy management, ISO 50001, wastewater treatment plant energy analysis

Procedia PDF Downloads 191
24484 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network

Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar

Abstract:

Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.

Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network

Procedia PDF Downloads 513
24483 Excision and Reconstruction of a Hypertrophic and Functional Bleb with Bovine Pericardium (Tutopatch®) and Amniotic Membrane: A Case Report

Authors: Blanca Fatela Cantillo, Silvia Iglesias Cerrato, Guadalupe Garrido Ceca

Abstract:

Purpose: Bleb dysfunction is a late complication following glaucoma filtration surgery. We describe our surgical technique for excision and reconstruction of a hypertrophic bleb complication using bovine pericardium patch graft (Tutopatch®) and amniotic membrane. Material and methods: The case report presents a hypertrophic bleb over the cornea with good intraocular pressure control. The hanging bleb without leak caused dysesthesia and high irregular astigmatism. Bleb reconstruction involved the excision of corneal fibrous material and avascular conjunctiva, preserving the original scleral and tennon. Bovine pericardium patch graft (Tutopatch®) was sited over these with fixed sutures, reinforcing the underlying scleral, and the conjunctiva advanced. The superior epithelium corneal defect was covered using an amniotic membrane. Conclusion: Repair of bleb dysfunction with varied techniques has been reported, including conjunctival advancement, use of scleral patch graft, dural patch graft, or pericardium. Additional use of amniotic membrane promotes epithelialization and exhibits anti-fibrotic and anti-inflammatory features. Reconstruction with bovine pericardium patch graft and amniotic membrane resulted in pain relief, visual rehabilitation, and good aesthetic results, with preservation of bleb function.

Keywords: reconstruction, hypertrophic bleb, bovine pericardium, amniotic membrane, dysesthesia of the bleb

Procedia PDF Downloads 73
24482 Review and Comparison of Associative Classification Data Mining Approaches

Authors: Suzan Wedyan

Abstract:

Data mining is one of the main phases in the Knowledge Discovery Database (KDD) which is responsible of finding hidden and useful knowledge from databases. There are many different tasks for data mining including regression, pattern recognition, clustering, classification, and association rule. In recent years a promising data mining approach called associative classification (AC) has been proposed, AC integrates classification and association rule discovery to build classification models (classifiers). This paper surveys and critically compares several AC algorithms with reference of the different procedures are used in each algorithm, such as rule learning, rule sorting, rule pruning, classifier building, and class allocation for test cases.

Keywords: associative classification, classification, data mining, learning, rule ranking, rule pruning, prediction

Procedia PDF Downloads 532
24481 Preserving the Cultural Values of the Mararoa River and Waipuna–Freshwater Springs, Southland New Zealand: An Integration of Traditional and Scientific Knowledge

Authors: Erine van Niekerk, Jason Holland

Abstract:

In Māori culture water is considered to be the foundation of all life and has its own mana (spiritual power) and mauri (life force). Water classification for cultural values therefore includes categories like waitapu (sacred water), waimanawa-whenua (water from under the land), waipuna (freshwater springs), the relationship between water quantity and quality and the relationship between surface and groundwater. Particular rivers and lakes have special significance to iwi and hapu for their rohe (tribal areas). The Mararoa River, including its freshwater springs and wetlands, is an example of such an area. There is currently little information available about the sources, characteristics and behavior of these important water resources and this study on the water quality of the Mararoa River and adjacent freshwater springs will provide valuable information to be used in informed decisions about water management. The regional council of Southland, Environment Southland, is required to make changes under their water quality policy in order to comply with the requirements for the New National Standards for Freshwater to consult with Maori to determine strategies for decision making. This requires an approach that includes traditional knowledge combined with scientific knowledge in the decision-making process. This study provided the scientific data that can be used in future for decision making on fresh water springs combined with traditional values for this particular area. Several parameters have been tested in situ as well as in a laboratory. Parameters such as temperature, salinity, electrical conductivity, Total Dissolved Solids, Total Kjeldahl Nitrogen, Total Phosphorus, Total Suspended Solids, and Escherichia coli among others show that recorded values of all test parameters fall within recommended ANZECC guidelines and Environment Southland standards and do not raise any concerns for the water quality of the springs and the river at the moment. However, the destruction of natural areas, particularly due to changes in farming practices, and the changes to water quality by the introduction of Didymosphenia geminate (Didymo) means Māori have already lost many of their traditional mahinga kai (food sources). There is a major change from land use such as sheep farming to dairying in Southland which puts freshwater resources under pressure. It is, therefore, important to draw on traditional knowledge and spirituality alongside scientific knowledge to protect the waters of the Mararoa River and waipuna. This study hopes to contribute to scientific knowledge to preserve the cultural values of these significant waters.

Keywords: cultural values, freshwater springs, Maori, water quality

Procedia PDF Downloads 278
24480 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 334
24479 An Observation of the Information Technology Research and Development Based on Article Data Mining: A Survey Study on Science Direct

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

One of the most important factors of research and development is the deep insight into the evolutions of scientific development. The state-of-the-art tools and instruments can considerably assist the researchers, and many of the world organizations have become aware of the advantages of data mining for the acquisition of the knowledge required for the unstructured data. This paper was an attempt to review the articles on the information technology published in the past five years with the aid of data mining. A clustering approach was used to study these articles, and the research results revealed that three topics, namely health, innovation, and information systems, have captured the special attention of the researchers.

Keywords: information technology, data mining, scientific development, clustering

Procedia PDF Downloads 274
24478 Security in Resource Constraints: Network Energy Efficient Encryption

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless nodes in a sensor network gather and process critical information designed to process and communicate, information flooding through such network is critical for decision making and data processing, the integrity of such data is one of the most critical factors in wireless security without compromising the processing and transmission capability of the network. This paper presents mechanism to securely transmit data over a chain of sensor nodes without compromising the throughput of the network utilizing available battery resources available at the sensor node.

Keywords: hybrid protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node data processing, Z-MAC

Procedia PDF Downloads 142
24477 Data Mining Techniques for Anti-Money Laundering

Authors: M. Sai Veerendra

Abstract:

Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project on developing a new data mining solution for AML Units in an international investment bank in Ireland, we survey recent data mining approaches for AML. In this paper, we present not only these approaches but also give an overview on the important factors in building data mining solutions for AML activities.

Keywords: data mining, clustering, money laundering, anti-money laundering solutions

Procedia PDF Downloads 531
24476 Digitizing Masterpieces in Italian Museums: Techniques, Challenges and Consequences from Giotto to Caravaggio

Authors: Ginevra Addis

Abstract:

The possibility of reproducing physical artifacts in a digital format is one of the opportunities offered by the technological advancements in information and communication most frequently promoted by museums. Indeed, the study and conservation of our cultural heritage have seen significant advancement due to the three-dimensional acquisition and modeling technology. A variety of laser scanning systems has been developed, based either on optical triangulation or on time-of-flight measurement, capable of producing digital 3D images of complex structures with high resolution and accuracy. It is necessary, however, to explore the challenges and opportunities that this practice brings within museums. The purpose of this paper is to understand what change is introduced by digital techniques in those museums that are hosting digital masterpieces. The methodology used will investigate three distinguished Italian exhibitions, related to the territory of Milan, trying to analyze the following issues about museum practices: 1) how digitizing art masterpieces increases the number of visitors; 2) what the need that calls for the digitization of artworks; 3) which techniques are most used; 4) what the setting is; 5) the consequences of a non-publication of hard copies of catalogues; 6) envision of these practices in the future. Findings will show how interconnection plays an important role in rebuilding a collection spread all over the world. Secondly how digital artwork duplication and extension of reality entail new forms of accessibility. Thirdly, that collection and preservation through digitization of images have both a social and educational mission. Fourthly, that convergence of the properties of different media (such as web, radio) is key to encourage people to get actively involved in digital exhibitions. The present analysis will suggest further research that should create museum models and interaction spaces that act as catalysts for innovation.

Keywords: digital masterpieces, education, interconnection, Italian museums, preservation

Procedia PDF Downloads 173
24475 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data

Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee

Abstract:

Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.

Keywords: data mining, evaluating new technology, technology opportunity, patent analysis

Procedia PDF Downloads 369
24474 Anomaly Detection Based on System Log Data

Authors: M. Kamel, A. Hoayek, M. Batton-Hubert

Abstract:

With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.

Keywords: logs, anomaly detection, ML, scoring, NLP

Procedia PDF Downloads 91
24473 Innovations and Challenges: Multimodal Learning in Cybersecurity

Authors: Tarek Saadawi, Rosario Gennaro, Jonathan Akeley

Abstract:

There is rapidly growing demand for professionals to fill positions in Cybersecurity. This is recognized as a national priority both by government agencies and the private sector. Cybersecurity is a very wide technical area which encompasses all measures that can be taken in an electronic system to prevent criminal or unauthorized use of data and resources. This requires defending computers, servers, networks, and their users from any kind of malicious attacks. The need to address this challenge has been recognized globally but is particularly acute in the New York metropolitan area, home to some of the largest financial institutions in the world, which are prime targets of cyberattacks. In New York State alone, there are currently around 57,000 jobs in the Cybersecurity industry, with more than 23,000 unfilled positions. The Cybersecurity Program at City College is a collaboration between the Departments of Computer Science and Electrical Engineering. In Fall 2020, The City College of New York matriculated its first students in theCybersecurity Master of Science program. The program was designed to fill gaps in the previous offerings and evolved out ofan established partnership with Facebook on Cybersecurity Education. City College has designed a program where courses, curricula, syllabi, materials, labs, etc., are developed in cooperation and coordination with industry whenever possible, ensuring that students graduating from the program will have the necessary background to seamlessly segue into industry jobs. The Cybersecurity Program has created multiple pathways for prospective students to obtain the necessary prerequisites to apply in order to build a more diverse student population. The program can also be pursued on a part-time basis which makes it available to working professionals. Since City College’s Cybersecurity M.S. program was established to equip students with the advanced technical skills needed to thrive in a high-demand, rapidly-evolving field, it incorporates a range of pedagogical formats. From its outset, the Cybersecurity program has sought to provide both the theoretical foundations necessary for meaningful work in the field along with labs and applied learning projects aligned with skillsets required by industry. The efforts have involved collaboration with outside organizations and with visiting professors designing new courses on topics such as Adversarial AI, Data Privacy, Secure Cloud Computing, and blockchain. Although the program was initially designed with a single asynchronous course in the curriculum with the rest of the classes designed to be offered in-person, the advent of the COVID-19 pandemic necessitated a move to fullyonline learning. The shift to online learning has provided lessons for future development by providing examples of some inherent advantages to the medium in addition to its drawbacks. This talk will address the structure of the newly-implemented Cybersecurity Master’s Program and discuss the innovations, challenges, and possible future directions.

Keywords: cybersecurity, new york, city college, graduate degree, master of science

Procedia PDF Downloads 142
24472 EnumTree: An Enumerative Biclustering Algorithm for DNA Microarray Data

Authors: Haifa Ben Saber, Mourad Elloumi

Abstract:

In a number of domains, like in DNA microarray data analysis, we need to cluster simultaneously rows (genes) and columns (conditions) of a data matrix to identify groups of constant rows with a group of columns. This kind of clustering is called biclustering. Biclustering algorithms are extensively used in DNA microarray data analysis. More effective biclustering algorithms are highly desirable and needed. We introduce a new algorithm called, Enumerative tree (EnumTree) for biclustering of binary microarray data. is an algorithm adopting the approach of enumerating biclusters. This algorithm extracts all biclusters consistent good quality. The main idea of ​​EnumLat is the construction of a new tree structure to represent adequately different biclusters discovered during the process of enumeration. This algorithm adopts the strategy of all biclusters at a time. The performance of the proposed algorithm is assessed using both synthetic and real DNA micryarray data, our algorithm outperforms other biclustering algorithms for binary microarray data. Biclusters with different numbers of rows. Moreover, we test the biological significance using a gene annotation web tool to show that our proposed method is able to produce biologically relevent biclusters.

Keywords: DNA microarray, biclustering, gene expression data, tree, datamining.

Procedia PDF Downloads 369
24471 The Impact of Financial Reporting on Sustainability

Authors: Lynn Ruggieri

Abstract:

The worldwide pandemic has only increased sustainability awareness. The public is demanding that businesses be held accountable for their impact on the environment. While financial data enjoys uniformity in reporting requirements, there are no uniform reporting requirements for non-financial data. Europe is leading the way with some standards being implemented for reporting non-financial sustainability data; however, there is no uniformity globally. And without uniformity, there is not a clear understanding of what information to include and how to disclose it. Sustainability reporting will provide important information to stakeholders and will enable businesses to understand their impact on the environment. Therefore, there is a crucial need for this data. This paper looks at the history of sustainability reporting in the countries of the European Union and throughout the world and makes a case for worldwide reporting requirements for sustainability.

Keywords: financial reporting, non-financial data, sustainability, global financial reporting

Procedia PDF Downloads 174
24470 Investigating the Role of Artificial Intelligence in Developing Creativity in Architecture Education in Egypt: A Case Study of Design Studios

Authors: Ahmed Radwan, Ahmed Abdel Ghaney

Abstract:

This paper delves into the transformative potential of artificial intelligence (AI) in fostering creativity within the domain of architecture education, especially with a specific emphasis on its implications within the Design Studios; the convergence of AI and architectural pedagogy has introduced avenues for redefining the boundaries of creative expression and problem-solving. By harnessing AI-driven tools, students and educators can collaboratively explore a spectrum of design possibilities, stimulate innovative ideation, and engage in multidimensional design processes. This paper investigates the ways in which AI contributes to architectural creativity by facilitating generative design, pattern recognition, virtual reality experiences, and sustainable design optimization. Furthermore, the study examines the balance between AI-enhanced creativity and the preservation of core principles of architectural design/education, ensuring that technology is harnessed to augment rather than replace foundational design skills. Through an exploration of Egypt's architectural heritage and contemporary challenges, this research underscores how AI can synergize with cultural context and historical insights to inspire cutting-edge architectural solutions. By analyzing AI's impact on nurturing creativity among Egyptian architecture students, this paper seeks to contribute to the ongoing discourse on the integration of technology within global architectural education paradigms. It is hoped that this research will guide the thoughtful incorporation of AI in fostering creativity while preserving the authenticity and richness of architectural design education in Egypt and beyond.

Keywords: architecture, artificial intelligence, architecture education, Egypt

Procedia PDF Downloads 71
24469 Exploration into Bio Inspired Computing Based on Spintronic Energy Efficiency Principles and Neuromorphic Speed Pathways

Authors: Anirudh Lahiri

Abstract:

Neuromorphic computing, inspired by the intricate operations of biological neural networks, offers a revolutionary approach to overcoming the limitations of traditional computing architectures. This research proposes the integration of spintronics with neuromorphic systems, aiming to enhance computational performance, scalability, and energy efficiency. Traditional computing systems, based on the Von Neumann architecture, struggle with scalability and efficiency due to the segregation of memory and processing functions. In contrast, the human brain exemplifies high efficiency and adaptability, processing vast amounts of information with minimal energy consumption. This project explores the use of spintronics, which utilizes the electron's spin rather than its charge, to create more energy-efficient computing systems. Spintronic devices, such as magnetic tunnel junctions (MTJs) manipulated through spin-transfer torque (STT) and spin-orbit torque (SOT), offer a promising pathway to reducing power consumption and enhancing the speed of data processing. The integration of these devices within a neuromorphic framework aims to replicate the efficiency and adaptability of biological systems. The research is structured into three phases: an exhaustive literature review to build a theoretical foundation, laboratory experiments to test and optimize the theoretical models, and iterative refinements based on experimental results to finalize the system. The initial phase focuses on understanding the current state of neuromorphic and spintronic technologies. The second phase involves practical experimentation with spintronic devices and the development of neuromorphic systems that mimic synaptic plasticity and other biological processes. The final phase focuses on refining the systems based on feedback from the testing phase and preparing the findings for publication. The expected contributions of this research are twofold. Firstly, it aims to significantly reduce the energy consumption of computational systems while maintaining or increasing processing speed, addressing a critical need in the field of computing. Secondly, it seeks to enhance the learning capabilities of neuromorphic systems, allowing them to adapt more dynamically to changing environmental inputs, thus better mimicking the human brain's functionality. The integration of spintronics with neuromorphic computing could revolutionize how computational systems are designed, making them more efficient, faster, and more adaptable. This research aligns with the ongoing pursuit of energy-efficient and scalable computing solutions, marking a significant step forward in the field of computational technology.

Keywords: material science, biological engineering, mechanical engineering, neuromorphic computing, spintronics, energy efficiency, computational scalability, synaptic plasticity.

Procedia PDF Downloads 33