Search results for: privacy preserving analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1087

Search results for: privacy preserving analytics

727 A Lightweight Blockchain: Enhancing Internet of Things Driven Smart Buildings Scalability and Access Control Using Intelligent Direct Acyclic Graph Architecture and Smart Contracts

Authors: Syed Irfan Raza Naqvi, Zheng Jiangbin, Ahmad Moshin, Pervez Akhter

Abstract:

Currently, the IoT system depends on a centralized client-servant architecture that causes various scalability and privacy vulnerabilities. Distributed ledger technology (DLT) introduces a set of opportunities for the IoT, which leads to practical ideas for existing components at all levels of existing architectures. Blockchain Technology (BCT) appears to be one approach to solving several IoT problems, like Bitcoin (BTC) and Ethereum, which offer multiple possibilities. Besides, IoTs are resource-constrained devices with insufficient capacity and computational overhead to process blockchain consensus mechanisms; the traditional BCT existing challenge for IoTs is poor scalability, energy efficiency, and transaction fees. IOTA is a distributed ledger based on Direct Acyclic Graph (DAG) that ensures M2M micro-transactions are free of charge. IOTA has the potential to address existing IoT-related difficulties such as infrastructure scalability, privacy and access control mechanisms. We proposed an architecture, SLDBI: A Scalable, lightweight DAG-based Blockchain Design for Intelligent IoT Systems, which adapts the DAG base Tangle and implements a lightweight message data model to address the IoT limitations. It enables the smooth integration of new IoT devices into a variety of apps. SLDBI enables comprehensive access control, energy efficiency, and scalability in IoT ecosystems by utilizing the Masked Authentication Message (MAM) protocol and the IOTA Smart Contract Protocol (ISCP). Furthermore, we suggest proof-of-work (PoW) computation on the full node in an energy-efficient way. Experiments have been carried out to show the capability of a tangle to achieve better scalability while maintaining energy efficiency. The findings show user access control management at granularity levels and ensure scale up to massive networks with thousands of IoT nodes, such as Smart Connected Buildings (SCBDs).

Keywords: blockchain, IOT, direct acyclic graphy, scalability, access control, architecture, smart contract, smart connected buildings

Procedia PDF Downloads 89
726 A Methodology of Using Fuzzy Logics and Data Analytics to Estimate the Life Cycle Indicators of Solar Photovoltaics

Authors: Thor Alexis Sazon, Alexander Guzman-Urbina, Yasuhiro Fukushima

Abstract:

This study outlines the method of how to develop a surrogate life cycle model based on fuzzy logic using three fuzzy inference methods: (1) the conventional Fuzzy Inference System (FIS), (2) the hybrid system of Data Analytics and Fuzzy Inference (DAFIS), which uses data clustering for defining the membership functions, and (3) the Adaptive-Neuro Fuzzy Inference System (ANFIS), a combination of fuzzy inference and artificial neural network. These methods were demonstrated with a case study where the Global Warming Potential (GWP) and the Levelized Cost of Energy (LCOE) of solar photovoltaic (PV) were estimated using Solar Irradiation, Module Efficiency, and Performance Ratio as inputs. The effects of using different fuzzy inference types, either Sugeno- or Mamdani-type, and of changing the number of input membership functions to the error between the calibration data and the model-generated outputs were also illustrated. The solution spaces of the three methods were consequently examined with a sensitivity analysis. ANFIS exhibited the lowest error while DAFIS gave slightly lower errors compared to FIS. Increasing the number of input membership functions helped with error reduction in some cases but, at times, resulted in the opposite. Sugeno-type models gave errors that are slightly lower than those of the Mamdani-type. While ANFIS is superior in terms of error minimization, it could generate solutions that are questionable, i.e. the negative GWP values of the Solar PV system when the inputs were all at the upper end of their range. This shows that the applicability of the ANFIS models highly depends on the range of cases at which it was calibrated. FIS and DAFIS generated more intuitive trends in the sensitivity runs. DAFIS demonstrated an optimal design point wherein increasing the input values does not improve the GWP and LCOE anymore. In the absence of data that could be used for calibration, conventional FIS presents a knowledge-based model that could be used for prediction. In the PV case study, conventional FIS generated errors that are just slightly higher than those of DAFIS. The inherent complexity of a Life Cycle study often hinders its widespread use in the industry and policy-making sectors. While the methodology does not guarantee a more accurate result compared to those generated by the Life Cycle Methodology, it does provide a relatively simpler way of generating knowledge- and data-based estimates that could be used during the initial design of a system.

Keywords: solar photovoltaic, fuzzy logic, inference system, artificial neural networks

Procedia PDF Downloads 138
725 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 105
724 Evaluating Global ‘Thing’ Security of Consumer Products

Authors: Achutha Raman

Abstract:

Today's brave new world features a bonanza of digitally interconnected products, or ‘things,’ that improve convenience, possibilities, and in some cases efficiency for consumers. Nonetheless, even as the market accelerates, this Internet of ‘things’ is subject to substantial leakage of consumer personal data. First defining the fluid concept of ‘things,’ this paper subsequently uses case studies taken from the EU, Asia, and the US, to highlight large gaps and comprehensively evaluate the state of security for consumer ‘things.’ Ultimately, this paper offers several ways of improving the present status quo, and especially focuses on an evaluative approach that augments the standard mechanism of Firmware Over the Air Updates, and ought to be easily implementable.

Keywords: cybersecurity, FOTA, Internet of Things, transnational privacy

Procedia PDF Downloads 194
723 FPGA Implementation of the BB84 Protocol

Authors: Jaouadi Ikram, Machhout Mohsen

Abstract:

The development of a quantum key distribution (QKD) system on a field-programmable gate array (FPGA) platform is the subject of this paper. A quantum cryptographic protocol is designed based on the properties of quantum information and the characteristics of FPGAs. The proposed protocol performs key extraction, reconciliation, error correction, and privacy amplification tasks to generate a perfectly secret final key. We modeled the presence of the spy in our system with a strategy to reveal some of the exchanged information without being noticed. Using an FPGA card with a 100 MHz clock frequency, we have demonstrated the evolution of the error rate as well as the amounts of mutual information (between the two interlocutors and that of the spy) passing from one step to another in the key generation process.

Keywords: QKD, BB84, protocol, cryptography, FPGA, key, security, communication

Procedia PDF Downloads 159
722 Tourism Management of the Heritage and Archaeological Sites in Egypt

Authors: Sabry A. El Azazy

Abstract:

The archaeological heritage sites are one of the most important touristic attractions worldwide. Egypt has various archaeological sites and historical locations that are classified within the list of the archaeological heritage destinations in the world, such as Cairo, Luxor, Aswan, Alexandria, and Sinai. This study focuses on how to manage the archaeological sites and provide them with all services according to the traveler's needs. Tourism management depends on strategic planning for supporting the national economy and sustainable development. Additionally, tourism management has to utilize highly effective standards of security, promotion, advertisement, sales, and marketing while taking into consideration the preservation of monuments. In Egypt, the archaeological heritage sites must be well-managed and protected, which would assist tourism management, especially in times of crisis. Recently, the monumental places and archeological heritage sites were affected by unstable conditions and were threatened. It is essential to focus on preserving our heritage. Moreover, more efforts and cooperation between the tourism organizations and ministry of archaeology have to be done in order to protect the archaeology and promote the tourism industry. Methodology: Qualitative methods have been used as the overall approach to this study. Interviews and observations have provided the researcher with the required in-depth insight to the research subject. The researcher was a lecturer of tourist guidance that allows visiting all historical sites in Egypt. Additionally, the researcher had the privilege to communicate with tourism specialists and attend meetings, conferences, and events that were focused on the research subject. Objectives: The main purpose of the research was gaining information in order to develop theoretical research on how to effectively benefit out of those historical sights both economically and culturally, and pursue further researches and scientific studies to be well-suited for tourism and hospitality sector. The researcher works hard to present further studies in a field related to tourism and archaeological heritage using previous experience. Pursing this course of study enables the researcher to acquire the necessary abilities and competencies to achieve the set goal successfully. Results: The professional tourism management focus on making Egypt one of the most important destinations in the world, and provide the heritage and archaeological sites with all services that will place those locations into the international map of tourism. Tourists interested in visiting Egypt and making tourism flourish supports and strengths Egypt's national economy and the local community, taking into consideration preserving our heritage and archaeology. Conclusions: Egypt has many tourism attractions represented in the heritage, archaeological sites, and touristic places. These places need more attention and efforts to be included in tourism programs and be opened for visitors from all over the world. These efforts will encourage both local and international tourism to see our great civilization and provide different touristic activities.

Keywords: archaeology, archaeological sites, heritage, ministry of archaeology, national economy, touristic attractions, tourism management, tourism organizations

Procedia PDF Downloads 123
721 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity

Authors: Hoda A. Abdel Hafez

Abstract:

Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.

Keywords: mining big data, big data, machine learning, telecommunication

Procedia PDF Downloads 374
720 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 51
719 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 138
718 Inferring Cognitive Skill in Concept Space

Authors: Rania A. Aboalela, Javed I. Khan

Abstract:

This research presents a learning assessment theory of Cognitive Skill in Concept Space (CS2) to measure the assessed knowledge in terms of cognitive skill levels of the concepts. The cognitive skill levels refer to levels such as if a student has acquired the state at the level of understanding, or applying, or analyzing, etc. The theory is comprised of three constructions: Graph paradigm of a semantic/ ontological scheme, the concept states of the theory and the assessment analytics which is the process to estimate the sets of concept state at a certain skill level. Concept state means if a student has already learned, or is ready to learn, or is not ready to learn a certain skill level. The experiment is conducted to prove the validation of the theory CS2.

Keywords: cognitive skill levels, concept states, concept space, knowledge assessment theory

Procedia PDF Downloads 294
717 A Case Study on the Impact of Technology Readiness in a Department of Clinical Nurses

Authors: Julie Delany

Abstract:

To thrive in today’s digital climate, it is vital that organisations adopt new technology and prepare for rising digital trends. This proves more difficult in government where, traditionally, people lack change readiness. While individuals may have a desire to work smarter, this does not necessarily mean embracing technology. This paper discusses the rollout of an application into a small department of highly experienced nurses. The goal was to both streamline the department's workflow and provide a platform for gathering essential business metrics. The biggest challenges were adoption and motivating the nurses to change their routines and learn new computer skills. Two-thirds struggled with the change, and as a result, some jeopardised the validity of the business metrics. In conclusion, there are lessons learned and recommendations for similar projects.

Keywords: change ready, information technology, end-user, iterative method, rollout plan, data analytics

Procedia PDF Downloads 114
716 Sharing Personal Information for Connection: The Effect of Social Exclusion on Consumer Self-Disclosure to Brands

Authors: Jiyoung Lee, Andrew D. Gershoff, Jerry Jisang Han

Abstract:

Most extant research on consumer privacy concerns and their willingness to share personal data has focused on contextual factors (e.g., types of information collected, type of compensation) that lead to consumers’ personal information disclosure. Unfortunately, the literature lacks a clear understanding of how consumers’ incidental psychological needs may influence consumers’ decisions to share their personal information with companies or brands. In this research, we investigate how social exclusion, which is an increasing societal problem, especially since the onset of the COVID-19 pandemic, leads to increased information disclosure intentions for consumers. Specifically, we propose and find that when consumers become socially excluded, their desire for social connection increases, and this desire leads to a greater willingness to disclose their personal information with firms. The motivation to form and maintain interpersonal relationships is one of the most fundamental human needs, and many researchers have found that deprivation of belongingness has negative consequences. Given the negative effects of social exclusion and the universal need to affiliate with others, people respond to exclusion with a motivation for social reconnection, resulting in various cognitive and behavioral consequences, such as paying greater attention to social cues and conforming to others. Here, we propose personal information disclosure as another form of behavior that can satisfy such social connection needs. As self-disclosure can serve as a strategic tool in creating and developing social relationships, those who have been socially excluded and thus have greater social connection desires may be more willing to engage in self-disclosure behavior to satisfy such needs. We conducted four experiments to test how feelings of social exclusion can influence the extent to which consumers share their personal information with brands. Various manipulations and measures were used to demonstrate the robustness of our effects. Through the four studies, we confirmed that (1) consumers who have been socially excluded show greater willingness to share their personal information with brands and that (2) such an effect is driven by the excluded individuals’ desire for social connection. Our findings shed light on how the desire for social connection arising from exclusion influences consumers’ decisions to disclose their personal information to brands. We contribute to the consumer disclosure literature by uncovering a psychological need that influences consumers’ disclosure behavior. We also extend the social exclusion literature by demonstrating that exclusion influences not only consumers’ choice of products but also their decision to disclose personal information to brands.

Keywords: consumer-brand relationship, consumer information disclosure, consumer privacy, social exclusion

Procedia PDF Downloads 79
715 Real Time Multi Person Action Recognition Using Pose Estimates

Authors: Aishrith Rao

Abstract:

Human activity recognition is an important aspect of video analytics, and many approaches have been recommended to enable action recognition. In this approach, the model is used to identify the action of the multiple people in the frame and classify them accordingly. A few approaches use RNNs and 3D CNNs, which are computationally expensive and cannot be trained with the small datasets which are currently available. Multi-person action recognition has been performed in order to understand the positions and action of people present in the video frame. The size of the video frame can be adjusted as a hyper-parameter depending on the hardware resources available. OpenPose has been used to calculate pose estimate using CNN to produce heap-maps, one of which provides skeleton features, which are basically joint features. The features are then extracted, and a classification algorithm can be applied to classify the action.

Keywords: human activity recognition, computer vision, pose estimates, convolutional neural networks

Procedia PDF Downloads 113
714 RFID and Intelligence: A Smart Authentication Method for Blind People​

Authors: V. Vishu, R. Manimegalai

Abstract:

A combination of Intelligence and Radio frequency identification to bring an enhanced authentication method for the improvement of visually challenged people. The main goal is to provide an improved authentication by combining Advanced Encryption Standard algorithm and Intelligence. Here the encryption key will be generated as a combination of intelligent information from sensors and tag values. The main challenges are security, privacy and cost. Besides, the method was created to evaluate the amount of interaction between sensors and significant influence on the level of visually challenged people’s mental and physical states. The proposal is to apply various ideas on independent living or to assist them for a good life.

Keywords: AES, encryption, intelligence, smart key

Procedia PDF Downloads 221
713 A Text Classification Approach Based on Natural Language Processing and Machine Learning Techniques

Authors: Rim Messaoudi, Nogaye-Gueye Gning, François Azelart

Abstract:

Automatic text classification applies mostly natural language processing (NLP) and other AI-guided techniques to automatically classify text in a faster and more accurate manner. This paper discusses the subject of using predictive maintenance to manage incident tickets inside the sociality. It focuses on proposing a tool that treats and analyses comments and notes written by administrators after resolving an incident ticket. The goal here is to increase the quality of these comments. Additionally, this tool is based on NLP and machine learning techniques to realize the textual analytics of the extracted data. This approach was tested using real data taken from the French National Railways (SNCF) company and was given a high-quality result.

Keywords: machine learning, text classification, NLP techniques, semantic representation

Procedia PDF Downloads 67
712 Rule Insertion Technique for Dynamic Cell Structure Neural Network

Authors: Osama Elsarrar, Marjorie Darrah, Richard Devin

Abstract:

This paper discusses the idea of capturing an expert’s knowledge in the form of human understandable rules and then inserting these rules into a dynamic cell structure (DCS) neural network. The DCS is a form of self-organizing map that can be used for many purposes, including classification and prediction. This particular neural network is considered to be a topology preserving network that starts with no pre-structure, but assumes a structure once trained. The DCS has been used in mission and safety-critical applications, including adaptive flight control and health-monitoring in aerial vehicles. The approach is to insert expert knowledge into the DCS before training. Rules are translated into a pre-structure and then training data are presented. This idea has been demonstrated using the well-known Iris data set and it has been shown that inserting the pre-structure results in better accuracy with the same training.

Keywords: neural network, self-organizing map, rule extraction, rule insertion

Procedia PDF Downloads 145
711 Resource Framework Descriptors for Interestingness in Data

Authors: C. B. Abhilash, Kavi Mahesh

Abstract:

Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.

Keywords: RDF, interestingness, knowledge base, semantic data

Procedia PDF Downloads 130
710 Model Order Reduction Using Hybrid Genetic Algorithm and Simulated Annealing

Authors: Khaled Salah

Abstract:

Model order reduction has been one of the most challenging topics in the past years. In this paper, a hybrid solution of genetic algorithm (GA) and simulated annealing algorithm (SA) are used to approximate high-order transfer functions (TFs) to lower-order TFs. In this approach, hybrid algorithm is applied to model order reduction putting in consideration improving accuracy and preserving the properties of the original model which are two important issues for improving the performance of simulation and computation and maintaining the behavior of the original complex models being reduced. Compared to conventional mathematical methods that have been used to obtain a reduced order model of high order complex models, our proposed method provides better results in terms of reducing run-time. Thus, the proposed technique could be used in electronic design automation (EDA) tools.

Keywords: genetic algorithm, simulated annealing, model reduction, transfer function

Procedia PDF Downloads 125
709 Independent Encryption Technique for Mobile Voice Calls

Authors: Nael Hirzalla

Abstract:

The legality of some countries or agencies’ acts to spy on personal phone calls of the public became a hot topic to many social groups’ talks. It is believed that this act is considered an invasion to someone’s privacy. Such act may be justified if it is singling out specific cases but to spy without limits is very unacceptable. This paper discusses the needs for not only a simple and light weight technique to secure mobile voice calls but also a technique that is independent from any encryption standard or library. It then presents and tests one encrypting algorithm that is based of frequency scrambling technique to show fair and delay-free process that can be used to protect phone calls from such spying acts.

Keywords: frequency scrambling, mobile applications, real-time voice encryption, spying on calls

Procedia PDF Downloads 447
708 Artificial Intelligence Ethics: What Business Leaders Need to Consider for the Future

Authors: Kylie Leonard

Abstract:

Investment in artificial intelligence (AI) can be an attractive opportunity for business leaders as there are many easy-to-see benefits. These benefits include task completion rates, overall cost, and better forecasting. Business leaders are often unaware of the challenges that can accompany AI, such as data center costs, access to data, employee acceptance, and privacy concerns. In addition to the benefits and challenges of AI, it is important to practice AI ethics to ensure the safe creation of AI. AI ethics include aspects of algorithm bias, limits in transparency, and surveillance. To be a good business leader, it is critical to address all the considerations involving the challenges of AI and AI ethics.

Keywords: artificial intelligence, artificial intelligence ethics, business leaders, business concerns

Procedia PDF Downloads 119
707 Open Consent And Artificial Intelligence For Health Research in South Africa

Authors: Amy Gooden

Abstract:

Various modes of consent have been utilized in health research, but open consent has not been explored in South Africa’s AI research context. Open consent entails the sharing of data without assurances of privacy and may be seen as an attempt to marry open science with informed consent. Because all potential uses of data are unknown, it has been questioned whether consent can be informed. Instead of trying to adapt existing modes of consent, why not adopt a new perspective? This is what open consent proposes and what this research will explore in AI health research in South Africa.

Keywords: artificial intelligence, consent, health, law, research, South Africa

Procedia PDF Downloads 129
706 A Novel Unconditionally Secure and Lightweight Bipartite Key Agreement Protocol

Authors: Jun Liu

Abstract:

This paper introduces a new bipartite key agreement (2PKA) protocol which provides unconditionally security and lightweight. The unconditional security is stemmed from the known impossibility of distinguishing a particular solution from all possible solutions of an underdetermined system of equations. The indistinguishability prevents an adversary from inferring to the common secret-key even with the access to an unlimited amount of computing capability. This new 2PKA protocol is also lightweight because that the calculation of a common secret-key only makes use of simple modular arithmetic. This information-theoretic 2PKA scheme provides the desired features of Key Confirmation (KC), Session Key (SK) security, Know-Key (KK) security, protection of individual privacy, and uniformly distributed value of a common key under prime modulus.

Keywords: bipartite key agreement, information-theoretic cryptography, perfect security, lightweight

Procedia PDF Downloads 40
705 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal

Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert

Abstract:

We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.

Keywords: computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving

Procedia PDF Downloads 339
704 Intellectual Property Protection of CRISPR Related Technologies

Authors: Zheng Miao, Dennis Fernandez

Abstract:

CRISPR research has the potential to completely transform life science, agriculture, live-stock and the health care industry. The Intellectual Property derived from its research has raised significant attention in the academic as well as the biopharmaceutical industry culminating an urgent need for strategic IP protection. We review the rudimentary concepts and key competitors of CRISPR technologies as well as the paramount strategies for intellectual property protection. Further, we elaborate on prosecution issues related to CRISPR patents as well as possible solutions to various patent laws, interferences and litigation. Finally, we address how the bioinformatics of the CRISPR technology begs an inquiry into issues of privacy and a host of ethical concerns.

Keywords: bioinformatics, CRISPR, biotechnology, intellectual property

Procedia PDF Downloads 231
703 An Approach for Determining and Reducing Vehicle Turnaround Time for Outbound Logistics by Using Critical Path Method

Authors: Prajakta M. Wazat, D. N. Raut

Abstract:

The study consists of a fast moving consumer goods (FMCG) beverage company wherein a portion of the supply chain which deals with outbound logistics is taken for improvement in order to reduce its logistics cost by using critical path method (CPM) method. Logistics is a major portion of the supply chain where customers are not willing to pay as it adds cost to product without adding value. In this study, it is necessary to ensure that products are delivered to clients at the right time while preserving high-quality standards from the beginning to the end of the supply chain. CPM is a logical sequencing method where in the most efficient route is achieved by arranging the series of events. CPM enables to identify a critical factor in order to minimize the delays and interruption by providing a feasible solution.

Keywords: FMCG, supply chain, outbound logistics, vehicle turnaround time, critical path method, cost reduction

Procedia PDF Downloads 141
702 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases

Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar

Abstract:

Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.

Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning

Procedia PDF Downloads 100
701 Cost-Effective, Accuracy Preserving Scalar Characterization for mmWave Transceivers

Authors: Mohammad Salah Abdullatif, Salam Hajjar, Paul Khanna

Abstract:

The development of instrument grade mmWave transceivers comes with many challenges. A general rule of thumb is that the performance of the instrument must be higher than the performance of the unit under test in terms of accuracy and stability. The calibration and characterizing of mmWave transceivers are important pillars for testing commercial products. Using a Vector Network Analyzer (VNA) with a mixer option has proven a high performance as an approach to calibrate mmWave transceivers. However, this approach comes with a high cost. In this work, a reduced-cost method to calibrate mmWave transceivers is proposed. A comparison between the proposed method and the VNA technology is provided. A demonstration of significant challenges is discussed, and an approach to meet the requirements is proposed.

Keywords: mmWave transceiver, scalar characterization, coupler connection, magic tee connection, calibration, VNA, vector network analyzer

Procedia PDF Downloads 86
700 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills

Authors: Kyle De Freitas, Margaret Bernard

Abstract:

Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.

Keywords: educational data mining, learning management system, learning analytics, EDM framework

Procedia PDF Downloads 300
699 People Abandoning Mobile Social Games: Using Candy Crush Saga as an Example

Authors: Pei-Shan Wei, Szu-Ying Lee, Hsi-Peng Lu, Jen-Chuen Tzou, Chien-I Weng

Abstract:

Mobile social games recently become extremely popular, spawning a whole new entertainment culture. However, mobile game players are fickle, quickly and easily picking up and abandoning games. This pilot study seeks to identify factors that influence users to discontinue playing mobile social games. We identified three sacrifices which can prompt users to abandon games: monetary sacrifice, time sacrifice and privacy sacrifice. The results showed that monetary sacrifice has a greater impact than the other two factors in causing players to discontinue usage intention.

Keywords: abandon, mobile devices, mobile social games, perceived sacrifice

Procedia PDF Downloads 472
698 The Use of Emerging Technologies in Higher Education Institutions: A Case of Nelson Mandela University, South Africa

Authors: Ayanda P. Deliwe, Storm B. Watson

Abstract:

The COVID-19 pandemic has disrupted the established practices of higher education institutions (HEIs). Most higher education institutions worldwide had to shift from traditional face-to-face to online learning. The online environment and new online tools are disrupting the way in which higher education is presented. Furthermore, the structures of higher education institutions have been impacted by rapid advancements in information and communication technologies. Emerging technologies should not be viewed in a negative light because, as opposed to the traditional curriculum that worked to create productive and efficient researchers, emerging technologies encourage creativity and innovation. Therefore, using technology together with traditional means will enhance teaching and learning. Emerging technologies in higher education not only change the experience of students, lecturers, and the content, but it is also influencing the attraction and retention of students. Higher education institutions are under immense pressure because not only are they competing locally and nationally, but emerging technologies also expand the competition internationally. Emerging technologies have eliminated border barriers, allowing students to study in the country of their choice regardless of where they are in the world. Higher education institutions are becoming indifferent as technology is finding its way into the lecture room day by day. Academics need to utilise technology at their disposal if they want to get through to their students. Academics are now competing for students' attention with social media platforms such as WhatsApp, Snapchat, Instagram, Facebook, TikTok, and others. This is posing a significant challenge to higher education institutions. It is, therefore, critical to pay attention to emerging technologies in order to see how they can be incorporated into the classroom in order to improve educational quality while remaining relevant in the work industry. This study aims to understand how emerging technologies have been utilised at Nelson Mandela University in presenting teaching and learning activities since April 2020. The primary objective of this study is to analyse how academics are incorporating emerging technologies in their teaching and learning activities. This primary objective was achieved by conducting a literature review on clarifying and conceptualising the emerging technologies being utilised by higher education institutions, reviewing and analysing the use of emerging technologies, and will further be investigated through an empirical analysis of the use of emerging technologies at Nelson Mandela University. Findings from the literature review revealed that emerging technology is impacting several key areas in higher education institutions, such as the attraction and retention of students, enhancement of teaching and learning, increase in global competition, elimination of border barriers, and highlighting the digital divide. The literature review further identified that learning management systems, open educational resources, learning analytics, and artificial intelligence are the most prevalent emerging technologies being used in higher education institutions. The identified emerging technologies will be further analysed through an empirical analysis to identify how they are being utilised at Nelson Mandela University.

Keywords: artificial intelligence, emerging technologies, learning analytics, learner management systems, open educational resources

Procedia PDF Downloads 52