Search results for: data integrity and privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25445

Search results for: data integrity and privacy

24695 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 129
24694 The Design of Imaginable Urban Road Landscape

Authors: Wang Zhenzhen, Wang Xu, Hong Liangping

Abstract:

With the rapid development of cities, the way that people commute has changed greatly, meanwhile, people turn to require more on physical and psychological aspects in the contemporary world. However, the current urban road landscape ignores these changes, for example, those road landscape elements are boring, confusing, fragmented and lack of integrity and hierarchy. Under such current situation, in order to shape beautiful, identifiable and unique road landscape, this article concentrates on the target of imaginability. This paper analyses the main elements of the urban road landscape, the concept of image and its generation mechanism, and then discusses the necessity and connotation of building imaginable urban road landscape as well as the main problems existing in current urban road landscape in terms of imaginability. Finally, this paper proposes how to design imaginable urban road landscape in details based on a specific case.

Keywords: identifiability, imaginability, road landscape, the image of the city

Procedia PDF Downloads 434
24693 The Idea of Reputation in a Post-Truth Era

Authors: Karen Armstrong

Abstract:

This paper considers the importance of acquiring, cultivating, and protecting one’s personal online reputation in a post-truth era. Although the idea of the individual is essential psychological construct, the concept necessarily now includes our online reputation. The idea of this online reputation has expanded to become almost more important than any other factor in terms of our professional, social and psychological development. The discussion will first consider philosophical ideas of the self, followed by an examination of underlying concepts of perception and interpretation in a post-truth world. Then, the idea of the recent shift to a consideration of posted images, through words and photos, in the construction of self, will be discussed. Next, the relation between private personal life and exterior social life, including our reputation in a variety of realms will be addressed. This will include the adoption of specific strategies and behaviors, which facilitate accuracy, currency and necessary modifications with regard to our online reputation. Finally, specific ways in which we can negotiate the fluid dynamic between reputation, and inner and outer selves to optimum effect will conclude the discussion.

Keywords: image, post-truth, privacy, reputation, surveillance

Procedia PDF Downloads 251
24692 Development and Application of the Proctoring System with Face Recognition for User Registration on the Educational Information Portal

Authors: Meruyert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova, Madina Ermaganbetova

Abstract:

This research paper explores the process of creating a proctoring system by evaluating the implementation of practical face recognition algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As an outcome, a proctoring system will be created, enabling the conduction of tests and ensuring academic integrity checks within the system. Due to the correct operation of the system, test works are carried out. The result of the creation of the proctoring system will be the basis for the automation of the informational, educational portal developed by machine learning.

Keywords: artificial intelligence, education portal, face recognition, machine learning, proctoring

Procedia PDF Downloads 113
24691 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 430
24690 Challenges of Blockchain Applications in the Supply Chain Industry: A Regulatory Perspective

Authors: Pardis Moslemzadeh Tehrani

Abstract:

Due to the emergence of blockchain technology and the benefits of cryptocurrencies, intelligent or smart contracts are gaining traction. Artificial intelligence (AI) is transforming our lives, and it is being embraced by a wide range of sectors. Smart contracts, which are at the heart of blockchains, incorporate AI characteristics. Such contracts are referred to as "smart" contracts because of the underlying technology that allows contracting parties to agree on terms expressed in computer code that defines machine-readable instructions for computers to follow under specific situations. The transmission happens automatically if the conditions are met. Initially utilised for financial transactions, blockchain applications have since expanded to include the financial, insurance, and medical sectors, as well as supply networks. Raw material acquisition by suppliers, design, and fabrication by manufacturers, delivery of final products to consumers, and even post-sales logistics assistance are all part of supply chains. Many issues are linked with managing supply chains from the planning and coordination stages, which can be implemented in a smart contract in a blockchain due to their complexity. Manufacturing delays and limited third-party amounts of product components have raised concerns about the integrity and accountability of supply chains for food and pharmaceutical items. Other concerns include regulatory compliance in multiple jurisdictions and transportation circumstances (for instance, many products must be kept in temperature-controlled environments to ensure their effectiveness). Products are handled by several providers before reaching customers in modern economic systems. Information is sent between suppliers, shippers, distributors, and retailers at every stage of the production and distribution process. Information travels more effectively when individuals are eliminated from the equation. The usage of blockchain technology could be a viable solution to these coordination issues. In blockchains, smart contracts allow for the rapid transmission of production data, logistical data, inventory levels, and sales data. This research investigates the legal and technical advantages and disadvantages of AI-blockchain technology in the supply chain business. It aims to uncover the applicable legal problems and barriers to the use of AI-blockchain technology to supply chains, particularly in the food industry. It also discusses the essential legal and technological issues and impediments to supply chain implementation for stakeholders, as well as methods for overcoming them before releasing the technology to clients. Because there has been little research done on this topic, it is difficult for industrial stakeholders to grasp how blockchain technology could be used in their respective operations. As a result, the focus of this research will be on building advanced and complex contractual terms in supply chain smart contracts on blockchains to cover all unforeseen supply chain challenges.

Keywords: blockchain, supply chain, IoT, smart contract

Procedia PDF Downloads 120
24689 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: data augmentation, mutex task generation, meta-learning, text classification.

Procedia PDF Downloads 89
24688 The Increasing Importance of the Role of AI in Higher Education

Authors: Joshefina Bengoechea Fernandez, Alex Bell

Abstract:

In its 2021 guidance for policy makers, the UNESCO has proposed 4 areas where AI can be applied in educational settings: These are: 1) Education management and delivery; 2) Learning and assessment; 3) Empowering teachers and facilitating teaching, and 4) Providing lifelong learning possibilities (UNESCO, 2021). Like with wblockchain technologies, AI will automate the management of educational institutions. These include, but are not limited to admissions, timetables, attendance, and homework monitoring. Furthermore, AI will be used to select relevant learning content across learning platforms for each student, based on his or her personalized needs. A problem educators face is the “one-size-fits-all” approach that does not work with a diverse student population. The purpose of this paper is to illustrate if the implementation of Technology is the solution to the Problems faced in Higher Education. The paper builds upon a constructivist approach, combining a literature review and research on key publications and academic reports.

Keywords: artificial intelligence, learning platforms, students personalised needs, life- long learning, privacy, ethics

Procedia PDF Downloads 98
24687 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network

Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan

Abstract:

Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.

Keywords: aggregation point, data communication, data aggregation, wireless sensor network

Procedia PDF Downloads 150
24686 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data

Procedia PDF Downloads 587
24685 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 347
24684 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis

Authors: C. B. Le, V. N. Pham

Abstract:

In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.

Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering

Procedia PDF Downloads 181
24683 Development of Scale in Evaluation of Effectiveness of Motivation of Divine Leadership

Authors: Parviz Abadi

Abstract:

Leadership is a key driver in organizational achievement. The research presented herein intends on providing the tools for assessing Divine Leadership, which imperative in quantitative evaluations of a leadership. The effectiveness of this leadership has never been examined. There are various tests that can be applied to this leadership, such as evaluation of it against follower motivation, or the impact it has on organizational success, etc. One of the common means of evaluation of a phenomenon is to conduct a quantitative study on the hypothesis related to the subject. The dimensions enacted in this leadership consisted of Humility, Integrity, Empowerment, Altruism, and Visionary. However, these elements of the construct of leadership are latent subjects and cannot easily be assessed. Therefore, it is necessary to develop tangible items that can relate to the construct. The study presented herein was conducted to develop the scales that were tangible and could have been applied in a quantitative study to assess this leadership. The study led to generating a detailed questionnaire, which consisted of 40 questions, that could be presented to participants in the survey.

Keywords: leadership, management, scale development, organizations

Procedia PDF Downloads 49
24682 Modeling Activity Pattern Using XGBoost for Mining Smart Card Data

Authors: Eui-Jin Kim, Hasik Lee, Su-Jin Park, Dong-Kyu Kim

Abstract:

Smart-card data are expected to provide information on activity pattern as an alternative to conventional person trip surveys. The focus of this study is to propose a method for training the person trip surveys to supplement the smart-card data that does not contain the purpose of each trip. We selected only available features from smart card data such as spatiotemporal information on the trip and geographic information system (GIS) data near the stations to train the survey data. XGboost, which is state-of-the-art tree-based ensemble classifier, was used to train data from multiple sources. This classifier uses a more regularized model formalization to control the over-fitting and show very fast execution time with well-performance. The validation results showed that proposed method efficiently estimated the trip purpose. GIS data of station and duration of stay at the destination were significant features in modeling trip purpose.

Keywords: activity pattern, data fusion, smart-card, XGboost

Procedia PDF Downloads 239
24681 Distinct Patterns of Resilience Identified Using Smartphone Mobile Experience Sampling Method (M-ESM) and a Dual Model of Mental Health

Authors: Hussain-Abdulah Arjmand, Nikki S. Rickard

Abstract:

The response to stress can be highly heterogenous, and may be influenced by methodological factors. The integrity of data will be optimized by measuring both positive and negative affective responses to an event, by measuring responses in real time as close to the stressful event as possible, and by utilizing data collection methods that do not interfere with naturalistic behaviours. The aim of the current study was to explore short term prototypical responses to major stressor events on outcome measures encompassing both positive and negative indicators of psychological functioning. A novel mobile experience sampling methodology (m-ESM) was utilized to monitor both effective responses to stressors in real time. A smartphone mental health app (‘Moodprism’) which prompts users daily to report both their positive and negative mood, as well as whether any significant event had occurred in the past 24 hours, was developed for this purpose. A sample of 142 participants was recruited as part of the promotion of this app. Participants’ daily reported experience of stressor events, levels of depressive symptoms and positive affect were collected across a 30 day period as they used the app. For each participant, major stressor events were identified on the subjective severity of the event rated by the user. Depression and positive affect ratings were extracted for the three days following the event. Responses to the event were scaled relative to their general reactivity across the remainder of the 30 day period. Participants were first clustered into groups based on initial reactivity and subsequent recovery following a stressor event. This revealed distinct patterns of responding along depressive symptomatology and positive affect. Participants were then grouped based on allocations to clusters in each outcome variable. A highly individualised nature in which participants respond to stressor events, in symptoms of depression and levels of positive affect, was observed. A complete description of the novel profiles identified will be presented at the conference. These findings suggest that real-time measurement of both positive and negative functioning to stressors yields a more complex set of responses than previously observed with retrospective reporting. The use of smartphone technology to measure individualized responding also proved to shed significant insight.

Keywords: depression, experience sampling methodology, positive functioning, resilience

Procedia PDF Downloads 235
24680 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: artificial intelligence, computer science, criminal investigation, digital forensics

Procedia PDF Downloads 207
24679 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: mutex task generation, data augmentation, meta-learning, text classification.

Procedia PDF Downloads 134
24678 Understanding Non-Utilization of AI Tools for Research and Academic Writing among Academic Staff in Nigerian Universities: A Paradigm Shift

Authors: Abubakar Abdulkareem, Nasir Haruna Soba

Abstract:

This study investigates the non-utilization of AI tools for research and academic writing among academic staff in Nigerian universities using the perceived attribute of innovation theory by Rogers as a theoretical framework to guide the investigation. This study was framed in an interpretative research paradigm. A qualitative methodology and case study research design was adopted. Interviews were conducted with 20 academic staff. The study used a thematic analysis process to identify 115 narratives. The narratives are organized into five major categories and further collapsed into five theoretical constructs explaining the non-use of AI tools for research and academic writing. Findings from this study revealed some of the reasons for the non-utilization of AI tools for research and academic writing as lack of Awareness, perceived Complexity, trust and Reliability Concerns, cost and accessibility, ethical and Privacy concerns and, cultural and institutional factors, etc.

Keywords: non-utilization, AI tools, research and academic writing, academic staff

Procedia PDF Downloads 35
24677 Inter Religion Harmony and World Peace: Theory from Shah Wali Ullah's Philosophy

Authors: Muhammad Usman Ghani

Abstract:

Religious tolerance is essential for the establishment of peace in the world. In the system created by Almighty Allah where a lot of diversity is found, still, this world holds unity itself. In today's world, human beings have been divided into clashes of civilizations or divided on the basis of religions or lingual differences. A religious scholar of Indo- Pak subcontinent describes four ethics, on the basis of which all religions of the world can unite. He says in his philosophy of religion that, there is a number of elements common in all religions but four are very common and they are: cleanliness, nobel deeds, relation to Almighty (existence of Almighty) and justice. He says that this universe also holds its integrity in itself. All humans are different in their attributes but to be a human being is common in them. Similarly, all species of the universe are different in their nature, but to be the creature of God is commonly shared by all of them.

Keywords: inter-religious relation, peace and harmony, unity, four common ethics/virtues

Procedia PDF Downloads 339
24676 Calculus of Turbojet Performances for Ideal Case

Authors: S. Bennoud, S. Hocine, H. Slme

Abstract:

Developments in turbine cooling technology play an important role in increasing the thermal efficiency and the power output of recent gas turbines, in particular the turbojets. Advanced turbojets operate at high temperatures to improve thermal efficiency and power output. These temperatures are far above the permissible metal temperatures. Therefore, there is a critical need to cool the blades in order to give theirs a maximum life period for safe operation. The focused objective of this work is to calculate the turbojet performances, as well as the calculation of turbine blades cooling. The developed application able the calculation of turbojet performances to different altitudes in order to find a point of optimal use making possible to maintain the turbine blades at an acceptable maximum temperature and to limit the local variations in temperatures in order to guarantee their integrity during all the lifespan of the engine.

Keywords: brayton cycle, turbine blades cooling, turbojet cycle, turbojet performances

Procedia PDF Downloads 218
24675 Criminal Laws Associated with Cyber-Medicine and Telemedicine in Current Law Systems in the World

Authors: Shahryar Eslamitabar

Abstract:

Currently, the internet plays an important role in the various scientific, commercial and service practices. Thanks to information and communication technology, the healthcare industry via the internet, generally known as cyber-medicine, can offer professional medical service in a wider geographical area. Having some appealing benefits such as convenience in offering healthcare services, improved accessibility to the services, enhanced information exchange, cost-effectiveness, time-saving, etc. Tele-health has increasingly developed innovative models of healthcare delivery. However, it presents many potential hazards to cyber-patients, inherent in the use of the system. First, there are legal issues associated with the communication and transfer of information on the internet. These include licensure, malpractice, liabilities and jurisdictions as well as privacy, confidentiality and security of personal data as the most important challenge brought about by this system. Additional items of concern are technological and ethical. Although, there are some rules to deal with pitfalls associated with cyber-medicine practices in the USA and some European countries, yet for all developments, it is being practiced in a legal vacuum in many countries. In addition to the domestic legislations to deal with potential problems arisen from the system, it is also imperative that some international or regional agreement should be developed to achieve the harmonization of laws among countries and states. This article discusses some implications posed by the practice of cyber-medicine in the healthcare system according to the experience of some developed countries using a comparative study of laws. It will also review the status of tele-health laws in Iran. Finally, it is intended to pave the way to outline a plan for countries like Iran, with newly-established judicial system for health laws, to develop appropriate regulations through providing some recommendations.

Keywords: tele-health, cyber-medicine, telemedicine, criminal laws, legislations, time-saving

Procedia PDF Downloads 653
24674 Revolutionizing Traditional Farming Using Big Data/Cloud Computing: A Review on Vertical Farming

Authors: Milind Chaudhari, Suhail Balasinor

Abstract:

Due to massive deforestation and an ever-increasing population, the organic content of the soil is depleting at a much faster rate. Due to this, there is a big chance that the entire food production in the world will drop by 40% in the next two decades. Vertical farming can help in aiding food production by leveraging big data and cloud computing to ensure plants are grown naturally by providing the optimum nutrients sunlight by analyzing millions of data points. This paper outlines the most important parameters in vertical farming and how a combination of big data and AI helps in calculating and analyzing these millions of data points. Finally, the paper outlines how different organizations are controlling the indoor environment by leveraging big data in enhancing food quantity and quality.

Keywords: big data, IoT, vertical farming, indoor farming

Procedia PDF Downloads 167
24673 Data Challenges Facing Implementation of Road Safety Management Systems in Egypt

Authors: A. Anis, W. Bekheet, A. El Hakim

Abstract:

Implementing a Road Safety Management System (SMS) in a crowded developing country such as Egypt is a necessity. Beginning a sustainable SMS requires a comprehensive reliable data system for all information pertinent to road crashes. In this paper, a survey for the available data in Egypt and validating it for using in an SMS in Egypt. The research provides some missing data, and refer to the unavailable data in Egypt, looking forward to the contribution of the scientific society, the authorities, and the public in solving the problem of missing or unreliable crash data. The required data for implementing an SMS in Egypt are divided into three categories; the first is available data such as fatality and injury rates and it is proven in this research that it may be inconsistent and unreliable, the second category of data is not available, but it may be estimated, an example of estimating vehicle cost is available in this research, the third is not available and can be measured case by case such as the functional and geometric properties of a facility. Some inquiries are provided in this research for the scientific society, such as how to improve the links among stakeholders of road safety in order to obtain a consistent, non-biased, and reliable data system.

Keywords: road safety management system, road crash, road fatality, road injury

Procedia PDF Downloads 134
24672 Review Paper on an Algorithm Enhancing Privacy and Security in Online Meeting Platforms Using a Secured Encryption

Authors: Tonderai Muchenje, Mkhatshwa Phethile

Abstract:

Humans living in this current situation know that communication with one another is necessary for themselves. There are many ways to communicate with each other; during unexpected natural disasters and outbreak of epidemics and pandemics, the need for online meeting platforms are considered most important. Apparently, the development in the telecommunication sector also played an important role. Therefore, the epidemic of the Covid-19 Pandemic and the new normal situation resulted in the overwhelming production of online meeting platforms to prevent the situation. This software is commonly used in business communications in the beginning. Rapidly the COVID-19 pandemic changed the situation. At present-day, these virtual meeting applications are not only used to have informal meetings with friends and relatives but also to be used to have formal meetings in the business and education (universities) sector. In this article, an attempt has been made to list out the useful secured ways for using online meeting platforms.

Keywords: virtual background, zoom, secure online algorithm, RingCentral, Pexip Pexip, TeamViewer, microsoft teams

Procedia PDF Downloads 106
24671 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies

Authors: Stefano Fantin

Abstract:

The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.

Keywords: cybersecurity, vulnerability, European Union, Japan

Procedia PDF Downloads 152
24670 Big Data-Driven Smart Policing: Big Data-Based Patrol Car Dispatching in Abu Dhabi, UAE

Authors: Oualid Walid Ben Ali

Abstract:

Big Data has become one of the buzzwords today. The recent explosion of digital data has led the organization, either private or public, to a new era towards a more efficient decision making. At some point, business decided to use that concept in order to learn what make their clients tick with phrases like ‘sales funnel’ analysis, ‘actionable insights’, and ‘positive business impact’. So, it stands to reason that Big Data was viewed through green (read: money) colored lenses. Somewhere along the line, however someone realized that collecting and processing data doesn’t have to be for business purpose only, but also could be used for other purposes to assist law enforcement or to improve policing or in road safety. This paper presents briefly, how Big Data have been used in the fields of policing order to improve the decision making process in the daily operation of the police. As example, we present a big-data driven system which is sued to accurately dispatch the patrol cars in a geographic environment. The system is also used to allocate, in real-time, the nearest patrol car to the location of an incident. This system has been implemented and applied in the Emirate of Abu Dhabi in the UAE.

Keywords: big data, big data analytics, patrol car allocation, dispatching, GIS, intelligent, Abu Dhabi, police, UAE

Procedia PDF Downloads 487
24669 Efficient Internal Generator Based on Random Selection of an Elliptic Curve

Authors: Mustapha Benssalah, Mustapha Djeddou, Karim Drouiche

Abstract:

The random number generation (RNG) presents a significant importance for the security and the privacy of numerous applications, such as RFID technology and smart cards. Since, the quality of the generated bit sequences is paramount that a weak internal generator for example, can directly cause the entire application to be insecure, and thus it makes no sense to employ strong algorithms for the application. In this paper, we propose a new pseudo random number generator (PRNG), suitable for cryptosystems ECC-based, constructed by randomly selecting points from several elliptic curves randomly selected. The main contribution of this work is the increasing of the generator internal states by extending the set of its output realizations to several curves auto-selected. The quality and the statistical characteristics of the proposed PRNG are validated using the Chi-square goodness of fit test and the empirical Special Publication 800-22 statistical test suite issued by NIST.

Keywords: PRNG, security, cryptosystem, ECC

Procedia PDF Downloads 439
24668 Mining Multicity Urban Data for Sustainable Population Relocation

Authors: Xu Du, Aparna S. Varde

Abstract:

In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. Experiments so far reveal that data mining methods discover useful knowledge from the multicity urban data. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.

Keywords: data mining, environmental modeling, sustainability, urban planning

Procedia PDF Downloads 301
24667 A Preliminary Comparative Study Between the United Kingdom and Taiwan: Public Private Collaboration and Cooperation in Tackling Large Scale Cyberattacks

Authors: Chi-Hsuan Cheng

Abstract:

This research aims to evaluate public-private partnerships against cyberattacks by comparing the UK and Taiwan. First, the study analyses major cyberattacks and factors influencing cybersecurity in both countries. Second, it assesses the effectiveness of current cyber defence strategies in combating cyberattacks by comparing the approaches taken in the UK and Taiwan, while also evaluating the cyber resilience of both nations. Lastly, the research evaluates existing public-private partnerships by comparing those in the UK and Taiwan, and proposes recommendations for enhancing cooperation and collaboration mechanisms in tackling cyberattacks. Grounded theory serves as the core research method. Theoretical sampling is used to recruit participants in both the UK and Taiwan, including investigators, police officers, and professionals from cybersecurity firms. Semi-structured interviews are conducted in English in the UK and Mandarin in Taiwan, recorded with consent, and pseudonymised for privacy. Data analysis involves open coding, grouping excerpts into codes, and categorising codes. Axial coding connects codes into categories, leading to the development of a codebook. The process continues iteratively until theoretical saturation is reached. Finally, selective coding identifies the core topic, evaluating public-private cooperation against cyberattacks and its implications for social and policing strategies in the UK and Taiwan, which highlights the current status of the cybersecurity industry, governmental plans for cybersecurity, and contributions to cybersecurity from both government sectors and cybersecurity firms, with a particular focus on public-private partnerships. In summary, this research aims to offer practical recommendations to law enforcement, private sectors, and academia for reflecting on current strategies and tailoring future approaches in cybersecurity

Keywords: cybersecurity, cybercrime, public private partnerships, cyberattack

Procedia PDF Downloads 68
24666 Model Order Reduction for Frequency Response and Effect of Order of Method for Matching Condition

Authors: Aref Ghafouri, Mohammad javad Mollakazemi, Farhad Asadi

Abstract:

In this paper, model order reduction method is used for approximation in linear and nonlinearity aspects in some experimental data. This method can be used for obtaining offline reduced model for approximation of experimental data and can produce and follow the data and order of system and also it can match to experimental data in some frequency ratios. In this study, the method is compared in different experimental data and influence of choosing of order of the model reduction for obtaining the best and sufficient matching condition for following the data is investigated in format of imaginary and reality part of the frequency response curve and finally the effect and important parameter of number of order reduction in nonlinear experimental data is explained further.

Keywords: frequency response, order of model reduction, frequency matching condition, nonlinear experimental data

Procedia PDF Downloads 396