Search results for: pervasive computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1105

Search results for: pervasive computing

715 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities

Authors: Nazli Hardy

Abstract:

Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.

Keywords: Internet of Things (IoT), authentication, protocols, survey

Procedia PDF Downloads 277
714 Time Integrated Measurements of Radon and Thoron Progeny Concentration in Various Dwellings of Bathinda District of Punjab Using Deposition Based Progeny Sensors

Authors: Kirandeep Kaur, Rohit Mehra, Pargin Bangotra

Abstract:

Radon and thoron are pervasive radioactive gases and so are their progenies. The progenies of radon and thoron are present in the indoor atmosphere as attached/unattached fractions. In the present work, seasonal variation of concentration of attached and total (attached + unattached) nanosized decay products of indoor radon and thoron has been studied in the dwellings of Bathinda District of Punjab using Deposition based progeny sensors over long integrated times, which are independent of air turbulence. The preliminary results of these measurements are reported particularly regarding DTPS (Direct Thoron Progeny Sensor) and DRPS (Direct Radon Progeny Sensor) for the first time in Bathinda. It has been observed that there is a strong linear relationship in total EERC (Equilibrium Equivalent Radon Concentration) and EETC (Equilibrium Equivalent Thoron Concentration) in rainy season (R2 = 0.83). Further a strong linear relation between total indoor radon concentration and attached fraction has also been observed for the same rainy season (R2= 0.91). The concentration of attached progeny of radon (EERCatt) is 76.3 % of the total Equilibrium Equivalent Radon Concentration (EERC).

Keywords: radon, thoron, progeny, DTPS/DRPS, EERC, EETC, seasonal variation

Procedia PDF Downloads 388
713 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 87
712 reconceptualizing the place of empire in european women’s travel writing through the lens of iberian texts

Authors: Gayle Nunley

Abstract:

Between the mid-nineteenth and early twentieth century, a number of Western European women broke with gender norms of their time and undertook to write and publish accounts of their own international journeys. In addition to contributing to their contemporaries’ progressive reimagining of the space and place of female experience within the public sphere, these often orientalism-tinged texts have come to provide key source material for the analysis of gendered voice in the narration of Empire, particularly with regard to works associated with Europe’s then-ascendant imperial powers, Britain and France. Incorporation of contemporaneous writings from the once-dominant Empires of Iberian Europe introduces an important additional lens onto this process. By bringing to bear geographic notions of placedness together with discourse analysis, the examination of works by Iberian Europe’s female travelers in conjunction with those of their more celebrated Northern European peers reveals a pervasive pattern of conjoined belonging and displacement traceable throughout the broader corpus, while also underscoring the insufficiency of binary paradigms of gendered voice. The re-situating of women travelers’ participation in the European imperial project to include voices from the Iberian south creates a more robust understanding of these writers’ complex, and often unexpectedly modern, engagement with notions of gender, mobility, ‘otherness’ and contact-zone encounter acted out both within and against the imperial paradigm.

Keywords: colonialism, orientalism, Spain, travel writing, women travelers

Procedia PDF Downloads 89
711 Securitizing Terrorism: A Critical Appraisal of Pakistan’s Counter-Terrorism Approach

Authors: Bilal Zubair

Abstract:

In a constantly challenging internal security environment, Pakistan is making ways to improvise and respond to the new variations in the pervasive phenomenon of terrorism. The state’s endeavors towards securitizing terrorism as an existential threat are both extensive and intensive which have systematically incorporated both military and non-military means. Since 2007, the military has been conducting intermittent operations and by 2014 has successfully neutralized the terrorist ability to target vital security installations and security personal. The terrorists have responded by targeting communities which are soft targets and extremely vulnerable to organized assaults. Within this context, the study aims to explain the emerging trends of terrorism in Pakistan, which multi-layered and complex developments are having far-reaching implications for state and society. With a view to explore the underlining reasons, present trends and ensuing ramifications of the emerging trends in terrorism, this study would examine the following: First, the historical processes and development of Terrorism in Pakistan; secondly the processes of securitization which include political consensus, legal frameworks and military operations against the terrorist groups; thirdly , the socio-cultural dimensions and geopolitical influences on the transforming nature of sectarian terrorism. The study will also highlight the grey areas and weak links in the ongoing securitization process. Finally, the study will thoroughly explore the societal insecurity which is manifested in internal displacements, identity crisis and weakening the socio-political fabric of the state.

Keywords: counter-terrorism, terrorism, sectarianism, securitizing

Procedia PDF Downloads 270
710 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 34
709 Orbital Tuning of Marl-Limestone Alternations (Upper Tithonian to Upper Berriasian) in North-South Axis (Tunisia): Geochronology and Sequence Implications

Authors: Hamdi Omar Omar, Hela Fakhfakh, Chokri Yaich

Abstract:

This work reflects the integration of different techniques, such as field sampling and observations, magnetic susceptibility measurement, cyclostratigaraphy and sequence stratigraphy. The combination of these results allows us to reconstruct the environmental evolution of the Sidi Khalif Formation in the North-South Axis (NOSA), aged of Upper Tithonian, Berriasian and Lower Valanginian. Six sedimentary facies were identified and are primarily influenced by open marine sedimentation receiving increasing terrigenous influx. Spectral analysis, based on MS variation (for the outcropped section) and wireline logging gamma ray (GR) variation (for the sub-area section) show a pervasive dominance of 405-kyr eccentricity cycles with the expression of 100-kyr eccentricity, obliquity and precession. This study provides (for the first time) a precise duration of 2.4 myr for the outcropped Sidi Khalif Formation with a sedimentation rate of 5.4 cm/kyr and the sub-area section to 3.24 myr with a sedimentation rate of 7.64 cm/kyr. We outlined 27 5th-order depositional sequences, 8 Milankovitch depositional sequences and 2 major 3rd-order cycles for the outcropping section, controlled by the long eccentricity (405 kyr) cycles and the precession index cycles. This study has demonstrated the potential of MS and GR to be used as proxies to develop an astronomically calibrated time-scale for the Mesozoic era.

Keywords: Berriasian, magnetic susceptibility, orbital tuning, Sidi Khalif Formation

Procedia PDF Downloads 239
708 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack

Authors: Vincent Andrew Cappellano

Abstract:

In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.

Keywords: architecture, resiliency, availability, cyber-attack

Procedia PDF Downloads 74
707 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 32
706 On the Causes of Boko Haram Terrorism: Socio-Economic versus Religious Injunctions

Authors: Sogo Angel Olofinbiyi

Abstract:

There have been widespread assumptions across the globe that the root cause of Boko Haram terrorism in Nigeria is religious rather than socio-economic. An investigation into this dichotomy allowed this study to fully demonstrate that the root cause of Boko Haram’s terrorist actions emanates from the non-fulfillment of socio-economic goals that are prompted by the violation of fundamental human rights, corruption, poverty, unconstitutional and undemocratic practices in the northern part of the Nigerian state. To achieve its aim of establishing the root cause of the terrorism crisis in the latter country, the study critically appraised the socio-economic context of the insurgency by adopting one-on-one in-depth interviews involving forty (40) participants to interrogate the phenomenon. Empirical evidence from the study demonstrated that the evolution of Boko Haram terrorism was a response to socio-economic phlebotomy, political and moral putrescence, and the dehumanization of people that stem from a combination of decades of mismanagement and pervasive corruption by various Nigerian leaders. The study concludes that, as long as the endemic socio-economic problems caused by global capitalism vis-a-vis unequal hegemonic power exchange as expressed in socio-political, ethno-religious and cultural forms persist in the Nigerian society, the terrorism insurgency will recur and remain an inevitable enterprise and indeed a normal social reaction to every undesirable state of affairs. Based on the findings, the study urges the need for the amelioration of the conditions of the vast majority of the Nigerian populace by making socio-economic facilities available to them through the political state.

Keywords: Boko Haram Terrorism, insurgency, socio-economic, religious injunctions

Procedia PDF Downloads 160
705 Breaking Barriers: Utilizing Innovation to Improve Educational Outcomes for Students with Disabilities

Authors: Emily Purdom, Rachel Robinson

Abstract:

As the number of students worldwide requiring speech-language therapy, occupational therapy and mental health services during their school day increases, innovation is becoming progressively more important to meet the demand. Telepractice can be used to reach a greater number of students requiring specialized therapy while maintaining the highest quality of care. It can be provided in a way that is not only effective but ultimately more convenient for student, teacher and therapist without the added burden of travel. Teletherapy eradicates many hurdles to traditional on-site service delivery and helps to solve the pervasive shortage of certified professionals. Because location is no longer a barrier to specialized education plans for students with disabilities when teletherapy is conducted, there are many advantages that can be deployed. Increased frequency of engagement is possible along with students receiving specialized care from a clinician that may not be in their direct area. Educational teams, including parents, can work together more easily and engage in face-to-face, student-centered collaboration through videoconference. Practical strategies will be provided for connecting students with qualified therapists without the typical in-person dynamic. In most cases, better therapy outcomes are going to be achieved when treatment is most convenient for the student and educator. This workshop will promote discussion in the field of education to increase advocacy for remote service delivery. It will serve as a resource for those wanting to expand their knowledge of options for students with special needs afforded through innovation.

Keywords: education technology, innovation, student support services, telepractice

Procedia PDF Downloads 218
704 High Performance Computing Enhancement of Agent-Based Economic Models

Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna

Abstract:

This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).

Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process

Procedia PDF Downloads 106
703 Educators’ Adherence to Learning Theories and Their Perceptions on the Advantages and Disadvantages of E-Learning

Authors: Samson T. Obafemi, Seraphin D. Eyono-Obono

Abstract:

Information and Communication Technologies (ICTs) are pervasive nowadays, including in education where they are expected to improve the performance of learners. However, the hope placed in ICTs to find viable solutions to the problem of poor academic performance in schools in the developing world has not yet yielded the expected benefits. This problem serves as a motivation to this study whose aim is to examine the perceptions of educators on the advantages and disadvantages of e-learning. This aim will be subdivided into two types of research objectives. Objectives on the identification and design of theories and models will be achieved using content analysis and literature review. However, the objective on the empirical testing of such theories and models will be achieved through the survey of educators from different schools in the Pinetown District of the South African Kwazulu-Natal province. SPSS is used to quantitatively analyse the data collected by the questionnaire of this survey using descriptive statistics and Pearson correlations after assessing the validity and the reliability of the data. The main hypothesis driving this study is that there is a relationship between the demographics of educators’ and their adherence to learning theories on one side, and their perceptions on the advantages and disadvantages of e-learning on the other side, as argued by existing research; but this research views these learning theories under three perspectives: educators’ adherence to self-regulated learning, to constructivism, and to progressivism. This hypothesis was fully confirmed by the empirical study except for the demographic factor where teachers’ level of education was found to be the only demographic factor affecting the perceptions of educators on the advantages and disadvantages of e-learning.

Keywords: academic performance, e-learning, learning theories, teaching and learning

Procedia PDF Downloads 258
702 Local Homology Modules

Authors: Fatemeh Mohammadi Aghjeh Mashhad

Abstract:

In this paper, we give several ways for computing generalized local homology modules by using Gorenstein flat resolutions. Also, we find some bounds for vanishing of generalized local homology modules.

Keywords: a-adic completion functor, generalized local homology modules, Gorenstein flat modules

Procedia PDF Downloads 386
701 Heat Transfer and Diffusion Modelling

Authors: R. Whalley

Abstract:

The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.

Keywords: heat, transfer, diffusion, modelling, computation

Procedia PDF Downloads 527
700 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 42
699 Systematic Discovery of Bacterial Toxins Against Plants Pathogens Fungi

Authors: Yaara Oppenheimer-Shaanan, Nimrod Nachmias, Marina Campos Rocha, Neta Schlezinger, Noam Dotan, Asaf Levy

Abstract:

Fusarium oxysporum, a fungus that attacks a broad range of plants and can cause infections in humans, operates across different kingdoms. This pathogen encounters varied conditions, such as temperature, pH, and nutrient availability, in plant and human hosts. The Fusarium oxysporum species complex, pervasive in soils globally, can affect numerous plants, including key crops like tomatoes and bananas. Controlling Fusarium infections can involve biocontrol agents that hinder the growth of harmful strains. Our research developed a computational method to identify toxin domains within a vast number of microbial genomes, leading to the discovery of nine distinct toxins capable of killing bacteria and fungi, including Fusarium. These toxins appear to function as enzymes, causing significant damage to cellular structures, membranes and DNA. We explored biological control using bacteria that produce polymorphic toxins, finding that certain bacteria, non-pathogenic to plants, offer a safe biological alternative for Fusarium management, as they did not harm macrophage cells or C. elegans. Additionally, we elucidated the 3D structures of two toxins with their protective immunity proteins, revealing their function as unique DNases. These potent toxins are likely instrumental in microbial competition within plant ecosystems and could serve as biocontrol agents to mitigate Fusarium wilt and related diseases.

Keywords: microbial toxins, antifungal, Fusarium oxysporum, bacterial-fungal intreactions

Procedia PDF Downloads 23
698 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 156
697 Methods for Solving Identification Problems

Authors: Fadi Awawdeh

Abstract:

In this work, we highlight the key concepts in using semigroup theory as a methodology used to construct efficient formulas for solving inverse problems. The proposed method depends on some results concerning integral equations. The experimental results show the potential and limitations of the method and imply directions for future work.

Keywords: identification problems, semigroup theory, methods for inverse problems, scientific computing

Procedia PDF Downloads 456
696 The State and Poverty Reduction Strategy in Nigeria: An Assessement

Authors: Musa Ogah Ari

Abstract:

Poverty has engaged the attention of the global community. Both the rich and poor countries are concerned about its prevalence and impacts. This phenomenon is more pervasive among developing countries with the greater challenges manifesting among African countries. In Nigeria people live with very low income, and so decent three-square meals, clothes, shelter and other basic necessities are very difficult to come by for most of the population. Qualitative health facilities are seriously lacking to over 160 million population in the state. Equally lacking are educational and social infrastructures that can be available to the people at affordable rates. Roads linking the interior parts of the state are generally in deplorable conditions, particularly in the rainy season. Safe drinking water is hard to come by as the state is not properly placed and equipped to function in full capacity to serve the interest of the people. The challenges of poverty is definitely enormous for both the national and state governments consequently, debilitating scourge of poverty. As the ruling elites in Nigeria claim to reduce the rising profile of poverty through series of policies and programmes, food production, promotion and funding of co-operatives for agriculture, improvement of infrastructures at the rural areas to guaranteeing employment through skill acquisition, assistance of rural women to break away from poverty and the provision of small scale credit facilities to poor members of the public were abysmally low. It is observed that the poverty alleviation programmes and policies failed because they were by nature, character and implementation pro-elites and anti-masses. None of the programmes or policies engaged the rural poor either in terms of formulation or implementation.

Keywords: the state, poverty, government policies, strategies, social amenities, corruption

Procedia PDF Downloads 325
695 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 34
694 ABET Accreditation Process for Engineering and Technology Programs: Detailed Process Flow from Criteria 1 to Criteria 8

Authors: Amit Kumar, Rajdeep Chakrabarty, Ganesh Gupta

Abstract:

This paper illustrates the detailed accreditation process of Accreditation Board of Engineering and Technology (ABET) for accrediting engineering and Technology programs. ABET is a non-governmental agency that accredits engineering and technology, applied and natural sciences, and computing sciences programs. ABET was founded on 10th May 1932 and was founded by Institute of Electrical and Electronics Engineering. International industries accept ABET accredited institutes having the highest standards in their academic programs. In this accreditation, there are eight criteria in general; criterion 1 describes the student outcome evaluations, criteria 2 measures the program's educational objectives, criteria 3 is the student outcome calculated from the marks obtained by students, criteria 4 establishes continuous improvement, criteria 5 focus on curriculum of the institute, criteria 6 is about faculties of this institute, criteria 7 measures the facilities provided by the institute and finally, criteria 8 focus on institutional support towards staff of the institute. In this paper, we focused on the calculative part of each criterion with equations and suitable examples, the files and documentation required for each criterion, and the total workflow of the process. The references and the values used to illustrate the calculations are all taken from the samples provided at ABET's official website. In the final section, we also discuss the criterion-wise score weightage followed by evaluation with timeframe and deadlines.

Keywords: Engineering Accreditation Committee, Computing Accreditation Committee, performance indicator, Program Educational Objective, ABET Criterion 1 to 7, IEEE, National Board of Accreditation, MOOCS, Board of Studies, stakeholders, course objective, program outcome, articulation, attainment, CO-PO mapping, CO-PO-SO mapping, PDCA cycle, degree certificates, course files, course catalogue

Procedia PDF Downloads 38
693 The Triple Threat: Microplastic, Nanoplastic, and Macroplastic Pollution and Their Cumulative Impacts on Marine Ecosystem

Authors: Tabugbo B. Ifeyinwa, Josephat O. Ogbuagu, Okeke A. Princewill, Victor C. Eze

Abstract:

The increasing amount of plastic pollution in maritime settings poses a substantial risk to the functioning of ecosystems and the preservation of biodiversity. This comprehensive analysis combines the most recent data on the environmental effects of pollution from macroplastics, microplastics, and nanoplastics within marine ecosystems. Our goal is to provide a comprehensive understanding of the cumulative impacts that plastic waste accumulates on marine life by outlining the origins, processes, and ecological repercussions connected with each size category of plastic debris. Microplastics and nanoplastics have more sneaky effects that are controlled by chemicals. These effects can get through biological barriers and affect the health of cells and the whole body. Compared to macroplastics, which primarily contribute to physical harm through entanglement and ingestion by marine fauna, microplastics, and nanoplastics are associated with non-physical effects. The review underlines a vital need for research that crosses disciplinary boundaries to untangle the intricate interactions that the various sizes of plastic pollution have with marine animals, evaluate the long-term ecological repercussions, and identify effective measures for mitigating the effects of plastic pollution. Additionally, we urge governmental interventions and worldwide cooperation to solve this pervasive environmental concern. Specifically, we identify significant knowledge gaps in the detection and effect assessment of nanoplastics. To protect marine biodiversity and preserve ecosystem services, this review highlights how urgent it is to address the broad spectrum of plastic pollution.

Keywords: macroplastic pollution, marine ecosystem, microplastic pollution, nanoplastic pollution

Procedia PDF Downloads 32
692 Development of Mobile Application for Internship Program Management Using the Concept of Model View Controller (MVC) Pattern

Authors: Shutchapol Chopvitayakun

Abstract:

Nowadays, especially for the last 5 years, mobile devices, mobile applications and mobile users, through the deployment of wireless communication and mobile phone cellular network, all these components are growing significantly bigger and stronger. They are being integrated into each other to create multiple purposes and pervasive deployments into every business and non-business sector such as education, medicine, traveling, finance, real estate and many more. Objective of this study was to develop a mobile application for seniors or last-year students who enroll the internship program at each tertiary school (undergraduate school) and do onsite practice at real field sties, real organizations and real workspaces. During the internship session, all students as the interns are required to exercise, drilling and training onsite with specific locations and specific tasks or may be some assignments from their supervisor. Their work spaces are both private and government corporates and enterprises. This mobile application is developed under schema of a transactional processing system that enables users to keep daily work or practice log, monitor true working locations and ability to follow daily tasks of each trainee. Moreover, it provides useful guidance from each intern’s advisor, in case of emergency. Finally, it can summarize all transactional data then calculate each internship cumulated hours from the field practice session for each individual intern.

Keywords: internship, mobile application, Android OS, smart phone devices, mobile transactional processing system, guidance and monitoring, tertiary education, senior students, model view controller (MVC)

Procedia PDF Downloads 286
691 Estimating Heavy Metal Leakage and Environmental Damage from Cigarette Butt Disposal in Urban Areas through CBPI Evaluation

Authors: Muhammad Faisal, Zai-Jin You, Muhammad Naeem

Abstract:

Concerns about the environment, public health, and the economy are raised by the fact that the world produces around 6 trillion cigarettes annually. Arguably the most pervasive forms of environmental litter, this dangerous trash must be eliminated. The researchers wanted to get an idea of how much pollution is seeping out of cigarette butts in metropolitan areas by studying their distribution and concentration. In order to accomplish this goal, the cigarette butt pollution indicator was applied in 29 different areas. The locations were monitored monthly for a full calendar year. The conditions for conducting the investigation of the venues were the same on both weekends and during the weekdays. By averaging the metal leakage ratio in various climates and the average weight of cigarette butts, we were able to estimate the total amount of heavy metal leakage. The findings revealed that the annual average value of the index for the areas that were investigated ranged from 1.38 to 10.4. According to these numbers, just 27.5% of the areas had a low pollution rating, while 43.5% had a major pollution status or worse. Weekends witnessed the largest fall (31% on average) in all locations' indices, while spring and summer saw the largest increase (26% on average) compared to autumn and winter. It was calculated that the average amount of heavy metals such as Cr, Cu, Cd, Zn, and Pb that seep into the environment from discarded cigarette butts in commercial, residential, and park areas, respectively, is 0.25 µg/m2, 0.078 µg/m2, and 0.18 µg/m2. Butt from cigarettes is one of the most prevalent forms of litter in the area that was examined. This litter is the origin of a wide variety of contaminants, including heavy metals. This toxic garbage poses a significant risk to the city.

Keywords: heavy metal, hazardous waste, waste management, litter

Procedia PDF Downloads 55
690 A Rationale to Describe Ambident Reactivity

Authors: David Ryan, Martin Breugst, Turlough Downes, Peter A. Byrne, Gerard P. McGlacken

Abstract:

An ambident nucleophile is a nucleophile that possesses two or more distinct nucleophilic sites that are linked through resonance and are effectively “in competition” for reaction with an electrophile. Examples include enolates, pyridone anions, and nitrite anions, among many others. Reactions of ambident nucleophiles and electrophiles are extremely prevalent at all levels of organic synthesis. The principle of hard and soft acids and bases (the “HSAB principle”) is most commonly cited in the explanation of selectivities in such reactions. Although this rationale is pervasive in any discussion on ambident reactivity, the HSAB principle has received considerable criticism. As a result, the principle’s supplantation has become an area of active interest in recent years. This project focuses on developing a model for rationalizing ambident reactivity. Presented here is an approach that incorporates computational calculations and experimental kinetic data to construct Gibbs energy profile diagrams. The preferred site of alkylation of nitrite anion with a range of ‘hard’ and ‘soft’ alkylating agents was established by ¹H NMR spectroscopy. Pseudo-first-order rate constants were measured directly by ¹H NMR reaction monitoring, and the corresponding second-order constants and Gibbs energies of activation were derived. These, in combination with computationally derived standard Gibbs energies of reaction, were sufficient to construct Gibbs energy wells. By representing the ambident system as a series of overlapping Gibbs energy wells, a more intuitive picture of ambident reactivity emerges. Here, previously unexplained switches in reactivity in reactions involving closely related electrophiles are elucidated.

Keywords: ambident, Gibbs, nucleophile, rates

Procedia PDF Downloads 58
689 Porphyry Cu-Mo-(Au) Mineralization at Paraga Area, Nakhchivan District, Azerbaijan: Evidence from Mineral Paragenesis, Hyrothermal Alteration and Geochemical Studies

Authors: M. Kumral, A. Abdelnasser, M. Budakoglu, M. Karaman, D. K. Yildirim, Z. Doner, A. Bostanci

Abstract:

The Paraga area is located at the extreme eastern part of Nakhchivan district at the boundary with Armenia. The field study is situated at Ordubad region placed in 9 km from Paraga village and stays at 2300-2800 m height over sea level. It lies within a region of low-grade metamorphic porphyritic volcanic and plutonic rocks. The detailed field studies revealed that this area composed mainly of metagabbro-diorite intrusive rocks with porphyritic character emplaced into meta-andesitic rocks. This complex is later intruded by unmapped olivine gabbroic rocks. The Cu-Mo-(Au) mineralization at Paraga deposit is vein-type mineralization that is essentially related to quartz veins stockwork which cut the dioritic rocks and concentrated at the eastern and northeastern parts of the area with different directions N80W, N25W, N70E and N45E. Also, this mineralization is associated with two shearing zones directed N75W and N15E. The host porphyritic rocks were affected by intense sulfidation, carbonatization, sericitization and silicification with pervasive hematitic alterations accompanied with mineralized quartz veins and quartz-carbonate veins. Sulfide minerals which are chalcopyrite, pyrite, arsenopyrite and sphalerite occurred in two cases either inside these mineralized quartz veins or disseminated in the highly altered rocks as well as molybdenite and also at the peripheries between the altered host rock and veins. Gold found as inclusion disseminated in arsenopyrite and pyrite as well as in their cracks.

Keywords: porphyry Cu-Mo-(Au), Paraga area, Nakhchivan, Azerbaijan, paragenesis, hyrothermal alteration

Procedia PDF Downloads 380
688 Research and Implementation of Cross-domain Data Sharing System in Net-centric Environment

Authors: Xiaoqing Wang, Jianjian Zong, Li Li, Yanxing Zheng, Jinrong Tong, Mao Zhan

Abstract:

With the rapid development of network and communication technology, a great deal of data has been generated in different domains of a network. These data show a trend of increasing scale and more complex structure. Therefore, an effective and flexible cross-domain data-sharing system is needed. The Cross-domain Data Sharing System(CDSS) in a net-centric environment is composed of three sub-systems. The data distribution sub-system provides data exchange service through publish-subscribe technology that supports asynchronism and multi-to-multi communication, which adapts to the needs of the dynamic and large-scale distributed computing environment. The access control sub-system adopts Attribute-Based Access Control(ABAC) technology to uniformly model various data attributes such as subject, object, permission and environment, which effectively monitors the activities of users accessing resources and ensures that legitimate users get effective access control rights within a legal time. The cross-domain access security negotiation subsystem automatically determines the access rights between different security domains in the process of interactive disclosure of digital certificates and access control policies through trust policy management and negotiation algorithms, which provides an effective means for cross-domain trust relationship establishment and access control in a distributed environment. The CDSS’s asynchronous,multi-to-multi and loosely-coupled communication features can adapt well to data exchange and sharing in dynamic, distributed and large-scale network environments. Next, we will give CDSS new features to support the mobile computing environment.

Keywords: data sharing, cross-domain, data exchange, publish-subscribe

Procedia PDF Downloads 103
687 Lying Decreases Relying: Deceiver's Distrust in Online Restaurant Reviews

Authors: Jenna Barriault, Reeshma Haji

Abstract:

Online consumer behaviourand reliance on online reviews may be more pervasive than ever, andthis necessitates a better scientific understanding of the widespread phenomenon of online deception. The present research focuses on the understudied topic of deceiver’s distrust, where those who engage in deception later have less trust in others in the context of online restaurant reviews. The purpose was to examine deception and valence in online restaurant reviews and the effects they had on deceiver’s distrust. Undergraduate university students (N = 76) completed an online study where valence was uniquely manipulated by telling participants that either positive (or negative reviews) were influential and asking them to write a correspondingly valenced review. Deception was manipulated in the same task. Participants in the deception condition were asked to write an online restaurant review that was counter to their actual experience of the restaurant (negative review of a restaurant they liked, positive review of the restaurant they did not like). In the no deception condition, participants were asked to write a review that they actually liked or didn’t like (based on the valence condition to which they were randomly assigned). Participants’ trust was then assessed through various measures, includingfuture reliance on online reviews. There was a main effect of deception on reliance on online reviews. Consistent with deceiver’s distrust, those who deceived reported that they would rely less on online reviews. This study demonstrates that even when participants are induced to write a deceptive review, it can result in deceiver’s distrust, thereby lowering their trust in online reviews. If trust or reliance can be altered through deception in online reviews, people may start questioning the objectivity or true representation of a company based on such reviews. A primary implication is that people may reduce theirreliance upon online reviews if they know they are easily subject to manipulation. The findings of this study also contribute to the limited research regarding deceiver’s distrust in an online context, and further research is clarifying the specific conditions in which it is most likely to occur.

Keywords: deceiver’s distrust, deception, online reviews, trust, valence

Procedia PDF Downloads 94
686 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 42