Search results for: pervasive computing
757 Law Verses Tradition: Beliefs in and Practices of Witchcraft in Contemporary Ghana and the Law
Authors: Baba Iddrisu Musah
Abstract:
Many Ghanaians, including the rich and downtrodden, elite and unlettered, rural and urban dwellers, politicians and civil servants, in one way or the other, believe in and practice witchcraft. The existence of witches’ camp in northern Ghana, the rise of Pentecostal churches, especially in southern Ghana with the penchant to cleanse people of witchcraft, as well as media reports of witchcraft imputations assuming wider dimensions in the country, often classified as a citadel of democracy, good governance and human rights in Africa, buttress the pervasive nature of belief in and the practice of witchcraft in the country. This is in spite of the fact that tremendous efforts, especially by British colonial authorities, were made to regulate witchcraft beliefs and its associated practices. Informed by Western values and philosophy, witchcraft was considered by colonial authorities as illogical and unscientific. This paper, which is largely a review of existing literature, supplemented by archival information from the national archives of Ghana, focuses on the nature of witchcraft regulation in Ghana’s pre-colonial and colonial past, as well as immediately after Ghana obtained her independence in 1957. This article concludes by rhetorically questioning whether or not believing in and the practice of witchcraft in contemporary Ghana in general, and the existence of witches’ camps in the northern region of the country are attributed to the failure of past regulations, as well as the failure of present government policies.Keywords: colonial, natives, regulation, witchcraft
Procedia PDF Downloads 257756 Exploring Social Impact of Emerging Technologies from Futuristic Data
Authors: Heeyeul Kwon, Yongtae Park
Abstract:
Despite the highly touted benefits, emerging technologies have unleashed pervasive concerns regarding unintended and unforeseen social impacts. Thus, those wishing to create safe and socially acceptable products need to identify such side effects and mitigate them prior to the market proliferation. Various methodologies in the field of technology assessment (TA), namely Delphi, impact assessment, and scenario planning, have been widely incorporated in such a circumstance. However, literatures face a major limitation in terms of sole reliance on participatory workshop activities. They unfortunately missed out the availability of a massive untapped data source of futuristic information flooding through the Internet. This research thus seeks to gain insights into utilization of futuristic data, future-oriented documents from the Internet, as a supplementary method to generate social impact scenarios whilst capturing perspectives of experts from a wide variety of disciplines. To this end, network analysis is conducted based on the social keywords extracted from the futuristic documents by text mining, which is then used as a guide to produce a comprehensive set of detailed scenarios. Our proposed approach facilitates harmonized depictions of possible hazardous consequences of emerging technologies and thereby makes decision makers more aware of, and responsive to, broad qualitative uncertainties.Keywords: emerging technologies, futuristic data, scenario, text mining
Procedia PDF Downloads 492755 Application of Griddization Management to Construction Hazard Management
Authors: Lingzhi Li, Jiankun Zhang, Tiantian Gu
Abstract:
Hazard management that can prevent fatal accidents and property losses is a fundamental process during the buildings’ construction stage. However, due to lack of safety supervision resources and operational pressures, the conduction of hazard management is poor and ineffective in China. In order to improve the quality of construction safety management, it is critical to explore the use of information technologies to ensure that the process of hazard management is efficient and effective. After exploring the existing problems of construction hazard management in China, this paper develops the griddization management model for construction hazard management. First, following the knowledge grid infrastructure, the griddization computing infrastructure for construction hazards management is designed which includes five layers: resource entity layer, information management layer, task management layer, knowledge transformation layer and application layer. This infrastructure will be as the technical support for realizing grid management. Second, this study divides the construction hazards into grids through city level, district level and construction site level according to grid principles. Last, a griddization management process including hazard identification, assessment and control is developed. Meanwhile, all stakeholders of construction safety management, such as owners, contractors, supervision organizations and government departments, should take the corresponding responsibilities in this process. Finally, a case study based on actual construction hazard identification, assessment and control is used to validate the effectiveness and efficiency of the proposed griddization management model. The advantage of this designed model is to realize information sharing and cooperative management between various safety management departments.Keywords: construction hazard, griddization computing, grid management, process
Procedia PDF Downloads 276754 Cloud Resources Utilization and Science Teacher’s Effectiveness in Secondary Schools in Cross River State, Nigeria
Authors: Michael Udey Udam
Abstract:
Background: This study investigated the impact of cloud resources, a component of cloud computing, on science teachers’ effectiveness in secondary schools in Cross River State. Three (3) research questions and three (3) alternative hypotheses guided the study. Method: The descriptive survey design was adopted for the study. The population of the study comprised 1209 science teachers in public secondary schools of Cross River state. Sample: A sample of 487 teachers was drawn from the population using a stratified random sampling technique. The researcher-made structured questionnaire with 18 was used for data collection for the study. Research question one was answered using the Pearson Product Moment Correlation, while research question two and the hypotheses were answered using the Analysis of Variance (ANOVA) statistics in the Statistical Package for Social Sciences (SPSS) at a 0.05 level of significance. Results: The results of the study revealed that there is a positive correlation between the utilization of cloud resources in teaching and teaching effectiveness among science teachers in secondary schools in Cross River state; there is a negative correlation between gender and utilization of cloud resources among science teachers in secondary schools in Cross River state; and that there is a significant correlation between teaching experience and the utilization of cloud resources among science teachers in secondary schools in Cross River state. Conclusion: The study justifies the effectiveness of the Cross River state government policy of introducing cloud computing into the education sector. The study recommends that the policy should be sustained.Keywords: cloud resources, science teachers, effectiveness, secondary school
Procedia PDF Downloads 76753 An Adjoint-Based Method to Compute Derivatives with Respect to Bed Boundary Positions in Resistivity Measurements
Authors: Mostafa Shahriari, Theophile Chaumont-Frelet, David Pardo
Abstract:
Resistivity measurements are used to characterize the Earth’s subsurface. They are categorized into two different groups: (a) those acquired on the Earth’s surface, for instance, controlled source electromagnetic (CSEM) and Magnetotellurics (MT), and (b) those recorded with borehole logging instruments such as Logging-While-Drilling (LWD) devices. LWD instruments are mostly used for geo-steering purposes, i.e., to adjust dip and azimuthal angles of a well trajectory to drill along a particular geological target. Modern LWD tools measure all nine components of the magnetic field corresponding to three orthogonal transmitter and receiver orientations. In order to map the Earth’s subsurface and perform geo-steering, we invert measurements using a gradient-based method that utilizes the derivatives of the recorded measurements with respect to the inversion variables. For resistivity measurements, these inversion variables are usually the constant resistivity value of each layer and the bed boundary positions. It is well-known how to compute derivatives with respect to the constant resistivity value of each layer using semi-analytic or numerical methods. However, similar formulas for computing the derivatives with respect to bed boundary positions are unavailable. The main contribution of this work is to provide an adjoint-based formulation for computing derivatives with respect to the bed boundary positions. The key idea to obtain the aforementioned adjoint state formulations for the derivatives is to separate the tangential and normal components of the field and treat them differently. This formulation allows us to compute the derivatives faster and more accurately than with traditional finite differences approximations. In the presentation, we shall first derive a formula for computing the derivatives with respect to the bed boundary positions for the potential equation. Then, we shall extend our formulation to 3D Maxwell’s equations. Finally, by considering a 1D domain and reducing the dimensionality of the problem, which is a common practice in the inversion of resistivity measurements, we shall derive a formulation to compute the derivatives of the measurements with respect to the bed boundary positions using a 1.5D variational formulation. Then, we shall illustrate the accuracy and convergence properties of our formulations by comparing numerical results with the analytical derivatives for the potential equation. For the 1.5D Maxwell’s system, we shall compare our numerical results based on the proposed adjoint-based formulation vs those obtained with a traditional finite difference approach. Numerical results shall show that our proposed adjoint-based technique produces enhanced accuracy solutions while its cost is negligible, as opposed to the finite difference approach that requires the solution of one additional problem per derivative.Keywords: inverse problem, bed boundary positions, electromagnetism, potential equation
Procedia PDF Downloads 178752 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 69751 Yawning Computing Using Bayesian Networks
Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube
Abstract:
Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms
Procedia PDF Downloads 456750 Combined Effects of Microplastics and Climate Change on Marine Life
Authors: Vikrant Sinha, Himanshu Singh, Nitish Kumar Singh, Sujal Nag
Abstract:
This research creates an urgent and complex challenge for marine ecosystems. Microplastics were primarily found on land, but now they are pervasive in marine environments as well, affecting a wide range of marine species, from zooplankton to larger mammals that live in those environments. These pollutants interfere with major biological processes like feeding and reproduction, causing disruption throughout the food web as microplastics are getting accumulated at different tropic levels. Meanwhile, climatic changes made these effects more accelerated, and the concentration of microplastics due to these occurrences is increasing day by day. Rising temperatures, melting ice, increased runoff due to rainfall, and shifting wind patterns are transforming marine life in a way that intensifies the burden on marine life. This dual stress is particularly present in fragile ecosystems of marine life, such as coral reefs and mangroves. Addressing this twisted crisis requires not only efforts to restrain plastic pollution but also adapts strategies for climate mitigation. This research emphasizes the critical need to combine approaches to save marine biodiversity and withstand the rapid changes in the environment.Keywords: microplastic pollution, climate change impacts, marine ecosystems, biodiversity threats, zooplankton ingestion, trophic accumulation, coral reef degradation, ecosystem resilience, plastic pollution mitigation, climate adaptation strategies, SST, sea surface temperature
Procedia PDF Downloads 13749 FRATSAN: A New Software for Fractal Analysis of Signals
Authors: Hamidreza Namazi
Abstract:
Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 469748 Harmonization of Conflict Ahadith between Dissociation and Peaceful Co-Existence with Non-Muslims
Authors: Saheed Biodun Qaasim-Badmusi
Abstract:
A lot has been written on peaceful co-existence with non-Muslims in Islam, but little attention is paid to the conflict between Ahadith relating to dissociation from non-Muslims as a kernel of Islamic faith, and the one indicating peaceful co-existence with them. Undoubtedly, proper understanding of seemingly contradictory prophetic traditions is an antidote to the bane of pervasive extremism in our society. This is what calls for need to shed light on ‘Harmonization of Conflict Ahadith between Dissociation and Peaceful Co-existence with Non-Muslims. It is in view of the above that efforts are made in this paper to collate Ahadith pertaining to dissociation from non-Muslims as well as co-existence with them. Consequently, a critical study of their authenticity is briefly explained before proceeding to analysis of their linguistic and contextual meanings. To arrive at the accurate interpretation, harmonization is graphically applied. The result shows that dissociation from non –Muslims as a bedrock of Islamic faith could be explained in Sunnah by prohibition of participating or getting satisfaction from their religious matters, and anti-Islamic activities. Also, freedom of apostasy, ignoring da`wah with wisdom and seeking non-Muslims support against Muslims are frowned upon in Sunnah as phenomenon of dissociation from non –Muslims. All the aforementioned are strictly prohibited in Sunnah whether under the pretext of enhancing peaceful co-existence with non-Muslims or not. While peaceful co-existence with non-Muslims is evidenced in Sunnah by permissibility of visiting the sick among them, exchange of gift with them, forgiving the wrong among them, having good relationship with non-Muslim neighbours, ties of non-Muslim kinship, legal business transaction with them and the like. Finally, the degree of peaceful co-existence with non-Muslims is determined by their attitude towards Islam and Muslims.Keywords: Ahadith, conflict, co-existence, non-Muslims
Procedia PDF Downloads 146747 When the ‘Buddha’s Tree Itself Becomes a Rhizome’: The Religious Itinerant, Nomad Science and the Buddhist State
Authors: James Taylor
Abstract:
This paper considers the political, geo-philosophical musings of Deleuze and Guattari on spatialisation, place and movement in relation to the religious nomad (wandering ascetics and reclusive forest monks) inhabiting the borderlands of Thailand. A nomadic science involves improvised ascetic practices between the molar lines striated by modern state apparatuses. The wandering ascetics, inhabiting a frontier political ecology, stand in contrast to the appropriating, sedentary metaphysics and sanctifying arborescence of statism and its corollary place-making, embedded in rootedness and territorialisation. It is argued that the religious nomads, residing on the endo-exteriorities of the state, came to represent a rhizomatic and politico-ontological threat to centre-nation and its apparatus of capture. The paper also theorises transitions and movement at the borderlands in the context of the state’s monastic reforms. These reforms, and its pervasive royal science, problematised the interstitial zones of the early ascetic wanderers in their radical cross-cutting networks and lines, moving within and across demarcated frontiers. Indeed, the ascetic wanderers and their allegorical war machine were seen as a source of wild, free-floating charisma and mystical power, eventually appropriated by the centre-nation in it’s becoming unitary and fixed.Keywords: Deleuze and Guattari, religious nomad, centre-nation, borderlands, Buddhism
Procedia PDF Downloads 85746 Psychometric Properties of the Sensory Processing Measure Preschool-Home among Children with Autism in Saudi Arabia
Authors: Shahad Alkhalifah, Jonh Wright
Abstract:
Autism spectrum disorder (ASD) is a pervasive developmental disorder associated, for 42% to 88% of people with ASD, with sensory processing disorders. Sensory processing disorders (SPD) impact daily functioning, and it is, therefore, essential to be able to diagnose them accurately. Currently, however, there is no assessment tool available for the Saudi Arabia (SA) population that would cover a wider enough age range. Therefore, this study aimed to assess the psychometric properties of the Sensory Processing Measure Preschool-Home Form (SPM-P) when used in English, with a population of English-speaking Saudi participants. This was chosen due to time limitations and the urgency in providing practitioners with appropriate tools. Using a convenience sampling approach group of caregivers of typically developing (TD) children and a group of caregivers for children with ASD were recruited (N = 40 and N = 16, respectively), and completed the SPM-P Home Form. Participants were also invited to complete it again after two weeks for test-retest reliability, and respectively, nine and five agreed. Reliability analyses suggested some issues with a few items when used in the Saudi culture, and, along with interscale correlations, it highlighted concerns with the factor structure. However, it was also found that the SPM-P Home has good criterion-based validity, and it is, therefore, suggested that it can be used until a tool is developed through translation and cultural adaptation. It is also suggested that the current factor structure of SPM-P Home is reassessed using a large sample.Keywords: autism, sensory, assessment, reliability, sensory processing dysfunction, preschool, validity
Procedia PDF Downloads 230745 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights
Authors: Julian Wise
Abstract:
Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.Keywords: mineral technology, big data, machine learning operations, data lake
Procedia PDF Downloads 112744 Corn Production in the Visayas: An Industry Study from 2002-2019
Authors: Julie Ann L. Gadin, Andrearose C. Igano, Carl Joseph S. Ignacio, Christopher C. Bacungan
Abstract:
Corn production has become an important and pervasive industry in the Visayas for many years. Its role as a substitute commodity to rice heightens demand for health-particular consumers. Unfortunately, the corn industry is confronted with several challenges, such as weak institutions. Considering these issues, the paper examined the factors that influence corn production in the three administrative regions in the Visayas, namely, Western Visayas, Central Visayas, and Eastern Visayas. The data used was retrieved from a variety of publicly available data sources such as the Philippine Statistics Authority, the Department of Agriculture, the Philippine Crop Insurance Corporation, and the International Disaster Database. Utilizing a dataset from 2002 to 2019, the indicators were tested using three multiple linear regression (MLR) models. Results showed that the land area harvested (p=0.02), and the value of corn production (p=0.00) are statistically significant variables that influence corn production in the Visayas. Given these findings, it is suggested that the policy of forest conversion and sustainable land management should be effective in enabling farmworkers to obtain land to grow corn crops, especially in rural regions. Furthermore, the Biofuels Act of 2006, the Livestock Industry Restructuring and Rationalization Act, and supported policy, Senate Bill No. 225, or an Act Establishing the Philippine Corn Research Institute and Appropriating Funds, should be enforced inclusively in order to improve the demand for the corn-allied industries which may lead to an increase in the value and volume of corn production in the Visayas.Keywords: corn, industry, production, MLR, Visayas
Procedia PDF Downloads 216743 An Evolutionary Approach for QAOA for Max-Cut
Authors: Francesca Schiavello
Abstract:
This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization
Procedia PDF Downloads 60742 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack
Authors: Vincent Andrew Cappellano
Abstract:
In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.Keywords: architecture, resiliency, availability, cyber-attack
Procedia PDF Downloads 109741 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm
Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy
Abstract:
IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.Keywords: IoT, fog networks, data stewardship, dynamic access policy
Procedia PDF Downloads 60740 Geometrical Analysis of Tiling Patterns in Azari Style: The Case of Tabriz Kaboud Mosque
Authors: Seyyedeh Faezeh Miralami, Sahar Sayyadchapari, Mona Laleh, Zahra Poursafar
Abstract:
Tiling patterns are magnificent display of decoration in Islamic period. They transform the dusty and dreary facades to splendid and ornate ones. Due to ideological factors and elements of Azari style decorations, geometrical patterns and vegetative designs became prevalent and pervasive in religious sites like mosques. Objectives: The objective of this research is a study of tiling patterns in Tabriz Kaboud mosque, as a splendid work of architecture in Azari style. In this study, the geometrical designs and tiling patterns employed in the mosque decorations are examined and analyzed. Method: The research is based on a descriptive analysis method. Data and information are collected based on documents library and field study. Then, polished and brushed, the study resulted in an illustrative conclusion. Findings: In religious sites such as mosques, geometry represents ‘divination’ in Christian theology and ‘Unity with God’ or ‘Tawhid’ in Islamic terminology. In other words, science, literature, architecture, and all forms of human expression and representation are pointed towards one cause, unity or divination. Tiling patterns of Kaboud Mosque, mostly hexagonal, circular, square and triangle, form outstanding architectonic features which recount a story, a narration of divination or unification with the One.Keywords: tiling, Azari style, Tabriz Kaboud Mosque, Islamic architecture
Procedia PDF Downloads 325739 High Performance Computing Enhancement of Agent-Based Economic Models
Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna
Abstract:
This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process
Procedia PDF Downloads 130738 Local Homology Modules
Authors: Fatemeh Mohammadi Aghjeh Mashhad
Abstract:
In this paper, we give several ways for computing generalized local homology modules by using Gorenstein flat resolutions. Also, we find some bounds for vanishing of generalized local homology modules.Keywords: a-adic completion functor, generalized local homology modules, Gorenstein flat modules
Procedia PDF Downloads 419737 Heat Transfer and Diffusion Modelling
Authors: R. Whalley
Abstract:
The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.Keywords: heat, transfer, diffusion, modelling, computation
Procedia PDF Downloads 554736 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations
Authors: Deepak Singh, Rail Kuliev
Abstract:
The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization
Procedia PDF Downloads 70735 The Effects of Emotional Working Memory Training on Trait Anxiety
Authors: Gabrielle Veloso, Welison Ty
Abstract:
Trait anxiety is a pervasive tendency to attend to and experience fears and worries to a disproportionate degree, across various situations. This study sought to determine if participants who undergo emotional working memory training will have significantly lower scores on the trait anxiety scales post-intervention. The study also sought to determine if emotional regulation mediated the relationship between working memory training and trait anxiety. Forty-nine participants underwent 20 days of computerized emotional working memory training called Emotional Dual n-back, which involves viewing a continuous stream of emotional content on a grid, and then remembering the location and color of items presented on the grid. Participants of the treatment group had significantly lower trait anxiety compared to controls post-intervention. Mediation analysis determined that working memory training had no significant relationship to anxiety as measured by the Beck’s Anxiety Inventory-Trait (BAIT), but was significantly related to anxiety as measured by form Y2 of the Spielberger State-Trait Anxiety Inventory (STAI-Y2). Emotion regulation, as measured by the Emotional Regulation Questionnaire (ERQ), was found not to mediate between working memory training and trait anxiety reduction. Results suggest that working memory training may be useful in reducing psychoemotional symptoms rather than somatic symptoms of trait anxiety. Moreover, it proposes for future research to further look into the mediating role of emotion regulation via neuroimaging and the development of more comprehensive measures of emotion regulation.Keywords: anxiety, emotion regulation, working-memory, working-memory training
Procedia PDF Downloads 152734 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities
Authors: Nazli Hardy
Abstract:
Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.Keywords: Internet of Things (IoT), authentication, protocols, survey
Procedia PDF Downloads 300733 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language
Authors: Wenjun Hou, Marek Perkowski
Abstract:
The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language
Procedia PDF Downloads 191732 Time Integrated Measurements of Radon and Thoron Progeny Concentration in Various Dwellings of Bathinda District of Punjab Using Deposition Based Progeny Sensors
Authors: Kirandeep Kaur, Rohit Mehra, Pargin Bangotra
Abstract:
Radon and thoron are pervasive radioactive gases and so are their progenies. The progenies of radon and thoron are present in the indoor atmosphere as attached/unattached fractions. In the present work, seasonal variation of concentration of attached and total (attached + unattached) nanosized decay products of indoor radon and thoron has been studied in the dwellings of Bathinda District of Punjab using Deposition based progeny sensors over long integrated times, which are independent of air turbulence. The preliminary results of these measurements are reported particularly regarding DTPS (Direct Thoron Progeny Sensor) and DRPS (Direct Radon Progeny Sensor) for the first time in Bathinda. It has been observed that there is a strong linear relationship in total EERC (Equilibrium Equivalent Radon Concentration) and EETC (Equilibrium Equivalent Thoron Concentration) in rainy season (R2 = 0.83). Further a strong linear relation between total indoor radon concentration and attached fraction has also been observed for the same rainy season (R2= 0.91). The concentration of attached progeny of radon (EERCatt) is 76.3 % of the total Equilibrium Equivalent Radon Concentration (EERC).Keywords: radon, thoron, progeny, DTPS/DRPS, EERC, EETC, seasonal variation
Procedia PDF Downloads 417731 reconceptualizing the place of empire in european women’s travel writing through the lens of iberian texts
Authors: Gayle Nunley
Abstract:
Between the mid-nineteenth and early twentieth century, a number of Western European women broke with gender norms of their time and undertook to write and publish accounts of their own international journeys. In addition to contributing to their contemporaries’ progressive reimagining of the space and place of female experience within the public sphere, these often orientalism-tinged texts have come to provide key source material for the analysis of gendered voice in the narration of Empire, particularly with regard to works associated with Europe’s then-ascendant imperial powers, Britain and France. Incorporation of contemporaneous writings from the once-dominant Empires of Iberian Europe introduces an important additional lens onto this process. By bringing to bear geographic notions of placedness together with discourse analysis, the examination of works by Iberian Europe’s female travelers in conjunction with those of their more celebrated Northern European peers reveals a pervasive pattern of conjoined belonging and displacement traceable throughout the broader corpus, while also underscoring the insufficiency of binary paradigms of gendered voice. The re-situating of women travelers’ participation in the European imperial project to include voices from the Iberian south creates a more robust understanding of these writers’ complex, and often unexpectedly modern, engagement with notions of gender, mobility, ‘otherness’ and contact-zone encounter acted out both within and against the imperial paradigm.Keywords: colonialism, orientalism, Spain, travel writing, women travelers
Procedia PDF Downloads 113730 Securitizing Terrorism: A Critical Appraisal of Pakistan’s Counter-Terrorism Approach
Authors: Bilal Zubair
Abstract:
In a constantly challenging internal security environment, Pakistan is making ways to improvise and respond to the new variations in the pervasive phenomenon of terrorism. The state’s endeavors towards securitizing terrorism as an existential threat are both extensive and intensive which have systematically incorporated both military and non-military means. Since 2007, the military has been conducting intermittent operations and by 2014 has successfully neutralized the terrorist ability to target vital security installations and security personal. The terrorists have responded by targeting communities which are soft targets and extremely vulnerable to organized assaults. Within this context, the study aims to explain the emerging trends of terrorism in Pakistan, which multi-layered and complex developments are having far-reaching implications for state and society. With a view to explore the underlining reasons, present trends and ensuing ramifications of the emerging trends in terrorism, this study would examine the following: First, the historical processes and development of Terrorism in Pakistan; secondly the processes of securitization which include political consensus, legal frameworks and military operations against the terrorist groups; thirdly , the socio-cultural dimensions and geopolitical influences on the transforming nature of sectarian terrorism. The study will also highlight the grey areas and weak links in the ongoing securitization process. Finally, the study will thoroughly explore the societal insecurity which is manifested in internal displacements, identity crisis and weakening the socio-political fabric of the state.Keywords: counter-terrorism, terrorism, sectarianism, securitizing
Procedia PDF Downloads 299729 Methods for Solving Identification Problems
Authors: Fadi Awawdeh
Abstract:
In this work, we highlight the key concepts in using semigroup theory as a methodology used to construct efficient formulas for solving inverse problems. The proposed method depends on some results concerning integral equations. The experimental results show the potential and limitations of the method and imply directions for future work.Keywords: identification problems, semigroup theory, methods for inverse problems, scientific computing
Procedia PDF Downloads 481728 Orbital Tuning of Marl-Limestone Alternations (Upper Tithonian to Upper Berriasian) in North-South Axis (Tunisia): Geochronology and Sequence Implications
Authors: Hamdi Omar Omar, Hela Fakhfakh, Chokri Yaich
Abstract:
This work reflects the integration of different techniques, such as field sampling and observations, magnetic susceptibility measurement, cyclostratigaraphy and sequence stratigraphy. The combination of these results allows us to reconstruct the environmental evolution of the Sidi Khalif Formation in the North-South Axis (NOSA), aged of Upper Tithonian, Berriasian and Lower Valanginian. Six sedimentary facies were identified and are primarily influenced by open marine sedimentation receiving increasing terrigenous influx. Spectral analysis, based on MS variation (for the outcropped section) and wireline logging gamma ray (GR) variation (for the sub-area section) show a pervasive dominance of 405-kyr eccentricity cycles with the expression of 100-kyr eccentricity, obliquity and precession. This study provides (for the first time) a precise duration of 2.4 myr for the outcropped Sidi Khalif Formation with a sedimentation rate of 5.4 cm/kyr and the sub-area section to 3.24 myr with a sedimentation rate of 7.64 cm/kyr. We outlined 27 5th-order depositional sequences, 8 Milankovitch depositional sequences and 2 major 3rd-order cycles for the outcropping section, controlled by the long eccentricity (405 kyr) cycles and the precession index cycles. This study has demonstrated the potential of MS and GR to be used as proxies to develop an astronomically calibrated time-scale for the Mesozoic era.Keywords: Berriasian, magnetic susceptibility, orbital tuning, Sidi Khalif Formation
Procedia PDF Downloads 266