Search results for: granular computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1163

Search results for: granular computing

713 Life Cycle Analysis of Using Brick Waste in Road Technology

Authors: Mezhoud Samy, Toumi Youcef, Boukendekdji Otmane

Abstract:

Nowadays, industrial by-products and waste are increasing along with public needs increase. The engineering sector has turned to sustainable development by emphasizing the aspects of environmental and life cycle assessment as an important objective. Among this waste, the remains of the red bricks (DBR) may be an alternative worth checking out, given their availability and abundance at the construction sites. In this context, this work aims to valorize DBR in the concrete road (BR). The incorporation of DBR is carried out by the substitution of the granular fractions of mixtures from noble quarry materials. The experimental plan aims to determine the physico-mechanical performance and environmental performance of manufactured BRs from DBR with a cement content (6.5%) and compared with a control BR without DBR. The studied characteristics are proctor, resistance to compression, resistance to flexural tensile at 7 and 28 days, modulus of elasticity, and total shrinkage. The results of this experimental study showed that the characteristics of recycled aggregates (DBR) are lower than those of natural aggregates but remain acceptable with respect to regulations. Results demonstrate the mechanical performance of BR made from less DBR than the control BR without DBR but remains appreciable and encourage their jobs in the road sector. Recycled aggregates can constitute an interesting economic and ecological alternative but require elementary precautions before any use.

Keywords: life cycle assessment, brick waste, road concrete, performance

Procedia PDF Downloads 73
712 Cloud Resources Utilization and Science Teacher’s Effectiveness in Secondary Schools in Cross River State, Nigeria

Authors: Michael Udey Udam

Abstract:

Background: This study investigated the impact of cloud resources, a component of cloud computing, on science teachers’ effectiveness in secondary schools in Cross River State. Three (3) research questions and three (3) alternative hypotheses guided the study. Method: The descriptive survey design was adopted for the study. The population of the study comprised 1209 science teachers in public secondary schools of Cross River state. Sample: A sample of 487 teachers was drawn from the population using a stratified random sampling technique. The researcher-made structured questionnaire with 18 was used for data collection for the study. Research question one was answered using the Pearson Product Moment Correlation, while research question two and the hypotheses were answered using the Analysis of Variance (ANOVA) statistics in the Statistical Package for Social Sciences (SPSS) at a 0.05 level of significance. Results: The results of the study revealed that there is a positive correlation between the utilization of cloud resources in teaching and teaching effectiveness among science teachers in secondary schools in Cross River state; there is a negative correlation between gender and utilization of cloud resources among science teachers in secondary schools in Cross River state; and that there is a significant correlation between teaching experience and the utilization of cloud resources among science teachers in secondary schools in Cross River state. Conclusion: The study justifies the effectiveness of the Cross River state government policy of introducing cloud computing into the education sector. The study recommends that the policy should be sustained.

Keywords: cloud resources, science teachers, effectiveness, secondary school

Procedia PDF Downloads 47
711 A Study on FWD Deflection Bowl Parameters for Condition Assessment of Flexible Pavement

Authors: Ujjval J. Solanki, Prof.(Dr.) P.J. Gundaliya, Prof.M.D. Barasara

Abstract:

The application of Falling Weight Deflectometer is to evaluate structural performance of the flexible pavement. The exercise of back calculation is required to know the modulus of elasticity of existing in-service pavement. The process of back calculation needs in-depth field experience for the input of range of modulus of elasticity of bituminous, granular and subgrade layer, and its required number of trial to find such matching moduli with the observed FWD deflection on the field. The study carried out at Barnala-Mansa State Highway Punjab-India using FWD before and after overlay; the deflections obtained at 0 on the load cell, 300, 600, 900,1200, 1500 and 1800 mm interval from the load cell these seven deflection results used to calculate Surface Curvature Index (SCI), Base damage Index (BDI), Base curvature index (BCI). This SCI, BCI and BDI indices are useful to predict the structural performance of in-service pavement and also useful to identify homogeneous section for condition assessment. The SCI, BCI and BDI range are determined for before and after overlay the range of SCI 520 to 51 BDI 294 to 63 BCI 83 to 0.27 for old pavement and SCI 272 to 23 BDI 228 to 28, BCI 25.85 to 4.60 for new pavement. It also shows good correlation with back calculated modulus of elasticity of all the three layer.

Keywords: back calculation, base damage index, base curvature index, FWD (Falling Weight Deflectometer), surface curvature index

Procedia PDF Downloads 310
710 An Adjoint-Based Method to Compute Derivatives with Respect to Bed Boundary Positions in Resistivity Measurements

Authors: Mostafa Shahriari, Theophile Chaumont-Frelet, David Pardo

Abstract:

Resistivity measurements are used to characterize the Earth’s subsurface. They are categorized into two different groups: (a) those acquired on the Earth’s surface, for instance, controlled source electromagnetic (CSEM) and Magnetotellurics (MT), and (b) those recorded with borehole logging instruments such as Logging-While-Drilling (LWD) devices. LWD instruments are mostly used for geo-steering purposes, i.e., to adjust dip and azimuthal angles of a well trajectory to drill along a particular geological target. Modern LWD tools measure all nine components of the magnetic field corresponding to three orthogonal transmitter and receiver orientations. In order to map the Earth’s subsurface and perform geo-steering, we invert measurements using a gradient-based method that utilizes the derivatives of the recorded measurements with respect to the inversion variables. For resistivity measurements, these inversion variables are usually the constant resistivity value of each layer and the bed boundary positions. It is well-known how to compute derivatives with respect to the constant resistivity value of each layer using semi-analytic or numerical methods. However, similar formulas for computing the derivatives with respect to bed boundary positions are unavailable. The main contribution of this work is to provide an adjoint-based formulation for computing derivatives with respect to the bed boundary positions. The key idea to obtain the aforementioned adjoint state formulations for the derivatives is to separate the tangential and normal components of the field and treat them differently. This formulation allows us to compute the derivatives faster and more accurately than with traditional finite differences approximations. In the presentation, we shall first derive a formula for computing the derivatives with respect to the bed boundary positions for the potential equation. Then, we shall extend our formulation to 3D Maxwell’s equations. Finally, by considering a 1D domain and reducing the dimensionality of the problem, which is a common practice in the inversion of resistivity measurements, we shall derive a formulation to compute the derivatives of the measurements with respect to the bed boundary positions using a 1.5D variational formulation. Then, we shall illustrate the accuracy and convergence properties of our formulations by comparing numerical results with the analytical derivatives for the potential equation. For the 1.5D Maxwell’s system, we shall compare our numerical results based on the proposed adjoint-based formulation vs those obtained with a traditional finite difference approach. Numerical results shall show that our proposed adjoint-based technique produces enhanced accuracy solutions while its cost is negligible, as opposed to the finite difference approach that requires the solution of one additional problem per derivative.

Keywords: inverse problem, bed boundary positions, electromagnetism, potential equation

Procedia PDF Downloads 160
709 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 45
708 Yawning Computing Using Bayesian Networks

Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube

Abstract:

Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.

Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms

Procedia PDF Downloads 440
707 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 441
706 Design and Evaluation of Production Performance Dashboard for Achieving Oil and Gas Production Target

Authors: Ivan Ramos Sampe Immanuel, Linung Kresno Adikusumo, Liston Sitanggang

Abstract:

Achieving the production targets of oil and gas in an upstream oil and gas company represents a complex undertaking necessitating collaborative engagement from a multidisciplinary team. In addition to conducting exploration activities and executing well intervention programs, an upstream oil and gas enterprise must assess the feasibility of attaining predetermined production goals. The monitoring of production performance serves as a critical activity to ensure organizational progress towards the established oil and gas performance targets. Subsequently, decisions within the upstream oil and gas management team are informed by the received information pertaining to the respective production performance. To augment the decision-making process, the implementation of a production performance dashboard emerges as a viable solution, providing an integrated and centralized tool. The deployment of a production performance dashboard manifests as an instrumental mechanism fostering a user-friendly interface for monitoring production performance, while concurrently preserving the intrinsic characteristics of granular data. The integration of diverse data sources into a unified production performance dashboard establishes a singular veritable source, thereby enhancing the organization's capacity to uphold a consolidated and authoritative foundation for its business requisites. Additionally, the heightened accessibility of the production performance dashboard to business users constitutes a compelling substantiation of its consequential impact on facilitating the monitoring of organizational targets.

Keywords: production, performance, dashboard, data analytics

Procedia PDF Downloads 44
705 Identification of Microbial Community in an Anaerobic Reactor Treating Brewery Wastewater

Authors: Abimbola M. Enitan, John O. Odiyo, Feroz M. Swalaha

Abstract:

The study of microbial ecology and their function in anaerobic digestion processes are essential to control the biological processes. This is to know the symbiotic relationship between the microorganisms that are involved in the conversion of complex organic matter in the industrial wastewater to simple molecules. In this study, diversity and quantity of bacterial community in the granular sludge taken from the different compartments of a full-scale upflow anaerobic sludge blanket (UASB) reactor treating brewery wastewater was investigated using polymerase chain reaction (PCR) and real-time quantitative PCR (qPCR). The phylogenetic analysis showed three major eubacteria phyla that belong to Proteobacteria, Firmicutes and Chloroflexi in the full-scale UASB reactor, with different groups populating different compartment. The result of qPCR assay showed high amount of eubacteria with increase in concentration along the reactor’s compartment. This study extends our understanding on the diverse, topological distribution and shifts in concentration of microbial communities in the different compartments of a full-scale UASB reactor treating brewery wastewater. The colonization and the trophic interactions among these microbial populations in reducing and transforming complex organic matter within the UASB reactors were established.

Keywords: bacteria, brewery wastewater, real-time quantitative PCR, UASB reactor

Procedia PDF Downloads 239
704 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 91
703 Green Technologies Developed by JSC “NIUIF”

Authors: Andrey Norov

Abstract:

In the recent years, Samoilov Research Institute for Mineral Fertilizers JSC “NIUIF”, the oldest (established in September 1919) industry-oriented institute in Russia, has developed a range of sustainable, environment-friendly, zero-waste technologies that ensure minimal consumption of materials and energy resources and fully consistent with the principles of Green Chemistry that include: - Ecofriendly energy and resource saving technology of sulfuric acid from sulfur according to DC-DA scheme (double conversion - double absorption); - Improved zero-waste technology of wet phosphoric acid (WPA) by dihydrate-hemihydrate process applicable to various types of phosphate raw materials; - Flexible, efficient, zero-waste, universal technology of NP / NPS / NPK / NPKS fertilizers with maximum heat recovery from chemical processes; - Novel, zero-waste, no-analogue technology of granular PK / PKS / NPKS fertilizers with controlled dissolution rate and nutrient supply into the soil, which allows to process a number of wastes and by-products; - Innovative resource-saving joint processing of wastes from the production of phosphogypsum and fluorosilicic acid (FSA) into ammonium sulfate with simultaneous neutralization of fluoride compounds with no lime used. - New fertilizer technology of increased environmental and agrochemical efficiency (currently under development). All listed green technologies are patented with Russian and Eurasian patents. The development of ecofriendly, safe, green technologies is ongoing in JSC “NIUIF”.

Keywords: NPKS fertilizers, FSA, sulfuric acid, WPA

Procedia PDF Downloads 70
702 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 39
701 Case Study; Drilled Shafts Installation in Difficult Site Conditions; Loose Sand and High Water Table

Authors: Anthony El Hachem, Hosam Salman

Abstract:

Selecting the most effective construction method for drilled shafts under the high phreatic surface can be a challenging task that requires effective communication between the design and construction teams. Slurry placement, temporary casing, and permanent casing are the three most commonly used installation techniques to ensure the stability of the drilled hole before casting the concrete. Each one of these methods has its implications on the installation and performance of the drilled piers. Drilled shafts were designed to support a fire wall for an Energy project in Central Texas. The subsurface consisted of interlayers of sands and clays of varying shear strengths. The design recommended that the shafts be installed with temporary casing or slurry displacement due to the anticipated groundwater seepage through granular soils. During the foundation construction, it was very difficult to maintain the stability of the hole, and the contractor requested to install the shafts using permanent casings. Therefore, the foundation design was modified to ensure that the cased shafts achieve the required load capacity. Effective and continuous communications between the owner, contractor and design team during field shaft installations to mitigate the unforeseen challenges helped the team to successfully complete the project.

Keywords: construction challenges, deep foundations, drilled shafts, loose sands underwater table, permanent casing

Procedia PDF Downloads 169
700 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack

Authors: Vincent Andrew Cappellano

Abstract:

In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.

Keywords: architecture, resiliency, availability, cyber-attack

Procedia PDF Downloads 77
699 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 36
698 Comparison of Regime Transition between Ellipsoidal and Spherical Particle Assemblies in a Model Shear Cell

Authors: M. Hossain, H. P. Zhu, A. B. Yu

Abstract:

This paper presents a numerical investigation of regime transition of flow of ellipsoidal particles and a comparison with that of spherical particle assembly. Particle assemblies constituting spherical and ellipsoidal particle of 2.5:1 aspect ratio are examined at separate instances in similar flow conditions in a shear cell model that is numerically developed based on the discrete element method. Correlations among elastically scaled stress, kinetically scaled stress, coordination number and volume fraction are investigated, and show important similarities and differences for the spherical and ellipsoidal particle assemblies. In particular, volume fractions at points of regime transition are identified for both types of particles. It is found that compared with spherical particle assembly, ellipsoidal particle assembly has higher volume fraction for the quasistatic to intermediate regime transition and lower volume fraction for the intermediate to inertial regime transition. Finally, the relationship between coordination number and volume fraction shows strikingly distinct features for the two cases, suggesting that different from spherical particles, the effect of the shear rate on the coordination number is not significant for ellipsoidal particles. This work provides a glimpse of currently running work on one of the most attractive scopes of research in this field and has a wide prospect in understanding rheology of more complex shaped particles in light of the strong basis of simpler spherical particle rheology.

Keywords: DEM, granular rheology, non-spherical particles, regime transition

Procedia PDF Downloads 249
697 High Performance Computing Enhancement of Agent-Based Economic Models

Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna

Abstract:

This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).

Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process

Procedia PDF Downloads 109
696 Local Homology Modules

Authors: Fatemeh Mohammadi Aghjeh Mashhad

Abstract:

In this paper, we give several ways for computing generalized local homology modules by using Gorenstein flat resolutions. Also, we find some bounds for vanishing of generalized local homology modules.

Keywords: a-adic completion functor, generalized local homology modules, Gorenstein flat modules

Procedia PDF Downloads 392
695 Heat Transfer and Diffusion Modelling

Authors: R. Whalley

Abstract:

The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.

Keywords: heat, transfer, diffusion, modelling, computation

Procedia PDF Downloads 534
694 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 52
693 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 167
692 Strength of Soft Clay Reinforced with Polypropylene Column

Authors: Muzamir Hasan, Anas Bazirgan

Abstract:

Granular columns is a technique that has the properties of improving bearing capacity, accelerating the dissipation of excess pore water pressure and reducing settlement in a weak soft soil. This research aims to investigate the role of Polypropylene column in improving the shear strength and compressibility of soft reconstituted kaolin clay by determining the effects of area replacement ratio, height penetrating ratio and volume replacement ratio of a singular Polypropylene column on the strength characteristics. Reinforced kaolin samples were subjected to Unconfined Compression (UCT) and Unconsolidated Undrained (UU) triaxial tests. The kaolin samples were 50 mm in diameter and 100 mm in height. Using the PP column reinforcement, with an area replacement ratio of 0.8, 0.5 and 0.3, shear strength increased approximately 5.27%, 26.22% and 64.28%, and 37.14%, 42.33% and 51.17%, for area replacement ratios of 25% and 10.24%. Meanwhile, UU testing showed an increase in shear strength of 24.01%, 23.17% and 23.49% and 28.79%, 27.29 and 30.81% for the same ratios. Based on the UCT results, the undrained shear strength generally increased with the decrease in height penetration ratio. However, based on the UU test results Mohr-Coulomb failure criteria, the installation of Polypropylene columns did not show any significant difference in effective friction angle. However, there was an increase in the apparent cohesion and undrained shear strength of the kaolin clay. In conclusion, Polypropylene column greatly improved the shear strength; and could therefore be implemented in reducing the cost of soil improvement as a replacement for non-renewable materials.

Keywords: polypropylene, UCT, UU test, Kaolin S300, ground improvement

Procedia PDF Downloads 309
691 A Study on the Reinforced Earth Walls Using Sandwich Backfills under Seismic Loads

Authors: Kavitha A.S., L.Govindaraju

Abstract:

Reinforced earth walls offer excellent solution to many problems associated with earth retaining structures especially under seismic conditions. Use of cohesive soils as backfill material reduces the cost of reinforced soil walls if proper drainage measures are taken. This paper presents a numerical study on the application of a new technique called sandwich technique in reinforced earth walls. In this technique, a thin layer of granular soil is placed above and below the reinforcement layer to initiate interface friction and the remaining portion of the backfill is filled up using the existing insitu cohesive soil. A 6 m high reinforced earth wall has been analysed as a two-dimensional plane strain finite element model. Three types of reinforcing elements such as geotextile, geogrid and metallic strips were used. The horizontal wall displacements and the tensile loads in the reinforcement were used as the criteria to evaluate the results at the end of construction and dynamic excitation phases. Also to verify the effectiveness of sandwich layer on the performance of the wall, the thickness of sand fill surrounding the reinforcement was varied. At the end of construction stage it is found that the wall with sandwich type backfill yielded lower displacements when compared to the wall with cohesive soil as backfill. Also with sandwich backfill, the reinforcement loads reduced substantially when compared to the wall with cohesive soil as backfill. Further, it is found that sandwich technique as backfill and geogrid as reinforcement is a good combination to reduce the deformations of geosynthetic reinforced walls during seismic loading.

Keywords: geogrid, geotextile, reinforced earth, sandwich technique

Procedia PDF Downloads 270
690 Methods for Solving Identification Problems

Authors: Fadi Awawdeh

Abstract:

In this work, we highlight the key concepts in using semigroup theory as a methodology used to construct efficient formulas for solving inverse problems. The proposed method depends on some results concerning integral equations. The experimental results show the potential and limitations of the method and imply directions for future work.

Keywords: identification problems, semigroup theory, methods for inverse problems, scientific computing

Procedia PDF Downloads 457
689 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 38
688 Comparison of Several Diagnostic Methods for Detecting Bovine Viral Diarrhea Virus Infection in Cattle

Authors: Azizollah Khodakaram- Tafti, Ali Mohammadi, Ghasem Farjanikish

Abstract:

Bovine viral diarrhea virus (BVDV) is one of the most important viral pathogens of cattle worldwide caused by Pestivirus genus, Flaviviridae family.The aim of the present study was to comparison several diagnostic methods and determine the prevalence of BVDV infection for the first time in dairy herds of Fars province, Iran. For initial screening, a total of 400 blood samples were randomly collected from 12 industrial dairy herds and analyzed using reverse transcription (RT)-PCR on the buffy coat. In the second step, blood samples and also ear notch biopsies were collected from 100 cattle of infected farms and tested by antigen capture ELISA (ACE), RT-PCR and immunohistochemistry (IHC). The results of nested RT-PCR (outer primers 0I100/1400R and inner primers BD1/BD2) was successful in 16 out of 400 buffy coat samples (4%) as acute infection in initial screening. Also, 8 out of 100 samples (2%) were positive as persistent infection (PI) by all of the diagnostic tests similarly including RT-PCR, ACE and IHC on buffy coat, serum and skin samples, respectively. Immunoreactivity for bovine BVDV antigen as brown, coarsely to finely granular was observed within the cytoplasm of epithelial cells of epidermis and hair follicles and also subcutaneous stromal cells. These findings confirm the importance of monitoring BVDV infection in cattle of this region and suggest detection and elimination of PI calves for controlling and eradication of this disease.

Keywords: antigen capture ELISA, bovine viral diarrhea virus, immunohistochemistry, RT-PCR, cattle

Procedia PDF Downloads 343
687 ABET Accreditation Process for Engineering and Technology Programs: Detailed Process Flow from Criteria 1 to Criteria 8

Authors: Amit Kumar, Rajdeep Chakrabarty, Ganesh Gupta

Abstract:

This paper illustrates the detailed accreditation process of Accreditation Board of Engineering and Technology (ABET) for accrediting engineering and Technology programs. ABET is a non-governmental agency that accredits engineering and technology, applied and natural sciences, and computing sciences programs. ABET was founded on 10th May 1932 and was founded by Institute of Electrical and Electronics Engineering. International industries accept ABET accredited institutes having the highest standards in their academic programs. In this accreditation, there are eight criteria in general; criterion 1 describes the student outcome evaluations, criteria 2 measures the program's educational objectives, criteria 3 is the student outcome calculated from the marks obtained by students, criteria 4 establishes continuous improvement, criteria 5 focus on curriculum of the institute, criteria 6 is about faculties of this institute, criteria 7 measures the facilities provided by the institute and finally, criteria 8 focus on institutional support towards staff of the institute. In this paper, we focused on the calculative part of each criterion with equations and suitable examples, the files and documentation required for each criterion, and the total workflow of the process. The references and the values used to illustrate the calculations are all taken from the samples provided at ABET's official website. In the final section, we also discuss the criterion-wise score weightage followed by evaluation with timeframe and deadlines.

Keywords: Engineering Accreditation Committee, Computing Accreditation Committee, performance indicator, Program Educational Objective, ABET Criterion 1 to 7, IEEE, National Board of Accreditation, MOOCS, Board of Studies, stakeholders, course objective, program outcome, articulation, attainment, CO-PO mapping, CO-PO-SO mapping, PDCA cycle, degree certificates, course files, course catalogue

Procedia PDF Downloads 44
686 Computational Fluid Dynamics (CFD) Simulation Approach for Developing New Powder Dispensing Device

Authors: Revanth Rallapalli

Abstract:

Manually dispensing solids and powders can be difficult as it requires gradually pour and check the amount on the scale to be dispensed. Current systems are manual and non-continuous in nature and are user-dependent and difficult to control powder dispensation. Recurrent dosing of powdered medicines in precise amounts quickly and accurately has been an all-time challenge. Various new powder dispensing mechanisms are being designed to overcome these challenges. A battery-operated screw conveyor mechanism is being innovated to overcome the above problems faced. These inventions are numerically evaluated at the concept development level by employing Computational Fluid Dynamics (CFD) of gas-solids multiphase flow systems. CFD has been very helpful in development of such devices saving time and money by reducing the number of prototypes and testing. Furthermore, this paper describes a simulation of powder dispensation from the trocar’s end by considering the powder as secondary flow in air, is simulated by using the technique called Dense Discrete Phase Model incorporated with Kinetic Theory of Granular Flow (DDPM-KTGF). By considering the volume fraction of powder as 50%, the transportation of powder from the inlet side to trocar’s end side is done by rotation of the screw conveyor. Thus, the performance is calculated for a 1-sec time frame in an unsteady computation manner. This methodology will help designers in developing design concepts to improve the dispensation and also at the effective area within a quick turnaround time frame.

Keywords: DDPM-KTGF, gas-solids multiphase flow, screw conveyor, Unsteady

Procedia PDF Downloads 164
685 A Review of the Factors That Influence on Nutrient Removal in Upflow Filters

Authors: Ali Alzeyadi, Edward Loffill, Rafid Alkhaddar Ali Alattabi

Abstract:

Phosphate, ammonium, and nitrates are forms of nutrients; they are released from different sources. High nutrient levels contribute to the eutrophication of water bodies by accelerating the extraordinary growth of algae. Recently, many filtration and treatment systems were developed and used for different removal processes. Due to enhanced operational aspects for the up-flow, continuous, granular Media filter researchers became more interested in further developing this technology and its performance for nutrient removal from wastewater. Environmental factors significantly affect the filtration process performance, and understanding their impact will help to maintain the nutrient removal process. Phosphate removal by phosphate sorption materials PSMs and nitrogen removal biologically are the methods of nutrient removal that have been discussed in this paper. Hence, the focus on the factors that influence these processes is the scope of this work. The finding showed the presence of factors affecting both removal processes; the size, shape, and roughness of the filter media particles play a crucial role in supporting biofilm formation. On the other hand, all of which are effected on the reactivity of surface between the media and phosphate. Many studies alluded to factors that have significant influence on the biological removal for nitrogen such as dissolved oxygen, temperature, and pH; this is due to the sensitivity of biological processes while the phosphate removal by PSMs showed less affected by these factors. This review work provides help to the researchers in create a comprehensive approach in regards study the nutrient removal in up flow filtration systems.

Keywords: nitrogen biological treatment, nutrients, psms, upflow filter, wastewater treatment

Procedia PDF Downloads 295
684 Extra Skeletal Manifestations of Histocytosis in Pediatrics

Authors: Ayda Youssef, Mohammed Ali Khalaf, Tarek Rafaat

Abstract:

Background: Langerhans cell histiocytosis (LCH) is a rare multi-systemic disease that shows an abnormal proliferation of these kinds of cells associated with a granular infiltration that affects different structures of the human body, including the lung, liver, spleen, lymph nodes, brain, mucocutaneous, soft tissue (head and neck), and salivary glands. Evaluation of the extent of disease is one of the major predictors of patient outcome. Objectives: To recognize the pathogenesis of Langerhans cell histiocytosis (LCH), describe the radiologic criteria that are suggestive of LCH in different organs rather than the bones and to illustrate the appropriate differential diagnoses for LCH in each of the common extra-osseous sites. Material and methods: A retrospective study was done on 150 biopsy-proven LCH patients from 2007 to 2012. All patients underwent imaging studies, mostly US, CT, and MRI. These patients were reviewed to assess the extra-skeletal manifestations of LCH. Results: In 150 patients with biopsy-proven LCH, There were 33 patients with liver affection, 5 patients with splenic lesions, 55 patients with enlarged lymph nodes, 9 patient with CNS disease and 11 patients with lung involvement. Conclusions: Because of the frequent LCH children and evaluation of the extent of disease is one of the major predictors of patient outcome. Radiologist need to be familiar with its presentation in different organs and regions of body outside the commonest site of affection (bones). A high-index suspicion should be raised a biopsy is recommended in the presence of radiological suspicion. Chemotherapy is the preferred therapeutic modality.

Keywords: langerhans cell histiocytosis, extra-skeletal, pediatrics, radiology

Procedia PDF Downloads 422