Search results for: monitoring networks
4314 Artificial Intelligence for Traffic Signal Control and Data Collection
Authors: Reggie Chandra
Abstract:
Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal
Procedia PDF Downloads 1694313 Signal Estimation and Closed Loop System Performance in Atrial Fibrillation Monitoring with Communication Channels
Authors: Mohammad Obeidat, Ayman Mansour
Abstract:
In this paper a unique issue rising from feedback control of Atrial Fibrillation monitoring system with embedded communication channels has been investigated. One of the important factors to measure the performance of the feedback control closed loop system is disturbance and noise attenuation factor. It is important that the feedback system can attenuate such disturbances on the atrial fibrillation heart rate signals. Communication channels depend on network traffic conditions and deliver different throughput, implying that the sampling intervals may change. Since signal estimation is updated on the arrival of new data, its dynamics actually change with the sampling interval. Consequently, interaction among sampling, signal estimation, and the controller will introduce new issues in remotely controlled Atrial Fibrillation system. This paper treats a remotely controlled atrial fibrillation system with one communication channel which connects between the heart rate and rhythm measurements to the remote controller. Typical and optimal signal estimation schemes is represented by a signal averaging filter with its time constant derived from the step size of the signal estimation algorithm.Keywords: atrial fibrillation, communication channels, closed loop, estimation
Procedia PDF Downloads 3784312 The Emerging Multi-Species Trap Fishery in the Red Sea Waters of Saudi Arabia
Authors: Nabeel M. Alikunhi, Zenon B. Batang, Aymen Charef, Abdulaziz M. Al-Suwailem
Abstract:
Saudi Arabia has a long history of using traps as a traditional fishing gear for catching commercially important demersal, mainly coral reef-associated fish species. Fish traps constitute the dominant small-scale fisheries in Saudi waters of Arabian Gulf (eastern seaboard of Saudi Arabia). Recently, however, traps have been increasingly used along the Saudi Red Sea coast (western seaboard), with a coastline of 1800 km (71%) compared to only 720 km (29%) in the Saudi Gulf region. The production trend for traps indicates a recent increase in catches and percent contribution to traditional fishery landings, thus ascertaining the rapid proliferation of trap fishing along the Saudi Red Sea coast. Reef-associated fish species, mainly groupers (Serranidae), emperors (Lethrinidae), parrotfishes (Scaridae), scads and trevallies (Carangidae), and snappers (Lutjanidae), dominate the trap catches, reflecting the reef-dominated shelf zone in the Red Sea. This ongoing investigation covers following major objectives (i) Baseline studies to characterize trap fishery through landing site visit and interview surveys (ii) Stock assessment by fisheries and biological data obtained through monthly landing site monitoring using fishery operational model by FLBEIA, (iii) Operational impacts, derelict traps assessment and by-catch analysis through bottom-mounted video camera and onboard monitoring (iv) Elucidation of fishing grounds and derelict traps impacts by onboard monitoring, Remotely Operated underwater Vehicle and Autonomous Underwater Vehicle surveys; and (v) Analysis of gear design and operations which covers colonization and deterioration experiments. The progress of this investigation on the impacts of the trap fishery on fish stocks and the marine environment in the Saudi Red Sea region is presented.Keywords: red sea, Saudi Arabia, fish trap, stock assessment, environmental impacts
Procedia PDF Downloads 3504311 Normalized P-Laplacian: From Stochastic Game to Image Processing
Authors: Abderrahim Elmoataz
Abstract:
More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems
Procedia PDF Downloads 5124310 Static Priority Approach to Under-Frequency Based Load Shedding Scheme in Islanded Industrial Networks: Using the Case Study of Fatima Fertilizer Company Ltd - FFL
Authors: S. H. Kazmi, T. Ahmed, K. Javed, A. Ghani
Abstract:
In this paper static scheme of under-frequency based load shedding is considered for chemical and petrochemical industries with islanded distribution networks relying heavily on the primary commodity to ensure minimum production loss, plant downtime or critical equipment shutdown. A simplistic methodology is proposed for in-house implementation of this scheme using underfrequency relays and a step by step guide is provided including the techniques to calculate maximum percentage overloads, frequency decay rates, time based frequency response and frequency based time response of the system. Case study of FFL electrical system is utilized, presenting the actual system parameters and employed load shedding settings following the similar series of steps. The arbitrary settings are then verified for worst overload conditions (loss of a generation source in this case) and comprehensive system response is then investigated.Keywords: islanding, under-frequency load shedding, frequency rate of change, static UFLS
Procedia PDF Downloads 4864309 Green Wave Control Strategy for Optimal Energy Consumption by Model Predictive Control in Electric Vehicles
Authors: Furkan Ozkan, M. Selcuk Arslan, Hatice Mercan
Abstract:
Electric vehicles are becoming increasingly popular asa sustainable alternative to traditional combustion engine vehicles. However, to fully realize the potential of EVs in reducing environmental impact and energy consumption, efficient control strategies are essential. This study explores the application of green wave control using model predictive control for electric vehicles, coupled with energy consumption modeling using neural networks. The use of MPC allows for real-time optimization of the vehicles’ energy consumption while considering dynamic traffic conditions. By leveraging neural networks for energy consumption modeling, the EV's performance can be further enhanced through accurate predictions and adaptive control. The integration of these advanced control and modeling techniques aims to maximize energy efficiency and range while navigating urban traffic scenarios. The findings of this research offer valuable insights into the potential of green wave control for electric vehicles and demonstrate the significance of integrating MPC and neural network modeling for optimizing energy consumption. This work contributes to the advancement of sustainable transportation systems and the widespread adoption of electric vehicles. To evaluate the effectiveness of the green wave control strategy in real-world urban environments, extensive simulations were conducted using a high-fidelity vehicle model and realistic traffic scenarios. The results indicate that the integration of model predictive control and energy consumption modeling with neural networks had a significant impact on the energy efficiency and range of electric vehicles. Through the use of MPC, the electric vehicle was able to adapt its speed and acceleration profile in realtime to optimize energy consumption while maintaining travel time objectives. The neural network-based energy consumption modeling provided accurate predictions, enabling the vehicle to anticipate and respond to variations in traffic flow, further enhancing energy efficiency and range. Furthermore, the study revealed that the green wave control strategy not only reduced energy consumption but also improved the overall driving experience by minimizing abrupt acceleration and deceleration, leading to a smoother and more comfortable ride for passengers. These results demonstrate the potential for green wave control to revolutionize urban transportation by enhancing the performance of electric vehicles and contributing to a more sustainable and efficient mobility ecosystem.Keywords: electric vehicles, energy efficiency, green wave control, model predictive control, neural networks
Procedia PDF Downloads 544308 Finding the Optimal Meeting Point Based on Travel Plans in Road Networks
Authors: Mohammad H. Ahmadi, Vahid Haghighatdoost
Abstract:
Given a set of source locations for a group of friends, and a set of trip plans for each group member as a sequence of Categories-of-Interests (COIs) (e.g., restaurant), and finally a specific COI as a common destination that all group members will gather together, in Meeting Point Based on Trip Plans (MPTPs) queries our goal is to find a Point-of-Interest (POI) from different COIs, such that the aggregate travel distance for the group is minimized. In this work, we considered two cases for aggregate function as Sum and Max. For solving this query, we propose an efficient pruning technique for shrinking the search space. Our approach contains three steps. In the first step, it prunes the search space around the source locations. In the second step, it prunes the search space around the centroid of source locations. Finally, we compute the intersection of all pruned areas as the final refined search space. We prove that the POIs beyond the refined area cannot be part of optimal answer set. The paper also covers an extensive performance study of the proposed technique.Keywords: meeting point, trip plans, road networks, spatial databases
Procedia PDF Downloads 1854307 Using the Weakest Precondition to Achieve Self-Stabilization in Critical Networks
Authors: Antonio Pizzarello, Oris Friesen
Abstract:
Networks, such as the electric power grid, must demonstrate exemplary performance and integrity. Integrity depends on the quality of both the system design model and the deployed software. Integrity of the deployed software is key, for both the original versions and the many that occur throughout numerous maintenance activity. Current software engineering technology and practice do not produce adequate integrity. Distributed systems utilize networks where each node is an independent computer system. The connections between them is realized via a network that is normally redundantly connected to guarantee the presence of a path between two nodes in the case of failure of some branch. Furthermore, at each node, there is software which may fail. Self-stabilizing protocols are usually present that recognize failure in the network and perform a repair action that will bring the node back to a correct state. These protocols first introduced by E. W. Dijkstra are currently present in almost all Ethernets. Super stabilization protocols capable of reacting to a change in the network topology due to the removal or addition of a branch in the network are less common but are theoretically defined and available. This paper describes how to use the Software Integrity Assessment (SIA) methodology to analyze self-stabilizing software. SIA is based on the UNITY formalism for parallel and distributed programming, which allows the analysis of code for verifying the progress property p leads-to q that describes the progress of all computations starting in a state satisfying p to a state satisfying q via the execution of one or more system modules. As opposed to demonstrably inadequate test and evaluation methods SIA allows the analysis and verification of any network self-stabilizing software as well as any other software that is designed to recover from failure without external intervention of maintenance personnel. The model to be analyzed is obtained by automatic translation of the system code to a transition system that is based on the use of the weakest precondition.Keywords: network, power grid, self-stabilization, software integrity assessment, UNITY, weakest precondition
Procedia PDF Downloads 2234306 A Machine Learning Approach for Efficient Resource Management in Construction Projects
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management
Procedia PDF Downloads 404305 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach
Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann
Abstract:
Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech
Procedia PDF Downloads 1024304 Global Mittag-Leffler Stability of Fractional-Order Bidirectional Associative Memory Neural Network with Discrete and Distributed Transmission Delays
Authors: Swati Tyagi, Syed Abbas
Abstract:
Fractional-order Hopfield neural networks are generally used to model the information processing among the interacting neurons. To show the constancy of the processed information, it is required to analyze the stability of these systems. In this work, we perform Mittag-Leffler stability for the corresponding Caputo fractional-order bidirectional associative memory (BAM) neural networks with various time-delays. We derive sufficient conditions to ensure the existence and uniqueness of the equilibrium point by using the theory of topological degree theory. By applying the fractional Lyapunov method and Mittag-Leffler functions, we derive sufficient conditions for the global Mittag-Leffler stability, which further imply the global asymptotic stability of the network equilibrium. Finally, we present two suitable examples to show the effectiveness of the obtained results.Keywords: bidirectional associative memory neural network, existence and uniqueness, fractional-order, Lyapunov function, Mittag-Leffler stability
Procedia PDF Downloads 3654303 Web and Smart Phone-based Platform Combining Artificial Intelligence and Satellite Remote Sensing Data to Geoenable Villages for Crop Health Monitoring
Authors: Siddhartha Khare, Nitish Kr Boro, Omm Animesh Mishra
Abstract:
Recent food price hikes may signal the end of an era of predictable global grain crop plenty due to climate change, population expansion, and dietary changes. Food consumption will treble in 20 years, requiring enormous production expenditures. Climate and the atmosphere changed owing to rainfall and seasonal cycles in the past decade. India's tropical agricultural relies on evapotranspiration and monsoons. In places with limited resources, the global environmental change affects agricultural productivity and farmers' capacity to adjust to changing moisture patterns. Motivated by these difficulties, satellite remote sensing might be combined with near-surface imaging data (smartphones, UAVs, and PhenoCams) to enable phenological monitoring and fast evaluations of field-level consequences of extreme weather events on smallholder agriculture output. To accomplish this technique, we must digitally map all communities agricultural boundaries and crop kinds. With the improvement of satellite remote sensing technologies, a geo-referenced database may be created for rural Indian agriculture fields. Using AI, we can design digital agricultural solutions for individual farms. Main objective is to Geo-enable each farm along with their seasonal crop information by combining Artificial Intelligence (AI) with satellite and near-surface data and then prepare long term crop monitoring through in-depth field analysis and scanning of fields with satellite derived vegetation indices. We developed an AI based algorithm to understand the timelapse based growth of vegetation using PhenoCam or Smartphone based images. We developed an android platform where user can collect images of their fields based on the android application. These images will be sent to our local server, and then further AI based processing will be done at our server. We are creating digital boundaries of individual farms and connecting these farms with our smart phone application to collect information about farmers and their crops in each season. We are extracting satellite-based information for each farm from Google earth engine APIs and merging this data with our data of tested crops from our app according to their farm’s locations and create a database which will provide the data of quality of crops from their location.Keywords: artificial intelligence, satellite remote sensing, crop monitoring, android and web application
Procedia PDF Downloads 1004302 Synchronization of Two Mobile Robots
Authors: R. M. López-Gutiérrez, J. A. Michel-Macarty, H. Cervantes-De Avila, J. I. Nieto-Hipólito, C. Cruz-Hernández, L. Cardoza-Avendaño, S. Cortiant-Velez
Abstract:
It is well know that mankind benefits from the application of robot control by virtual handlers in industrial environments. In recent years, great interest has emerged in the control of multiple robots in order to carry out collective tasks. One main trend is to copy the natural organization that some organisms have, such as, ants, bees, school of fish, birds’ migration, etc. Surely, this collaborative work, results in better outcomes than those obtain in an isolated or individual effort. This topic has a great drive because collaboration between several robots has the potential capability of carrying out more complicated tasks, doing so, with better efficiency, resiliency and fault tolerance, in cases such as: coordinate navigation towards a target, terrain exploration, and search-rescue operations. In this work, synchronization of multiple autonomous robots is shown over a variety of coupling topologies: star, ring, chain, and global. In all cases, collective synchronous behavior is achieved, in the complex networks formed with mobile robots. Nodes of these networks are modeled by a mass using Matlab to simulate them.Keywords: robots, synchronization, bidirectional, coordinate navigation
Procedia PDF Downloads 3584301 Assessment the Implications of Regional Transport and Local Emission Sources for Mitigating Particulate Matter in Thailand
Authors: Ruchirek Ratchaburi, W. Kevin. Hicks, Christopher S. Malley, Lisa D. Emberson
Abstract:
Air pollution problems in Thailand have improved over the last few decades, but in some areas, concentrations of coarse particulate matter (PM₁₀) are above health and regulatory guidelines. It is, therefore, useful to investigate how PM₁₀ varies across Thailand, what conditions cause this variation, and how could PM₁₀ concentrations be reduced. This research uses data collected by the Thailand Pollution Control Department (PCD) from 17 monitoring sites, located across 12 provinces, and obtained between 2011 and 2015 to assess PM₁₀ concentrations and the conditions that lead to different levels of pollution. This is achieved through exploration of air mass pathways using trajectory analysis, used in conjunction with the monitoring data, to understand the contribution of different months, an hour of the day and source regions to annual PM₁₀ concentrations in Thailand. A focus is placed on locations that exceed the national standard for the protection of human health. The analysis shows how this approach can be used to explore the influence of biomass burning on annual average PM₁₀ concentration and the difference in air pollution conditions between Northern and Southern Thailand. The results demonstrate the substantial contribution that open biomass burning from agriculture and forest fires in Thailand and neighboring countries make annual average PM₁₀ concentrations. The analysis of PM₁₀ measurements at monitoring sites in Northern Thailand show that in general, high concentrations tend to occur in March and that these particularly high monthly concentrations make a substantial contribution to the overall annual average concentration. In 2011, a > 75% reduction in the extent of biomass burning in Northern Thailand and in neighboring countries resulted in a substantial reduction not only in the magnitude and frequency of peak PM₁₀ concentrations but also in annual average PM₁₀ concentrations at sites across Northern Thailand. In Southern Thailand, the annual average PM₁₀ concentrations for individual years between 2011 and 2015 did not exceed the human health standard at any site. The highest peak concentrations in Southern Thailand were much lower than for Northern Thailand for all sites. The peak concentrations at sites in Southern Thailand generally occurred between June and October and were associated with air mass back trajectories that spent a substantial proportion of time over the sea, Indonesia, Malaysia, and Thailand prior to arrival at the monitoring sites. The results show that emissions reductions from biomass burning and forest fires require action on national and international scales, in both Thailand and neighboring countries, such action could contribute to ensuring compliance with Thailand air quality standards.Keywords: annual average concentration, long-range transport, open biomass burning, particulate matter
Procedia PDF Downloads 1834300 Ultra Reliable Communication: Availability Analysis in 5G Cellular Networks
Authors: Yosra Benchaabene, Noureddine Boujnah, Faouzi Zarai
Abstract:
To meet the growing demand of users, the fifth generation (5G) will continue to provide services to higher data rates with higher carrier frequencies and wider bandwidths. As part of the 5G communication paradigm, Ultra Reliable Communication (URC) is envisaged as an important technology pillar for providing anywhere and anytime services to end users. Ultra Reliable Communication (URC) is considered an important technology that why it has become an active research topic. In this work, we analyze the availability of a service in the space domain. We characterize spatially available areas consisting of all locations that meet a performance requirement with confidence, and we define cell availability and system availability, individual user availability, and user-oriented system availability. Poisson point process (PPP) and Voronoi tessellation are adopted to model the spatial characteristics of a cell deployment in heterogeneous networks. Numerical results are presented, also highlighting the effect of different system parameters on the achievable link availability.Keywords: URC, dependability and availability, space domain analysis, Poisson point process, Voronoi Tessellation
Procedia PDF Downloads 1224299 Analysis of Vibratory Signals Based on Local Mean Decomposition (LMD) for Rolling Bearing Fault Diagnosis
Authors: Toufik Bensana, Medkour Mihoub, Slimane Mekhilef
Abstract:
The use of vibration analysis has been established as the most common and reliable method of analysis in the field of condition monitoring and diagnostics of rotating machinery. Rolling bearings cover a broad range of rotary machines and plays a crucial role in the modern manufacturing industry. Unfortunately, the vibration signals collected from a faulty bearing are generally nonstationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA), and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that, the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. The results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.Keywords: fault diagnosis, rolling element bearing, local mean decomposition, condition monitoring
Procedia PDF Downloads 3904298 Monitoring of Serological Test of Blood Serum in Indicator Groups of the Population of Central Kazakhstan
Authors: Praskovya Britskaya, Fatima Shaizadina, Alua Omarova, Nessipkul Alysheva
Abstract:
Planned preventive vaccination, which is carried out in the Republic of Kazakhstan, promoted permanent decrease in the incidence of measles and viral hepatitis B. In the structure of VHB patients prevail people of young, working age. Monitoring of infectious incidence, monitoring of coverage of immunization of the population, random serological control over the immunity enable well-timed identification of distribution of the activator, effectiveness of the taken measures and forecasting. The serological blood analysis was conducted in indicator groups of the population of Central Kazakhstan for the purpose of identification of antibody titre for vaccine preventable infections (measles, viral hepatitis B). Measles antibodies were defined by method of enzyme-linked assay (ELA) with test-systems "VektoKor" – Ig G ('Vektor-Best' JSC). Antibodies for HBs-antigen of hepatitis B virus in blood serum was identified by method of enzyme-linked assay (ELA) with VektoHBsAg test systems – antibodies ('Vektor-Best' JSC). The result of the analysis is positive, the concentration of IgG to measles virus in the studied sample is equal to 0.18 IU/ml or more. Protective level of concentration of anti-HBsAg makes 10 mIU/ml. The results of the study of postvaccinal measles immunity showed that the share of seropositive people made 87.7% of total number of surveyed. The level of postvaccinal immunity to measles in age groups differs. So, among people older than 56 the percentage of seropositive made 95.2%. Among people aged 15-25 were registered 87.0% seropositive, at the age of 36-45 – 86.6%. In age groups of 25-35 and 36-45 the share of seropositive people was approximately at the same level – 88.5% and 88.8% respectively. The share of people seronegative to a measles virus made 12.3%. The biggest share of seronegative people was found among people aged 36-45 – 13.4% and 15-25 – 13.0%. The analysis of results of the examined people for the existence of postvaccinal immunity to viral hepatitis B showed that from all surveyed only 33.5% have the protective level of concentration of anti-HBsAg of 10 mIU/ml and more. The biggest share of people protected from VHB virus is observed in the age group of 36-45 and makes 60%. In the indicator group – above 56 – seropositive people made 4.8%. The high percentage of seronegative people has been observed in all studied age groups from 40.0% to 95.2%. The group of people which is least protected from getting VHB is people above 56 (95.2%). The probability to get VHB is also high among young people aged 25-35, the percentage of seronegative people made 80%. Thus, the results of the conducted research testify to the need for carrying out serological monitoring of postvaccinal immunity for the purpose of operational assessment of the epidemiological situation, early identification of its changes and prediction of the approaching danger.Keywords: antibodies, blood serum, immunity, immunoglobulin
Procedia PDF Downloads 2554297 An Attentional Bi-Stream Sequence Learner (AttBiSeL) for Credit Card Fraud Detection
Authors: Amir Shahab Shahabi, Mohsen Hasirian
Abstract:
Modern societies, marked by expansive Internet connectivity and the rise of e-commerce, are now integrated with digital platforms at an unprecedented level. The efficiency, speed, and accessibility of e-commerce have garnered a substantial consumer base. Against this backdrop, electronic banking has undergone rapid proliferation within the realm of online activities. However, this growth has inadvertently given rise to an environment conducive to illicit activities, notably electronic payment fraud, posing a formidable challenge to the domain of electronic banking. A pivotal role in upholding the integrity of electronic commerce and business transactions is played by electronic fraud detection, particularly in the context of credit cards which underscores the imperative of comprehensive research in this field. To this end, our study introduces an Attentional Bi-Stream Sequence Learner (AttBiSeL) framework that leverages attention mechanisms and recurrent networks. By incorporating bidirectional recurrent layers, specifically bidirectional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) layers, the proposed model adeptly extracts past and future transaction sequences while accounting for the temporal flow of information in both directions. Moreover, the integration of an attention mechanism accentuates specific transactions to varying degrees, as manifested in the output of the recurrent networks. The effectiveness of the proposed approach in automatic credit card fraud classification is evaluated on the European Cardholders' Fraud Dataset. Empirical results validate that the hybrid architectural paradigm presented in this study yields enhanced accuracy compared to previous studies.Keywords: credit card fraud, deep learning, attention mechanism, recurrent neural networks
Procedia PDF Downloads 184296 Harnessing Artificial Intelligence and Machine Learning for Advanced Fraud Detection and Prevention
Authors: Avinash Malladhi
Abstract:
Forensic accounting is a specialized field that involves the application of accounting principles, investigative skills, and legal knowledge to detect and prevent fraud. With the rise of big data and technological advancements, artificial intelligence (AI) and machine learning (ML) algorithms have emerged as powerful tools for forensic accountants to enhance their fraud detection capabilities. In this paper, we review and analyze various AI/ML algorithms that are commonly used in forensic accounting, including supervised and unsupervised learning, deep learning, natural language processing Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Support Vector Machines (SVMs), Decision Trees, and Random Forests. We discuss their underlying principles, strengths, and limitations and provide empirical evidence from existing research studies demonstrating their effectiveness in detecting financial fraud. We also highlight potential ethical considerations and challenges associated with using AI/ML in forensic accounting. Furthermore, we highlight the benefits of these technologies in improving fraud detection and prevention in forensic accounting.Keywords: AI, machine learning, forensic accounting & fraud detection, anti money laundering, Benford's law, fraud triangle theory
Procedia PDF Downloads 934295 The Effect of Environmental, Social, and Governance (ESG) Disclosure on Firms’ Credit Rating and Capital Structure
Authors: Heba Abdelmotaal
Abstract:
This paper explores the impact of the extent of a company's environmental, social, and governance (ESG) disclosure on credit rating and capital structure. The analysis is based on a sample of 202 firms from the 350 FTSE firms over the period of 2008-2013. ESG disclosure score is measured using Proprietary Bloomberg score based on the extent of a company's Environmental, Social, and Governance (ESG) disclosure. The credit rating is measured by The QuiScore, which is a measure of the likelihood that a company will become bankrupt in the twelve months following the date of calculation. The Capital Structure is measured by long term debt ratio. Two hypotheses are test using panel data regression. The results suggested that the higher degree of ESG disclosure leads to better credit rating. There is significant negative relationship between ESG disclosure and the long term debit percentage. The paper includes implications for the transparency which is resulting of the ESG disclosure could support the Monitoring Function. The monitoring role of disclosure is the increasing in the transparency of the credit rating agencies, also it could affect on managers’ actions. This study provides empirical evidence on the material of ESG disclosure on credit ratings changes and the firms’ capital decision making.Keywords: capital structure, credit rating agencies, ESG disclosure, panel data regression
Procedia PDF Downloads 3604294 Deep Learning Based Unsupervised Sport Scene Recognition and Highlights Generation
Authors: Ksenia Meshkova
Abstract:
With increasing amount of multimedia data, it is very important to automate and speed up the process of obtaining meta. This process means not just recognition of some object or its movement, but recognition of the entire scene versus separate frames and having timeline segmentation as a final result. Labeling datasets is time consuming, besides, attributing characteristics to particular scenes is clearly difficult due to their nature. In this article, we will consider autoencoders application to unsupervised scene recognition and clusterization based on interpretable features. Further, we will focus on particular types of auto encoders that relevant to our study. We will take a look at the specificity of deep learning related to information theory and rate-distortion theory and describe the solutions empowering poor interpretability of deep learning in media content processing. As a conclusion, we will present the results of the work of custom framework, based on autoencoders, capable of scene recognition as was deeply studied above, with highlights generation resulted out of this recognition. We will not describe in detail the mathematical description of neural networks work but will clarify the necessary concepts and pay attention to important nuances.Keywords: neural networks, computer vision, representation learning, autoencoders
Procedia PDF Downloads 1274293 Therapeutic Drug Monitoring by Dried Blood Spot and LC-MS/MS: Novel Application to Carbamazepine and Its Metabolite in Paediatric Population
Authors: Giancarlo La Marca, Engy Shokry, Fabio Villanelli
Abstract:
Epilepsy is one of the most common neurological disorders, with an estimated prevalence of 50 million people worldwide. Twenty five percent of the epilepsy population is represented in children under the age of 15 years. For antiepileptic drugs (AED), there is a poor correlation between plasma concentration and dose especially in children. This was attributed to greater pharmacokinetic variability than adults. Hence, therapeutic drug monitoring (TDM) is recommended in controlling toxicity while drug exposure is maintained. Carbamazepine (CBZ) is a first-line AED and the drug of first choice in trigeminal neuralgia. CBZ is metabolised in the liver into carbamazepine-10,11-epoxide (CBZE), its major metabolite which is equipotent. This develops the need for an assay able to monitor the levels of both CBZ and CBZE. The aim of the present study was to develop and validate a LC-MS/MS method for simultaneous quantification of CBZ and CBZE in dried blood spots (DBS). DBS technique overcomes many logistical problems, ethical issues and technical challenges faced by classical plasma sampling. LC-MS/MS has been regarded as superior technique over immunoassays and HPLC/UV methods owing to its better specificity and sensitivity, lack of interference or matrix effects. Our method combines advantages of DBS technique and LC-MS/MS in clinical practice. The extraction process was done using methanol-water-formic acid (80:20:0.1, v/v/v). The chromatographic elution was achieved by using a linear gradient with a mobile phase consisting of acetonitrile-water-0.1% formic acid at a flow rate of 0.50 mL/min. The method was linear over the range 1-40 mg/L and 0.25-20 mg/L for CBZ and CBZE respectively. The limit of quantification was 1.00 mg/L and 0.25 mg/L for CBZ and CBZE, respectively. Intra-day and inter-day assay precisions were found to be less than 6.5% and 11.8%. An evaluation of DBS technique was performed, including effect of extraction solvent, spot homogeneity and stability in DBS. Results from a comparison with the plasma assay are also presented. The novelty of the present work lies in being the first to quantify CBZ and its metabolite from only one 3.2 mm DBS disc finger-prick sample (3.3-3.4 µl blood) by LC-MS/MS in a 10 min. chromatographic run.Keywords: carbamazepine, carbamazepine-10, 11-epoxide, dried blood spots, LC-MS/MS, therapeutic drug monitoring
Procedia PDF Downloads 4174292 Design and Developing the Infrared Sensor for Detection and Measuring Mass Flow Rate in Seed Drills
Authors: Bahram Besharti, Hossein Navid, Hadi Karimi, Hossein Behfar, Iraj Eskandari
Abstract:
Multiple or miss sowing by seed drills is a common problem on the farm. This problem causes overuse of seeds, wasting energy, rising crop treatment cost and reducing crop yield in harvesting. To be informed of mentioned faults and monitoring the performance of seed drills during sowing, developing a seed sensor for detecting seed mass flow rate and monitoring in a delivery tube is essential. In this research, an infrared seed sensor was developed to estimate seed mass flow rate in seed drills. The developed sensor comprised of a pair of spaced apart circuits one acting as an IR transmitter and the other acting as an IR receiver. Optical coverage in the sensing section was obtained by setting IR LEDs and photo-diodes directly on opposite sides. Passing seeds made interruption in radiation beams to the photo-diode which caused output voltages to change. The voltage difference of sensing units summed by a microcontroller and were converted to an analog value by DAC chip. The sensor was tested by using a roller seed metering device with three types of seeds consist of chickpea, wheat, and alfalfa (representing large, medium and fine seed, respectively). The results revealed a good fitting between voltage received from seed sensor and mass flow of seeds in the delivery tube. A linear trend line was set for three seeds collected data as a model of the mass flow of seeds. A final mass flow model was developed for various size seeds based on receiving voltages from the seed sensor, thousand seed weight and equivalent diameter of seeds. The developed infrared seed sensor, besides monitoring mass flow of seeds in field operations, can be used for the assessment of mechanical planter seed metering unit performance in the laboratory and provide an easy calibrating method for seed drills before planting in the field.Keywords: seed flow, infrared, seed sensor, seed drills
Procedia PDF Downloads 3664291 A Study on Vulnerability of Alahsa Governorate to Generate Urban Heat Islands
Authors: Ilham S. M. Elsayed
Abstract:
The purpose of this study is to investigate Alahsa Governorate status and its vulnerability to generate urban heat islands. Alahsa Governorate is a famous oasis in the Arabic Peninsula including several oil centers. Extensive literature review was done to collect previous relative data on the urban heat island of Alahsa Governorate. Data used for the purpose of this research were collected from authorized bodies who control weather station networks over Alahsa Governorate, Eastern Province, Saudi Arabia. Although, the number of weather station networks within the region is very limited and the analysis using GIS software and its techniques is difficult and limited, the data analyzed confirm an increase in temperature for more than 2 °C from 2004 to 2014. Such increase is considerable whenever human health and comfort are the concern. The increase of temperature within one decade confirms the availability of urban heat islands. The study concludes that, Alahsa Governorate is vulnerable to create urban heat islands and more attention should be drawn to strategic planning of the governorate that is developing with a high pace and considerable increasing levels of urbanization.Keywords: Alahsa Governorate, population density, Urban Heat Island, weather station
Procedia PDF Downloads 2504290 A Distributed Smart Battery Management System – sBMS, for Stationary Energy Storage Applications
Authors: António J. Gano, Carmen Rangel
Abstract:
Currently, electric energy storage systems for stationary applications have known an increasing interest, namely with the integration of local renewable energy power sources into energy communities. Li-ion batteries are considered the leading electric storage devices to achieve this integration, and Battery Management Systems (BMS) are decisive for their control and optimum performance. In this work, the advancement of a smart BMS (sBMS) prototype with a modular distributed topology is described. The system, still under development, has a distributed architecture with modular characteristics to operate with different battery pack topologies and charge capacities, integrating adaptive algorithms for functional state real-time monitoring and management of multicellular Li-ion batteries, and is intended for application in the context of a local energy community fed by renewable energy sources. This sBMS system includes different developed hardware units: (1) Cell monitoring units (CMUs) for interfacing with each individual cell or module monitoring within the battery pack; (2) Battery monitoring and switching unit (BMU) for global battery pack monitoring, thermal control and functional operating state switching; (3) Main management and local control unit (MCU) for local sBMS’s management and control, also serving as a communications gateway to external systems and devices. This architecture is fully expandable to battery packs with a large number of cells, or modules, interconnected in series, as the several units have local data acquisition and processing capabilities, communicating over a standard CAN bus and will be able to operate almost autonomously. The CMU units are intended to be used with Li-ion cells but can be used with other cell chemistries, with output voltages within the 2.5 to 5 V range. The different unit’s characteristics and specifications are described, including the different implemented hardware solutions. The developed hardware supports both passive and active methods for charge equalization, considered fundamental functionalities for optimizing the performance and the useful lifetime of a Li-ion battery package. The functional characteristics of the different units of this sBMS system, including different process variables data acquisition using a flexible set of sensors, can support the development of custom algorithms for estimating the parameters defining the functional states of the battery pack (State-of-Charge, State-of-Health, etc.) as well as different charge equalizing strategies and algorithms. This sBMS system is intended to interface with other systems and devices using standard communication protocols, like those used by the Internet of Things. In the future, this sBMS architecture can evolve to a fully decentralized topology, with all the units using Wi-Fi protocols and integrating a mesh network, making unnecessary the MCU unit. The status of the work in progress is reported, leading to conclusions on the system already executed, considering the implemented hardware solution, not only as fully functional advanced and configurable battery management system but also as a platform for developing custom algorithms and optimizing strategies to achieve better performance of electric energy stationary storage devices.Keywords: Li-ion battery, smart BMS, stationary electric storage, distributed BMS
Procedia PDF Downloads 1014289 Greenland Monitoring Using Vegetation Index: A Case Study of Lal Suhanra National Park
Authors: Rabia Munsaf Khan, Eshrat Fatima
Abstract:
The analysis of the spatial extent and temporal change of vegetation cover using remotely sensed data is of critical importance to agricultural sciences. Pakistan, being an agricultural country depends on this resource as it makes 70% of the GDP. The case study is of Lal Suhanra National Park, which is not only the biggest forest reserve of Pakistan but also of Asia. The study is performed using different temporal images of Landsat. Also, the results of Landsat are cross-checked by using Sentinel-2 imagery as it has both higher spectral and spatial resolution. Vegetation can easily be detected using NDVI which is a common and widely used index. It is an important vegetation index, widely applied in research on global environmental and climatic change. The images are then classified to observe the change occurred over 15 years. Vegetation cover maps of 2000 and 2016 are used to generate the map of vegetation change detection for the respective years and to find out the changing pattern of vegetation cover. Also, the NDVI values aided in the detection of percentage decrease in vegetation cover. The study reveals that vegetation cover of the area has decreased significantly during the year 2000 and 2016.Keywords: Landsat, normalized difference vegetation index (NDVI), sentinel 2, Greenland monitoring
Procedia PDF Downloads 3094288 A Case Study on Machine Learning-Based Project Performance Forecasting for an Urban Road Reconstruction Project
Authors: Soheila Sadeghi
Abstract:
In construction projects, predicting project performance metrics accurately is essential for effective management and successful delivery. However, conventional methods often depend on fixed baseline plans, disregarding the evolving nature of project progress and external influences. To address this issue, we introduce a distinct approach based on machine learning to forecast key performance indicators, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category within an urban road reconstruction project. Our proposed model leverages time series forecasting techniques, namely Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance by analyzing historical data and project progress. Additionally, the model incorporates external factors, including weather patterns and resource availability, as features to improve forecast accuracy. By harnessing the predictive capabilities of machine learning, our performance forecasting model enables project managers to proactively identify potential deviations from the baseline plan and take timely corrective measures. To validate the effectiveness of the proposed approach, we conduct a case study on an urban road reconstruction project, comparing the model's predictions with actual project performance data. The outcomes of this research contribute to the advancement of project management practices in the construction industry by providing a data-driven solution for enhancing project performance monitoring and control.Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, schedule variance, earned value management
Procedia PDF Downloads 394287 Long-Term Indoor Air Monitoring for Students with Emphasis on Particulate Matter (PM2.5) Exposure
Authors: Seyedtaghi Mirmohammadi, Jamshid Yazdani, Syavash Etemadi Nejad
Abstract:
One of the main indoor air parameters in classrooms is dust pollution and it depends on the particle size and exposure duration. However, there is a lake of data about the exposure level to PM2.5 concentrations in rural area classrooms. The objective of the current study was exposure assessment for PM2.5 for students in the classrooms. One year monitoring was carried out for fifteen schools by time-series sampling to evaluate the indoor air PM2.5 in the rural district of Sari city, Iran. A hygrometer and thermometer were used to measure some psychrometric parameters (temperature, relative humidity, and wind speed) and Real-Time Dust Monitor, (MicroDust Pro, Casella, UK) was used to monitor particulate matters (PM2.5) concentration. The results show the mean indoor PM2.5 concentration in the studied classrooms was 135µg/m3. The regression model indicated that a positive correlation between indoor PM2.5 concentration and relative humidity, also with distance from city center and classroom size. Meanwhile, the regression model revealed that the indoor PM2.5 concentration, the relative humidity, and dry bulb temperature was significant at 0.05, 0.035, and 0.05 levels, respectively. A statistical predictive model was obtained from multiple regressions modeling for indoor PM2.5 concentration and indoor psychrometric parameters conditions.Keywords: classrooms, concentration, humidity, particulate matters, regression
Procedia PDF Downloads 3354286 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO
Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu
Abstract:
Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO
Procedia PDF Downloads 914285 Marine Environmental Monitoring Using an Open Source Autonomous Marine Surface Vehicle
Authors: U. Pruthviraj, Praveen Kumar R. A. K. Athul, K. V. Gangadharan, S. Rao Shrikantha
Abstract:
An open source based autonomous unmanned marine surface vehicle (UMSV) is developed for some of the marine applications such as pollution control, environmental monitoring and thermal imaging. A double rotomoulded hull boat is deployed which is rugged, tough, quick to deploy and moves faster. It is suitable for environmental monitoring, and it is designed for easy maintenance. A 2HP electric outboard marine motor is used which is powered by a lithium-ion battery and can also be charged from a solar charger. All connections are completely waterproof to IP67 ratings. In full throttle speed, the marine motor is capable of up to 7 kmph. The motor is integrated with an open source based controller using cortex M4F for adjusting the direction of the motor. This UMSV can be operated by three modes: semi-autonomous, manual and fully automated. One of the channels of a 2.4GHz radio link 8 channel transmitter is used for toggling between different modes of the USMV. In this electric outboard marine motor an on board GPS system has been fitted to find the range and GPS positioning. The entire system can be assembled in the field in less than 10 minutes. A Flir Lepton thermal camera core, is integrated with a 64-bit quad-core Linux based open source processor, facilitating real-time capturing of thermal images and the results are stored in a micro SD card which is a data storage device for the system. The thermal camera is interfaced to an open source processor through SPI protocol. These thermal images are used for finding oil spills and to look for people who are drowning at low visibility during the night time. A Real Time clock (RTC) module is attached with the battery to provide the date and time of thermal images captured. For the live video feed, a 900MHz long range video transmitter and receiver is setup by which from a higher power output a longer range of 40miles has been achieved. A Multi-parameter probe is used to measure the following parameters: conductivity, salinity, resistivity, density, dissolved oxygen content, ORP (Oxidation-Reduction Potential), pH level, temperature, water level and pressure (absolute).The maximum pressure it can withstand 160 psi, up to 100m. This work represents a field demonstration of an open source based autonomous navigation system for a marine surface vehicle.Keywords: open source, autonomous navigation, environmental monitoring, UMSV, outboard motor, multi-parameter probe
Procedia PDF Downloads 241