Search results for: accuracy of payment time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20383

Search results for: accuracy of payment time

19813 Predicting Financial Distress in South Africa

Authors: Nikki Berrange, Gizelle Willows

Abstract:

Business rescue has become increasingly popular since its inclusion in the Companies Act of South Africa in May 2011. The Alternate Exchange (AltX) of the Johannesburg Stock Exchange has experienced a marked increase in the number of companies entering business rescue. This study sampled twenty companies listed on the AltX to determine whether Altman’s Z-score model for emerging markets (ZEM) or Taffler’s Z-score model is a more accurate model in predicting financial distress for small to medium size companies in South Africa. The study was performed over three different time horizons; one, two and three years prior to the event of financial distress, in order to determine how many companies each model predicted would be unlikely to succeed as well as the predictive ability and accuracy of the respective models. The study found that Taffler’s Z-score model had a greater ability at predicting financial distress from all three-time horizons.

Keywords: Altman’s ZEM-score, Altman’s Z-score, AltX, business rescue, Taffler’s Z-score

Procedia PDF Downloads 339
19812 Simulation of Pedestrian Service Time at Different Delay Times

Authors: Imran Badshah

Abstract:

Pedestrian service time reflects the performance of the facility, and it’s a key parameter to analyze the capability of facilities provided to serve pedestrians. The level of service of pedestrians (LOS) mainly depends on pedestrian time and safety. The pedestrian time utilized by taking a service is mainly influenced by the number of available services and the time utilized by each pedestrian in receiving a service; that is called a delay time. In this paper, we analyzed the simulated pedestrian service time with different delay times. A simulation is performed in AnyLogic by developing a model that reflects the real scenario of pedestrian services such as ticket machine gates at rail stations, airports, shopping malls, and cinema halls. The simulated pedestrian time is determined for various delay values. The simulated result shows how pedestrian time changes with the delay pattern. The histogram and time plot graph of a model gives the mean, maximum and minimum values of the pedestrian time. This study helps us to check the behavior of pedestrian time at various services such as subway stations, airports, shopping malls, and cinema halls.

Keywords: agent-based simulation, anylogic model, pedestrian behavior, time delay

Procedia PDF Downloads 191
19811 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser

Authors: Guanqiao Wang, Hongyang Yu

Abstract:

There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.

Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing

Procedia PDF Downloads 128
19810 The Classification Accuracy of Finance Data through Holder Functions

Authors: Yeliz Karaca, Carlo Cattani

Abstract:

This study focuses on the local Holder exponent as a measure of the function regularity for time series related to finance data. In this study, the attributes of the finance dataset belonging to 13 countries (India, China, Japan, Sweden, France, Germany, Italy, Australia, Mexico, United Kingdom, Argentina, Brazil, USA) located in 5 different continents (Asia, Europe, Australia, North America and South America) have been examined.These countries are the ones mostly affected by the attributes with regard to financial development, covering a period from 2012 to 2017. Our study is concerned with the most important attributes that have impact on the development of finance for the countries identified. Our method is comprised of the following stages: (a) among the multi fractal methods and Brownian motion Holder regularity functions (polynomial, exponential), significant and self-similar attributes have been identified (b) The significant and self-similar attributes have been applied to the Artificial Neuronal Network (ANN) algorithms (Feed Forward Back Propagation (FFBP) and Cascade Forward Back Propagation (CFBP)) (c) the outcomes of classification accuracy have been compared concerning the attributes that have impact on the attributes which affect the countries’ financial development. This study has enabled to reveal, through the application of ANN algorithms, how the most significant attributes are identified within the relevant dataset via the Holder functions (polynomial and exponential function).

Keywords: artificial neural networks, finance data, Holder regularity, multifractals

Procedia PDF Downloads 230
19809 The Impact of the Training Program Provided by the Saudi Archery Federation on the Electromyography of the Bow Arm Muscles

Authors: Hana Aljumayi, Mohammed Issa

Abstract:

The aim of this study was to investigate the effect of the training program for professional athletes at the Saudi Archery Federation on the electrical activity of the muscles involved in pulling the bowstring, maximum muscle strength (MVC) and to identify the relationship between the electrical activity of these muscles and accuracy in shooting among female archers. The researcher used a descriptive approach that was suitable for the nature of the study, and a sample of nine female archers was selected using purposive sampling. An EMG device was used to measure signal amplitude, signal frequency, spectral energy signal, and MVC. The results showed statistically significant differences in signal amplitude among muscles, with F(8,1)=5.91 and a significance level of 0.02. There were also statistically significant differences between muscles in terms of signal frequency, with F(8,1)=8.23 and a significance level of 0.02. Bonferroni test results indicated statistically significant differences between measurements at a significance level of 0.05, with anterior measurements showing an average difference of 16.4 compared to other measurements. Furthermore, there was a significant negative correlation between signal amplitude in the calf muscle and accuracy in shooting (r=-0.78) at a significance level of 0.02. There was also a significant positive correlation between signal frequency in the calf muscle and accuracy in shooting (r=0.72) at a significance level of 0.04. In conclusion, it appears that the training program for archery athletes focused more on skill development than physical aspects such as muscle activity and strength development. However, it did have a statistically significant effect on signal amplitude but not on signal frequency or MVC development in muscles involved in pulling the bowstring.

Keywords: electrical activity of muscles, archery sport, shooting accuracy, muscles

Procedia PDF Downloads 48
19808 Coarse Grid Computational Fluid Dynamics Fire Simulations

Authors: Wolfram Jahn, Jose Manuel Munita

Abstract:

While computational fluid dynamics (CFD) simulations of fire scenarios are commonly used in the design of buildings, less attention has been given to the use of CFD simulations as an operational tool for the fire services. The reason of this lack of attention lies mainly in the fact that CFD simulations typically take large periods of time to complete, and their results would thus not be available in time to be of use during an emergency. Firefighters often face uncertain conditions when entering a building to attack a fire. They would greatly benefit from a technology based on predictive fire simulations, able to assist their decision-making process. The principal constraint to faster CFD simulations is the fine grid necessary to solve accurately the physical processes that govern a fire. This paper explores the possibility of overcoming this constraint and using coarse grid CFD simulations for fire scenarios, and proposes a methodology to use the simulation results in a meaningful way that can be used by the fire fighters during an emergency. Data from real scale compartment fire tests were used to compare CFD fire models with different grid arrangements, and empirical correlations were obtained to interpolate data points into the grids. The results show that the strongly predominant effect of the heat release rate of the fire on the fluid dynamics allows for the use of coarse grids with relatively low overall impact of simulation results. Simulations with an acceptable level of accuracy could be run in real time, thus making them useful as a forecasting tool for emergency response purposes.

Keywords: CFD, fire simulations, emergency response, forecast

Procedia PDF Downloads 302
19807 The Unscented Kalman Filter Implementation for the Sensorless Speed Control of a Permanent Magnet Synchronous Motor

Authors: Justas Dilys

Abstract:

ThispaperaddressestheimplementationandoptimizationofanUnscentedKalmanFilter(UKF) for the Permanent Magnet Synchronous Motor (PMSM) sensorless control using an ARM Cortex- M3 microcontroller. A various optimization levels based on arithmetic calculation reduction was implemented in ARM Cortex-M3 microcontroller. The execution time of UKF estimator was up to 90µs without loss of accuracy. Moreover, simulation studies on the Unscented Kalman filters are carried out using Matlab to explore the usability of the UKF in a sensorless PMSMdrive.

Keywords: unscented kalman filter, ARM, PMSM, implementation

Procedia PDF Downloads 143
19806 Analytical and Numerical Investigation of Friction-Restricted Growth and Buckling of Elastic Fibers

Authors: Peter L. Varkonyi, Andras A. Sipos

Abstract:

The quasi-static growth of elastic fibers is studied in the presence of distributed contact with an immobile surface, subject to isotropic dry or viscous friction. Unlike classical problems of elastic stability modelled by autonomous dynamical systems with multiple time scales (slowly varying bifurcation parameter, and fast system dynamics), this problem can only be formulated as a non-autonomous system without time scale separation. It is found that the fibers initially converge to a trivial, straight configuration, which is later replaced by divergence reminiscent of buckling phenomena. In order to capture the loss of stability, a new definition of exponential stability against infinitesimal perturbations for systems defined over finite time intervals is developed. A semi-analytical method for the determination of the critical length based on eigenvalue analysis is proposed. The post-critical behavior of the fibers is studied numerically by using variational methods. The emerging post-critical shapes and the asymptotic behavior as length goes to infinity are identified for simple spatial distributions of growth. Comparison with physical experiments indicates reasonable accuracy of the theoretical model. Some applications from modeling plant root growth to the design of soft manipulators in robotics are briefly discussed.

Keywords: buckling, elastica, friction, growth

Procedia PDF Downloads 177
19805 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 265
19804 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use

Authors: Mayank Mundhra, Chester Rebeiro

Abstract:

Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.

Keywords: Ripple, Kelips, unique node list, consensus, information propagation

Procedia PDF Downloads 127
19803 Evaluation of Clinical Decision Support System in Electronic Medical Record System: A Case of Malawi National Art Electronic Medical Record System

Authors: Pachawo Bisani, Goodall Nyirenda

Abstract:

The Malawi National Antiretroviral Therapy (NART) Electronic Medical Record (EMR) system was designed and developed with guidance from the Ministry of Health through the Department of HIV and AIDS (DHA) with the aim of supporting the management of HIV patient data and reporting in high prevalence ART clinics. As of 2021, the system has been scaled up to over 206 facilities across the country. The system is integrated with the clinical decision support system (CDSS) to assist healthcare providers in making a decision about an individual patient at a particular point in time. Despite NART EMR undergoing several evaluations and assessments, little has been done to evaluate the clinical decision support system in the NART EMR system. Hence, the study aimed to evaluate the use of CDSS in the NART EMR system in Malawi. The study adopted a mixed-method approach, and data was collected through interviews, observations, and questionnaires. The study has revealed that the CDSS tools were integrated into the ART clinic workflow, making it easy for the user to use it. The study has also revealed challenges in system reliability and information accuracy. Despite the challenges, the study further revealed that the system is effective and efficient, and overall, users are satisfied with the system. The study recommends that the implementers focus more on the logic behind the clinical decision-support intervention in order to address some of the concerns and enhance the accuracy of the information supplied. The study further suggests consulting the system's actual users throughout implementation.

Keywords: clinical decision support system, electronic medical record system, usability, antiretroviral therapy

Procedia PDF Downloads 70
19802 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs

Procedia PDF Downloads 145
19801 Water Body Detection and Estimation from Landsat Satellite Images Using Deep Learning

Authors: M. Devaki, K. B. Jayanthi

Abstract:

The identification of water bodies from satellite images has recently received a great deal of attention. Different methods have been developed to distinguish water bodies from various satellite images that vary in terms of time and space. Urban water identification issues body manifests in numerous applications with a great deal of certainty. There has been a sharp rise in the usage of satellite images to map natural resources, including urban water bodies and forests, during the past several years. This is because water and forest resources depend on each other so heavily that ongoing monitoring of both is essential to their sustainable management. The relevant elements from satellite pictures have been chosen using a variety of techniques, including machine learning. Then, a convolution neural network (CNN) architecture is created that can identify a superpixel as either one of two classes, one that includes water or doesn't from input data in a complex metropolitan scene. The deep learning technique, CNN, has advanced tremendously in a variety of visual-related tasks. CNN can improve classification performance by reducing the spectral-spatial regularities of the input data and extracting deep features hierarchically from raw pictures. Calculate the water body using the satellite image's resolution. Experimental results demonstrate that the suggested method outperformed conventional approaches in terms of water extraction accuracy from remote-sensing images, with an average overall accuracy of 97%.

Keywords: water body, Deep learning, satellite images, convolution neural network

Procedia PDF Downloads 71
19800 E-Procurement Adoption and Effective Service Delivery in the Uganda Coffee Industry

Authors: Taus Muganda

Abstract:

This research explores the intricate relationship between e-procurement adoption and effective service delivery in the Uganda Coffee Industry, focusing on the processes involved, key actors, and the impact of digital transformation. The study is guided by three prominent theories, Actor-Network Theory, Resource-Based View Theory, and Institutional Theory to comprehensively explore the dynamics of e-procurement in the context of the coffee sector. The primary aim of this project is to examine the e-procurement adoption process and its role in enhancing service delivery within the Uganda Coffee Industry. The research questions guiding this inquiry are: firstly, whether e-procurement adoption and implementation contribute to achieving quality service delivery; and secondly, how e-procurement adoption can be effectively realized within the Uganda Coffee Industry. To address these questions, the study has laid out specific objectives. Firstly, it seeks to investigate the impact of e-procurement on effective service delivery, analysing how the integration of digital processes influences the overall quality of services provided in the coffee industry. Secondly, it aims to critically analyse the measures required to achieve effective delivery outcomes through the adoption and implementation of e-procurement, assessing the strategies that can maximize the benefits of digital transformation. Furthermore, the research endeavours to identify and examine the key actor’s instrumental in achieving effective service delivery within the Uganda Coffee Industry. By utilizing Actor-Network Theory, the study will elucidate the network of relationships and collaborations among actors involved in the e-procurement process. The research contributes to addressing a critical gap in the sector. Despite coffee being the leading export crop in Uganda, constituting 16% of total exports, there is a recognized need for digital transformation, specifically in the realm of e-procurement, to enhance the productivity of producers and contribute to the economic growth of the country. The study aims to provide insights into transforming the Uganda Coffee Industry by focusing on improving the e-procurement services delivered to actors in the coffee sector. The three forms of e-procurement investigated in this research—E-Sourcing, E-Payment, and E-Invoicing—serve as focal points in understanding the multifaceted dimensions of digital integration within the Uganda Coffee Industry. This research endeavours to offer practical recommendations for policymakers, industry stakeholders, and the UCDA to strategically leverage e-procurement for the benefit of the entire coffee value chain.

Keywords: e-procurement, effective service delivery, actors, actor-network theory, resource-based view theory, institutional theory, e-invocing, e-payment, e-sourcing

Procedia PDF Downloads 44
19799 An Experimental Study on the Variability of Nonnative and Native Inference of Word Meanings in Timed and Untimed Conditions

Authors: Swathi M. Vanniarajan

Abstract:

Reading research suggests that online contextual vocabulary comprehension while reading is an interactive and integrative process. One’s success in it depends on a variety of factors including the amount and the nature of available linguistic and nonlinguistic cues, his/her analytical and integrative skills, schema memory (content familiarity), and processing speed characterized along the continuum of controlled to automatic processing. The experiment reported here, conducted with 30 native speakers as one group and 30 nonnative speakers as another group (all graduate students), hypothesized that while working on (24) tasks which required them to comprehend an unfamiliar word in real time without backtracking, due to the differences in the nature of their respective reading processes, the nonnative subjects would be less able to construct the meanings of the unknown words by integrating the multiple but sufficient contextual cues provided in the text but the native subjects would be able to. The results indicated that there were significant inter-group as well as intra-group differences in terms of the quality of definitions given. However, when given additional time, while the nonnative speakers could significantly improve the quality of their definitions, the native speakers in general would not, suggesting that all things being equal, time is a significant factor for success in nonnative vocabulary and reading comprehension processes and that accuracy precedes automaticity in the development of nonnative reading processes also.

Keywords: reading, second language processing, vocabulary comprehension

Procedia PDF Downloads 148
19798 The Effects of Advisor Status and Time Pressure on Decision-Making in a Luggage Screening Task

Authors: Rachel Goh, Alexander McNab, Brent Alsop, David O'Hare

Abstract:

In a busy airport, the decision whether to take passengers aside and search their luggage for dangerous items can have important consequences. If an officer fails to search and stop a bag containing a dangerous object, a life-threatening incident might occur. But stopping a bag unnecessarily means that the officer might lose time searching the bag and face an angry passenger. Passengers’ bags, however, are often cluttered with personal belongings of varying shapes and sizes. It can be difficult to determine what is dangerous or not, especially if the decisions must be made quickly in cases of busy flight schedules. Additionally, the decision to search bags is often made with input from the surrounding officers on duty. This scenario raises several questions: 1) Past findings suggest that humans are more reliant on an automated aid when under time pressure in a visual search task, but does this translate to human-human reliance? 2) Are humans more likely to agree with another person if the person is assumed to be an expert or a novice in these ambiguous situations? In the present study, forty-one participants performed a simulated luggage-screening task. They were partnered with an advisor of two different statuses (expert vs. novice), but of equal accuracy (90% correct). Participants made two choices each trial: their first choice with no advisor input, and their second choice after advisor input. The second choice was made within either 2 seconds or 8 seconds; failure to do so resulted in a long time-out period. Under the 2-second time pressure, participants were more likely to disagree with their own first choice and agree with the expert advisor, regardless of whether the expert was right or wrong, but especially when the expert suggested that the bag was safe. The findings indicate a tendency for people to assume less responsibility for their decisions and defer to their partner, especially when a quick decision is required. This over-reliance on others’ opinions might have negative consequences in real life, particularly when relying on fallible human judgments. More awareness is needed regarding how a stressful environment may influence reliance on other’s opinions, and how better techniques are needed to make the best decisions under high stress and time pressure.

Keywords: advisors, decision-making, time pressure, trust

Procedia PDF Downloads 159
19797 Benchmarking of Pentesting Tools

Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.

Keywords: cybersecurity, IDS, security, web scanners, web vulnerabilities

Procedia PDF Downloads 303
19796 A Model for Diagnosis and Prediction of Coronavirus Using Neural Network

Authors: Sajjad Baghernezhad

Abstract:

Meta-heuristic and hybrid algorithms have high adeer in modeling medical problems. In this study, a neural network was used to predict covid-19 among high-risk and low-risk patients. This study was conducted to collect the applied method and its target population consisting of 550 high-risk and low-risk patients from the Kerman University of medical sciences medical center to predict the coronavirus. In this study, the memetic algorithm, which is a combination of a genetic algorithm and a local search algorithm, has been used to update the weights of the neural network and develop the accuracy of the neural network. The initial study showed that the accuracy of the neural network was 88%. After updating the weights, the memetic algorithm increased by 93%. For the proposed model, sensitivity, specificity, positive predictivity value, value/accuracy to 97.4, 92.3, 95.8, 96.2, and 0.918, respectively; for the genetic algorithm model, 87.05, 9.20 7, 89.45, 97.30 and 0.967 and for logistic regression model were 87.40, 95.20, 93.79, 0.87 and 0.916. Based on the findings of this study, neural network models have a lower error rate in the diagnosis of patients based on individual variables and vital signs compared to the regression model. The findings of this study can help planners and health care providers in signing programs and early diagnosis of COVID-19 or Corona.

Keywords: COVID-19, decision support technique, neural network, genetic algorithm, memetic algorithm

Procedia PDF Downloads 55
19795 System Response of a Variable-Rate Aerial Application System

Authors: Daniel E. Martin, Chenghai Yang

Abstract:

Variable-rate aerial application systems are becoming more readily available; however, aerial applicators typically only use the systems for constant-rate application of materials, allowing the systems to compensate for upwind and downwind ground speed variations. Much of the resistance to variable-rate aerial application system adoption in the U.S. pertains to applicator’s trust in the systems to turn on and off automatically as desired. The objectives of this study were to evaluate a commercially available variable-rate aerial application system under field conditions to demonstrate both the response and accuracy of the system to desired application rate inputs. This study involved planting oats in a 35-acre fallow field during the winter months to establish a uniform green backdrop in early spring. A binary (on/off) prescription application map was generated and a variable-rate aerial application of glyphosate was made to the field. Airborne multispectral imagery taken before and two weeks after the application documented actual field deposition and efficacy of the glyphosate. When compared to the prescription application map, these data provided application system response and accuracy information. The results of this study will be useful for quantifying and documenting the response and accuracy of a commercially available variable-rate aerial application system so that aerial applicators can be more confident in their capabilities and the use of these systems can increase, taking advantage of all that aerial variable-rate technologies have to offer.

Keywords: variable-rate, aerial application, remote sensing, precision application

Procedia PDF Downloads 457
19794 A Methodology for Automatic Diversification of Document Categories

Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.

Keywords: big data analysis, document classification, multi-category, text mining, topic analysis

Procedia PDF Downloads 255
19793 Discrete-Time Bulk Queue with Service Capacity Depending on Previous Service Time

Authors: Yutae Lee

Abstract:

This paper considers a discrete-time bulk-arrival bulkservice queueing system, where service capacity varies depending on the previous service time. By using the generating function technique and the supplementary variable method, we compute the distributions of the queue length at an arbitrary slot boundary and a departure time.

Keywords: discrete-time queue, bulk queue, variable service capacity, queue length distribution

Procedia PDF Downloads 459
19792 Violence Detection and Tracking on Moving Surveillance Video Using Machine Learning Approach

Authors: Abe Degale D., Cheng Jian

Abstract:

When creating automated video surveillance systems, violent action recognition is crucial. In recent years, hand-crafted feature detectors have been the primary method for achieving violence detection, such as the recognition of fighting activity. Researchers have also looked into learning-based representational models. On benchmark datasets created especially for the detection of violent sequences in sports and movies, these methods produced good accuracy results. The Hockey dataset's videos with surveillance camera motion present challenges for these algorithms for learning discriminating features. Image recognition and human activity detection challenges have shown success with deep representation-based methods. For the purpose of detecting violent images and identifying aggressive human behaviours, this research suggested a deep representation-based model using the transfer learning idea. The results show that the suggested approach outperforms state-of-the-art accuracy levels by learning the most discriminating features, attaining 99.34% and 99.98% accuracy levels on the Hockey and Movies datasets, respectively.

Keywords: violence detection, faster RCNN, transfer learning and, surveillance video

Procedia PDF Downloads 78
19791 Some Accuracy Related Aspects in Two-Fluid Hydrodynamic Sub-Grid Modeling of Gas-Solid Riser Flows

Authors: Joseph Mouallem, Seyed Reza Amini Niaki, Norman Chavez-Cussy, Christian Costa Milioli, Fernando Eduardo Milioli

Abstract:

Sub-grid closures for filtered two-fluid models (fTFM) useful in large scale simulations (LSS) of riser flows can be derived from highly resolved simulations (HRS) with microscopic two-fluid modeling (mTFM). Accurate sub-grid closures require accurate mTFM formulations as well as accurate correlation of relevant filtered parameters to suitable independent variables. This article deals with both of those issues. The accuracy of mTFM is touched by assessing the impact of gas sub-grid turbulence over HRS filtered predictions. A gas turbulence alike effect is artificially inserted by means of a stochastic forcing procedure implemented in the physical space over the momentum conservation equation of the gas phase. The correlation issue is touched by introducing a three-filtered variable correlation analysis (three-marker analysis) performed under a variety of different macro-scale conditions typical or risers. While the more elaborated correlation procedure clearly improved accuracy, accounting for gas sub-grid turbulence had no significant impact over predictions.

Keywords: fluidization, gas-particle flow, two-fluid model, sub-grid models, filtered closures

Procedia PDF Downloads 103
19790 Geophysical Methods and Machine Learning Algorithms for Stuck Pipe Prediction and Avoidance

Authors: Ammar Alali, Mahmoud Abughaban

Abstract:

Cost reduction and drilling optimization is the goal of many drilling operators. Historically, stuck pipe incidents were a major segment of non-productive time (NPT) associated costs. Traditionally, stuck pipe problems are part of the operations and solved post-sticking. However, the real key to savings and success is in predicting the stuck pipe incidents and avoiding the conditions leading to its occurrences. Previous attempts in stuck-pipe predictions have neglected the local geology of the problem. The proposed predictive tool utilizes geophysical data processing techniques and Machine Learning (ML) algorithms to predict drilling activities events in real-time using surface drilling data with minimum computational power. The method combines two types of analysis: (1) real-time prediction, and (2) cause analysis. Real-time prediction aggregates the input data, including historical drilling surface data, geological formation tops, and petrophysical data, from wells within the same field. The input data are then flattened per the geological formation and stacked per stuck-pipe incidents. The algorithm uses two physical methods (stacking and flattening) to filter any noise in the signature and create a robust pre-determined pilot that adheres to the local geology. Once the drilling operation starts, the Wellsite Information Transfer Standard Markup Language (WITSML) live surface data are fed into a matrix and aggregated in a similar frequency as the pre-determined signature. Then, the matrix is correlated with the pre-determined stuck-pipe signature for this field, in real-time. The correlation used is a machine learning Correlation-based Feature Selection (CFS) algorithm, which selects relevant features from the class and identifying redundant features. The correlation output is interpreted as a probability curve of stuck pipe incidents prediction in real-time. Once this probability passes a fixed-threshold defined by the user, the other component, cause analysis, alerts the user of the expected incident based on set pre-determined signatures. A set of recommendations will be provided to reduce the associated risk. The validation process involved feeding of historical drilling data as live-stream, mimicking actual drilling conditions, of an onshore oil field. Pre-determined signatures were created for three problematic geological formations in this field prior. Three wells were processed as case studies, and the stuck-pipe incidents were predicted successfully, with an accuracy of 76%. This accuracy of detection could have resulted in around 50% reduction in NPT, equivalent to 9% cost saving in comparison with offset wells. The prediction of stuck pipe problem requires a method to capture geological, geophysical and drilling data, and recognize the indicators of this issue at a field and geological formation level. This paper illustrates the efficiency and the robustness of the proposed cross-disciplinary approach in its ability to produce such signatures and predicting this NPT event.

Keywords: drilling optimization, hazard prediction, machine learning, stuck pipe

Procedia PDF Downloads 204
19789 Enhanced Planar Pattern Tracking for an Outdoor Augmented Reality System

Authors: L. Yu, W. K. Li, S. K. Ong, A. Y. C. Nee

Abstract:

In this paper, a scalable augmented reality framework for handheld devices is presented. The presented framework is enabled by using a server-client data communication structure, in which the search for tracking targets among a database of images is performed on the server-side while pixel-wise 3D tracking is performed on the client-side, which, in this case, is a handheld mobile device. Image search on the server-side adopts a residual-enhanced image descriptors representation that gives the framework a scalability property. The tracking algorithm on the client-side is based on a gravity-aligned feature descriptor which takes the advantage of a sensor-equipped mobile device and an optimized intensity-based image alignment approach that ensures the accuracy of 3D tracking. Automatic content streaming is achieved by using a key-frame selection algorithm, client working phase monitoring and standardized rules for content communication between the server and client. The recognition accuracy test performed on a standard dataset shows that the method adopted in the presented framework outperforms the Bag-of-Words (BoW) method that has been used in some of the previous systems. Experimental test conducted on a set of video sequences indicated the real-time performance of the tracking system with a frame rate at 15-30 frames per second. The presented framework is exposed to be functional in practical situations with a demonstration application on a campus walk-around.

Keywords: augmented reality framework, server-client model, vision-based tracking, image search

Procedia PDF Downloads 265
19788 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator

Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira

Abstract:

True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).

Keywords: distillation curve, petroleum distillation, simulation, true boiling point curve

Procedia PDF Downloads 423
19787 The Synergistic Effects of Blockchain and AI on Enhancing Data Integrity and Decision-Making Accuracy in Smart Contracts

Authors: Sayor Ajfar Aaron, Sajjat Hossain Abir, Ashif Newaz, Mushfiqur Rahman

Abstract:

Investigating the convergence of blockchain technology and artificial intelligence, this paper examines their synergistic effects on data integrity and decision-making within smart contracts. By implementing AI-driven analytics on blockchain-based platforms, the research identifies improvements in automated contract enforcement and decision accuracy. The paper presents a framework that leverages AI to enhance transparency and trust while blockchain ensures immutable record-keeping, culminating in significantly optimized operational efficiencies in various industries.

Keywords: artificial intelligence, blockchain, data integrity, smart contracts

Procedia PDF Downloads 24
19786 Sea-Land Segmentation Method Based on the Transformer with Enhanced Edge Supervision

Authors: Lianzhong Zhang, Chao Huang

Abstract:

Sea-land segmentation is a basic step in many tasks such as sea surface monitoring and ship detection. The existing sea-land segmentation algorithms have poor segmentation accuracy, and the parameter adjustments are cumbersome and difficult to meet actual needs. Also, the current sea-land segmentation adopts traditional deep learning models that use Convolutional Neural Networks (CNN). At present, the transformer architecture has achieved great success in the field of natural images, but its application in the field of radar images is less studied. Therefore, this paper proposes a sea-land segmentation method based on the transformer architecture to strengthen edge supervision. It uses a self-attention mechanism with a gating strategy to better learn relative position bias. Meanwhile, an additional edge supervision branch is introduced. The decoder stage allows the feature information of the two branches to interact, thereby improving the edge precision of the sea-land segmentation. Based on the Gaofen-3 satellite image dataset, the experimental results show that the method proposed in this paper can effectively improve the accuracy of sea-land segmentation, especially the accuracy of sea-land edges. The mean IoU (Intersection over Union), edge precision, overall precision, and F1 scores respectively reach 96.36%, 84.54%, 99.74%, and 98.05%, which are superior to those of the mainstream segmentation models and have high practical application values.

Keywords: SAR, sea-land segmentation, deep learning, transformer

Procedia PDF Downloads 149
19785 Real Estate Trend Prediction with Artificial Intelligence Techniques

Authors: Sophia Liang Zhou

Abstract:

For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.

Keywords: linear regression, random forest, artificial neural network, real estate price prediction

Procedia PDF Downloads 86
19784 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 169