Search results for: accuracy assessment.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9152

Search results for: accuracy assessment.

7982 Detection of Internal Mold Infection of Intact Tomatoes by Non-Destructive, Transmittance VIS-NIR Spectroscopy

Authors: K. Petcharaporn

Abstract:

The external characteristics of tomatoes, such as freshness, color and size are typically used in quality control processes for tomatoes sorting. However, the internal mold infection of intact tomato cannot be sorted based on a visible assessment and destructive method alone. In this study, a non-destructive technique was used to predict the internal mold infection of intact tomatoes by using transmittance visible and near infrared (VIS-NIR) spectroscopy. Spectra for 200 samples contained 100 samples for normal tomatoes and 100 samples for mold infected tomatoes were acquired in the wavelength range between 665-955 nm. This data was used in conjunction with partial least squares-discriminant analysis (PLS-DA) method to generate a classification model for tomato quality between groups of internal mold infection of intact tomato samples. For this task, the data was split into two groups, 140 samples were used for a training set and 60 samples were used for a test set. The spectra of both normal and internally mold infected tomatoes showed different features in the visible wavelength range. Combined spectral pretreatments of standard normal variate transformation (SNV) and smoothing (Savitzky-Golay) gave the optimal calibration model in training set, 85.0% (63 out of 71 for the normal samples and 56 out of 69 for the internal mold samples). The classification accuracy of the best model on the test set was 91.7% (29 out of 29 for the normal samples and 26 out of 31 for the internal mold tomato samples). The results from this experiment showed that transmittance VIS-NIR spectroscopy can be used as a non-destructive technique to predict the internal mold infection of intact tomatoes.

Keywords: tomato, mold, quality, prediction, transmittance

Procedia PDF Downloads 361
7981 Developing an Accurate AI Algorithm for Histopathologic Cancer Detection

Authors: Leah Ning

Abstract:

This paper discusses the development of a machine learning algorithm that accurately detects metastatic breast cancer (cancer has spread elsewhere from its origin part) in selected images that come from pathology scans of lymph node sections. Being able to develop an accurate artificial intelligence (AI) algorithm would help significantly in breast cancer diagnosis since manual examination of lymph node scans is both tedious and oftentimes highly subjective. The usage of AI in the diagnosis process provides a much more straightforward, reliable, and efficient method for medical professionals and would enable faster diagnosis and, therefore, more immediate treatment. The overall approach used was to train a convolution neural network (CNN) based on a set of pathology scan data and use the trained model to binarily classify if a new scan were benign or malignant, outputting a 0 or a 1, respectively. The final model’s prediction accuracy is very high, with 100% for the train set and over 70% for the test set. Being able to have such high accuracy using an AI model is monumental in regard to medical pathology and cancer detection. Having AI as a new tool capable of quick detection will significantly help medical professionals and patients suffering from cancer.

Keywords: breast cancer detection, AI, machine learning, algorithm

Procedia PDF Downloads 89
7980 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data

Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos

Abstract:

Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.

Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia

Procedia PDF Downloads 20
7979 Incorporating Multiple Supervised Learning Algorithms for Effective Intrusion Detection

Authors: Umar Albalawi, Sang C. Suh, Jinoh Kim

Abstract:

As internet continues to expand its usage with an enormous number of applications, cyber-threats have significantly increased accordingly. Thus, accurate detection of malicious traffic in a timely manner is a critical concern in today’s Internet for security. One approach for intrusion detection is to use Machine Learning (ML) techniques. Several methods based on ML algorithms have been introduced over the past years, but they are largely limited in terms of detection accuracy and/or time and space complexity to run. In this work, we present a novel method for intrusion detection that incorporates a set of supervised learning algorithms. The proposed technique provides high accuracy and outperforms existing techniques that simply utilizes a single learning method. In addition, our technique relies on partial flow information (rather than full information) for detection, and thus, it is light-weight and desirable for online operations with the property of early identification. With the mid-Atlantic CCDC intrusion dataset publicly available, we show that our proposed technique yields a high degree of detection rate over 99% with a very low false alarm rate (0.4%).

Keywords: intrusion detection, supervised learning, traffic classification, computer networks

Procedia PDF Downloads 348
7978 The Effectiveness of the Repositioning Campaign of PKO BP Brand on the Basis of Questionnaire Research

Authors: Danuta Szwajca

Abstract:

Image is a very important intangible asset of a contemporary enterprise, especially, in case of a bank as a public trust institution. A positive, demanded image may effectively distinguish the bank among the competition and build the customer confidence and loyalty. PKO BP is the biggest and largest bank functioning on the Polish financial market. Within the years not a very nice image of the bank has been embedded in the customers’ minds as an old-fashioned, stagnant, resistant to changes institution, what result in the customer loss, and ageing. For this reason, in 2010, the bank launched a campaign of radical image change along with a strategy of branches modernization and improvement of the product offer. The objective of the article is to make an attempt of effectiveness assessment of the brand repositioning campaign that lasted three years. The foundations of the assessment are the results of the questionnaire research concerning the way of bank’s perception before and after the campaign.

Keywords: advertising campaign, brand repositioning, image of the bank, repositioning

Procedia PDF Downloads 422
7977 Assessment of Aminopolyether on 18F-FDG Samples

Authors: Renata L. C. Leão, João E. Nascimento, Natalia C. E. S. Nascimento, Elaine S. Vasconcelos, Mércia L. Oliveira

Abstract:

The quality control procedures of a radiopharmaceutical include the assessment of its chemical purity. The method suggested by international pharmacopeias consists of a thin layer chromatographic run. In this paper, the method proposed by the United States Pharmacopeia (USP) is compared to a direct method to determine the final concentration of aminopolyether in Fludeoxyglucose (18F-FDG) preparations. The approach (no chromatographic run) was achieved by placing the thin-layer chromatography (TLC) plate directly on an iodine vapor chamber. Both methods were validated and they showed adequate results to determine the concentration of aminopolyether in 18F-FDG preparations. However, the direct method is more sensitive, faster and simpler when compared to the reference method (with chromatographic run), and it may be chosen for use in routine quality control of 18F-FDG.

Keywords: chemical purity, Kryptofix 222, thin layer chromatography, validation

Procedia PDF Downloads 198
7976 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 274
7975 Role of Pulp Volume Method in Assessment of Age and Gender in Lucknow, India, an Observational Study

Authors: Anurag Tripathi, Sanad Khandelwal

Abstract:

Age and gender determination are required in forensic for victim identification. There is secondary dentine deposition throughout life, resulting in decreased pulp volume and size. Evaluation of pulp volume using Cone Beam Computed Tomography (CBCT)is a noninvasive method to evaluate the age and gender of an individual. The study was done to evaluate the efficacy of pulp volume method in the determination of age and gender.Aims/Objectives: The study was conducted to estimate age and determine sex by measuring tooth pulp volume with the help of CBCT. An observational study of one year duration on CBCT data of individuals was conducted in Lucknow. Maxillary central incisors (CI) and maxillary canine (C) of the randomly selected samples were assessed for measurement of pulp volume using a software. Statistical analysis: Chi Square Test, Arithmetic Mean, Standard deviation, Pearson’s Correlation, Linear & Logistic regression analysis. Results: The CBCT data of Ninety individuals with age range between 18-70 years was evaluated for pulp volume of central incisor and canine (CI & C). The Pearson correlation coefficient between the tooth pulp volume (CI & C) and chronological age suggested that pulp volume decreased with age. The validation of the equations for sex determination showed higher prediction accuracy for CI (56.70%) and lower for C (53.30%).Conclusion: Pulp volume obtained from CBCT is a reliable indicator for age estimation and gender prediction.

Keywords: forensic, dental age, pulp volume, cone beam computed tomography

Procedia PDF Downloads 97
7974 Proposed Model to Assess E-Government Readiness in Jordan

Authors: Hadeel Abdulatif, Maha Alkhaffaf

Abstract:

E-government is the use of Information and Communication Technology to enrich the access to and delivery of government services to citizens, business partners and employees, Policy makers and regulatory bodies have to be cognizant of the degree of readiness of a populace in order to design and implement efficient e-government programs. This paper aims to provide a transparent situation analyses for the case of e-government official website in Jordan, it focuses on assessing e-government in Jordan; web site assessment by using international criteria for assessing e-government websites, However, the study analyses the environmental factor consisting of cultural and business environment factors. By reviewing the literature the researchers found that government's efforts towards e-government may vary according to the country's readiness and other key implementation factors which will lead to diverse e-government experience; thus, there is a need to study the impact of key factors to implement e-government in Jordan.

Keywords: e-government, environmental factors, website assessment, readiness

Procedia PDF Downloads 291
7973 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery

Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley

Abstract:

Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.

Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter

Procedia PDF Downloads 469
7972 Risk Assessment of Oil Spill Pollution by Integration of Gnome, Aloha and Gis in Bandar Abbas Coast, Iran

Authors: Mehrnaz Farzingohar, Mehran Yasemi, Ahmad Savari

Abstract:

The oil products are imported and exported via Rajaee’s tanker terminal. Within loading and discharging in several cases the oil is released into the berths and made oil spills. The spills are distributed within short time and seriously affected Rajaee port’s environment and even extended areas. The trajectory and fate of oil spills investigated by modeling and parted by three risk levels base on the modeling results. First GNOME (General NOAA Operational Modeling Environment) applied to trajectory the liquid oil. Second, ALOHA (Areal Location Of Hazardous Atmosphere) air quality model, is integrated to predict the oil evaporation path within the air. Base on the identified zones the high risk areas are signed by colored dots which their densities calculated and clarified on a map which displayed the harm places. Wind and water circulation moved the pollution to the East of Rajaee Port that accumulated about 12 km of coastline. Approximately 20 km of north east of Qeshm Island shore is covered by the three levels of risky areas. Since the main wind direction is SSW the pollution pushed to the east and the highest risk zones formed on the crests edges hence the low risk appeared on the concavities. This assessment help the management and emergency systems to monitor the exposure places base on the priority factors and find the best approaches to protect the environment.

Keywords: oil spill, modeling, pollution, risk assessment

Procedia PDF Downloads 385
7971 Holistic Simulation-Based Impact Analysis Framework for Sustainable Manufacturing

Authors: Mijoh A. Gbededo, Kapila Liyanage, Sabuj Mallik

Abstract:

The emerging approaches to sustainable manufacturing are considered to be solution-oriented with the aim of addressing the environmental, economic and social issues holistically. However, the analysis of the interdependencies amongst the three sustainability dimensions has not been fully captured in the literature. In a recent review of approaches to sustainable manufacturing, two categories of techniques are identified: 1) Sustainable Product Development (SPD), and 2) Sustainability Performance Assessment (SPA) techniques. The challenges of the approaches are not only related to the arguments and misconceptions of the relationships between the techniques and sustainable development but also to the inability to capture and integrate the three sustainability dimensions. This requires a clear definition of some of the approaches and a road-map to the development of a holistic approach that supports sustainability decision-making. In this context, eco-innovation, social impact assessment, and life cycle sustainability analysis play an important role. This paper deployed an integrative approach that enabled amalgamation of sustainable manufacturing approaches and the theories of reciprocity and motivation into a holistic simulation-based impact analysis framework. The findings in this research have the potential to guide sustainability analysts to capture the aspects of the three sustainability dimensions into an analytical model. Additionally, the research findings presented can aid the construction of a holistic simulation model of a sustainable manufacturing and support effective decision-making.

Keywords: life cycle sustainability analysis, sustainable manufacturing, sustainability performance assessment, sustainable product development

Procedia PDF Downloads 172
7970 Detection of Internal Mold Infection of Intact For Tomatoes by Non-Destructive, Transmittance VIS-NIR Spectroscopy

Authors: K. Petcharaporn, N. Prathengjit

Abstract:

The external characteristics of tomatoes, such as freshness, color and size are typically used in quality control processes for tomatoes sorting. However, the internal mold infection of intact tomato cannot be sorted based on a visible assessment and destructive method alone. In this study, a non-destructive technique was used to predict the internal mold infection of intact tomatoes by using transmittance visible and near infrared (VIS-NIR) spectroscopy. Spectra for 200 samples contained 100 samples for normal tomatoes and 100 samples for mold infected tomatoes were acquired in the wavelength range between 665-955 nm. This data was used in conjunction with partial least squares-discriminant analysis (PLS-DA) method to generate a classification model for tomato quality between groups of internal mold infection of intact tomato samples. For this task, the data was split into two groups, 140 samples were used for a training set and 60 samples were used for a test set. The spectra of both normal and internally mold infected tomatoes showed different features in the visible wavelength range. Combined spectral pretreatments of standard normal variate transformation (SNV) and smoothing (Savitzky-Golay) gave the optimal calibration model in training set, 85.0% (63 out of 71 for the normal samples and 56 out of 69 for the internal mold samples). The classification accuracy of the best model on the test set was 91.7% (29 out of 29 for the normal samples and 26 out of 31 for the internal mold tomato samples). The results from this experiment showed that transmittance VIS-NIR spectroscopy can be used as a non-destructive technique to predict the internal mold infection of intact tomatoes.

Keywords: tomato, mold, quality, prediction, transmittance

Procedia PDF Downloads 518
7969 The Developing of Teaching Materials Online for Students in Thailand

Authors: Pitimanus Bunlue

Abstract:

The objectives of this study were to identify the unique characteristics of Salaya Old market, Phutthamonthon, Nakhon Pathom and develop the effective video media to promote the homeland awareness among local people and the characteristic features of this community were collectively summarized based on historical data, community observation, and people’s interview. The acquired data were used to develop a media describing prominent features of the community. The quality of the media was later assessed by interviewing local people in the old market in terms of content accuracy, video, and narration qualities, and sense of homeland awareness after watching the video. The result shows a 6-minute video media containing historical data and outstanding features of this community was developed. Based on the interview, the content accuracy was good. The picture quality and the narration were very good. Most people developed a sense of homeland awareness after watching the video also as well.

Keywords: audio-visual, creating homeland awareness, Phutthamonthon Nakhon Pathom, research and development

Procedia PDF Downloads 290
7968 Implementation of CNV-CH Algorithm Using Map-Reduce Approach

Authors: Aishik Deb, Rituparna Sinha

Abstract:

We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.

Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing

Procedia PDF Downloads 135
7967 Inferring Human Mobility in India Using Machine Learning

Authors: Asra Yousuf, Ajaykumar Tannirkulum

Abstract:

Inferring rural-urban migration trends can help design effective policies that promote better urban planning and rural development. In this paper, we describe how machine learning algorithms can be applied to predict internal migration decisions of people. We consider data collected from household surveys in Tamil Nadu to train our model. To measure the performance of the model, we use data on past migration from National Sample Survey Organisation of India. The factors for training the model include socioeconomic characteristic of each individual like age, gender, place of residence, outstanding loans, strength of the household, etc. and his past migration history. We perform a comparative analysis of the performance of a number of machine learning algorithm to determine their prediction accuracy. Our results show that machine learning algorithms provide a stronger prediction accuracy as compared to statistical models. Our goal through this research is to propose the use of data science techniques in understanding human decisions and behaviour in developing countries.

Keywords: development, migration, internal migration, machine learning, prediction

Procedia PDF Downloads 270
7966 Vibration-Based Monitoring of Tensioning Stay Cables of an Extradosed Bridge

Authors: Chun-Chung Chen, Bo-Han Lee, Yu-Chi Sung

Abstract:

Monitoring the status of tensioning force of stay cables is a significant issue for the assessment of structural safety of extradosed bridges. Moreover, it is known that there is a high correlation between the existing tension force and the vibration frequencies of cables. This paper presents the characteristic of frequencies of stay cables of a field extradosed bridge by using vibration-based monitoring methods. The vibration frequencies of each stay cables were measured in stages from the beginning to the completion of bridge construction. The result shows that the vibration frequency variation trend of different lengths of cables at each measured stage is different. The observed feature can help the application of the bridge long-term monitoring system and contribute to the assessment of bridge safety.

Keywords: vibration-based method, extradosed bridges, bridge health monitoring, bridge stay cables

Procedia PDF Downloads 144
7965 Automated Tracking and Statistics of Vehicles at the Signalized Intersection

Authors: Qiang Zhang, Xiaojian Hu1

Abstract:

Intersection is the place where vehicles and pedestrians must pass through, turn and evacuate. Obtaining the motion data of vehicles near the intersection is of great significance for transportation research. Since there are usually many targets and there are more conflicts between targets, this makes it difficult to obtain vehicle motion parameters in traffic videos of intersections. According to the characteristics of traffic videos, this paper applies video technology to realize the automated track, count and trajectory extraction of vehicles to collect traffic data by roadside surveillance cameras installed near the intersections. Based on the video recognition method, the vehicles in each lane near the intersection are tracked with extracting trajectory and counted respectively in various degrees of occlusion and visibility. The performances are compared with current recognized CPU-based algorithms of real-time tracking-by-detection. The speed of the presented system is higher than the others and the system has a better real-time performance. The accuracy of direction has reached about 94.99% on average, and the accuracy of classification and statistics has reached about 75.12% on average.

Keywords: tracking and statistics, vehicle, signalized intersection, motion parameter, trajectory

Procedia PDF Downloads 219
7964 Spatial Working Memory Is Enhanced by the Differential Outcome Procedure in a Group of Participants with Mild Cognitive Impairment

Authors: Ana B. Vivas, Antonia Ypsilanti, Aristea I. Ladas, Angeles F. Estevez

Abstract:

Mild Cognitive Impairment (MCI) is considered an intermediate stage between normal and pathological aging, as a substantial percentage of people diagnosed with MCI converts later to dementia of the Alzheimer’s type. Memory is of the first cognitive processes to deteriorate in this condition. In the present study we employed the differential outcomes procedure (DOP) to improve visuospatial memory in a group of participants with MCI. The DOP requires the structure of a conditional discriminative learning task in which a correct choice response to a specific stimulus-stimulus association is reinforced with a particular reinforcer or outcome. A group of 10 participants with MCI, and a matched control group had to learn and keep in working memory four target locations out of eight possible locations where a shape could be presented. Results showed that participants with MCI had a statistically significant better terminal accuracy when a unique outcome was paired with a location (76% accuracy) as compared to a non differential outcome condition (64%). This finding suggests that the DOP is useful in improving working memory in MCI patients, which may delay their conversion to dementia.

Keywords: mild cognitive impairment, working memory, differential outcomes, cognitive process

Procedia PDF Downloads 458
7963 Spatiotemporal Neural Network for Video-Based Pose Estimation

Authors: Bin Ji, Kai Xu, Shunyu Yao, Jingjing Liu, Ye Pan

Abstract:

Human pose estimation is a popular research area in computer vision for its important application in human-machine interface. In recent years, 2D human pose estimation based on convolution neural network has got great progress and development. However, in more and more practical applications, people often need to deal with tasks based on video. It’s not far-fetched for us to consider how to combine the spatial and temporal information together to achieve a balance between computing cost and accuracy. To address this issue, this study proposes a new spatiotemporal model, namely Spatiotemporal Net (STNet) to combine both temporal and spatial information more rationally. As a result, the predicted keypoints heatmap is potentially more accurate and spatially more precise. Under the condition of ensuring the recognition accuracy, the algorithm deal with spatiotemporal series in a decoupled way, which greatly reduces the computation of the model, thus reducing the resource consumption. This study demonstrate the effectiveness of our network over the Penn Action Dataset, and the results indicate superior performance of our network over the existing methods.

Keywords: convolutional long short-term memory, deep learning, human pose estimation, spatiotemporal series

Procedia PDF Downloads 147
7962 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion

Authors: Ali Kazemi

Abstract:

Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.

Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting

Procedia PDF Downloads 63
7961 A Theoretical Study on Pain Assessment through Human Facial Expresion

Authors: Mrinal Kanti Bhowmik, Debanjana Debnath Jr., Debotosh Bhattacharjee

Abstract:

A facial expression is undeniably the human manners. It is a significant channel for human communication and can be applied to extract emotional features accurately. People in pain often show variations in facial expressions that are readily observable to others. A core of actions is likely to occur or to increase in intensity when people are in pain. To illustrate the changes in the facial appearance, a system known as Facial Action Coding System (FACS) is pioneered by Ekman and Friesen for human observers. According to Prkachin and Solomon, a set of such actions carries the bulk of information about pain. Thus, the Prkachin and Solomon pain intensity (PSPI) metric is defined. So, it is very important to notice that facial expressions, being a behavioral source in communication media, provide an important opening into the issues of non-verbal communication in pain. People express their pain in many ways, and this pain behavior is the basis on which most inferences about pain are drawn in clinical and research settings. Hence, to understand the roles of different pain behaviors, it is essential to study the properties. For the past several years, the studies are concentrated on the properties of one specific form of pain behavior i.e. facial expression. This paper represents a comprehensive study on pain assessment that can model and estimate the intensity of pain that the patient is suffering. It also reviews the historical background of different pain assessment techniques in the context of painful expressions. Different approaches incorporate FACS from psychological views and a pain intensity score using the PSPI metric in pain estimation. This paper investigates in depth analysis of different approaches used in pain estimation and presents different observations found from each technique. It also offers a brief study on different distinguishing features of real and fake pain. Therefore, the necessity of the study lies in the emerging fields of painful face assessment in clinical settings.

Keywords: facial action coding system (FACS), pain, pain behavior, Prkachin and Solomon pain intensity (PSPI)

Procedia PDF Downloads 345
7960 Systematic Review of Quantitative Risk Assessment Tools and Their Effect on Racial Disproportionality in Child Welfare Systems

Authors: Bronwen Wade

Abstract:

Over the last half-century, child welfare systems have increasingly relied on quantitative risk assessment tools, such as actuarial or predictive risk tools. These tools are developed by performing statistical analysis of how attributes captured in administrative data are related to future child maltreatment. Some scholars argue that attributes in administrative data can serve as proxies for race and that quantitative risk assessment tools reify racial bias in decision-making. Others argue that these tools provide more “objective” and “scientific” guides for decision-making instead of subjective social worker judgment. This study performs a systematic review of the literature on the impact of quantitative risk assessment tools on racial disproportionality; it examines methodological biases in work on this topic, summarizes key findings, and provides suggestions for further work. A search of CINAHL, PsychInfo, Proquest Social Science Premium Collection, and the ProQuest Dissertations and Theses Collection was performed. Academic and grey literature were included. The review includes studies that use quasi-experimental methods and development, validation, or re-validation studies of quantitative risk assessment tools. PROBAST (Prediction model Risk of Bias Assessment Tool) and CHARMS (CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies) were used to assess the risk of bias and guide data extraction for risk development, validation, or re-validation studies. ROBINS-I (Risk of Bias in Non-Randomized Studies of Interventions) was used to assess for bias and guide data extraction for the quasi-experimental studies identified. Due to heterogeneity among papers, a meta-analysis was not feasible, and a narrative synthesis was conducted. 11 papers met the eligibility criteria, and each has an overall high risk of bias based on the PROBAST and ROBINS-I assessments. This is deeply concerning, as major policy decisions have been made based on a limited number of studies with a high risk of bias. The findings on racial disproportionality have been mixed and depend on the tool and approach used. Authors use various definitions for racial equity, fairness, or disproportionality. These concepts of statistical fairness are connected to theories about the reason for racial disproportionality in child welfare or social definitions of fairness that are usually not stated explicitly. Most findings from these studies are unreliable, given the high degree of bias. However, some of the less biased measures within studies suggest that quantitative risk assessment tools may worsen racial disproportionality, depending on how disproportionality is mathematically defined. Authors vary widely in their approach to defining and addressing racial disproportionality within studies, making it difficult to generalize findings or approaches across studies. This review demonstrates the power of authors to shape policy or discourse around racial justice based on their choice of statistical methods; it also demonstrates the need for improved rigor and transparency in studies of quantitative risk assessment tools. Finally, this review raises concerns about the impact that these tools have on child welfare systems and racial disproportionality.

Keywords: actuarial risk, child welfare, predictive risk, racial disproportionality

Procedia PDF Downloads 51
7959 Factor Analysis Based on Semantic Differential of the Public Perception of Public Art: A Case Study of the Malaysia National Monument

Authors: Yuhanis Ibrahim, Sung-Pil Lee

Abstract:

This study attempts to address factors that contribute to outline public art factors assessment, memorial monument specifically. Memorial monuments hold significant and rich message whether the intention of the art is to mark and commemorate important event or to inform younger generation about the past. Public monument should relate to the public and raise awareness about the significant issue. Therefore, by investigating the impact of the existing public memorial art will hopefully shed some lights to the upcoming public art projects’ stakeholders to ensure the lucid memorial message is delivered to the public directly. Public is the main actor as public is the fundamental purpose that the art was created. Perception is framed as one of the reliable evaluation tools to assess the public art impact factors. The Malaysia National Monument was selected to be the case study for the investigation. The public’s perceptions were gathered using a questionnaire that involved (n-115) participants to attain keywords, and next Semantical Differential Methodology (SDM) was adopted to evaluate the perceptions about the memorial monument. These perceptions were then measured with Reliability Factor and then were factorised using Factor Analysis of Principal Component Analysis (PCA) method to acquire concise factors for the monument assessment. The result revealed that there are four factors that influence public’s perception on the monument which are aesthetic, audience, topology, and public reception. The study concludes by proposing the factors for public memorial art assessment for the next future public memorial projects especially in Malaysia.

Keywords: factor analysis, public art, public perception, semantical differential methodology

Procedia PDF Downloads 501
7958 Optimized Real Ground Motion Scaling for Vulnerability Assessment of Building Considering the Spectral Uncertainty and Shape

Authors: Chen Bo, Wen Zengping

Abstract:

Based on the results of previous studies, we focus on the research of real ground motion selection and scaling method for structural performance-based seismic evaluation using nonlinear dynamic analysis. The input of earthquake ground motion should be determined appropriately to make them compatible with the site-specific hazard level considered. Thus, an optimized selection and scaling method are established including the use of not only Monte Carlo simulation method to create the stochastic simulation spectrum considering the multivariate lognormal distribution of target spectrum, but also a spectral shape parameter. Its applications in structural fragility analysis are demonstrated through case studies. Compared to the previous scheme with no consideration of the uncertainty of target spectrum, the method shown here can make sure that the selected records are in good agreement with the median value, standard deviation and spectral correction of the target spectrum, and greatly reveal the uncertainty feature of site-specific hazard level. Meanwhile, it can help improve computational efficiency and matching accuracy. Given the important infection of target spectrum’s uncertainty on structural seismic fragility analysis, this work can provide the reasonable and reliable basis for structural seismic evaluation under scenario earthquake environment.

Keywords: ground motion selection, scaling method, seismic fragility analysis, spectral shape

Procedia PDF Downloads 291
7957 The Change of Urban Land Use/Cover Using Object Based Approach for Southern Bali

Authors: I. Gusti A. A. Rai Asmiwyati, Robert J. Corner, Ashraf M. Dewan

Abstract:

Change on land use/cover (LULC) dominantly affects spatial structure and function. It can have such impacts by disrupting social culture practice and disturbing physical elements. Thus, it has become essential to understand of the dynamics in time and space of LULC as it can be used as a critical input for developing sustainable LULC. This study was an attempt to map and monitor the LULC change in Bali Indonesia from 2003 to 2013. Using object based classification to improve the accuracy, and change detection, multi temporal land use/cover data were extracted from a set of ASTER satellite image. The overall accuracies of the classification maps of 2003 and 2013 were 86.99% and 80.36%, respectively. Built up area and paddy field were the dominant type of land use/cover in both years. Patch increase dominantly in 2003 illustrated the rapid paddy field fragmentation and the huge occurring transformation. This approach is new for the case of diverse urban features of Bali that has been growing fast and increased the classification accuracy than the manual pixel based classification.

Keywords: land use/cover, urban, Bali, ASTER

Procedia PDF Downloads 539
7956 Assessing the Influence of Station Density on Geostatistical Prediction of Groundwater Levels in a Semi-arid Watershed of Karnataka

Authors: Sakshi Dhumale, Madhushree C., Amba Shetty

Abstract:

The effect of station density on the geostatistical prediction of groundwater levels is of critical importance to ensure accurate and reliable predictions. Monitoring station density directly impacts the accuracy and reliability of geostatistical predictions by influencing the model's ability to capture localized variations and small-scale features in groundwater levels. This is particularly crucial in regions with complex hydrogeological conditions and significant spatial heterogeneity. Insufficient station density can result in larger prediction uncertainties, as the model may struggle to adequately represent the spatial variability and correlation patterns of the data. On the other hand, an optimal distribution of monitoring stations enables effective coverage of the study area and captures the spatial variability of groundwater levels more comprehensively. In this study, we investigate the effect of station density on the predictive performance of groundwater levels using the geostatistical technique of Ordinary Kriging. The research utilizes groundwater level data collected from 121 observation wells within the semi-arid Berambadi watershed, gathered over a six-year period (2010-2015) from the Indian Institute of Science (IISc), Bengaluru. The dataset is partitioned into seven subsets representing varying sampling densities, ranging from 15% (12 wells) to 100% (121 wells) of the total well network. The results obtained from different monitoring networks are compared against the existing groundwater monitoring network established by the Central Ground Water Board (CGWB). The findings of this study demonstrate that higher station densities significantly enhance the accuracy of geostatistical predictions for groundwater levels. The increased number of monitoring stations enables improved interpolation accuracy and captures finer-scale variations in groundwater levels. These results shed light on the relationship between station density and the geostatistical prediction of groundwater levels, emphasizing the importance of appropriate station densities to ensure accurate and reliable predictions. The insights gained from this study have practical implications for designing and optimizing monitoring networks, facilitating effective groundwater level assessments, and enabling sustainable management of groundwater resources.

Keywords: station density, geostatistical prediction, groundwater levels, monitoring networks, interpolation accuracy, spatial variability

Procedia PDF Downloads 56
7955 A Phishing Email Detection Approach Using Machine Learning Techniques

Authors: Kenneth Fon Mbah, Arash Habibi Lashkari, Ali A. Ghorbani

Abstract:

Phishing e-mails are a security issue that not only annoys online users, but has also resulted in significant financial losses for businesses. Phishing advertisements and pornographic e-mails are difficult to detect as attackers have been becoming increasingly intelligent and professional. Attackers track users and adjust their attacks based on users’ attractions and hot topics that can be extracted from community news and journals. This research focuses on deceptive Phishing attacks and their variants such as attacks through advertisements and pornographic e-mails. We propose a framework called Phishing Alerting System (PHAS) to accurately classify e-mails as Phishing, advertisements or as pornographic. PHAS has the ability to detect and alert users for all types of deceptive e-mails to help users in decision making. A well-known email dataset has been used for these experiments and based on previously extracted features, 93.11% detection accuracy is obtainable by using J48 and KNN machine learning techniques. Our proposed framework achieved approximately the same accuracy as the benchmark while using this dataset.

Keywords: phishing e-mail, phishing detection, anti phishing, alarm system, machine learning

Procedia PDF Downloads 336
7954 Age–Related Changes of the Sella Turcica Morphometry in Adults Older Than 20-25 Years

Authors: Yu. I. Pigolkin, M. A. Garcia Corro

Abstract:

Age determination of unknown dead bodies in forensic personal identification is a complicated process which involves the application of numerous methods and techniques. Skeletal remains are less exposed to influences of environmental factors. In order to enhance the accuracy of forensic age estimation additional properties of bones correlating with age are required to be revealed. Material and Methods: Dimensional examination of the sella turcica was carried out on cadavers with the cranium opened by a circular vibrating saw. The sample consisted of a total of 90 Russian subjects, ranging in age from two months and 87 years. Results: The tendency of dimensional variations throughout life was detected. There were no observed gender differences in the morphometry of the sella turcica. The shared use of the sella turcica depth and length values revealed the possibility to categorize an examined sample in a certain age period. Conclusions: Based on the results of existing methods of age determination, the morphometry of the sella turcica can be an additional characteristic, amplifying the received values, and accordingly, increasing the accuracy of forensic biological age diagnosis.

Keywords: age–related changes in bone structures, forensic personal identification, sella turcica morphometry, body identification

Procedia PDF Downloads 274
7953 Virtual Chemistry Laboratory as Pre-Lab Experiences: Stimulating Student's Prediction Skill

Authors: Yenni Kurniawati

Abstract:

Students Prediction Skill in chemistry experiments is an important skill for pre-service chemistry students to stimulate students reflective thinking at each stage of many chemistry experiments, qualitatively and quantitatively. A Virtual Chemistry Laboratory was designed to give students opportunities and times to practicing many kinds of chemistry experiments repeatedly, everywhere and anytime, before they do a real experiment. The Virtual Chemistry Laboratory content was constructed using the Model of Educational Reconstruction and developed to enhance students ability to predicted the experiment results and analyzed the cause of error, calculating the accuracy and precision with carefully in using chemicals. This research showed students changing in making a decision and extremely beware with accuracy, but still had a low concern in precision. It enhancing students level of reflective thinking skill related to their prediction skill 1 until 2 stage in average. Most of them could predict the characteristics of the product in experiment, and even the result will going to be an error. In addition, they take experiments more seriously and curiously about the experiment results. This study recommends for a different subject matter to provide more opportunities for students to learn about other kinds of chemistry experiments design.

Keywords: virtual chemistry laboratory, chemistry experiments, prediction skill, pre-lab experiences

Procedia PDF Downloads 338