Search results for: hierarchical text classification models
7500 Representations of Wolves (Canis lupus) in Feature Films: The Detailed Analysis of the Text and Picture in the Chosen Movies
Authors: Barbara Klimek
Abstract:
Wolves are one of the most misrepresented species in literature and the media. They’re often portrayed as vicious, man-eating beasts whose main life goal is to hunt and kill people. Many movie directors use wolves as their main characters in different types of films, especially horror, thriller and science fiction movies to create gore and fear. This, in turn, results in people being afraid of wolves and wanting to destroy them. Such cultural creations caused wolves being stalked, abused and killed by people and in many areas they were completely destroyed. This paper analyzes the representations of wolves in the chosen films in the four main portrayed aspects: 1. the overall picture – true versus false, positive versus negative, based on stereotypes or realistic, displaying wolf behavior typical of the species or fake 2. subjectivity – how humans treat and talk about the animals – as subjects or as objects 3. animal welfare – how humans treat wolves and nature, are the human – animal relations positive and appropriate or negative and abusive 4. empathy – are human characters shown to co-feel the suffering with the wolves, do they display signs of empathy towards the animals, do the animals empathize with humans? The detailed analysis of the text and pictures presented in the chosen films concludes that wolves are especially misrepresented in the movies. Their behavior is shown as fake and negative, based on stereotypes and myths, the human – animal relations are shown mainly as negative where people fear the animals and hunt them and wolves stalk, follow, attack and kill humans. It shows that people do not understand the needs of these animals and are unable to show empathy towards them. The article will discuss the above-mentioned study results in detail and will present many examples. Animal representations in cultural creations, including film have a great impact on how people treat particular species of animals. The media shape people’s attitudes, what in turn results in people either respecting and protecting the animals or fearing, disliking and destroying the particular species.Keywords: film, movies, representations, wolves
Procedia PDF Downloads 2137499 Channel Characteristics and Morphometry of a Part of Umtrew River, Meghalaya
Authors: Pratyashi Phukan, Ranjan Saikia
Abstract:
Morphometry incorporates quantitative study of the area ,altitude,volume, slope profiles of a land and drainage basin characteristics of the area concerned.Fluvial geomorphology includes the consideration of linear,areal and relief aspects of a fluvially originated drainage basin. The linear aspect deals with the hierarchical orders of streams, numbers, and lenghts of stream segments and various relationship among them.The areal aspect includes the analysis of basin perimeters,basin shape, basin area, and related morphometric laws. The relief aspect incorporates besides hypsometric, climographic and altimetric analysis,the study of absolute and relative reliefs, relief ratios, average slope, etc. In this paper we have analysed the relationship among stream velocity, channel shape,sediment load,channel width,channel depth, etc.Keywords: morphometry, hydraulic geometry, Umtrew river, Meghalaya
Procedia PDF Downloads 4597498 Communication Strategies of Russian-English Asymmetric Bilinguals Given Insufficient Language Faculty
Authors: Varvara Tyurina
Abstract:
In the age of globalization Internet communication as a new format of interactions have become an integral part of our daily routine. Internet environment allows for new conditions and provides participants to a communication act with extra communication tools which can be used on Internet forums or in chat rooms. As a result communicants tend to alternate their behavior patterns in contrast to those practiced in live communication. It is not yet clear which communication strategies participants to Internet communication abide by and what determines their choices. Given the continually changing environment of a forum or a chat the behavior of a communicant can be interpreted in terms of autopoiesis theory which sees adaptation as the major tool for coexistence between the living system and its niche. Each communication act is seen as interaction between the communicant (i.e. the living system) and the overall environment of the forum (i.e. the niche) rather than one particular interlocutor. When communicating via the Internet participants are believed to aim at reaching a balance between themselves and the environment of a forum or a chat. The research focuses on unveiling the adaptation strategies employed by a communicant in particular cases and looks into the reasons they are employed. There is a correlation between language faculty of the communicants and the strategies they opt for when communicating on Internet forums and in chat rooms. The research included an experiment with a sample of Russian-English asymmetric bilinguals aged 16-25. Respondents were given two texts of equivalent contents, but of different language complexity. They had to respond to the texts as if they were making a reciprocal comment at a forum. It has been revealed that when communicants realize that their language faculty is not sufficient to understand the initial text they tend to amend their communication strategy in order to maintain the balance with the niche (remain involved in the communication). Most common strategies for responding to a difficult-to-understand text were self-presentation, veiling poor language faculty and response evasion. The research has so far focused on a very narrow aspect of correlation between language faculty and communication behavior, namely the syntactic and lexicological complexity of initial texts. It is essential to conduct a series of experiments that dwell on other characteristics of the texts to determine the range of cases when language faculty determines the choice of adaptation strategy.Keywords: adaptation, communication strategies, internet communication, verbal interaction, autopoiesis theory
Procedia PDF Downloads 3627497 Reliability Evaluation of a Payment Model in Mobile E-Commerce Using Colored Petri Net
Authors: Abdolghader Pourali, Mohammad V. Malakooti, Muhammad Hussein Yektaie
Abstract:
A mobile payment system in mobile e-commerce generally have high security so that the user can trust it for doing business deals, sales, paying financial transactions, etc. in the mobile payment system. Since an architecture or payment model in e-commerce only shows the way of interaction and collaboration among users and mortgagers and does not present any evaluation of effectiveness and confidence about financial transactions to stakeholders. In this paper, we try to present a detailed assessment of the reliability of a mobile payment model in the mobile e-commerce using formal models and colored Petri nets. Finally, we demonstrate that the reliability of this system has high value (case study: a secure payment model in mobile commerce.Keywords: reliability, colored Petri net, assessment, payment models, m-commerce
Procedia PDF Downloads 5377496 Applying Arima Data Mining Techniques to ERP to Generate Sales Demand Forecasting: A Case Study
Authors: Ghaleb Y. Abbasi, Israa Abu Rumman
Abstract:
This paper modeled sales history archived from 2012 to 2015 bulked in monthly bins for five products for a medical supply company in Jordan. The sales forecasts and extracted consistent patterns in the sales demand history from the Enterprise Resource Planning (ERP) system were used to predict future forecasting and generate sales demand forecasting using time series analysis statistical technique called Auto Regressive Integrated Moving Average (ARIMA). This was used to model and estimate realistic sales demand patterns and predict future forecasting to decide the best models for five products. Analysis revealed that the current replenishment system indicated inventory overstocking.Keywords: ARIMA models, sales demand forecasting, time series, R code
Procedia PDF Downloads 3857495 Little RAGNER: Toward Lightweight, Generative, Named Entity Recognition through Prompt Engineering, and Multi-Level Retrieval Augmented Generation
Authors: Sean W. T. Bayly, Daniel Glover, Don Horrell, Simon Horrocks, Barnes Callum, Stuart Gibson, Mac Misuira
Abstract:
We assess suitability of recent, ∼7B parameter, instruction-tuned Language Models for Generative Named Entity Recognition (GNER). Alongside Retrieval Augmented Generation (RAG), and supported by task-specific prompting, our proposed Multi-Level Information Retrieval method achieves notable improvements over finetuned entity-level and sentence-level methods. We conclude that language models directed toward this task are highly capable when distinguishing between positive classes (precision). However, smaller models seem to struggle to find all entities (recall). Poorly defined classes such as ”Miscellaneous” exhibit substantial declines in performance, likely due to the ambiguity it introduces to the prompt. This is partially resolved through a self-verification method using engineered prompts containing knowledge of the stricter class definitions, particularly in areas where their boundaries are in danger of overlapping, such as the conflation between the location ”Britain” and the nationality ”British”. Finally, we explore correlations between model performance on the GNER task with performance on relevant academic benchmarks.Keywords: generative named entity recognition, information retrieval, lightweight artificial intelligence, prompt engineering, personal information identification, retrieval augmented generation, self verification
Procedia PDF Downloads 477494 The Use of Drones in Measuring Environmental Impacts of the Forest Garden Approach
Authors: Andrew J. Zacharias
Abstract:
The forest garden approach (FGA) was established by Trees for the Future (TREES) over the organization’s 30 years of agroforestry projects in Sub-Saharan Africa. This method transforms traditional agricultural systems into highly managed gardens that produce food and marketable products year-round. The effects of the FGA on food security, dietary diversity, and economic resilience have been measured closely, and TREES has begun to closely monitor the environmental impacts through the use of sensors mounted on unmanned aerial vehicles, commonly known as 'drones'. These drones collect thousands of pictures to create 3-D models in both the visible and the near-infrared wavelengths. Analysis of these models provides TREES with quantitative and qualitative evidence of improvements to the annual above-ground biomass and leaf area indices, as measured in-situ using NDVI calculations.Keywords: agroforestry, biomass, drones, NDVI
Procedia PDF Downloads 1577493 An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models
Authors: Christopher Godwin Udomboso, Angela Unna Chukwu, Isaac Kwame Dontwi
Abstract:
In selecting a Statistical Neural Network model, the Network Information Criterion (NIC) has been observed to be sample biased, because it does not account for sample sizes. The selection of a model from a set of fitted candidate models requires objective data-driven criteria. In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC), based on Kullback’s symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The analyses show that on a general note, the ANIC improves model selection in more sample sizes than does the NIC.Keywords: statistical neural network, network information criterion, adjusted network, information criterion, transfer function
Procedia PDF Downloads 5677492 Experimental Parameters’ Effects on the Electrical Discharge Machining Performances
Authors: Asmae Tafraouti, Yasmina Layouni, Pascal Kleimann
Abstract:
The growing market for Microsystems (MST) and Micro-Electromechanical Systems (MEMS) is driving the research for alternative manufacturing techniques to microelectronics-based technologies, which are generally expensive and time-consuming. Hot-embossing and micro-injection modeling of thermoplastics appear to be industrially viable processes. However, both require the use of master models, usually made in hard materials such as steel. These master models cannot be fabricated using standard microelectronics processes. Thus, other micromachining processes are used, such as laser machining or micro-electrical discharge machining (µEDM). In this work, µEDM has been used. The principle of µEDM is based on the use of a thin cylindrical micro-tool that erodes the workpiece surface. The two electrodes are immersed in a dielectric with a distance of a few micrometers (gap). When an electrical voltage is applied between the two electrodes, electrical discharges are generated, which cause material machining. In order to produce master models with high resolution and smooth surfaces, it is necessary to well control the discharge mechanism. However, several problems are encountered, such as a random electrical discharge process, the fluctuation of the discharge energy, the electrodes' polarity inversion, and the wear of the micro-tool. The effect of different parameters, such as the applied voltage, the working capacitor, the micro-tool diameter, and the initial gap, has been studied. This analysis helps to improve the machining performances, such as the workpiece surface condition and the lateral crater's gap.Keywords: craters, electrical discharges, micro-electrical discharge machining, microsystems
Procedia PDF Downloads 747491 Electrical Load Estimation Using Estimated Fuzzy Linear Parameters
Authors: Bader Alkandari, Jamal Y. Madouh, Ahmad M. Alkandari, Anwar A. Alnaqi
Abstract:
A new formulation of fuzzy linear estimation problem is presented. It is formulated as a linear programming problem. The objective is to minimize the spread of the data points, taking into consideration the type of the membership function of the fuzzy parameters to satisfy the constraints on each measurement point and to insure that the original membership is included in the estimated membership. Different models are developed for a fuzzy triangular membership. The proposed models are applied to different examples from the area of fuzzy linear regression and finally to different examples for estimating the electrical load on a busbar. It had been found that the proposed technique is more suited for electrical load estimation, since the nature of the load is characterized by the uncertainty and vagueness.Keywords: fuzzy regression, load estimation, fuzzy linear parameters, electrical load estimation
Procedia PDF Downloads 5407490 Finite Element-Based Stability Analysis of Roadside Settlements Slopes from Barpak to Yamagaun through Laprak Village of Gorkha, an Epicentral Location after the 7.8Mw 2015 Barpak, Gorkha, Nepal Earthquake
Authors: N. P. Bhandary, R. C. Tiwari, R. Yatabe
Abstract:
The research employs finite element method to evaluate the stability of roadside settlements slopes from Barpak to Yamagaon through Laprak village of Gorkha, Nepal after the 7.8Mw 2015 Barpak, Gorkha, Nepal earthquake. It includes three major villages of Gorkha, i.e., Barpak, Laprak and Yamagaun that were devastated by 2015 Gorkhas’ earthquake. The road head distance from the Barpak to Laprak and Laprak to Yamagaun are about 14 and 29km respectively. The epicentral distance of main shock of magnitude 7.8 and aftershock of magnitude 6.6 were respectively 7 and 11 kilometers (South-East) far from the Barpak village nearer to Laprak and Yamagaon. It is also believed that the epicenter of the main shock as said until now was not in the Barpak village, it was somewhere near to the Yamagaun village. The chaos that they had experienced during the earthquake in the Yamagaun was much more higher than the Barpak. In this context, we have carried out a detailed study to investigate the stability of Yamagaun settlements slope as a case study, where ground fissures, ground settlement, multiple cracks and toe failures are the most severe. In this regard, the stability issues of existing settlements and proposed road alignment, on the Yamagaon village slope are addressed, which is surrounded by many newly activated landslides. Looking at the importance of this issue, field survey is carried out to understand the behavior of ground fissures and multiple failure characteristics of the slopes. The results suggest that the Yamgaun slope in Profile 2-2, 3-3 and 4-4 are not safe enough for infrastructure development even in the normal soil slope conditions as per 2, 3 and 4 material models; however, the slope seems quite safe for at Profile 1-1 for all 4 material models. The result also indicates that the first three profiles are marginally safe for 2, 3 and 4 material models respectively. The Profile 4-4 is not safe enough for all 4 material models. Thus, Profile 4-4 needs a special care to make the slope stable.Keywords: earthquake, finite element method, landslide, stability
Procedia PDF Downloads 3487489 A Bathtub Curve from Nonparametric Model
Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos
Abstract:
This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.Keywords: bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution
Procedia PDF Downloads 4467488 Epigenetic Drugs for Major Depressive Disorder: A Critical Appraisal of Available Studies
Authors: Aniket Kumar, Jacob Peedicayil
Abstract:
Major depressive disorder (MDD) is a common and important psychiatric disorder. Several clinical features of MDD suggest an epigenetic basis for its pathogenesis. Since epigenetics (heritable changes in gene expression not involving changes in DNA sequence) may underlie the pathogenesis of MDD, epigenetic drugs such as DNA methyltransferase inhibitors (DNMTi) and histone deactylase inhibitors (HDACi) may be useful for treating MDD. The available literature indexed in Pubmed on preclinical drug trials of epigenetic drugs for the treatment of MDD was investigated. The search terms we used were ‘depression’ or ‘depressive’ and ‘HDACi’ or ‘DNMTi’. Among epigenetic drugs, it was found that there were 3 preclinical trials using HDACi and 3 using DNMTi for the treatment of MDD. All the trials were conducted on rodents (mice or rats). The animal models of depression that were used were: learned helplessness-induced animal model, forced swim test, open field test, and the tail suspension test. One study used a genetic rat model of depression (the Flinders Sensitive Line). The HDACi that were tested were: sodium butyrate, compound 60 (Cpd-60), and valproic acid. The DNMTi that were tested were: 5-azacytidine and decitabine. Among the three preclinical trials using HDACi, all showed an antidepressant effect in animal models of depression. Among the 3 preclinical trials using DNMTi also, all showed an antidepressant effect in animal models of depression. Thus, epigenetic drugs, namely, HDACi and DNMTi, may prove to be useful in the treatment of MDD and merit further investigation for the treatment of this disorder.Keywords: DNA methylation, drug discovery, epigenetics, major depressive disorder
Procedia PDF Downloads 1887487 Design and Implementation of Wireless Syncronized AI System for Security
Authors: Saradha Priya
Abstract:
Developing virtual human is very important to meet the challenges occurred in many applications where human find difficult or risky to perform the task. A robot is a machine that can perform a task automatically or with guidance. Robotics is generally a combination of artificial intelligence and physical machines (motors). Computational intelligence involves the programmed instructions. This project proposes a robotic vehicle that has a camera, PIR sensor and text command based movement. It is specially designed to perform surveillance and other few tasks in the most efficient way. Serial communication has been occurred between a remote Base Station, GUI Application, and PC.Keywords: Zigbee, camera, pirsensor, wireless transmission, DC motor
Procedia PDF Downloads 3497486 A Biomechanical Model for the Idiopathic Scoliosis Using the Antalgic-Trak Technology
Authors: Joao Fialho
Abstract:
The mathematical modelling of idiopathic scoliosis has been studied throughout the years. The models presented on those papers are based on the orthotic stabilization of the idiopathic scoliosis, which are based on a transversal force being applied to the human spine on a continuous form. When considering the ATT (Antalgic-Trak Technology) device, the existent models cannot be used, as the type of forces applied are no longer transversal nor applied in a continuous manner. In this device, vertical traction is applied. In this study we propose to model the idiopathic scoliosis, using the ATT (Antalgic-Trak Technology) device, and with the parameters obtained from the mathematical modeling, set up a case-by-case individualized therapy plan, for each patient.Keywords: idiopathic scoliosis, mathematical modelling, human spine, Antalgic-Trak technology
Procedia PDF Downloads 2697485 Quality Assessment of New Zealand Mānuka Honeys Using Hyperspectral Imaging Combined with Deep 1D-Convolutional Neural Networks
Authors: Hien Thi Dieu Truong, Mahmoud Al-Sarayreh, Pullanagari Reddy, Marlon M. Reis, Richard Archer
Abstract:
New Zealand mānuka honey is a honeybee product derived mainly from Leptospermum scoparium nectar. The potent antibacterial activity of mānuka honey derives principally from methylglyoxal (MGO), in addition to the hydrogen peroxide and other lesser activities present in all honey. MGO is formed from dihydroxyacetone (DHA) unique to L. scoparium nectar. Mānuka honey also has an idiosyncratic phenolic profile that is useful as a chemical maker. Authentic mānuka honey is highly valuable, but almost all honey is formed from natural mixtures of nectars harvested by a hive over a time period. Once diluted by other nectars, mānuka honey irrevocably loses value. We aimed to apply hyperspectral imaging to honey frames before bulk extraction to minimise the dilution of genuine mānuka by other honey and ensure authenticity at the source. This technology is non-destructive and suitable for an industrial setting. Chemometrics using linear Partial Least Squares (PLS) and Support Vector Machine (SVM) showed limited efficacy in interpreting chemical footprints due to large non-linear relationships between predictor and predictand in a large sample set, likely due to honey quality variability across geographic regions. Therefore, an advanced modelling approach, one-dimensional convolutional neural networks (1D-CNN), was investigated for analysing hyperspectral data for extraction of biochemical information from honey. The 1D-CNN model showed superior prediction of honey quality (R² = 0.73, RMSE = 2.346, RPD= 2.56) to PLS (R² = 0.66, RMSE = 2.607, RPD= 1.91) and SVM (R² = 0.67, RMSE = 2.559, RPD=1.98). Classification of mono-floral manuka honey from multi-floral and non-manuka honey exceeded 90% accuracy for all models tried. Overall, this study reveals the potential of HSI and deep learning modelling for automating the evaluation of honey quality in frames.Keywords: mānuka honey, quality, purity, potency, deep learning, 1D-CNN, chemometrics
Procedia PDF Downloads 1397484 Translation and Transculturality in Contemporary Chinese Art: A Case Study of Gu Wenda’s 'Forest of Stone Steles' and 'United Nations: Temple of Heaven'
Authors: Rui Zhang
Abstract:
Translation has been elevated to one of the key notions in contemporary cultural discourse for a wide range of fields. It focuses not only on communication or transmission of meaning between different languages, but also on ways in which the very act of translation can be understood as a metaphor for cultural process. In recent years, the notion of translation is employed by some contemporary Chinese artists in a conceptual way, whose works contribute to constructing/deconstructing global/local cultural discourse and their own cultural identities. This study examines two artworks by contemporary Chinese artist Gu Wenda from a translational perspective, namely Forest of Stone Steles - Retranslation & Rewriting of Tang Poetry and United Nations - China Monument: Temple of Heaven, aiming to broaden the scope of Translation Studies to investigate visual culture and enrich methodological approach to contemporary Chinese art. Focusing on the relationship between translation, visuality and materiality in these two works, this study explores the nature of translation as part of the production of cultural discourse in the age of globalization as well as a way of establishing cultural identity. Gu Wenda, one of the most prestigious artists in contemporary China, is considered a pioneer in ‘85 Art Movement of China, and thereafter he went abroad for his artistic pursuits. His transnational experience enriches his cultural identity and the underlying discourse constructed/deconstructed in many of his works. In the two works already mentioned, the concept of translation is deployed by Gu Wenda on both linguistic level and metaphorical level for artistic expression. These two works produce discourses in which the artist’s perception of cultural identity in a transnational context is articulated by the tension between source text and target text. Based on the conceptual framework of cultural identity proposed by Stuart Hall, analyses of Gu Wenda’s cultural identity revealed through translation in these two works are centred on two axes, i.e., the axis of similarity and continuity with Chinese intellectual culture and the axis of difference and rupture with it, and the dialogic relationship between these two vectors. It argues that besides serving as a means of constructing visuality in the two works, translation metaphorizes Gu Wenda’s journey from overcoming his cultural identity anxiety to re-establishing a transcultural identity embedded in the underlying discourse.Keywords: contemporary Chinese art, cultural identity, transculturality, translation
Procedia PDF Downloads 4977483 Locus of Control, Metacognitive Knowledge, Metacognitive Regulation, and Student Performance in an Introductory Economics Course
Authors: Ahmad A. Kader
Abstract:
In the principles of Microeconomics course taught during the Fall Semester 2019, 158out of 179 students participated in the completion of two questionnaires and a survey describing their demographic and academic profiles. The two questionnaires include the 29 items of the Rotter Locus of Control Scale and the 52 items of the Schraw andDennisonMetacognitive Awareness Scale. The 52 items consist of 17 items describing knowledge of cognition and 37 items describing the regulation of cognition. The paper is intended to show the combined influence of locus of control, metacognitive knowledge, and metacognitive regulation on student performance. The survey covers variables that have been tested and recognized in economic education literature, which include GPA, gender, age, course level, race, student classification, whether the course was required or elective, employments, whether a high school economic course was taken, and attendance. Regression results show that of the economic education variables, GPA, classification, whether the course was required or elective, and attendance are the only significant variables in their influence on student grade. Of the educational psychology variables, the regression results show that the locus of control variable has a negative and significant effect, while the metacognitive knowledge variable has a positive and significant effect on student grade. Also, the adjusted R square value increased markedly with the addition of the locus of control, metacognitive knowledge, and metacognitive regulation variables to the regression equation. The t test results also show that students who are internally oriented and are high on the metacognitive knowledge scale significantly outperform students who are externally oriented and are low on the metacognitive knowledge scale. The implication of these results for educators is discussed in the paper.Keywords: locus of control, metacognitive knowledge, metacognitive regulation, student performance, economic education
Procedia PDF Downloads 1207482 On the Use of Analytical Performance Models to Design a High-Performance Active Queue Management Scheme
Authors: Shahram Jamali, Samira Hamed
Abstract:
One of the open issues in Random Early Detection (RED) algorithm is how to set its parameters to reach high performance for the dynamic conditions of the network. Although original RED uses fixed values for its parameters, this paper follows a model-based approach to upgrade performance of the RED algorithm. It models the routers queue behavior by using the Markov model and uses this model to predict future conditions of the queue. This prediction helps the proposed algorithm to make some tunings over RED's parameters and provide efficiency and better performance. Widespread packet level simulations confirm that the proposed algorithm, called Markov-RED, outperforms RED and FARED in terms of queue stability, bottleneck utilization and dropped packets count.Keywords: active queue management, RED, Markov model, random early detection algorithm
Procedia PDF Downloads 5397481 Ontology Expansion via Synthetic Dataset Generation and Transformer-Based Concept Extraction
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology expansion, synthetic dataset, transformer fine-tuning, concept extraction, DOLCE, BERT, taxonomy, LLM, NER
Procedia PDF Downloads 147480 Using Historical Data for Stock Prediction
Authors: Sofia Stoica
Abstract:
In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.Keywords: finance, machine learning, opening price, stock market
Procedia PDF Downloads 1907479 Classification of Coughing and Breathing Activities Using Wearable and a Light-Weight DL Model
Authors: Subham Ghosh, Arnab Nandi
Abstract:
Background: The proliferation of Wireless Body Area Networks (WBAN) and Internet of Things (IoT) applications demonstrates the potential for continuous monitoring of physical changes in the body. These technologies are vital for health monitoring tasks, such as identifying coughing and breathing activities, which are necessary for disease diagnosis and management. Monitoring activities such as coughing and deep breathing can provide valuable insights into a variety of medical issues. Wearable radio-based antenna sensors, which are lightweight and easy to incorporate into clothing or portable goods, provide continuous monitoring. This mobility gives it a substantial advantage over stationary environmental sensors like as cameras and radar, which are constrained to certain places. Furthermore, using compressive techniques provides benefits such as reduced data transmission speeds and memory needs. These wearable sensors offer more advanced and diverse health monitoring capabilities. Methodology: This study analyzes the feasibility of using a semi-flexible antenna operating at 2.4 GHz (ISM band) and positioned around the neck and near the mouth to identify three activities: coughing, deep breathing, and idleness. Vector network analyzer (VNA) is used to collect time-varying complex reflection coefficient data from perturbed antenna nearfield. The reflection coefficient (S11) conveys nuanced information caused by simultaneous variations in the nearfield radiation of three activities across time. The signatures are sparsely represented with gaussian windowed Gabor spectrograms. The Gabor spectrogram is used as a sparse representation approach, which reassigns the ridges of the spectrogram images to improve their resolution and focus on essential components. The antenna is biocompatible in terms of specific absorption rate (SAR). The sparsely represented Gabor spectrogram pictures are fed into a lightweight deep learning (DL) model for feature extraction and classification. Two antenna locations are investigated in order to determine the most effective localization for three different activities. Findings: Cross-validation techniques were used on data from both locations. Due to the complex form of the recorded S11, separate analyzes and assessments were performed on the magnitude, phase, and their combination. The combination of magnitude and phase fared better than the separate analyses. Various sliding window sizes, ranging from 1 to 5 seconds, were tested to find the best window for activity classification. It was discovered that a neck-mounted design was effective at detecting the three unique behaviors.Keywords: activity recognition, antenna, deep-learning, time-frequency
Procedia PDF Downloads 107478 Downscaling Grace Gravity Models Using Spectral Combination Techniques for Terrestrial Water Storage and Groundwater Storage Estimation
Authors: Farzam Fatolazadeh, Kalifa Goita, Mehdi Eshagh, Shusen Wang
Abstract:
The Gravity Recovery and Climate Experiment (GRACE) is a satellite mission with twin satellites for the precise determination of spatial and temporal variations in the Earth’s gravity field. The products of this mission are monthly global gravity models containing the spherical harmonic coefficients and their errors. These GRACE models can be used for estimating terrestrial water storage (TWS) variations across the globe at large scales, thereby offering an opportunity for surface and groundwater storage (GWS) assessments. Yet, the ability of GRACE to monitor changes at smaller scales is too limited for local water management authorities. This is largely due to the low spatial and temporal resolutions of its models (~200,000 km2 and one month, respectively). High-resolution GRACE data products would substantially enrich the information that is needed by local-scale decision-makers while offering the data for the regions that lack adequate in situ monitoring networks, including northern parts of Canada. Such products could eventually be obtained through downscaling. In this study, we extended the spectral combination theory to simultaneously downscale spatiotemporally the 3o spatial coarse resolution of GRACE to 0.25o degrees resolution and monthly coarse resolution to daily resolution. This method combines the monthly gravity field solution of GRACE and daily hydrological model products in the form of both low and high-frequency signals to produce high spatiotemporal resolution TWSA and GWSA products. The main contribution and originality of this study are to comprehensively and simultaneously consider GRACE and hydrological variables and their uncertainties to form the estimator in the spectral domain. Therefore, it is predicted that we reach downscale products with an acceptable accuracy.Keywords: GRACE satellite, groundwater storage, spectral combination, terrestrial water storage
Procedia PDF Downloads 837477 Assessment of Pre-Processing Influence on Near-Infrared Spectra for Predicting the Mechanical Properties of Wood
Authors: Aasheesh Raturi, Vimal Kothiyal, P. D. Semalty
Abstract:
We studied mechanical properties of Eucalyptus tereticornis using FT-NIR spectroscopy. Firstly, spectra were pre-processed to eliminate useless information. Then, prediction model was constructed by partial least squares regression. To study the influence of pre-processing on prediction of mechanical properties for NIR analysis of wood samples, we applied various pretreatment methods like straight line subtraction, constant offset elimination, vector-normalization, min-max normalization, multiple scattering. Correction, first derivative, second derivatives and their combination with other treatment such as First derivative + straight line subtraction, First derivative+ vector normalization and First derivative+ multiplicative scattering correction. The data processing methods in combination of preprocessing with different NIR regions, RMSECV, RMSEP and optimum factors/rank were obtained by optimization process of model development. More than 350 combinations were obtained during optimization process. More than one pre-processing method gave good calibration/cross-validation and prediction/test models, but only the best calibration/cross-validation and prediction/test models are reported here. The results show that one can safely use NIR region between 4000 to 7500 cm-1 with straight line subtraction, constant offset elimination, first derivative and second derivative preprocessing method which were found to be most appropriate for models development.Keywords: FT-NIR, mechanical properties, pre-processing, PLS
Procedia PDF Downloads 3627476 Economic Development Impacts of Connected and Automated Vehicles (CAV)
Authors: Rimon Rafiah
Abstract:
This paper will present a combination of two seemingly unrelated models, which are the one for estimating economic development impacts as a result of transportation investment and the other for increasing CAV penetration in order to reduce congestion. Measuring economic development impacts resulting from transportation investments is becoming more recognized around the world. Examples include the UK’s Wider Economic Benefits (WEB) model, Economic Impact Assessments in the USA, various input-output models, and additional models around the world. The economic impact model is based on WEB and is based on the following premise: investments in transportation will reduce the cost of personal travel, enabling firms to be more competitive, creating additional throughput (the same road allows more people to travel), and reducing the cost of travel of workers to a new workplace. This reduction in travel costs was estimated in out-of-pocket terms in a given localized area and was then translated into additional employment based on regional labor supply elasticity. This additional employment was conservatively assumed to be at minimum wage levels, translated into GDP terms, and from there into direct taxation (i.e., an increase in tax taken by the government). The CAV model is based on economic principles such as CAV usage, supply, and demand. Usage of CAVs can increase capacity using a variety of means – increased automation (known as Level I thru Level IV) and also by increased penetration and usage, which has been predicted to go up to 50% by 2030 according to several forecasts, with possible full conversion by 2045-2050. Several countries have passed policies and/or legislation on sales of gasoline-powered vehicles (none) starting in 2030 and later. Supply was measured via increased capacity on given infrastructure as a function of both CAV penetration and implemented technologies. The CAV model, as implemented in the USA, has shown significant savings in travel time and also in vehicle operating costs, which can be translated into economic development impacts in terms of job creation, GDP growth and salaries as well. The models have policy implications as well and can be adapted for use in Japan as well.Keywords: CAV, economic development, WEB, transport economics
Procedia PDF Downloads 747475 Liquid-Liquid Equilibrium Study in Solvent Extraction of o-Cresol from Coal Tar
Authors: Dewi Selvia Fardhyanti, Astrilia Damayanti
Abstract:
Coal tar is a liquid by-product of the process of coal gasification and carbonation, also in some industries such as steel, power plant, cement, and others. This liquid oil mixture contains various kinds of useful compounds such as aromatic compounds and phenolic compounds. These compounds are widely used as raw material for insecticides, dyes, medicines, perfumes, coloring matters, and many others. This research investigates thermodynamic modelling of liquid-liquid equilibria (LLE) in solvent extraction of o-Cresol from the coal tar. The equilibria are modeled by ternary components of Wohl, Van Laar, and Three-Suffix Margules models. The values of the parameters involved are obtained by curve-fitting to the experimental data. Based on the comparison between calculated and experimental data, it turns out that among the three models studied, the Three-Suffix Margules seems to be the best to predict the LLE of o-Cresol for those system.Keywords: coal tar, o-Cresol, Wohl, Van Laar, three-suffix margules
Procedia PDF Downloads 2777474 AutoML: Comprehensive Review and Application to Engineering Datasets
Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili
Abstract:
The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.Keywords: automated machine learning, uncertainty, engineering dataset, regression
Procedia PDF Downloads 617473 Regularization of Gene Regulatory Networks Perturbed by White Noise
Authors: Ramazan I. Kadiev, Arcady Ponosov
Abstract:
Mathematical models of gene regulatory networks can in many cases be described by ordinary differential equations with switching nonlinearities, where the initial value problem is ill-posed. Several regularization methods are known in the case of deterministic networks, but the presence of stochastic noise leads to several technical difficulties. In the presentation, it is proposed to apply the methods of the stochastic singular perturbation theory going back to Yu. Kabanov and Yu. Pergamentshchikov. This approach is used to regularize the above ill-posed problem, which, e.g., makes it possible to design stable numerical schemes. Several examples are provided in the presentation, which support the efficiency of the suggested analysis. The method can also be of interest in other fields of biomathematics, where differential equations contain switchings, e.g., in neural field models.Keywords: ill-posed problems, singular perturbation analysis, stochastic differential equations, switching nonlinearities
Procedia PDF Downloads 1947472 Improved Anatomy Teaching by the 3D Slicer Platform
Authors: Ahmedou Moulaye Idriss, Yahya Tfeil
Abstract:
Medical imaging technology has become an indispensable tool in many branches of the biomedical, health area, and research and is vitally important for the training of professionals in these fields. It is not only about the tools, technologies, and knowledge provided but also about the community that this training project proposes. In order to be able to raise the level of anatomy teaching in the medical school of Nouakchott in Mauritania, it is necessary and even urgent to facilitate access to modern technology for African countries. The role of technology as a key driver of justifiable development has long been recognized. Anatomy is an essential discipline for the training of medical students; it is a key element for the training of medical specialists. The quality and results of the work of a young surgeon depend on his better knowledge of anatomical structures. The teaching of anatomy is difficult as the discipline is being neglected by medical students in many academic institutions. However, anatomy remains a vital part of any medical education program. When anatomy is presented in various planes medical students approve of difficulties in understanding. They do not increase their ability to visualize and mentally manipulate 3D structures. They are sometimes not able to correctly identify neighbouring or associated structures. This is the case when they have to make the identification of structures related to the caudate lobe when the liver is moved to different positions. In recent decades, some modern educational tools using digital sources tend to replace old methods. One of the main reasons for this change is the lack of cadavers in laboratories with poorly qualified staff. The emergence of increasingly sophisticated mathematical models, image processing, and visualization tools in biomedical imaging research have enabled sophisticated three-dimensional (3D) representations of anatomical structures. In this paper, we report our current experience in the Faculty of Medicine in Nouakchott Mauritania. One of our main aims is to create a local learning community in the fields of anatomy. The main technological platform used in this project is called 3D Slicer. 3D Slicer platform is an open-source application available for free for viewing, analysis, and interaction with biomedical imaging data. Using the 3D Slicer platform, we created from real medical images anatomical atlases of parts of the human body, including head, thorax, abdomen, liver, and pelvis, upper and lower limbs. Data were collected from several local hospitals and also from the website. We used MRI and CT-Scan imaging data from children and adults. Many different anatomy atlases exist, both in print and digital forms. Anatomy Atlas displays three-dimensional anatomical models, image cross-sections of labelled structures and source radiological imaging, and a text-based hierarchy of structures. Open and free online anatomical atlases developed by our anatomy laboratory team will be available to our students. This will allow pedagogical autonomy and remedy the shortcomings by responding more fully to the objectives of sustainable local development of quality education and good health at the national level. To make this work a reality, our team produced several atlases available in our faculty in the form of research projects.Keywords: anatomy, education, medical imaging, three dimensional
Procedia PDF Downloads 2417471 Working Capital Management and Profitability of Uk Firms: A Contingency Theory Approach
Authors: Ishmael Tingbani
Abstract:
This paper adopts a contingency theory approach to investigate the relationship between working capital management and profitability using data of 225 listed British firms on the London Stock Exchange for the period 2001-2011. The paper employs a panel data analysis on a series of interactive models to estimate this relationship. The findings of the study confirm the relevance of the contingency theory. Evidence from the study suggests that the impact of working capital management on profitability varies and is constrained by organizational contingencies (environment, resources, and management factors) of the firm. These findings have implications for a more balanced and nuanced view of working capital management policy for policy-makers.Keywords: working capital management, profitability, contingency theory approach, interactive models
Procedia PDF Downloads 347