Search results for: Task Scenarios
84 Automation of Heat Exchanger using Neural Network
Authors: Sudhir Agashe, Ashok Ghatol, Sujata Agashe
Abstract:
In this paper the development of a heat exchanger as a pilot plant for educational purpose is discussed and the use of neural network for controlling the process is being presented. The aim of the study is to highlight the need of a specific Pseudo Random Binary Sequence (PRBS) to excite a process under control. As the neural network is a data driven technique, the method for data generation plays an important role. In light of this a careful experimentation procedure for data generation was crucial task. Heat exchange is a complex process, which has a capacity and a time lag as process elements. The proposed system is a typical pipe-in- pipe type heat exchanger. The complexity of the system demands careful selection, proper installation and commissioning. The temperature, flow, and pressure sensors play a vital role in the control performance. The final control element used is a pneumatically operated control valve. While carrying out the experimentation on heat exchanger a welldrafted procedure is followed giving utmost attention towards safety of the system. The results obtained are encouraging and revealing the fact that if the process details are known completely as far as process parameters are concerned and utilities are well stabilized then feedback systems are suitable, whereas neural network control paradigm is useful for the processes with nonlinearity and less knowledge about process. The implementation of NN control reinforces the concepts of process control and NN control paradigm. The result also underlined the importance of excitation signal typically for that process. Data acquisition, processing, and presentation in a typical format are the most important parameters while validating the results.Keywords: Process identification, neural network, heat exchanger.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157383 Distinctive Features of Legal Relations in the Area of Subsoil Use, Renewal and Protection in Ukraine
Authors: N. Maksimentseva
Abstract:
The issue of public administration in subsoil use, renewal and protection is of high importance for Ukraine since it is strongly linked to energy security of the state as well as it shall facilitate the people of Ukraine to efficiently implement its propitiatory rights towards natural resources and redistribution of national wealth. As it is stipulated in the Article 11 of the Subsoil Code of Ukraine (the Code) the authorities that administer the industry are limited to central executive bodies and local governments. In particular, it is stipulated in the Code that the Ukraine’s Cabinet of Ministers carries out public administration in geological exploration, production and protection of subsoil. Other state bodies of public administration include central public authority responsible for state environmental protection policies; central public authority in charge of implementation of state geological exploration and efficient subsoil use policies; central authority in charge of state health and safety control policies. There are also public authorities in the Autonomous Republic of Crimea; local executive bodies and other state authorities and local self-government authorities in compliance with laws of Ukraine. This article is devoted to the analysis of the legal relations in the area of public administration of subsoil use, renewal and protection in Ukraine. The main approaches to study the essence of legal relations in the named area as well as its tasks, functions and methods are analyzed. It is concluded in this article that legal relationship in the field of public administration of subsoil use, renewal and protection is characterized by specifics of its task (development of natural resources).
Keywords: Legal relations, public administration, Subsoil Code of Ukraine, subsoil use, renewal and protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109382 Spatial Query Localization Method in Limited Reference Point Environment
Authors: Victor Krebss
Abstract:
Task of object localization is one of the major challenges in creating intelligent transportation. Unfortunately, in densely built-up urban areas, localization based on GPS only produces a large error, or simply becomes impossible. New opportunities arise for the localization due to the rapidly emerging concept of a wireless ad-hoc network. Such network, allows estimating potential distance between these objects measuring received signal level and construct a graph of distances in which nodes are the localization objects, and edges - estimates of the distances between pairs of nodes. Due to the known coordinates of individual nodes (anchors), it is possible to determine the location of all (or part) of the remaining nodes of the graph. Moreover, road map, available in digital format can provide localization routines with valuable additional information to narrow node location search. However, despite abundance of well-known algorithms for solving the problem of localization and significant research efforts, there are still many issues that currently are addressed only partially. In this paper, we propose localization approach based on the graph mapped distances on the digital road map data basis. In fact, problem is reduced to distance graph embedding into the graph representing area geo location data. It makes possible to localize objects, in some cases even if only one reference point is available. We propose simple embedding algorithm and sample implementation as spatial queries over sensor network data stored in spatial database, allowing employing effectively spatial indexing, optimized spatial search routines and geometry functions.Keywords: Intelligent Transportation System, Sensor Network, Localization, Spatial Query, GIS, Graph Embedding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 153581 A Spatial Hypergraph Based Semi-Supervised Band Selection Method for Hyperspectral Imagery Semantic Interpretation
Authors: Akrem Sellami, Imed Riadh Farah
Abstract:
Hyperspectral imagery (HSI) typically provides a wealth of information captured in a wide range of the electromagnetic spectrum for each pixel in the image. Hence, a pixel in HSI is a high-dimensional vector of intensities with a large spectral range and a high spectral resolution. Therefore, the semantic interpretation is a challenging task of HSI analysis. We focused in this paper on object classification as HSI semantic interpretation. However, HSI classification still faces some issues, among which are the following: The spatial variability of spectral signatures, the high number of spectral bands, and the high cost of true sample labeling. Therefore, the high number of spectral bands and the low number of training samples pose the problem of the curse of dimensionality. In order to resolve this problem, we propose to introduce the process of dimensionality reduction trying to improve the classification of HSI. The presented approach is a semi-supervised band selection method based on spatial hypergraph embedding model to represent higher order relationships with different weights of the spatial neighbors corresponding to the centroid of pixel. This semi-supervised band selection has been developed to select useful bands for object classification. The presented approach is evaluated on AVIRIS and ROSIS HSIs and compared to other dimensionality reduction methods. The experimental results demonstrate the efficacy of our approach compared to many existing dimensionality reduction methods for HSI classification.Keywords: Hyperspectral image, spatial hypergraph, dimensionality reduction, semantic interpretation, band selection, feature extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 122080 A Distributed Mobile Agent Based on Intrusion Detection System for MANET
Authors: Maad Kamal Al-Anni
Abstract:
This study is about an algorithmic dependence of Artificial Neural Network on Multilayer Perceptron (MPL) pertaining to the classification and clustering presentations for Mobile Adhoc Network vulnerabilities. Moreover, mobile ad hoc network (MANET) is ubiquitous intelligent internetworking devices in which it has the ability to detect their environment using an autonomous system of mobile nodes that are connected via wireless links. Security affairs are the most important subject in MANET due to the easy penetrative scenarios occurred in such an auto configuration network. One of the powerful techniques used for inspecting the network packets is Intrusion Detection System (IDS); in this article, we are going to show the effectiveness of artificial neural networks used as a machine learning along with stochastic approach (information gain) to classify the malicious behaviors in simulated network with respect to different IDS techniques. The monitoring agent is responsible for detection inference engine, the audit data is collected from collecting agent by simulating the node attack and contrasted outputs with normal behaviors of the framework, whenever. In the event that there is any deviation from the ordinary behaviors then the monitoring agent is considered this event as an attack , in this article we are going to demonstrate the signature-based IDS approach in a MANET by implementing the back propagation algorithm over ensemble-based Traffic Table (TT), thus the signature of malicious behaviors or undesirable activities are often significantly prognosticated and efficiently figured out, by increasing the parametric set-up of Back propagation algorithm during the experimental results which empirically shown its effectiveness for the ratio of detection index up to 98.6 percentage. Consequently it is proved in empirical results in this article, the performance matrices are also being included in this article with Xgraph screen show by different through puts like Packet Delivery Ratio (PDR), Through Put(TP), and Average Delay(AD).
Keywords: Mobile ad hoc network, MANET, intrusion detection system, back propagation algorithm, neural networks, traffic table, multilayer perceptron, feed-forward back-propagation, network simulator 2.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 92879 Selecting the Best Sub-Region Indexing the Images in the Case of Weak Segmentation Based On Local Color Histograms
Authors: Mawloud Mosbah, Bachir Boucheham
Abstract:
Color Histogram is considered as the oldest method used by CBIR systems for indexing images. In turn, the global histograms do not include the spatial information; this is why the other techniques coming later have attempted to encounter this limitation by involving the segmentation task as a preprocessing step. The weak segmentation is employed by the local histograms while other methods as CCV (Color Coherent Vector) are based on strong segmentation. The indexation based on local histograms consists of splitting the image into N overlapping blocks or sub-regions, and then the histogram of each block is computed. The dissimilarity between two images is reduced, as consequence, to compute the distance between the N local histograms of the both images resulting then in N*N values; generally, the lowest value is taken into account to rank images, that means that the lowest value is that which helps to designate which sub-region utilized to index images of the collection being asked. In this paper, we make under light the local histogram indexation method in the hope to compare the results obtained against those given by the global histogram. We address also another noteworthy issue when Relying on local histograms namely which value, among N*N values, to trust on when comparing images, in other words, which sub-region among the N*N sub-regions on which we base to index images. Based on the results achieved here, it seems that relying on the local histograms, which needs to pose an extra overhead on the system by involving another preprocessing step naming segmentation, does not necessary mean that it produces better results. In addition to that, we have proposed here some ideas to select the local histogram on which we rely on to encode the image rather than relying on the local histogram having lowest distance with the query histograms.
Keywords: CBIR, Color Global Histogram, Color Local Histogram, Weak Segmentation, Euclidean Distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173078 Policy Brief/Note of Philippine Health Issues: Human Rights Violations Committed on Healthcare Workers
Authors: Trina Isabel D. Santiago, Daniel C. Chua, Jumee F. Tayaban, Joseph Daniel S. Timbol, Joshua M. Yanes
Abstract:
Numerous instances of human rights violations on healthcare workers have been reported during the COVID-19 pandemic in the Philippines. This paper aims to explore these civil and political rights violations and propose recommendations to address these. Our review shows that a wide range of civic and political human rights violations have been committed by individual citizens and government agencies on individual healthcare workers and health worker groups. These violations include discrimination, red-tagging, evictions, illegal arrests, and acts of violence ranging from chemical attacks to homicide. If left unchecked, these issues, compounded by the pandemic, may lead to the exacerbations of the pre-existing problems of the Philippine healthcare system. Despite all pre-existing reports by human rights groups and public media articles, there still seems to be a lack of government action to condemn and prevent these violations. The existence of government agencies which directly contribute to these violations with the lack of condemnation from other agencies further propagate the problem. Given these issues, this policy brief recommends the establishment of an interagency task force for the protection of human rights of healthcare workers as well as the expedited passing of current legislative bills towards the same goal. For more immediate action, we call for the establishment of a dedicated hotline for these incidents with adequate appointment and training of point persons, construction of clear guidelines, and closer collaboration between government agencies in being united against these issues.
Keywords: COVID-19 pandemic, healthcare workers, human rights violations, Philippines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8477 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images
Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn
Abstract:
The detection and segmentation of mitochondria from fluorescence microscopy is crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. Although there exists a number of open-source software tools and artificial intelligence (AI) methods designed for analyzing mitochondrial images, the availability of only a few combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compactibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source Python and OpenCV library, the algorithms are implemented in three stages: pre-processing; image binarization; and coarse-to-fine segmentation. The proposed model is validated using the fluorescence mitochondrial dataset. Ground truth labels generated using Labkit were also used to evaluate the performance of our detection and segmentation model using precision, recall and rand index. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks concludes the paper.
Keywords: 2D, Binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46876 Novel Adaptive Channel Equalization Algorithms by Statistical Sampling
Authors: János Levendovszky, András Oláh
Abstract:
In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.
Keywords: Cellular Neural Network, channel equalization, communication over fading channels, multiuser communication, spectral efficiency, statistical sampling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 152075 Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation
Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour
Abstract:
Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.
Keywords: Answer processing, answer validation, classification, question answering, query reformulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 284774 Automated Fact-Checking By Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state of the art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study presents a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive and authoritative data; 2) developing a search function to automatically select relevant, new and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that: 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graph in Wikidata to dynamically augment the representations of claims and references without introducing too much noises; II) exploring semantic relations in claims and references to further enhance fact-checking.
Keywords: Fact checking, claim verification, Deep Learning, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8073 Working with Children and Young People as a much Neglected Area of Education within the Social Studies Curriculum in Poland
Authors: Marta Czechowska-Bieluga
Abstract:
Social work education in Poland focuses mostly on developing competencies that address the needs of individuals and families affected by a variety of life's problems. As a result of the ageing of the Polish population, much attention is equally devoted to adults, including the elderly. However, social work with children and young people is the area of education which should be given more consideration. Social work students are mostly trained to cater to the needs of families and the competencies aimed to respond to the needs of children and young people do not receive enough attention and are only offered as elective classes. This paper strives to review the social work programmes offered by the selected higher education institutions in Poland in terms of social work training aimed at helping children and young people to address their life problems. The analysis conducted in this study indicates that university education for social work focuses on training professionals who will provide assistance only to adults. Due to changes in the social and political situation, including, in particular, changes in social policy implemented for the needy, it is necessary to extend this area of education to include the specificity of the support for children and young people; especially, in the light of the appearance of new support professions within the area of social work. For example, family assistants, whose task is to support parents in performing their roles as guardians and educators, also assist children. Therefore, it becomes necessary to equip social work professionals with competencies which include issues related to the quality of life of underage people living in families. Social work curricula should be extended to include the issues of child and young person development and the patterns governing this phase of life.
Keywords: Social work education, social work programmes, social worker, university.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 64972 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.
Keywords: Automation, human factors, air traffic controller, MINIMA, OOTL, Out-Of-The-Loop, EEG, electroencephalography, HMI, human machine interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145271 The Study of Formal and Semantic Errors of Lexis by Persian EFL Learners
Authors: Mohammad J. Rezai, Fereshteh Davarpanah
Abstract:
Producing a text in a language which is not one’s mother tongue can be a demanding task for language learners. Examining lexical errors committed by EFL learners is a challenging area of investigation which can shed light on the process of second language acquisition. Despite the considerable number of investigations into grammatical errors, few studies have tackled formal and semantic errors of lexis committed by EFL learners. The current study aimed at examining Persian learners’ formal and semantic errors of lexis in English. To this end, 60 students at three different proficiency levels were asked to write on 10 different topics in 10 separate sessions. Finally, 600 essays written by Persian EFL learners were collected, acting as the corpus of the study. An error taxonomy comprising formal and semantic errors was selected to analyze the corpus. The formal category covered misselection and misformation errors, while the semantic errors were classified into lexical, collocational and lexicogrammatical categories. Each category was further classified into subcategories depending on the identified errors. The results showed that there were 2583 errors in the corpus of 9600 words, among which, 2030 formal errors and 553 semantic errors were identified. The most frequent errors in the corpus included formal error commitment (78.6%), which were more prevalent at the advanced level (42.4%). The semantic errors (21.4%) were more frequent at the low intermediate level (40.5%). Among formal errors of lexis, the highest number of errors was devoted to misformation errors (98%), while misselection errors constituted 2% of the errors. Additionally, no significant differences were observed among the three semantic error subcategories, namely collocational, lexical choice and lexicogrammatical. The results of the study can shed light on the challenges faced by EFL learners in the second language acquisition process.
Keywords: Collocational errors, lexical errors, Persian EFL learners, semantic errors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 122970 Combined Effect of Moving and Open Boundary Conditions in the Simulation of Inland Inundation Due to Far Field Tsunami
Authors: M. Ashaque Meah, Md. Fazlul Karim, M. Shah Noor, Nazmun Nahar Papri, M. Khalid Hossen, M. Ismoen
Abstract:
Tsunami and inundation modelling due to far field tsunami propagation in a limited area is a very challenging numerical task because it involves many aspects such as the formation of various types of waves and the irregularities of coastal boundaries. To compute the effect of far field tsunami and extent of inland inundation due to far field tsunami along the coastal belts of west coast of Malaysia and Southern Thailand, a formulated boundary condition and a moving boundary condition are simultaneously used. In this study, a boundary fitted curvilinear grid system is used in order to incorporate the coastal and island boundaries accurately as the boundaries of the model domain are curvilinear in nature and the bending is high. The tsunami response of the event 26 December 2004 along the west open boundary of the model domain is computed to simulate the effect of far field tsunami. Based on the data of the tsunami source at the west open boundary of the model domain, a boundary condition is formulated and applied to simulate the tsunami response along the coastal and island boundaries. During the simulation process, a moving boundary condition is initiated instead of fixed vertical seaside wall. The extent of inland inundation and tsunami propagation pattern are computed. Some comparisons are carried out to test the validation of the simultaneous use of the two boundary conditions. All simulations show excellent agreement with the data of observation.Keywords: Open boundary condition, moving boundary condition, boundary-fitted curvilinear grids, far field tsunami, Shallow Water Equations, tsunami source, Indonesian tsunami of 2004.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205569 COVID_ICU_BERT: A Fine-tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as physiological vital signs, images and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful to influence the judgement of clinical sentiment in ICU clinical notes. This paper presents two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of a clinical transformer model that can reliably predict clinical sentiment for notes of COVID patients in ICU. We train the model on clinical notes for COVID-19 patients, ones not previously seen by Bio_ClinicalBERT or Bio_Discharge_Summary_BERT. The model which was based on Bio_ClinicalBERT achieves higher predictive accuracy than the one based on Bio_Discharge_Summary_BERT (Acc 93.33%, AUC 0.98, and Precision 0.96). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and Precision 0.92).
Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27668 Construction and Validation of a Hybrid Lumbar Spine Model for the Fast Evaluation of Intradiscal Pressure and Mobility
Authors: Ali Hamadi Dicko, Nicolas Tong-Yette, Benjamin Gilles, François Faure, Olivier Palombi
Abstract:
A novel hybrid model of the lumbar spine, allowing fast static and dynamic simulations of the disc pressure and the spine mobility, is introduced in this work. Our contribution is to combine rigid bodies, deformable finite elements, articular constraints, and springs into a unique model of the spine. Each vertebra is represented by a rigid body controlling a surface mesh to model contacts on the facet joints and the spinous process. The discs are modeled using a heterogeneous tetrahedral finite element model. The facet joints are represented as elastic joints with six degrees of freedom, while the ligaments are modeled using non-linear one-dimensional elastic elements. The challenge we tackle is to make these different models efficiently interact while respecting the principles of Anatomy and Mechanics. The mobility, the intradiscal pressure, the facet joint force and the instantaneous center of rotation of the lumbar spine are validated against the experimental and theoretical results of the literature on flexion, extension, lateral bending as well as axial rotation. Our hybrid model greatly simplifies the modeling task and dramatically accelerates the simulation of pressure within the discs, as well as the evaluation of the range of motion and the instantaneous centers of rotation, without penalizing precision. These results suggest that for some types of biomechanical simulations, simplified models allow far easier modeling and faster simulations compared to usual full-FEM approaches without any loss of accuracy.
Keywords: Hybrid, modeling, fast simulation, lumbar spine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 238267 Space Telemetry Anomaly Detection Based on Statistical PCA Algorithm
Authors: B. Nassar, W. Hussein, M. Mokhtar
Abstract:
The critical concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission, but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the problem above coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions, and the results show that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: Space telemetry monitoring, multivariate analysis, PCA algorithm, space operations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 206266 Expert Witness Testimony in the Battered Woman Syndrome
Authors: Ana Pauna
Abstract:
The Expert Witness Testimony in the Battered Woman Syndrome Expert witness testimony (EWT) is a kind of information given by an expert specialized in the field (here in BWS) to the jury in order to help the court better understand the case. EWT does not always work in favor of the battered women. Two main decision-making models are discussed in the paper: the Mathematical model and the Explanation model. In the first model, the jurors calculate ″the importance and strength of each piece of evidence″ whereas in the second model they try to integrate the EWT with the evidence and create a coherent story that would describe the crime. The jury often misunderstands and misjudges battered women for their action (or in this case inaction). They assume that these women are masochists and accept being mistreated for if a man abuses a woman constantly, she should and could divorce him or simply leave at any time. The research in the domain found that indeed, expert witness testimony has a powerful influence on juror’s decisions thus its quality needs to be further explored. One of the important factors that need further studies is a bias called the dispositionist worldview (a belief that what happens to people is of their own doing). This kind of attributional bias represents a tendency to think that a person’s behavior is due to his or her disposition, even when the behavior is clearly attributed to the situation. Hypothesis The hypothesis of this paper is that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. The juror would therefore commit the fundamental attribution error and believe that the victim’s disposition caused the rape and not the situation she was in. Methods The subjects in the study were 500 randomly sampled undergraduate students from McGill, Concordia, Université de Montréal and UQAM. Dispositional Worldview was scored on the Dispositionist Worldview Questionnaire. After reading the Rape Scenarios, each student was asked to play the role of a juror and answer a questionnaire consisting of 7 questions about the responsibility, causality and fault of the victim. Results The results confirm the hypothesis which states that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. By doing so, the juror commits the fundamental attribution error because he will believe that the victim’s disposition, and not the constraints or opportunities of the situation, caused the rape scenario.Keywords: bias, expert/witness testimony, attribution error, jury, rape myth
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217965 Exploring SL Writing and SL Sensitivity during Writing Tasks: Poor and Advanced Writing in a Context of Second Language Other than English
Authors: S. Figueiredo, M. Alves Martins, C. Silva, C. Simões
Abstract:
This study integrates a larger research empirical project that examines second language (SL) learners’ profiles and valid procedures to perform complete and diagnostic assessment in schools. 102 learners of Portuguese as a SL aged 7 and 17 years speakers of distinct home languages were assessed in several linguistic tasks. In this article, we focused on writing performance in the specific task of narrative essay composition. The written outputs were measured using the score in six components adapted from an English SL assessment context (Alberta Education): linguistic vocabulary, grammar, syntax, strategy, socio-linguistic, and discourse. The writing processes and strategies in Portuguese language used by different immigrant students were analysed to determine features and diversity of deficits on authentic texts performed by SL writers. Differentiated performance was based on the diversity of the following variables: grades, previous schooling, home language, instruction in first language, and exposure to Portuguese as Second Language. Indo-Aryan languages speakers showed low writing scores compared to their peers and the type of language and respective cognitive mapping (such as Mandarin and Arabic) was the predictor, not linguistic distance. Home language instruction should also be prominently considered in further research to understand specificities of cognitive academic profile in a Romance languages learning context. Additionally, this study also examined the teachers’ representations that will be here addressed to understand educational implications of second language teaching in psychological distress of different minorities in schools of specific host countries.Keywords: Second language, writing assessment, home language, immigrant students, Portuguese language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195864 ZMP Based Reference Generation for Biped Walking Robots
Authors: Kemalettin Erbatur, Özer Koca, Evrim Taşkıran, Metin Yılmaz, Utku Seven
Abstract:
Recent fifteen years witnessed fast improvements in the field of humanoid robotics. The human-like robot structure is more suitable to human environment with its supreme obstacle avoidance properties when compared with wheeled service robots. However, the walking control for bipedal robots is a challenging task due to their complex dynamics. Stable reference generation plays a very important role in control. Linear Inverted Pendulum Model (LIPM) and the Zero Moment Point (ZMP) criterion are applied in a number of studies for stable walking reference generation of biped walking robots. This paper follows this main approach too. We propose a natural and continuous ZMP reference trajectory for a stable and human-like walk. The ZMP reference trajectories move forward under the sole of the support foot when the robot body is supported by a single leg. Robot center of mass trajectory is obtained from predefined ZMP reference trajectories by a Fourier series approximation method. The Gibbs phenomenon problem common with Fourier approximations of discontinuous functions is avoided by employing continuous ZMP references. Also, these ZMP reference trajectories possess pre-assigned single and double support phases, which are very useful in experimental tuning work. The ZMP based reference generation strategy is tested via threedimensional full-dynamics simulations of a 12-degrees-of-freedom biped robot model. Simulation results indicate that the proposed reference trajectory generation technique is successful.Keywords: Biped robot, Linear Inverted Pendulum Model, Zero Moment Point, Fourier series approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163163 Impact of Computer-Mediated Communication on Virtual Teams- Performance: An Empirical Study
Authors: Nadeem Ehsan, Ebtisam Mirza, Muhammad Ahmad
Abstract:
In a complex project environment, project teams face multi-dimensional communication problems that can ultimately lead to project breakdown. Team Performance varies in Face-to-Face (FTF) environment versus groups working remotely in a computermediated communication (CMC) environment. A brief review of the Input_Process_Output model suggested by James E. Driskell, Paul H. Radtke and Eduardo Salas in “Virtual Teams: Effects of Technological Mediation on Team Performance (2003)", has been done to develop the basis of this research. This model theoretically analyzes the effects of technological mediation on team processes, such as, cohesiveness, status and authority relations, counternormative behavior and communication. An empirical study described in this paper has been undertaken to test the “cohesiveness" of diverse project teams in a multi-national organization. This study uses both quantitative and qualitative techniques for data gathering and analysis. These techniques include interviews, questionnaires for data collection and graphical data representation for analyzing the collected data. Computer-mediated technology may impact team performance because of difference in cohesiveness among teams and this difference may be moderated by factors, such as, the type of communication environment, the type of task and the temporal context of the team. Based on the reviewed model, sets of hypotheses are devised and tested. This research, reports on a study that compared team cohesiveness among virtual teams using CMC and non-CMC communication mediums. The findings suggest that CMC can help virtual teams increase team cohesiveness among their members, making CMC an effective medium for increasing productivity and team performance.Keywords: Computer-mediated Communication, Virtual Teams, Team Performance, Team Cohesiveness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 233262 The Use of Mobile Phone as Enhancement to Mark Multiple Choice Objectives English Grammar and Literature Examination: An Exploratory Case Study of Preliminary National Diploma Students, Abdu Gusau Polytechnic, Talata Mafara, Zamfara State, Nigeria
Authors: T. Abdulkadir
Abstract:
Most often, marking and assessment of multiple choice kinds of examinations have been opined by many as a cumbersome and herculean task to accomplished manually in Nigeria. Usually this may be in obvious nexus to the fact that mass numbers of candidates were known to take the same examination simultaneously. Eventually, marking such a mammoth number of booklets dared and dread even the fastest paid examiners who often undertake the job with the resulting consequences of stress and boredom. This paper explores the evolution, as well as the set aim to envision and transcend marking the Multiple Choice Objectives- type examination into a thing of creative recreation, or perhaps a more relaxing activity via the use of the mobile phone. A more “pragmatic” dimension method was employed to achieve this work, rather than the formal “in-depth research” based approach due to the “novelty” of the mobile-smartphone e-Marking Scheme discovery. Moreover, being an evolutionary scheme, no recent academic work shares a direct same topic concept with the ‘use of cell phone as an e-marking technique’ was found online; thus, the dearth of even miscellaneous citations in this work. Additional future advancements are what steered the anticipatory motive of this paper which laid the fundamental proposition. However, the paper introduces for the first time the concept of mobile-smart phone e-marking, the steps to achieve it, as well as the merits and demerits of the technique all spelt out in the subsequent pages.
Keywords: Cell phone, e-marking scheme, mobile phone, mobile-smart phone, multiple choice objectives, smartphone.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 96961 A Design for Customer Preferences Model by Cluster Analysis of Geometric Features and Customer Preferences
Authors: Yuan-Jye Tseng, Ching-Yen Chen
Abstract:
In the design cycle, a main design task is to determine the external shape of the product. The external shape of a product is one of the key factors that can affect the customers’ preferences linking to the motivation to buy the product, especially in the case of a consumer electronic product such as a mobile phone. The relationship between the external shape and the customer preferences needs to be studied to enhance the customer’s purchase desire and action. In this research, a design for customer preferences model is developed for investigating the relationships between the external shape and the customer preferences of a product. In the first stage, the names of the geometric features are collected and evaluated from the data of the specified internet web pages using the developed text miner. The key geometric features can be determined if the number of occurrence on the web pages is relatively high. For each key geometric feature, the numerical values are explored using the text miner to collect the internet data from the web pages. In the second stage, a cluster analysis model is developed to evaluate the numerical values of the key geometric features to divide the external shapes into several groups. Several design suggestion cases can be proposed, for example, large model, mid-size model, and mini model, for designing a mobile phone. A customer preference index is developed by evaluating the numerical data of each of the key geometric features of the design suggestion cases. The design suggestion case with the top ranking of the customer preference index can be selected as the final design of the product. In this paper, an example product of a notebook computer is illustrated. It shows that the external shape of a product can be used to drive customer preferences. The presented design for customer preferences model is useful for determining a suitable external shape of the product to increase customer preferences.
Keywords: Cluster analysis, customer preferences, design evaluation, design for customer preferences, product design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77660 Semantic Enhanced Social Media Sentiments for Stock Market Prediction
Authors: K. Nirmala Devi, V. Murali Bhaskaran
Abstract:
Traditional document representation for classification follows Bag of Words (BoW) approach to represent the term weights. The conventional method uses the Vector Space Model (VSM) to exploit the statistical information of terms in the documents and they fail to address the semantic information as well as order of the terms present in the documents. Although, the phrase based approach follows the order of the terms present in the documents rather than semantics behind the word. Therefore, a semantic concept based approach is used in this paper for enhancing the semantics by incorporating the ontology information. In this paper a novel method is proposed to forecast the intraday stock market price directional movement based on the sentiments from Twitter and money control news articles. The stock market forecasting is a very difficult and highly complicated task because it is affected by many factors such as economic conditions, political events and investor’s sentiment etc. The stock market series are generally dynamic, nonparametric, noisy and chaotic by nature. The sentiment analysis along with wisdom of crowds can automatically compute the collective intelligence of future performance in many areas like stock market, box office sales and election outcomes. The proposed method utilizes collective sentiments for stock market to predict the stock price directional movements. The collective sentiments in the above social media have powerful prediction on the stock price directional movements as up/down by using Granger Causality test.
Keywords: Bag of Words, Collective Sentiments, Ontology, Semantic relations, Sentiments, Social media, Stock Prediction, Twitter, Vector Space Model and wisdom of crowds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 280059 Review of Strategies for Hybrid Energy Storage Management System in Electric Vehicle Application
Authors: Kayode A. Olaniyi, Adeola A. Ogunleye, Tola M. Osifeko
Abstract:
Electric Vehicles (EV) appear to be gaining increasing patronage as a feasible alternative to Internal Combustion Engine Vehicles (ICEVs) for having low emission and high operation efficiency. The EV energy storage systems are required to handle high energy and power density capacity constrained by limited space, operating temperature, weight and cost. The choice of strategies for energy storage evaluation, monitoring and control remains a challenging task. This paper presents review of various energy storage technologies and recent researches in battery evaluation techniques used in EV applications. It also underscores strategies for the hybrid energy storage management and control schemes for the improvement of EV stability and reliability. The study reveals that despite the advances recorded in battery technologies there is still no cell which possess both the optimum power and energy densities among other requirements, for EV application. However combination of two or more energy storages as hybrid and allowing the advantageous attributes from each device to be utilized is a promising solution. The review also reveals that State-of-Charge (SoC) is the most crucial method for battery estimation. The conventional method of SoC measurement is however questioned in the literature and adaptive algorithms that include all model of disturbances are being proposed. The review further suggests that heuristic-based approach is commonly adopted in the development of strategies for hybrid energy storage system management. The alternative approach which is optimization-based is found to be more accurate but is memory and computational intensive and as such not recommended in most real-time applications.
Keywords: Hybrid electric vehicle, hybrid energy storage, battery state estimation, ate of charge, state of health.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 104958 Assessing the Impact of High Fidelity Human Patient Simulation on Teamwork among Nursing, Medicine and Pharmacy Undergraduate Students
Authors: S. MacDonald, A. Manuel, R. Law, N. Bandruak, A. Dubrowski, V. Curran, J. Smith-Young, K. Simmons, A. Warren
Abstract:
High fidelity human patient simulation has been used for many years by health sciences education programs to foster critical thinking, engage learners, improve confidence, improve communication, and enhance psychomotor skills. Unfortunately, there is a paucity of research on the use of high fidelity human patient simulation to foster teamwork among nursing, medicine and pharmacy undergraduate students. This study compared the impact of high fidelity and low fidelity simulation education on teamwork among nursing, medicine and pharmacy students. For the purpose of this study, two innovative teaching scenarios were developed based on the care of an adult patient experiencing acute anaphylaxis: one high fidelity using a human patient simulator and one low fidelity using case based discussions. A within subjects, pretest-posttest, repeated measures design was used with two-treatment levels and random assignment of individual subjects to teams of two or more professions. A convenience sample of twenty-four (n=24) undergraduate students participated, including: nursing (n=11), medicine (n=9), and pharmacy (n=4). The Interprofessional Teamwork Questionnaire was used to assess for changes in students’ perception of their functionality within the team, importance of interprofessional collaboration, comprehension of roles, and confidence in communication and collaboration. Student satisfaction was also assessed. Students reported significant improvements in their understanding of the importance of interprofessional teamwork and of the roles of nursing and medicine on the team after participation in both the high fidelity and the low fidelity simulation. However, only participants in the high fidelity simulation reported a significant improvement in their ability to function effectively as a member of the team. All students reported that both simulations were a meaningful learning experience and all students would recommend both experiences to other students. These findings suggest there is merit in both high fidelity and low fidelity simulation as a teaching and learning approach to foster teamwork among undergraduate nursing, medicine and pharmacy students. However, participation in high fidelity simulation may provide a more realistic opportunity to practice and function as an effective member of the interprofessional health care team.
Keywords: Acute anaphylaxis, high fidelity human patient simulation, low fidelity simulation, interprofessional education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 95457 Scientific Methods in Educational Management: The Metasystems Perspective
Authors: Elena A. Railean
Abstract:
Although scientific methods have been the subject of a large number of papers, the term ‘scientific methods in educational management’ is still not well defined. In this paper, it is adopted the metasystems perspective to define the mentioned term and distinguish them from methods used in time of the scientific management and knowledge management paradigms. In our opinion, scientific methods in educational management rely on global phenomena, events, and processes and their influence on the educational organization. Currently, scientific methods in educational management are integrated with the phenomenon of globalization, cognitivisation, and openness, etc. of educational systems and with global events like the COVID-19 pandemic. Concrete scientific methods are nested in a hierarchy of more and more abstract models of educational management, which form the context of the global impact on education, in general, and learning outcomes, in particular. However, scientific methods can be assigned to a specific mission, strategy, or tactics of educational management of the concrete organization, either by the global management, local development of school organization, or/and development of the life-long successful learner. By accepting this assignment, the scientific method becomes a personal goal of each individual with the educational organization or the option to develop the educational organization at the global standards. In our opinion, in educational management, the scientific methods need to confine the scope to the deep analysis of concrete tasks of the educational system (i.e., teaching, learning, assessment, development), which result in concrete strategies of organizational development. More important are seeking the ways for dynamic equilibrium between the strategy and tactic of the planetary tasks in the field of global education, which result in a need for ecological methods of learning and communication. In sum, distinction between local and global scientific methods is dependent on the subjective conception of the task assignment, measurement, and appraisal. Finally, we conclude that scientific methods are not holistic scientific methods, but the strategy and tactics implemented in the global context by an effective educational/academic manager.
Keywords: Educational management, scientific management, educational leadership, scientific method in educational management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139556 Genetic Algorithm Application in a Dynamic PCB Assembly with Carryover Sequence- Dependent Setups
Authors: M. T. Yazdani Sabouni, Rasaratnam Logendran
Abstract:
We consider a typical problem in the assembly of printed circuit boards (PCBs) in a two-machine flow shop system to simultaneously minimize the weighted sum of weighted tardiness and weighted flow time. The investigated problem is a group scheduling problem in which PCBs are assembled in groups and the interest is to find the best sequence of groups as well as the boards within each group to minimize the objective function value. The type of setup operation between any two board groups is characterized as carryover sequence-dependent setup time, which exactly matches with the real application of this problem. As a technical constraint, all of the boards must be kitted before the assembly operation starts (kitting operation) and by kitting staff. The main idea developed in this paper is to completely eliminate the role of kitting staff by assigning the task of kitting to the machine operator during the time he is idle which is referred to as integration of internal (machine) and external (kitting) setup times. Performing the kitting operation, which is a preparation process of the next set of boards while the other boards are currently being assembled, results in the boards to continuously enter the system or have dynamic arrival times. Consequently, a dynamic PCB assembly system is introduced for the first time in the assembly of PCBs, which also has characteristics similar to that of just-in-time manufacturing. The problem investigated is computationally very complex, meaning that finding the optimal solutions especially when the problem size gets larger is impossible. Thus, a heuristic based on Genetic Algorithm (GA) is employed. An example problem on the application of the GA developed is demonstrated and also numerical results of applying the GA on solving several instances are provided.Keywords: Genetic algorithm, Dynamic PCB assembly, Carryover sequence-dependent setup times, Multi-objective.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156955 A Sentence-to-Sentence Relation Network for Recognizing Textual Entailment
Authors: Isaac K. E. Ampomah, Seong-Bae Park, Sang-Jo Lee
Abstract:
Over the past decade, there have been promising developments in Natural Language Processing (NLP) with several investigations of approaches focusing on Recognizing Textual Entailment (RTE). These models include models based on lexical similarities, models based on formal reasoning, and most recently deep neural models. In this paper, we present a sentence encoding model that exploits the sentence-to-sentence relation information for RTE. In terms of sentence modeling, Convolutional neural network (CNN) and recurrent neural networks (RNNs) adopt different approaches. RNNs are known to be well suited for sequence modeling, whilst CNN is suited for the extraction of n-gram features through the filters and can learn ranges of relations via the pooling mechanism. We combine the strength of RNN and CNN as stated above to present a unified model for the RTE task. Our model basically combines relation vectors computed from the phrasal representation of each sentence and final encoded sentence representations. Firstly, we pass each sentence through a convolutional layer to extract a sequence of higher-level phrase representation for each sentence from which the first relation vector is computed. Secondly, the phrasal representation of each sentence from the convolutional layer is fed into a Bidirectional Long Short Term Memory (Bi-LSTM) to obtain the final sentence representations from which a second relation vector is computed. The relations vectors are combined and then used in then used in the same fashion as attention mechanism over the Bi-LSTM outputs to yield the final sentence representations for the classification. Experiment on the Stanford Natural Language Inference (SNLI) corpus suggests that this is a promising technique for RTE.Keywords: Deep neural models, natural language inference, recognizing textual entailment, sentence-to-sentence relation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454