Search results for: text preprocessing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1412

Search results for: text preprocessing

452 Post-occupancy Evaluation of Greenway Based on Multi-source data : A Case Study of Jincheng Greenway in Chengdu

Authors: Qin Zhu

Abstract:

Under the development concept of Park City, Tianfu Greenway system, as the basic and pre-configuration element of Chengdu Global Park construction, connects urban open space with linear and circular structures and undertakes and exerts the ecological, cultural and recreational functions of the park system. Chengdu greenway construction is in full swing. In the process of greenway planning and construction, the landscape effect of greenway on urban quality improvement is more valued, and the long-term impact of crowd experience on the sustainable development of greenway is often ignored. Therefore, it is very important to test the effectiveness of greenway construction from the perspective of users. Taking Jincheng Greenway in Chengdu as an example, this paper attempts to introduce multi-source data to construct a post-occupancy evaluation model of greenway and adopts behavior mapping method, questionnaire survey method, web text analysis and IPA analysis method to comprehensively evaluate the user 's behavior characteristics and satisfaction. According to the evaluation results, we can grasp the actual behavior rules and comprehensive needs of users so that the experience of building greenways can be fed back in time and provide guidance for the optimization and improvement of built greenways and the planning and construction of future greenways.

Keywords: multi-source data, greenway, IPA analysis, post -occupancy evaluation (POE)

Procedia PDF Downloads 42
451 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 228
450 Flood Hazard and Risk Mapping to Assess Ice-Jam Flood Mitigation Measures

Authors: Karl-Erich Lindenschmidt, Apurba Das, Joel Trudell, Keanne Russell

Abstract:

In this presentation, we explore options for mitigating ice-jam flooding along the Athabasca River in western Canada. Not only flood hazard, expressed in this case as the probability of flood depths and extents being exceeded, but also flood risk, in which annual expected damages are calculated. Flood risk is calculated, which allows a cost-benefit analysis to be made so that decisions on the best mitigation options are not based solely on flood hazard but also on the costs related to flood damages and the benefits of mitigation. The river ice model is used to simulate extreme ice-jam flood events with which scenarios are run to determine flood exposure and damages in flood-prone areas along the river. We will concentrate on three mitigation options – the placement of a dike, artificial breakage of the ice cover along the river, the installation of an ice-control structure, and the construction of a reservoir. However, any mitigation option is not totally failsafe. For example, dikes can still be overtopped and breached, and ice jams may still occur in areas of the river where ice covers have been artificially broken up. Hence, for all options, it is recommended that zoning of building developments away from greater flood hazard areas be upheld. Flood mitigation can have a negative effect of giving inhabitants a false sense of security that flooding may not happen again, leading to zoning policies being relaxed. (Text adapted from Lindenschmidt [2022] "Ice Destabilization Study - Phase 2", submitted to the Regional Municipality of Wood Buffalo, Alberta, Canada)

Keywords: ice jam, flood hazard, flood risk river ice modelling, flood risk

Procedia PDF Downloads 153
449 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 174
448 Intelligent Chatbot Generating Dynamic Responses Through Natural Language Processing

Authors: Aarnav Singh, Jatin Moolchandani

Abstract:

The proposed research work aims to build a query-based AI chatbot that can answer any question related to any topic. A chatbot is software that converses with users via text messages. In the proposed system, we aim to build a chatbot that generates a response based on the user’s query. For this, we use natural language processing to analyze the query and some set of texts to form a concise answer. The texts are obtained through web-scrapping and filtering all the credible sources from a web search. The objective of this project is to provide a chatbot that is able to provide simple and accurate answers without the user having to read through a large number of articles and websites. Creating an AI chatbot that can answer a variety of user questions on a variety of topics is the goal of the proposed research project. This chatbot uses natural language processing to comprehend user inquiries and provides succinct responses by examining a collection of writings that were scraped from the internet. The texts are carefully selected from reliable websites that are found via internet searches. This project aims to provide users with a chatbot that provides clear and precise responses, removing the need to go through several articles and web pages in great detail. In addition to exploring the reasons for their broad acceptance and their usefulness across many industries, this article offers an overview of the interest in chatbots throughout the world.

Keywords: Chatbot, Artificial Intelligence, natural language processing, web scrapping

Procedia PDF Downloads 39
447 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction

Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili

Abstract:

Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.

Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software

Procedia PDF Downloads 103
446 Experimental Device to Test Corrosion Behavior of Materials in the Molten Salt Reactor Environment

Authors: Jana Petru, Marie Kudrnova

Abstract:

The use of technologies working with molten salts is conditioned by finding suitable construction materials that must meet several demanding criteria. In addition to temperature resistance, materials must also show corrosion resistance to salts; they must meet mechanical requirements and other requirements according to the area of use – for example, radiation resistance in Molten Salt Reactors. The present text describes an experimental device for studying the corrosion resistance of candidate materials in molten mixtures of salts and is a partial task of the international project ADAR, dealing with the evaluation of advanced nuclear reactors based on molten salts. The design of the device is based on a test exposure of Inconel 625 in the mixture of salts Hitec in a high temperature tube furnace. The result of the pre-exposure is, in addition to the metallographic evaluation of the behavior of material 625 in the mixture of nitrate salts, mainly a list of operational and construction problems that were essential for the construction of the new experimental equipment. The main output is a scheme of a newly designed gas-tight experimental apparatus capable of operating in an inert argon atmosphere, temperature up to 600 °C, pressure 3 bar, in the presence of a corrosive salt environment, with an exposure time of hundreds of hours. This device will enable the study of promising construction materials for nuclear energy.

Keywords: corrosion, experimental device, molten salt, steel

Procedia PDF Downloads 100
445 Implementing Text Using Political and Current Issues to Create Choreography: “The Pledge 2.0”

Authors: Muhammad Fairul Azreen bin Mohd Zahid, Melissa Querk, Aimi Nabila bt Anizaim

Abstract:

For this particular research, the focus is based on the practice as research which will produce a choreography as the outcome. The ideas organically develop as an “epiphany” from the meeting, brainstorming, or situation that revolves around surroundings. In this study, the researchers are approaching the national pillar of Malaysia known as ‘Rukun Negara’ to develop a choreographic idea. The concept theory of Speech Act by J.L Austin is used to compose the choreography alongside with national pillar ‘Rukun Negara’ as a guideline for a contemporary work titled, The Pledge 2.0, besides fostering the spirit of unity. These approaches will offer flexibility in creating a choreography piece. The pledge has crossed the boundaries by using texts and heavy issues in choreography developments. It will emphasize the concept of delivering the speech via verbal and nonverbal body language. Besides using the Theory of Speech Acts, the development process of creating this piece will lay the bare normative structure implicit in performance practice. Converging current issues into the final choreographic piece for this research is vital as this research will explore a few choreography methods from different perspectives. Hence, the audience will be able to see the world of dance that always revolves in line with the diachronic process in many ways. The method used in this research is qualitative, which will be used in finding the movement that fits the given facts.

Keywords: performing arts, speech act, performative, nationalism, choreography, politic in dance

Procedia PDF Downloads 64
444 A Transformer-Based Question Answering Framework for Software Contract Risk Assessment

Authors: Qisheng Hu, Jianglei Han, Yue Yang, My Hoa Ha

Abstract:

When a company is considering purchasing software for commercial use, contract risk assessment is critical to identify risks to mitigate the potential adverse business impact, e.g., security, financial and regulatory risks. Contract risk assessment requires reviewers with specialized knowledge and time to evaluate the legal documents manually. Specifically, validating contracts for a software vendor requires the following steps: manual screening, interpreting legal documents, and extracting risk-prone segments. To automate the process, we proposed a framework to assist legal contract document risk identification, leveraging pre-trained deep learning models and natural language processing techniques. Given a set of pre-defined risk evaluation problems, our framework utilizes the pre-trained transformer-based models for question-answering to identify risk-prone sections in a contract. Furthermore, the question-answering model encodes the concatenated question-contract text and predicts the start and end position for clause extraction. Due to the limited labelled dataset for training, we leveraged transfer learning by fine-tuning the models with the CUAD dataset to enhance the model. On a dataset comprising 287 contract documents and 2000 labelled samples, our best model achieved an F1 score of 0.687.

Keywords: contract risk assessment, NLP, transfer learning, question answering

Procedia PDF Downloads 105
443 Assessing EU-China Security Interests from Contradiction to Convergence

Authors: Julia Gurol

Abstract:

Why do we observe a shift towards convergence in EU-China security interests? While contradicting attitudes towards key principles of inter-state and region-to-state relations, including state sovereignty, territorial integrity, and intervention policies have ever since hindered EU-China inter-regional cooperation beyond the economic realm, collaboration in peace and security issues is now becoming a key pillar of European-Chinese relations. In addition, the Belt and Road Initiative as most ambitious Chinese foreign policy project explicitly touches upon several European foreign policy and security preferences. Based on these counterintuitive findings, this paper traces the process of convergence of Sino-European security interests. Drawing on qualitative text analysis of official Chinese and European policy papers and documents from the establishment of diplomatic relations in 1975 until today, it assesses the striking change over time. On this basis, the paper uses theories of neo-functionalism, inter-regionalism, and securitization and borrows from constructivist views in International Relations’ theory, to expound possible motives for the change in Chinese and respectively European preferences in the security realm. The results reveal interesting insights into the decisive factors and motives behind both countries’ foreign policies. The paper concludes with a discussion of further potential and difficulties of EU-China security cooperation.

Keywords: belt and road initiative, China, European Union, foreign policy, neo-functionalism, security

Procedia PDF Downloads 262
442 Introducing a Video-Based E-Learning Module to Improve Disaster Preparedness at a Tertiary Hospital in Oman

Authors: Ahmed Al Khamisi

Abstract:

The Disaster Preparedness Standard (DPS) is one of the elements that is evaluated by the Accreditation Canada International (ACI). ACI emphasizes to train and educate all staff, including service providers and senior leaders, on emergency and disaster preparedness upon the orientation and annually thereafter. Lack of awareness and deficit of knowledge among the healthcare providers about DPS have been noticed in a tertiary hospital where ACI standards were implemented. Therefore, this paper aims to introduce a video-based e-learning (VB-EL) module that explains the hospital’s disaster plan in a simple language which will be easily accessible to all healthcare providers through the hospital’s website. The healthcare disaster preparedness coordinator in the targeted hospital will be responsible to ensure that VB-EL is ready by 25 April 2019. This module will be developed based on the Kirkpatrick evaluation method. In fact, VB-EL combines different data forms such as images, motion, sounds, text in a complementary fashion which will suit diverse learning styles and individual learning pace of healthcare providers. Moreover, the module can be adjusted easily than other tools to control the information that healthcare providers receive. It will enable healthcare providers to stop, rewind, fast-forward, and replay content as many times as needed. Some anticipated limitations in the development of this module include challenges of preparing VB-EL content and resistance from healthcare providers.

Keywords: Accreditation Canada International, Disaster Preparedness Standard, Kirkpatrick evaluation method, video-based e-learning

Procedia PDF Downloads 131
441 Efficacy of Cognitive Rehabilitation Therapy on Poststroke Depression among Survivors of Stroke; A Systematic Review

Authors: Zahra Hassani

Abstract:

Background and Purpose: Poststroke depression (PSD) is one of the complications of a stroke that reduces the patient's chance of recovery, becomes irritable, and changes personality. Cognitive rehabilitation is one of the non-pharmacological methods that improve deficits such as attention, memory, and symptoms of depression. Therefore, the purpose of the present study is to evaluate the Efficacy of Cognitive Rehabilitation Therapy on Poststroke Depression among Survivors of stroke. Method: In this study, a systematic review of the databases Google Scholar, PubMed, Science Direct, Elsevier between the years 2015 and 2019 with the keywords cognitive rehabilitation therapy, post-stroke, depression Search is done. In this process, studies that examined the Efficacy of Cognitive Rehabilitation Therapy on Poststroke Depression among Survivors of stroke were included in the study. Results: Inclusion criteria were full-text availability, interventional study, and non-review articles. There was a significant difference between the articles in terms of the indices studied, sample number, method of implementation, and so on. A review of studies have shown that cognitive rehabilitation therapy has a significant role in reducing the symptoms of post-stroke depression. The use of these interventions is also effective in improving problem-solving skills, improving memory, and improving attention and concentration. Conclusion: This study emphasizes on the development of efficient and flexible adaptive skills through cognitive processes and its effect on reducing depression in patients after stroke.

Keywords: cognitive therapy, depression, stroke, rehabilitation

Procedia PDF Downloads 102
440 The Translation Of Original Metaphor In Literature

Authors: Esther Matthews

Abstract:

This paper looks at ways of translating new metaphors: those conceived and created by authors, which are often called ‘original’ metaphors in the world of Translation Studies. An original metaphor is the most extreme form of figurative language, often dramatic and shocking in effect. It displays unexpected juxtapositions of language, suggesting there could be as many different translations as there are translators. However, some theorists say original metaphors should be translated ‘literally’ or ‘word for word’ as far as possible, suggesting a similarity between translators’ solutions. How do literary translators approach this challenge? This study focuses on Spanish-English translations of a novel full of original metaphors: Nada by Carmen Laforet (1921 – 2004). Original metaphors from the text were compared to the four published English translations by Inez Muñoz, Charles Franklin Payne, Glafyra Ennis, and Edith Grossman. These four translators employed a variety of translation methods, but they translated ‘literally’ in well over half of the original metaphors studied. In a two-part translation exercise and questionnaire, professional literary translators were asked to translate a number of these metaphors. Many different methods were employed, but again, over half of the original metaphors were translated literally. Although this investigation was limited to one author and language pair, it gives a clear indication that, although literary translators’ solutions vary, on the whole, they prefer to translate original metaphors as literally as possible within the confines of English grammar and syntax. It also reveals literary translators’ desire to reproduce the distinctive character of an author’s work as accurately as possible for the target reader.

Keywords: translation, original metaphor, literature, translator training

Procedia PDF Downloads 246
439 Real-Time Neuroimaging for Rehabilitation of Stroke Patients

Authors: Gerhard Gritsch, Ana Skupch, Manfred Hartmann, Wolfgang Frühwirt, Hannes Perko, Dieter Grossegger, Tilmann Kluge

Abstract:

Rehabilitation of stroke patients is dominated by classical physiotherapy. Nowadays, a field of research is the application of neurofeedback techniques in order to help stroke patients to get rid of their motor impairments. Especially, if a certain limb is completely paralyzed, neurofeedback is often the last option to cure the patient. Certain exercises, like the imagination of the impaired motor function, have to be performed to stimulate the neuroplasticity of the brain, such that in the neighboring parts of the injured cortex the corresponding activity takes place. During the exercises, it is very important to keep the motivation of the patient at a high level. For this reason, the missing natural feedback due to a movement of the effected limb may be replaced by a synthetic feedback based on the motor-related brain function. To generate such a synthetic feedback a system is needed which measures, detects, localizes and visualizes the motor related µ-rhythm. Fast therapeutic success can only be achieved if the feedback features high specificity, comes in real-time and without large delay. We describe such an approach that offers a 3D visualization of µ-rhythms in real time with a delay of 500ms. This is accomplished by combining smart EEG preprocessing in the frequency domain with source localization techniques. The algorithm first selects the EEG channel featuring the most prominent rhythm in the alpha frequency band from a so-called motor channel set (C4, CZ, C3; CP6, CP4, CP2, CP1, CP3, CP5). If the amplitude in the alpha frequency band of this certain electrode exceeds a threshold, a µ-rhythm is detected. To prevent detection of a mixture of posterior alpha activity and µ-activity, the amplitudes in the alpha band outside the motor channel set are not allowed to be in the same range as the main channel. The EEG signal of the main channel is used as template for calculating the spatial distribution of the µ - rhythm over all electrodes. This spatial distribution is the input for a inverse method which provides the 3D distribution of the µ - activity within the brain which is visualized in 3D as color coded activity map. This approach mitigates the influence of lid artifacts on the localization performance. The first results of several healthy subjects show that the system is capable of detecting and localizing the rarely appearing µ-rhythm. In most cases the results match with findings from visual EEG analysis. Frequent eye-lid artifacts have no influence on the system performance. Furthermore, the system will be able to run in real-time. Due to the design of the frequency transformation the processing delay is 500ms. First results are promising and we plan to extend the test data set to further evaluate the performance of the system. The relevance of the system with respect to the therapy of stroke patients has to be shown in studies with real patients after CE certification of the system. This work was performed within the project ‘LiveSolo’ funded by the Austrian Research Promotion Agency (FFG) (project number: 853263).

Keywords: real-time EEG neuroimaging, neurofeedback, stroke, EEG–signal processing, rehabilitation

Procedia PDF Downloads 365
438 Techniques to Teach Reading at Pre-Reading Stage

Authors: Anh Duong

Abstract:

The three-phase reading lesson has been put forth around the world as the new and innovative framework which is corresponding to the learner-centered trend in English language teaching and learning. Among three stages, pre-reading attracts many teachers’ and researchers’ attention for its vital role in preparing students with knowledge and interest in reading class. The researcher’s desire to exemplify effectiveness of activities prior to text reading has provoked the current study. Three main aspects were investigated in this paper, i.e. teachers’ and student’s perception of pre-reading stage, teachers’ exploitation of pre-reading techniques and teachers’ recommendation of effective pre-reading activities. Aiming at pre-reading techniques for first-year students at English Department, this study involved 200 fresh-men and 10 teachers from Division 1 to participate in the questionnaire survey. Interviews with the teachers and classroom observation were employed as a tool to take an insight into the responses gained from the early instrument. After a detailed procedure of analyzing data, the researcher discovered that thanks to the participants’ acclamation of pre-reading stage, this phase was frequently conducted by the surveyed teachers. Despite the fact that pre-reading activities apparently put a hand in motivating students to read and creating a joyful learning atmosphere, they did not fulfill another function as supporting students’ reading comprehension. Therefore, a range of techniques and notices when preparing and conducting pre-reading phase was detected from the interviewed teachers. The findings assisted the researcher to propose some related pedagogical implications concerning teachers’ source of pre-reading techniques, variations of suggested activities and first-year reading syllabus.

Keywords: pre-reading stage, pre-reading techniques, teaching reading, language teaching

Procedia PDF Downloads 467
437 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems

Authors: Rodolfo Lorbieski, Silvia Modesto Nassar

Abstract:

Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.

Keywords: stacking, multi-layers, ensemble, multi-class

Procedia PDF Downloads 249
436 Exploring the Difficulties of Acceleration Concept from the Perspective of Historical Textual Analysis

Authors: Yun-Ju Chiu, Feng-Yi Chen

Abstract:

Kinematics is the beginning to learn mechanics in physics course. The concept of acceleration plays an important role in learning kinematics. Teachers usually instruct the conception through the formulas and graphs of kinematics and the well-known law F = ma. However, over the past few decades, a lot of researchers reveal numerous students’ difficulties in learning acceleration. One of these difficulties is that students frequently confuse acceleration with velocity and force. Why is the concept of acceleration so difficult to learn? The aim of this study is to understand the conceptual evolution of acceleration through the historical textual analysis. Text analysis and one-to-one interviews with high school students and teachers are used in this study. This study finds the history of science constructed from textbooks is usually quite different from the real evolution of history. For example, most teachers and students believe that the best-known law F = ma was written down by Newton. The expression of the second law is not F = ma in Newton’s best-known book Principia in 1687. Even after more than one hundred years, a famous Cambridge textbook titled An Elementary Treatise on Mechanics by Whewell of Trinity College did not express this law as F = ma. At that time of Whewell, the early mid-nineteenth century Britain, the concept of acceleration was not only ambiguous but also confused with the concept of force. The process of learning the concept of acceleration is analogous to its conceptual development in history. The study from the perspective of historical textual analysis will promote the understanding of the concept learning difficulties, the development of professional physics teaching, and the improvement of the context of physics textbooks.

Keywords: acceleration, textbooks, mechanics, misconception, history of science

Procedia PDF Downloads 231
435 Evaluation of Video Development about Exclusive Breastfeeding as a Nutrition Education Media for Posyandu Cadre

Authors: Ari Istiany, Guspri Devi Artanti, M. Si

Abstract:

Based on the results Riskesdas, it is known that breastfeeding awareness about the importance of exclusive breastfeeding is still low at only 15.3 %. These conditions resulted in a very infant at risk for infectious diseases, such as diarrhea and acute respiratory infection. Therefore, the aim of this study to evaluate the video development about exclusive breastfeeding as a nutrition education media for posyandu cadre. This research used development methods for making the video about exclusive breastfeeding. The study was conducted in urban areas Rawamangun, East Jakarta. Respondents of this study were 1 media experts from the Department of Educational Technology - UNJ, 2 subject matter experts from Department of Home Economics - UNJ and 20 posyandu cadres to assess the quality of the video. Aspects assessed include the legibility of text, image display quality, color composition, clarity of sound, music appropriateness, duration, suitability of the material and language. Data were analyzed descriptively likes frequency distribution table, the average value, and deviation standard. The result of this study showed that the average score assessment according to media experts, subject matter experts, and posyandu cadres respectively was 3.43 ± 0.51 (good), 4.37 ± 0.52 (very good) and 3.6 ± 0.73 (good). The conclusion is on exclusive breastfeeding video as feasible as a media for nutrition education. While suggestions for the improvement of visual media is multiply illustrations, add material about the correct way of breastfeeding and healthy baby pictures.

Keywords: exclusive breastfeeding, posyandu cadre, video, nutrition education

Procedia PDF Downloads 392
434 The Impact of Psychiatric Symptoms on Return to Work after Occupational Injury

Authors: Kuan-Han Lin, Kuan-Yin Lin, Ka-Chun Siu

Abstract:

The purpose of this systematic review was to determine the impact of post-traumatic stress disorders (PTSD) symptom or depressive symptoms on return to work (RTW) after occupational injury. The original articles of clinical trials and observational studies from PubMed, MEDLINE, and PsycINFO between January 1980 and November 2016 were retrieved. Two reviewers evaluated the abstracts identified by the search criteria for full-text review. To be included in the final analysis, studies were required to use either intervention or observational study design to examine the association between psychiatric symptoms and RTW. A modified checklist designed by Downs & Black and Crombie was used to assess the methodological quality of included study. A total of 58 articles were identified from the electronic databases after duplicate removed. Seven studies fulfilled the inclusion criteria and were critically reviewed. The rates of RTW in the included studies were reported to be 6% to 63.6% among workers after occupational injuries. This review found that post-traumatic stress symptom and depressive symptoms were negatively associated with RTW. Although the impact of psychiatric symptoms on RTW after occupational injury remains poorly understood, this review brought up the important information that injured workers with psychiatric symptoms had poor RTW outcome. Future work should address the effective management of psychiatric factors affecting RTW among workers.

Keywords: depressive symptom, occupational injury, post-traumatic stress disorder, return to work

Procedia PDF Downloads 246
433 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 49
432 Challenging the Stereotypes: A Critical Study of Chotti Munda, His Arrow, and Sula

Authors: Khushboo Gokani, Renu Josan

Abstract:

Mahasweta Devi and Toni Morrison are the two stalwarts of the Indian-English and the Afro-American literature respectively. The writings of these two novelists are authentic and powerful records of the lives of the people because much of their personal experiences have gone into the making of their works. Devi, a representative force of the Indian English literature, is also a social activist working with the tribals of Bihar, Jharkhand, Orissa and West Bengal. Most of her works echo the lives and struggles of the subalterns as is evident in her 'best-beloved book' Chotti Munda and His Arrow. The novelist focuses on the struggle of the tribals against the colonial and the feudal powers to create their identity, thereby, embarking on the ideological project called Setting the Record Straight. The Nobel laureate Toni Morrison, on the other hand, brings to the fore the crucial issues of gender, race, and class in many of her significant works. In one of her representative works, Sula, the protagonist emerges as a non-conformist and directly confronts the notion of a ‘good woman’ nurtured by the community of the Blacks. In addition to this, the struggle of the Blacks against the White domination, also become an important theme of the text. The thrust of the paper lies in making a critical analysis of the portrayal of the heroic attempts of the subaltern protagonist and the artistic endeavor of the novelists in challenging the stereotypes.

Keywords: the struggle of the muted groups, subaltern, center and periphery, challenging the stereotypes

Procedia PDF Downloads 218
431 The Universal Theory: Role of Imaginary Pressure on Different Relative Motions

Authors: Sahib Dino Naseerani

Abstract:

The presented scientific text discusses the concept of imaginary pressure and its role in different relative motions. It explores how imaginary pressure, which is the combined effect of external atmospheric pressure and real pressure, affects various substances and their physical properties. The study aims to understand the impact of imaginary pressure and its potential applications in different contexts, such as spaceflight. The main objective of this study is to investigate the role of imaginary pressure on different relative motions. Specifically, the researchers aim to examine how imaginary pressure affects the contraction and mass variation of a body when it is in motion at the speed of light. The study seeks to provide insights into the behavior and consequences of imaginary pressure in various scenarios. The data was collected using three research papers. This research contributes to a better understanding of the theoretical implications of imaginary pressure. It elucidates how imaginary pressure is responsible for the contraction and mass variation of a body in motion, particularly at the speed of light. The findings shed light on the behavior of substances under the influence of imaginary pressure, providing valuable insights for future scientific studies. The study addresses the question of how imaginary pressure influences various relative motions and their associated physical properties. It aims to understand the role of imaginary pressure in the contraction and mass variation of a body, particularly at high speeds. By examining different substances in liquid and solid forms, the research explores the consequences of imaginary pressure on their volume, length, and mass.

Keywords: imaginary pressure, contraction, variation, relative motion

Procedia PDF Downloads 75
430 Investigating Online Literacy among Undergraduates in Malaysia

Authors: Vivien Chee Pei Wei

Abstract:

Today we live in a scenario in which letters share space with images on screens that vary in size, shape, and style. The popularization of television, then the computer and now the e-readers, tablets, and smartphones made the electronic assume the role that previously was restricted to printed materials. Since the extensive use of new technologies to produce, disseminate, collect and access electronic publications began, the changes to reading has been intensified. To be able to read online, it involves more than just utilizing specific skills, strategies, and practices, but also in negotiating multiple information sources. In this study, different perspectives of digital reading are being explored in order to define the key aspects of the term. The focus is to explore how new technologies affect how undergraduates’ reading behavior, which in turn, gives readers different reading levels and engagement with the text and other support materials in the same media. There is also the importance of the relationship between reading platforms, reading levels and formats of electronic publications. The study looks at the online reading practices of about 100 undergraduates from a local university. The data collected using the survey and interviews with the respondents are analyzed thematically. Findings from this study found that both digital and traditional reading are interrelated, and should not be viewed as separate, but complementary to each other. However, reading online complicates some of the skills required by traditional reading. Consequently, in order to successfully read and comprehend multiple sources of information online, undergraduates need regular opportunities to practice and develop their skills as part of their natural reading practices.

Keywords: concepts, digital reading, literacy, traditional reading

Procedia PDF Downloads 291
429 The World View of Tere Liye in Negeri Para Bedebah an Analysis of Genetic Structuralism Lucien Goldmann

Authors: Muhammad Fadli Muslimin

Abstract:

Negeri Para Bedebah is known as one of the works of Tere Liye, an Indonesia author. In the literary works, the fiction as always tries to reflect the reality of the society where the author or the social groups lived in. The essential or nature of society is generally a reality while literary work is fiction and both of them are social fact. Negeri Para Bedebah is a novel fiction which is a social fact and which holds an important role in reality. It is more likely as the representation of social, economy and politic aspects in Indonesia. The purpose of this study is to reveal the world view of Tere Liye throughout novel Negeri Para Bedebah. By analyzing the object using genetic structuralism Lucien Goldmann which chiefly focuses on world view, it is stated that the literary work is an structure and it has homology with the structure in society. The structure of literary work is not chiefly homolog to the structure of society but homolog to the world view which is growing and developing inside the society. The methodological research used in this paper is a dialectic method which focuses on the starting and ending points lied in the literary text by paying attention to the coherent meanings. The result of this study is that Tere Liye shows us his world view about the structure of the society where he is living in, but one is an imaginative form of the world and the homology to the reality itself.

Keywords: homology, literary work, society, structure, world view

Procedia PDF Downloads 487
428 Post-Structural Study of Gender in Shakespearean Othello from Butlerian Perspective

Authors: Muhammad Shakeel Rehman Hissam

Abstract:

This study aims at analyzing gender in Othello by applying Judith Butler’s Post-Structural theory of gender and gender performance. The analysis of the play provides us context by which we can examine what kinds of effects the drama have on understanding of the researchers regarding gender identity. The study sets out to examine that, is there any evidence or ground in Shakespearean selected work which leads to challenge the patriarchal taken for granted prescribed roles of gender? This would be the focal point in study of Othello that actions and performances of characters determine their gender identity rather than their sexuality. It argues that gender of Shakespearean characters has no constant, fixed and structural impression. On the contrary, they undergo consistent variations in their behavior and performance which impart fluidity and volatility to them. The focal point of the present study is Butler’s prominent work; Gender Trouble: Feminism and subversion of Identity and her post structural theory of Gender performativity as the theoretical underpinning of the text. It analyzes the selected play in Post-Structural gender perspective. The gender-centric plot of the play is riddled with fluidity of gender. The most fascinating aspect of the play is the transformations of genders on the basis of performances by different characters and through these transformations; gender identity is revealed and determined. The study reconstructs the accepted gender norms by challenging the traditional concept of gender that is based on sexual differences of characters.

Keywords: post structural, gender, performativity, socio-cultural gender norms, binaries, Othello, Butler, identity

Procedia PDF Downloads 344
427 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 282
426 Technology in the Calculation of People Health Level: Design of a Computational Tool

Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna

Abstract:

Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.

Keywords: calculator, care, eHealth, health

Procedia PDF Downloads 238
425 DCDNet: Lightweight Document Corner Detection Network Based on Attention Mechanism

Authors: Kun Xu, Yuan Xu, Jia Qiao

Abstract:

The document detection plays an important role in optical character recognition and text analysis. Because the traditional detection methods have weak generalization ability, and deep neural network has complex structure and large number of parameters, which cannot be well applied in mobile devices, this paper proposes a lightweight Document Corner Detection Network (DCDNet). DCDNet is a two-stage architecture. The first stage with Encoder-Decoder structure adopts depthwise separable convolution to greatly reduce the network parameters. After introducing the Feature Attention Union (FAU) module, the second stage enhances the feature information of spatial and channel dim and adaptively adjusts the size of receptive field to enhance the feature expression ability of the model. Aiming at solving the problem of the large difference in the number of pixel distribution between corner and non-corner, Weighted Binary Cross Entropy Loss (WBCE Loss) is proposed to define corner detection problem as a classification problem to make the training process more efficient. In order to make up for the lack of Dataset of document corner detection, a Dataset containing 6620 images named Document Corner Detection Dataset (DCDD) is made. Experimental results show that the proposed method can obtain fast, stable and accurate detection results on DCDD.

Keywords: document detection, corner detection, attention mechanism, lightweight

Procedia PDF Downloads 331
424 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 52
423 A Systematic Review Examining the Experimental methodology behind in vivo testing of hiatus hernia and Diaphragmatic Hernia Mesh

Authors: Whitehead-Clarke T., Beynon V., Banks J., Karanjia R., Mudera V., Windsor A., Kureshi A.

Abstract:

Introduction: Mesh implants are regularly used to help repair both hiatus hernias (HH) and diaphragmatic hernias (DH). In vivo studies are used to test not only mesh safety but increasingly comparative efficacy. Our work examines the field of in vivo mesh testing for HH and DH models to establish current practices and standards. Method: This systematic review was registered with PROSPERO. Medline and Embase databases were searched for relevant in vivo studies. 44 articles were identified and underwent abstract review, where 22 were excluded. 4 further studies were excluded after full text review – leaving 18 to undergo data extraction. Results: Of 18 studies identified, 9 used an in vivo HH model and 9 a DH model. 5 studies undertook mechanical testing on tissue samples – all uniaxial in nature. Testing strip widths ranged from 1-20mm (median 3mm). Testing speeds varied from 1.5-60mm/minute. Upon histology, the most commonly assessed structural and cellular factors were neovascularization and macrophages, respectively (n=9 each). Structural analysis was mostly qualitative, where cellular analysis was equally likely to be quantitative. 11 studies assessed adhesion formation, of which 8 used one of four scoring systems. 8 studies measured mesh shrinkage. Discussion: In vivo studies assessing mesh for HH and DH repair are uncommon. Within this relatively young field, we encourage surgical and materials testing institutions to discuss its standardisation.

Keywords: hiatus, diaphragmatic, hernia, mesh, materials testing, in vivo

Procedia PDF Downloads 193