Search results for: sparse representation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1389

Search results for: sparse representation

729 Distributed Perceptually Important Point Identification for Time Series Data Mining

Authors: Tak-Chung Fu, Ying-Kit Hung, Fu-Lai Chung

Abstract:

In the field of time series data mining, the concept of the Perceptually Important Point (PIP) identification process is first introduced in 2001. This process originally works for financial time series pattern matching and it is then found suitable for time series dimensionality reduction and representation. Its strength is on preserving the overall shape of the time series by identifying the salient points in it. With the rise of Big Data, time series data contributes a major proportion, especially on the data which generates by sensors in the Internet of Things (IoT) environment. According to the nature of PIP identification and the successful cases, it is worth to further explore the opportunity to apply PIP in time series ‘Big Data’. However, the performance of PIP identification is always considered as the limitation when dealing with ‘Big’ time series data. In this paper, two distributed versions of PIP identification based on the Specialized Binary (SB) Tree are proposed. The proposed approaches solve the bottleneck when running the PIP identification process in a standalone computer. Improvement in term of speed is obtained by the distributed versions.

Keywords: distributed computing, performance analysis, Perceptually Important Point identification, time series data mining

Procedia PDF Downloads 435
728 Governance Question and the Participatory Policy Making: Making the Process Functional in Nigeria

Authors: Albert T. Akume, P. D. Dahida

Abstract:

This paper examines the effect of various epochs of governments on policy making in Nigeria. The character of governance and public policy making of both epochs was exclusive, non-participatory and self-centric. As a consequence the interests of citizenry were not represented, neither protected nor sought to meet fairly the needs of all groups. The introduction of the post-1999 democratic government demand that the hitherto skewed pattern of policy making cease to be a character of governance. Hence, the need for citizen participation in the policy making process. The question then is what mode is most appropriate to engender public participation so as to make the policy making process functional? Given the prevailing social, economic and political dilemmas the utilization of the direct mode of citizen participation to affect policy outcome is doubtful if not unattainable. It is due to these predicament that this paper uses the documentary research design argues for the utilization of the indirect mode of citizen participation in the policy making process so as to affect public policy outcome appropriately and with less cost, acrimony and delays.

Keywords: governance, public policy, participation, representation, civil society

Procedia PDF Downloads 374
727 Provenance in Scholarly Publications: Introducing the provCite Ontology

Authors: Maria Joseph Israel, Ahmed Amer

Abstract:

Our work aims to broaden the application of provenance technology beyond its traditional domains of scientific workflow management and database systems by offering a general provenance framework to capture richer and extensible metadata in unstructured textual data sources such as literary texts, commentaries, translations, and digital humanities. Specifically, we demonstrate the feasibility of capturing and representing expressive provenance metadata, including more of the context for citing scholarly works (e.g., the authors’ explicit or inferred intentions at the time of developing his/her research content for publication), while also supporting subsequent augmentation with similar additional metadata (by third parties, be they human or automated). To better capture the nature and types of possible citations, in our proposed provenance scheme metaScribe, we extend standard provenance conceptual models to form our proposed provCite ontology. This provides a conceptual framework which can accurately capture and describe more of the functional and rhetorical properties of a citation than can be achieved with any current models.

Keywords: knowledge representation, provenance architecture, ontology, metadata, bibliographic citation, semantic web annotation

Procedia PDF Downloads 117
726 Dynamic Ad-hoc Topologies for Mobile Robot Navigation Based on Non-Uniform Grid Maps

Authors: Peter Sauer, Thomas Hinze, Petra Hofstedt

Abstract:

To avoid obstacles in the surrounding environment and to navigate to a given target belong to the most important tasks for mobile robots. According to these tasks different data structures are suitable. To avoid near obstacles, occupancy grid maps are an ideal representation of the surroundings. For less fine grained tasks, such as navigating from one room to another in an apartment, pure grid maps are inappropriate. Grid maps are very detailed, calculating paths to navigate between rooms based on grid maps would take too long. Instead, graph-based data structures, so-called topologies, turn out to be a proper choice for such tasks. In this paper we present two methods to dynamically create topologies from grid maps. Both methods are based on non-uniform grid maps. The topologies are generated on-the-fly and can easily be modified to represent changes in the environment. This allows a hybrid approach to control mobile robots, where, depending on the situation and the current task, either the grid map or the generated topology may be used.

Keywords: robot navigation, occupancy grids, topological maps, dynamic map creation

Procedia PDF Downloads 563
725 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents

Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei

Abstract:

With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.

Keywords: document processing, framework, formal definition, machine learning

Procedia PDF Downloads 218
724 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 640
723 Advancements in AI Training and Education for a Future-Ready Healthcare System

Authors: Shamie Kumar

Abstract:

Background: Radiologists and radiographers (RR) need to educate themselves and their colleagues to ensure that AI is integrated safely, useful, and in a meaningful way with the direction it always benefits the patients. AI education and training are fundamental to the way RR work and interact with it, such that they feel confident using it as part of their clinical practice in a way they understand it. Methodology: This exploratory research will outline the current educational and training gaps for radiographers and radiologists in AI radiology diagnostics. It will review the status, skills, challenges of educating and teaching. Understanding the use of artificial intelligence within daily clinical practice, why it is fundamental, and justification on why learning about AI is essential for wider adoption. Results: The current knowledge among RR is very sparse, country dependent, and with radiologists being the majority of the end-users for AI, their targeted training and learning AI opportunities surpass the ones available to radiographers. There are many papers that suggest there is a lack of knowledge, understanding, and training of AI in radiology amongst RR, and because of this, they are unable to comprehend exactly how AI works, integrates, benefits of using it, and its limitations. There is an indication they wish to receive specific training; however, both professions need to actively engage in learning about it and develop the skills that enable them to effectively use it. There is expected variability amongst the profession on their degree of commitment to AI as most don’t understand its value; this only adds to the need to train and educate RR. Currently, there is little AI teaching in either undergraduate or postgraduate study programs, and it is not readily available. In addition to this, there are other training programs, courses, workshops, and seminars available; most of these are short and one session rather than a continuation of learning which cover a basic understanding of AI and peripheral topics such as ethics, legal, and potential of AI. There appears to be an obvious gap between the content of what the training program offers and what the RR needs and wants to learn. Due to this, there is a risk of ineffective learning outcomes and attendees feeling a lack of clarity and depth of understanding of the practicality of using AI in a clinical environment. Conclusion: Education, training, and courses need to have defined learning outcomes with relevant concepts, ensuring theory and practice are taught as a continuation of the learning process based on use cases specific to a clinical working environment. Undergraduate and postgraduate courses should be developed robustly, ensuring the delivery of it is with expertise within that field; in addition, training and other programs should be delivered as a way of continued professional development and aligned with accredited institutions for a degree of quality assurance.

Keywords: artificial intelligence, training, radiology, education, learning

Procedia PDF Downloads 85
722 Identification of Shocks from Unconventional Monetary Policy Measures

Authors: Margarita Grushanina

Abstract:

After several prominent central banks including European Central Bank (ECB), Federal Reserve System (Fed), Bank of Japan and Bank of England employed unconventional monetary policies in the aftermath of the financial crisis of 2008-2009 the problem of identification of the effects from such policies became of great interest. One of the main difficulties in identification of shocks from unconventional monetary policy measures in structural VAR analysis is that they often are anticipated, which leads to a non-fundamental MA representation of the VAR model. Moreover, the unconventional monetary policy actions may indirectly transmit to markets information about the future stance of the interest rate, which raises a question of the plausibility of the assumption of orthogonality between shocks from unconventional and conventional policy measures. This paper offers a method of identification that takes into account the abovementioned issues. The author uses factor-augmented VARs to increase the information set and identification through heteroskedasticity of error terms and rank restrictions on the errors’ second moments’ matrix to deal with the cross-correlation of the structural shocks.

Keywords: factor-augmented VARs, identification through heteroskedasticity, monetary policy, structural VARs

Procedia PDF Downloads 348
721 Mechanistic Modelling to De-risk Process Scale-up

Authors: Edwin Cartledge, Jack Clark, Mazaher Molaei-Chalchooghi

Abstract:

The mixing in the crystallization step of active pharmaceutical ingredient manufacturers was studied via advanced modeling tools to enable a successful scale-up. A virtual representation of the vessel was created, and computational fluid dynamics were used to simulate multiphase flow and, thus, the mixing environment within this vessel. The study identified a significant dead zone in the vessel underneath the impeller and found that increasing the impeller speed and power did not improve the mixing. A series of sensitivity analyses found that to improve mixing, the vessel had to be redesigned, and found that optimal mixing could be obtained by adding two extra cylindrical baffles. The same two baffles from the simulated environment were then constructed and added to the process vessel. By identifying these potential issues before starting the manufacture and modifying the vessel to ensure good mixing, this study mitigated a failed crystallization and potential batch disposal, which could have resulted in a significant loss of high-value material.

Keywords: active pharmaceutical ingredient, baffles, computational fluid dynamics, mixing, modelling

Procedia PDF Downloads 97
720 Quasi–Periodicity of Tonic Intervals in Octave and Innovation of Themes in Music Compositions

Authors: R. C. Tyagi

Abstract:

Quasi-periodicity of frequency intervals observed in Shruti based Absolute Scale of Music has been used to graphically identify the Anchor notes ‘Vadi’ and ‘Samvadi’ which are nodal points for expansion, elaboration and iteration of the emotional theme represented by the characteristic tonic arrangement in Raga compositions. This analysis leads to defining the Tonic parameters in the octave including the key-note frequency, tonic intervals’ anchor notes and the on-set and range of quasi-periodicities as exponents of 2. Such uniformity of representation of characteristic data would facilitate computational analysis and synthesis of music compositions and also help develop noise suppression techniques. Criteria for tuning of strings for compatibility with placement of frets on finger boards is discussed. Natural Rhythmic cycles in music compositions are analytically shown to lie between 3 and 126 beats.

Keywords: absolute scale, anchor notes, computational analysis, frets, innovation, noise suppression, Quasi-periodicity, rhythmic cycle, tonic interval, Shruti

Procedia PDF Downloads 304
719 Comparative Study of Different Enhancement Techniques for Computed Tomography Images

Authors: C. G. Jinimole, A. Harsha

Abstract:

One of the key problems facing in the analysis of Computed Tomography (CT) images is the poor contrast of the images. Image enhancement can be used to improve the visual clarity and quality of the images or to provide a better transformation representation for further processing. Contrast enhancement of images is one of the acceptable methods used for image enhancement in various applications in the medical field. This will be helpful to visualize and extract details of brain infarctions, tumors, and cancers from the CT image. This paper presents a comparison study of five contrast enhancement techniques suitable for the contrast enhancement of CT images. The types of techniques include Power Law Transformation, Logarithmic Transformation, Histogram Equalization, Contrast Stretching, and Laplacian Transformation. All these techniques are compared with each other to find out which enhancement provides better contrast of CT image. For the comparison of the techniques, the parameters Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) are used. Logarithmic Transformation provided the clearer and best quality image compared to all other techniques studied and has got the highest value of PSNR. Comparison concludes with better approach for its future research especially for mapping abnormalities from CT images resulting from Brain Injuries.

Keywords: computed tomography, enhancement techniques, increasing contrast, PSNR and MSE

Procedia PDF Downloads 314
718 Transfer Learning for Protein Structure Classification at Low Resolution

Authors: Alexander Hudson, Shaogang Gong

Abstract:

Structure determination is key to understanding protein function at a molecular level. Whilst significant advances have been made in predicting structure and function from amino acid sequence, researchers must still rely on expensive, time-consuming analytical methods to visualise detailed protein conformation. In this study, we demonstrate that it is possible to make accurate (≥80%) predictions of protein class and architecture from structures determined at low (>3A) resolution, using a deep convolutional neural network trained on high-resolution (≤3A) structures represented as 2D matrices. Thus, we provide proof of concept for high-speed, low-cost protein structure classification at low resolution, and a basis for extension to prediction of function. We investigate the impact of the input representation on classification performance, showing that side-chain information may not be necessary for fine-grained structure predictions. Finally, we confirm that high resolution, low-resolution and NMR-determined structures inhabit a common feature space, and thus provide a theoretical foundation for boosting with single-image super-resolution.

Keywords: transfer learning, protein distance maps, protein structure classification, neural networks

Procedia PDF Downloads 136
717 Clarifier Dialogue Interface to resolve linguistic ambiguities in E-Learning Environment

Authors: Dalila Souilem, Salma Boumiza, Abdelkarim Abdelkader

Abstract:

The Clarifier Dialogue Interface (CDI) is a part of an online teaching system based on human-machine communication in learning situation. This interface used in the system during the learning action specifically in the evaluation step, to clarify ambiguities in the learner's response. The CDI can generate patterns allowing access to an information system, using the selectors associated with lexical units. To instantiate these patterns, the user request (especially learner’s response), must be analyzed and interpreted to deduce the canonical form, the semantic form and the subject of the sentence. For the efficiency of this interface at the interpretation level, a set of substitution operators is carried out in order to extend the possibilities of manipulation with a natural language. A second approach that will be presented in this paper focuses on the object languages with new prospects such as combination of natural language with techniques of handling information system in the area of online education. So all operators, the CDI and other interfaces associated to the domain expertise and teaching strategies will be unified using FRAME representation form.

Keywords: dialogue, e-learning, FRAME, information system, natural language

Procedia PDF Downloads 377
716 Improvement of the 3D Finite Element Analysis of High Voltage Power Transformer Defects in Time Domain

Authors: M. Rashid Hussain, Shady S. Refaat

Abstract:

The high voltage power transformer is the most essential part of the electrical power utilities. Reliability on the transformers is the utmost concern, and any failure of the transformers can lead to catastrophic losses in electric power utility. The causes of transformer failure include insulation failure by partial discharge, core and tank failure, cooling unit failure, current transformer failure, etc. For the study of power transformer defects, finite element analysis (FEA) can provide valuable information on the severity of defects. FEA provides a more accurate representation of complex geometries because they consider thermal, electrical, and environmental influences on the insulation models to obtain basic characteristics of the insulation system during normal and partial discharge conditions. The purpose of this paper is the time domain analysis of defects 3D model of high voltage power transformer using FEA to study the electric field distribution at different points on the defects.

Keywords: power transformer, finite element analysis, dielectric response, partial discharge, insulation

Procedia PDF Downloads 158
715 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 395
714 Simulation of 1D Dielectric Barrier Discharge in Argon Mixtures

Authors: Lucas Wilman Crispim, Patrícia Hallack, Maikel Ballester

Abstract:

This work aims at modeling electric discharges in gas mixtures. The mathematical model mimics the ignition process in a commercial spark-plug when a high voltage is applied to the plug terminals. A longitudinal unidimensional Cartesian domain is chosen for the simulation region. Energy and mass transfer are considered for a macroscopic fluid representation, while energy transfer in molecular collisions and chemical reactions are contemplated at microscopic level. The macroscopic model is represented by a set of uncoupled partial differential equations. Microscopic effects are studied within a discrete model for electronic and molecular collisions in the frame of ZDPlasKin, a plasma modeling numerical tool. The BOLSIG+ solver is employed in solving the electronic Boltzmann equation. An operator splitting technique is used to separate microscopic and macroscopic models. The simulation gas is a mixture of atomic Argon neutral, excited and ionized. Spatial and temporal evolution of such species and temperature are presented and discussed.

Keywords: CFD, electronic discharge, ignition, spark plug

Procedia PDF Downloads 162
713 Grid-Connected Inverter Experimental Simulation and Droop Control Implementation

Authors: Nur Aisyah Jalalludin, Arwindra Rizqiawan, Goro Fujita

Abstract:

In this study, we aim to demonstrate a microgrid system experimental simulation for an easy understanding of a large-scale microgrid system. This model is required for industrial training and learning environments. However, in order to create an exact representation of a microgrid system, the laboratory-scale system must fulfill the requirements of a grid-connected inverter, in which power values are assigned to the system to cope with the intermittent output from renewable energy sources. Aside from that, during changes in load capacity, the grid-connected system must be able to supply power from the utility grid side and microgrid side in a balanced manner. Therefore, droop control is installed in the inverter’s control board to maintain equal power sharing in both sides. This power control in a stand-alone condition and droop control in a grid-connected condition must be implemented in order to maintain a stabilized system. Based on the experimental results, power control and droop control can both be applied in the system by comparing the experimental and reference values.

Keywords: droop control, droop characteristic, grid-connected inverter, microgrid, power control

Procedia PDF Downloads 886
712 Women’s Leadership for Sustainable Outcomes: On the Road to Gender Equality for a Better Tomorrow

Authors: Deepika Faugoo

Abstract:

Gender equality stands as the cornerstone of societal progress, intricately woven into the very essence of the 2030 Sustainable Development Goals (SDGs). Yet, the gender leadership gap remains a formidable obstacle hindering global equality. Despite women's educational advancements, their underrepresentation in senior roles persists as a baffling anomaly. Drawing from contemporary research, empirical evidence, and secondary data, this paper underscores the imperative of advancing women in leadership to drive SDGs related to empowerment and gender equality by 2030. It highlights the undeniable link between women leaders and sustainable outcomes, citing case studies and examples of their contributions to financial performance, prosperity, economic growth, and societal well-being. Exploring persistent barriers and emerging challenges, it offers actionable strategies to enhance women's representation in leadership, promising transformative benefits for organizations and societies. Amidst societal upheavals, gender equality emerges as a potent solution, catalyzing change toward a future where every voice resonates, ensuring no one is left behind.

Keywords: senior leadership, empowerment, SDGs, gender equality

Procedia PDF Downloads 69
711 Temporal and Spacial Adaptation Strategies in Aerodynamic Simulation of Bluff Bodies Using Vortex Particle Methods

Authors: Dario Milani, Guido Morgenthal

Abstract:

Fluid dynamic computation of wind caused forces on bluff bodies e.g light flexible civil structures or high incidence of ground approaching airplane wings, is one of the major criteria governing their design. For such structures a significant dynamic response may result, requiring the usage of small scale devices as guide-vanes in bridge design to control these effects. The focus of this paper is on the numerical simulation of the bluff body problem involving multiscale phenomena induced by small scale devices. One of the solution methods for the CFD simulation that is relatively successful in this class of applications is the Vortex Particle Method (VPM). The method is based on a grid free Lagrangian formulation of the Navier-Stokes equations, where the velocity field is modeled by particles representing local vorticity. These vortices are being convected due to the free stream velocity as well as diffused. This representation yields the main advantages of low numerical diffusion, compact discretization as the vorticity is strongly localized, implicitly accounting for the free-space boundary conditions typical for this class of FSI problems, and a natural representation of the vortex creation process inherent in bluff body flows. When the particle resolution reaches the Kolmogorov dissipation length, the method becomes a Direct Numerical Simulation (DNS). However, it is crucial to note that any solution method aims at balancing the computational cost against the accuracy achievable. In the classical VPM method, if the fluid domain is discretized by Np particles, the computational cost is O(Np2). For the coupled FSI problem of interest, for example large structures such as long-span bridges, the aerodynamic behavior may be influenced or even dominated by small structural details such as barriers, handrails or fairings. For such geometrically complex and dimensionally large structures, resolving the complete domain with the conventional VPM particle discretization might become prohibitively expensive to compute even for moderate numbers of particles. It is possible to reduce this cost either by reducing the number of particles or by controlling its local distribution. It is also possible to increase the accuracy of the solution without increasing substantially the global computational cost by computing a correction of the particle-particle interaction in some regions of interest. In this paper different strategies are presented in order to extend the conventional VPM method to reduce the computational cost whilst resolving the required details of the flow. The methods include temporal sub stepping to increase the accuracy of the particles convection in certain regions as well as dynamically re-discretizing the particle map to locally control the global and the local amount of particles. Finally, these methods will be applied on a test case and the improvements in the efficiency as well as the accuracy of the proposed extension to the method are presented. The important benefits in terms of accuracy and computational cost of the combination of these methods will be thus presented as long as their relevant applications.

Keywords: adaptation, fluid dynamic, remeshing, substepping, vortex particle method

Procedia PDF Downloads 262
710 A Supervised Approach for Word Sense Disambiguation Based on Arabic Diacritics

Authors: Alaa Alrakaf, Sk. Md. Mizanur Rahman

Abstract:

Since the last two decades’ Arabic natural language processing (ANLP) has become increasingly much more important. One of the key issues related to ANLP is ambiguity. In Arabic language different pronunciation of one word may have a different meaning. Furthermore, ambiguity also has an impact on the effectiveness and efficiency of Machine Translation (MT). The issue of ambiguity has limited the usefulness and accuracy of the translation from Arabic to English. The lack of Arabic resources makes ambiguity problem more complicated. Additionally, the orthographic level of representation cannot specify the exact meaning of the word. This paper looked at the diacritics of Arabic language and used them to disambiguate a word. The proposed approach of word sense disambiguation used Diacritizer application to Diacritize Arabic text then found the most accurate sense of an ambiguous word using Naïve Bayes Classifier. Our Experimental study proves that using Arabic Diacritics with Naïve Bayes Classifier enhances the accuracy of choosing the appropriate sense by 23% and also decreases the ambiguity in machine translation.

Keywords: Arabic natural language processing, machine learning, machine translation, Naive bayes classifier, word sense disambiguation

Procedia PDF Downloads 358
709 Multimodal Discourse Analysis of Egyptian Political Movies: A Case Study of 'People at the Top Ahl Al Kemma' Movie

Authors: Mariam Waheed Mekheimar

Abstract:

Nascent research is conducted to the advancement of discourse analysis to include different modes as images, sound, and text. The focus of this study will be to elucidate how images are embedded with texts in an audio-visual medium as cinema to send political messages; it also seeks to broaden our understanding of politics beyond a relatively narrow conceptualization of the 'political' through studying non-traditional discourses as the cinematic discourse. The aim herein is to develop a systematic approach to film analysis to capture political meanings in films. The method adopted in this research is Multimodal Discourse Analysis (MDA) focusing on embedding visuals with texts. As today's era is the era of images and that necessitates analyzing images. Drawing on the writings of O'Halloran, Kress and Van Leuween, John Bateman and Janina Wildfeuer, different modalities will be studied to understand how those modes interact in the cinematic discourse. 'People at the top movie' is selected as an example to unravel the political meanings throughout film tackling the cinematic representation of the notion of social justice.

Keywords: Egyptian cinema, multimodal discourse analysis, people at the top, social justice

Procedia PDF Downloads 422
708 Recurrent Neural Networks with Deep Hierarchical Mixed Structures for Chinese Document Classification

Authors: Zhaoxin Luo, Michael Zhu

Abstract:

In natural languages, there are always complex semantic hierarchies. Obtaining the feature representation based on these complex semantic hierarchies becomes the key to the success of the model. Several RNN models have recently been proposed to use latent indicators to obtain the hierarchical structure of documents. However, the model that only uses a single-layer latent indicator cannot achieve the true hierarchical structure of the language, especially a complex language like Chinese. In this paper, we propose a deep layered model that stacks arbitrarily many RNN layers equipped with latent indicators. After using EM and training it hierarchically, our model solves the computational problem of stacking RNN layers and makes it possible to stack arbitrarily many RNN layers. Our deep hierarchical model not only achieves comparable results to large pre-trained models on the Chinese short text classification problem but also achieves state of art results on the Chinese long text classification problem.

Keywords: nature language processing, recurrent neural network, hierarchical structure, document classification, Chinese

Procedia PDF Downloads 68
707 Under the ‘Fourth World’: A Discussion to the Transformation of Character-Settings in Chinese Ethnic Minority Films

Authors: Sicheng Liu

Abstract:

Based on the key issue of the current fourth world studies, the article aims to analyze the features of character-settings in Chinese ethnic minority films. As a generalizable transformation, this feature progresses from a microcosmic representation. It argues that, as the mediation, films note down the current state of people and their surroundings, while the ‘fourth world’ theorization (or the fourth cinema) provides a new perspective to ethnic minority topics in China. Like the ‘fourth cinema’ focusing on the depiction of indigeneity groups, the ethnic minority films portrait the non-Han nationalities in China. Both types possess the motif of returning history-writing to the minority members’ own hand. In this article, the discussion entirely involves three types of cinematic role-settings in Chinese minority themed films, which illustrates that, similar to the creative principle of the fourth film, the themes and narratives of these films are becoming more individualized, with more concern to minority grassroots.

Keywords: 'fourth world', Chinese ethnic minority films, ethnicity and culture reflection, 'mother tongue' (muyu), highlighting to individual spiritual

Procedia PDF Downloads 188
706 Multi-Dimensional Experience of Processing Textual and Visual Information: Case Study of Allocations to Places in the Mind’s Eye Based on Individual’s Semantic Knowledge Base

Authors: Joanna Wielochowska, Aneta Wielochowska

Abstract:

Whilst the relationship between scientific areas such as cognitive psychology, neurobiology and philosophy of mind has been emphasized in recent decades of scientific research, concepts and discoveries made in both fields overlap and complement each other in their quest for answers to similar questions. The object of the following case study is to describe, analyze and illustrate the nature and characteristics of a certain cognitive experience which appears to display features of synaesthesia, or rather high-level synaesthesia (ideasthesia). The following research has been conducted on the subject of two authors, monozygotic twins (both polysynaesthetes) experiencing involuntary associations of identical nature. Authors made attempts to identify which cognitive and conceptual dependencies may guide this experience. Operating on self-introduced nomenclature, the described phenomenon- multi-dimensional processing of textual and visual information- aims to define a relationship that involuntarily and immediately couples the content introduced by means of text or image a sensation of appearing in a certain place in the mind’s eye. More precisely: (I) defining a concept introduced by means of textual content during activity of reading or writing, or (II) defining a concept introduced by means of visual content during activity of looking at image(s) with simultaneous sensation of being allocated to a given place in the mind’s eye. A place can be then defined as a cognitive representation of a certain concept. During the activity of processing information, a person has an immediate and involuntary feel of appearing in a certain place themselves, just like a character of a story, ‘observing’ a venue or a scenery from one or more perspectives and angles. That forms a unique and unified experience, constituting a background mental landscape of text or image being looked at. We came to a conclusion that semantic allocations to a given place could be divided and classified into the categories and subcategories and are naturally linked with an individual’s semantic knowledge-base. A place can be defined as a representation one’s unique idea of a given concept that has been established in their semantic knowledge base. A multi-level structure of selectivity of places in the mind’s eye, as a reaction to a given information (one stimuli), draws comparisons to structures and patterns found in botany. Double-flowered varieties of flowers and a whorl system (arrangement) which is characteristic to components of some flower species were given as an illustrative example. A composition of petals that fan out from one single point and wrap around a stem inspired an idea that, just like in nature, in philosophy of mind there are patterns driven by the logic specific to a given phenomenon. The study intertwines terms perceived through the philosophical lens, such as definition of meaning, subjectivity of meaning, mental atmosphere of places, and others. Analysis of this rare experience aims to contribute to constantly developing theoretical framework of the philosophy of mind and influence the way human semantic knowledge base and processing given content in terms of distinguishing between information and meaning is researched.

Keywords: information and meaning, information processing, mental atmosphere of places, patterns in nature, philosophy of mind, selectivity, semantic knowledge base, senses, synaesthesia

Procedia PDF Downloads 124
705 Labview-Based System for Fiber Links Events Detection

Authors: Bo Liu, Qingshan Kong, Weiqing Huang

Abstract:

With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.

Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising

Procedia PDF Downloads 123
704 Optimized Dynamic Bayesian Networks and Neural Verifier Test Applied to On-Line Isolated Characters Recognition

Authors: Redouane Tlemsani, Redouane, Belkacem Kouninef, Abdelkader Benyettou

Abstract:

In this paper, our system is a Markovien system which we can see it like a Dynamic Bayesian Networks. One of the major interests of these systems resides in the complete training of the models (topology and parameters) starting from training data. The Bayesian Networks are representing models of dubious knowledge on complex phenomena. They are a union between the theory of probability and the graph theory in order to give effective tools to represent a joined probability distribution on a set of random variables. The representation of knowledge bases on description, by graphs, relations of causality existing between the variables defining the field of study. The theory of Dynamic Bayesian Networks is a generalization of the Bayesians networks to the dynamic processes. Our objective amounts finding the better structure which represents the relationships (dependencies) between the variables of a dynamic bayesian network. In applications in pattern recognition, one will carry out the fixing of the structure which obliges us to admit some strong assumptions (for example independence between some variables).

Keywords: Arabic on line character recognition, dynamic Bayesian network, pattern recognition, networks

Procedia PDF Downloads 618
703 Rethinking the History of an Expanding City through Its Images: Birmingham, England, the Nineteenth Century

Authors: Lin Chang

Abstract:

Birmingham, England was a town in the late-eighteenth century and became the nation’s second largest city in the late nineteenth century. The city expanded rapidly in terms of its population and size. Three generations of artists from a local family, the Lines, made a large number of drawings and paintings depicting the growth and changes of their city. At first sight, the meaning of the pictures seems straight-forward: providing records of what were torn down and newly-built. However, except for being read as maps, the pictures reveal a struggle in vision as to whether unsightly manufactories and their smoking chimneys should be visualized and how far the borders of the town should have been positioned and understood as they continued to grow and encroached upon its immediate countryside. This art-historic paper examines some topographic views by the Lines family and explores how they, through unusual depiction of rural and urban scenery, manage to give form to the borderlands between the country and the city. This paper argues that while the idea of the country and the city seems to be common sense, the two realms actually pose difficulty for visual representation as to where exactly their borders are and the idea itself has dichotomized the way people consider landscape imageries to be.

Keywords: Birmingham, suburb, urban fringes, landscape

Procedia PDF Downloads 197
702 Self-denigration in Doctoral Defense Sessions: Scale Development and Validation

Authors: Alireza Jalilifar, Nadia Mayahi

Abstract:

The dissertation defense as a complicated conflict-prone context entails the adoption of elegant interactional strategies, one of which is self-denigration. This study aimed to develop and validate a self-denigration model that fits the context of doctoral defense sessions in applied linguistics. Two focus group discussions provided the basis for developing this conceptual model, which assumed 10 functions for self-denigration, namely good manners, modesty, affability, altruism, assertiveness, diffidence, coercive self-deprecation, evasion, diplomacy, and flamboyance. These functions were used to design a 40-item questionnaire on the attitudes of applied linguists concerning self-denigration in defense sessions. The confirmatory factor analysis of the questionnaire indicated the predictive ability of the measurement model. The findings of this study suggest that self-denigration in doctoral defense sessions is the social representation of the participants’ values, ideas and practices adopted as a negotiation strategy and a conflict management policy for the purpose of establishing harmony and maintaining resilience. This study has implications for doctoral students and academics and illuminates further research on self-denigration in other contexts.

Keywords: academic discourse, politeness, self-denigration, grounded theory, dissertation defense

Procedia PDF Downloads 137
701 Modeling User Context Using CEAR Diagram

Authors: Ravindra Dastikop, G. S. Thyagaraju, U. P. Kulkarni

Abstract:

Even though the number of context aware applications is increasing day by day along with the users, till today there is no generic programming paradigm for context aware applications. This situation could be remedied by design and developing the appropriate context modeling and programming paradigm for context aware applications. In this paper, we are proposing the static context model and metrics for validating the expressiveness and understandability of the model. The proposed context modeling is a way of describing a situation of user using context entities , attributes and relationships .The model which is an extended and hybrid version of ER model, ontology model and Graphical model is specifically meant for expressing and understanding the user situation in context aware environment. The model is useful for understanding context aware problems, preparing documentation and designing programs and databases. The model makes use of context entity attributes relationship (CEAR) diagram for representation of association between the context entities and attributes. We have identified a new set of graphical notations for improving the expressiveness and understandability of context from the end user perspective .

Keywords: user context, context entity, context entity attributes, situation, sensors, devices, relationships, actors, expressiveness, understandability

Procedia PDF Downloads 344
700 The Impact of COVID-19 Pandemic on the Issue and Ideological Congruence of Trump and Bolsonaro Administrations

Authors: Flavio Contrera, Paulo Cesar Gregorio

Abstract:

Recent political developments and government control actions in the face of the COVID-19 pandemic draw attention to the contrast between the duties of government and the demands of democratic representation. Elected by mobilizing far-right issues, Trump and Bolsonaro moved away from the WHO guidelines but had to accommodate demands on the health and on the social protection system on the one hand and demands from the economic sector on the other. This study used the MARPOR Project method to assess the impact of the COVID-19 pandemic on the issue and ideological congruence between the electoral and governmental arena in both the Trump and Bolsonaro Administrations. Findings reveal issue congruence between arenas in "National Way of Life: Positive", "Law and Order," and "Technology and Infrastructure" for Donald Trump, and "Welfare State Expansion" for Bolsonaro. Ideological estimation results show that Trump and Bolsonaro positioned to the right in their presidential elections, initially moved to the center-right. However, welfare policies actions at high frequency during the COVID-19 pandemic moved the ideological estimations of both governments to the center-left, despite their denial rhetoric.

Keywords: congruence, COVID-19, Donald Trump, Jair Bolsonaro

Procedia PDF Downloads 231