Search results for: semantic computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1460

Search results for: semantic computing

530 Electronic Physical Activity Record (EPAR): Key for Data Driven Physical Activity Healthcare Services

Authors: Rishi Kanth Saripalle

Abstract:

Medical experts highly recommend to include physical activity in everyone’s daily routine irrespective of gender or age as it helps to improve various medical issues or curb potential issues. Simultaneously, experts are also diligently trying to provide various healthcare services (interventions, plans, exercise routines, etc.) for promoting healthy living and increasing physical activity in one’s ever increasing hectic schedules. With the introduction of wearables, individuals are able to keep track, analyze, and visualize their daily physical activities. However, there seems to be no common agreed standard for representing, gathering, aggregating and analyzing an individual’s physical activity data from disparate multiple sources (exercise pans, multiple wearables, etc.). This issue makes it highly impractical to develop any data-driven physical activity applications and healthcare programs. Further, the inability to integrate the physical activity data into an individual’s Electronic Health Record to provide a wholistic image of that individual’s health is still eluding the experts. This article has identified three primary reasons for this potential issue. First, there is no agreed standard, both structure and semantic, for representing and sharing physical activity data across disparate systems. Second, various organizations (e.g., LA fitness, Gold’s Gym, etc.) and research backed interventions and programs still primarily rely on paper or unstructured format (such as text or notes) to keep track of the data generated from physical activities. Finally, most of the wearable devices operate in silos. This article identifies the underlying problem, explores the idea of reusing existing standards, and identifies the essential modules required to move forward.

Keywords: electronic physical activity record, physical activity in EHR EIM, tracking physical activity data, physical activity data standards

Procedia PDF Downloads 270
529 3D Modeling Approach for Cultural Heritage Structures: The Case of Virgin of Loreto Chapel in Cusco, Peru

Authors: Rony Reátegui, Cesar Chácara, Benjamin Castañeda, Rafael Aguilar

Abstract:

Nowadays, heritage building information modeling (HBIM) is considered an efficient tool to represent and manage information of cultural heritage (CH). The basis of this tool relies on a 3D model generally obtained from a cloud-to-BIM procedure. There are different methods to create an HBIM model that goes from manual modeling based on the point cloud to the automatic detection of shapes and the creation of objects. The selection of these methods depends on the desired level of development (LOD), level of information (LOI), grade of generation (GOG), as well as on the availability of commercial software. This paper presents the 3D modeling of a stone masonry chapel using Recap Pro, Revit, and Dynamo interface following a three-step methodology. The first step consists of the manual modeling of simple structural (e.g., regular walls, columns, floors, wall openings, etc.) and architectural (e.g., cornices, moldings, and other minor details) elements using the point cloud as reference. Then, Dynamo is used for generative modeling of complex structural elements such as vaults, infills, and domes. Finally, semantic information (e.g., materials, typology, state of conservation, etc.) and pathologies are added within the HBIM model as text parameters and generic models families, respectively. The application of this methodology allows the documentation of CH following a relatively simple to apply process that ensures adequate LOD, LOI, and GOG levels. In addition, the easy implementation of the method as well as the fact of using only one BIM software with its respective plugin for the scan-to-BIM modeling process means that this methodology can be adopted by a larger number of users with intermediate knowledge and limited resources since the BIM software used has a free student license.

Keywords: cloud-to-BIM, cultural heritage, generative modeling, HBIM, parametric modeling, Revit

Procedia PDF Downloads 124
528 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow

Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun

Abstract:

With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.

Keywords: cloud storage security, sharing storage, attributes, Hash algorithm

Procedia PDF Downloads 369
527 Frequency Transformation with Pascal Matrix Equations

Authors: Phuoc Si Nguyen

Abstract:

Frequency transformation with Pascal matrix equations is a method for transforming an electronic filter (analogue or digital) into another filter. The technique is based on frequency transformation in the s-domain, bilinear z-transform with pre-warping frequency, inverse bilinear transformation and a very useful application of the Pascal’s triangle that simplifies computing and enables calculation by hand when transforming from one filter to another. This paper will introduce two methods to transform a filter into a digital filter: frequency transformation from the s-domain into the z-domain; and frequency transformation in the z-domain. Further, two Pascal matrix equations are derived: an analogue to digital filter Pascal matrix equation and a digital to digital filter Pascal matrix equation. These are used to design a desired digital filter from a given filter.

Keywords: frequency transformation, bilinear z-transformation, pre-warping frequency, digital filters, analog filters, pascal’s triangle

Procedia PDF Downloads 530
526 Multi-source Question Answering Framework Using Transformers for Attribute Extraction

Authors: Prashanth Pillai, Purnaprajna Mangsuli

Abstract:

Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.

Keywords: natural language processing, deep learning, transformers, information retrieval

Procedia PDF Downloads 180
525 Solving Linear Systems Involved in Convex Programming Problems

Authors: Yixun Shi

Abstract:

Many interior point methods for convex programming solve an (n+m)x(n+m)linear system in each iteration. Many implementations solve this system in each iteration by considering an equivalent mXm system (4) as listed in the paper, and thus the job is reduced into solving the system (4). However, the system(4) has to be solved exactly since otherwise the error would be entirely passed onto the last m equations of the original system. Often the Cholesky factorization is computed to obtain the exact solution of (4). One Cholesky factorization is to be done in every iteration, resulting in higher computational costs. In this paper, two iterative methods for solving linear systems using vector division are combined together and embedded into interior point methods. Instead of computing one Cholesky factorization in each iteration, it requires only one Cholesky factorization in the entire procedure, thus significantly reduces the amount of computation needed for solving the problem. Based on that, a hybrid algorithm for solving convex programming problems is proposed.

Keywords: convex programming, interior point method, linear systems, vector division

Procedia PDF Downloads 386
524 Optimal Design of Redundant Hybrid Manipulator for Minimum Singularity

Authors: Arash Rahmani, Ahmad Ghanbari, Abbas Baghernezhad, Babak Safaei

Abstract:

In the design of parallel manipulators, usually mean value of a dexterity measure over the workspace volume is considered as the objective function to be used in optimization algorithms. The mentioned indexes in a hybrid parallel manipulator (HPM) are quite complicated to solve thanks to infinite solutions for every point within the workspace of the redundant manipulators. In this paper, spatial isotropic design axioms are extended as a well-known method for optimum design of manipulators. An upper limit for the isotropy measure of HPM is calculated and instead of computing and minimizing isotropy measure, minimizing the obtained limit is considered. To this end, two different objective functions are suggested which are obtained from objective functions of comprising modules. Finally, by using genetic algorithm (GA), the best geometric parameters for a specific hybrid parallel robot which is composed of two modified Gough-Stewart platforms (MGSP) are achieved.

Keywords: hybrid manipulator, spatial isotropy, genetic algorithm, optimum design

Procedia PDF Downloads 324
523 Linguistic Misinterpretation and the Dialogue of Civilizations

Authors: Oleg Redkin, Olga Bernikova

Abstract:

Globalization and migrations have made cross-cultural contacts more frequent and intensive. Sometimes, these contacts may lead to misunderstanding between partners of communication and misinterpretations of the verbal messages that some researchers tend to consider as the 'clash of civilizations'. In most cases, reasons for that may be found in cultural and linguistic differences and hence misinterpretations of intentions and behavior. The current research examines factors of verbal and non-verbal communication that should be taken into consideration in verbal and non-verbal contacts. Language is one of the most important manifestations of the cultural code, and it is often considered as one of the special features of a civilization. The Arabic language, in particular, is commonly associated with Islam and the language and the Arab-Muslim civilization. It is one of the most important markers of self-identification for more than 200 million of native speakers. Arabic is the language of the Quran and hence the symbol of religious affiliation for more than one billion Muslims around the globe. Adequate interpretation of Arabic texts requires profound knowledge of its grammar, semantics of its vocabulary. Communicating sides who belong to different cultural groups are guided by different models of behavior and hierarchy of values, besides that the vocabulary each of them uses in the dialogue may convey different semantic realities and vary in connotations. In this context direct, literal translation in most cases cannot adequately convey the original meaning of the original message. Besides that peculiarities and diversities of the extralinguistic information, such as the body language, communicative etiquette, cultural background and religious affiliations may make the dialogue even more difficult. It is very likely that the so called 'clash of civilizations' in most cases is due to misinterpretation of counterpart's means of discourse such as language, cultural codes, and models of behavior rather than lies in basic contradictions between partners of communication. In the process of communication, one has to rely on universal values rather than focus on cultural or religious peculiarities, to take into account current linguistic and extralinguistic context.

Keywords: Arabic, civilization, discourse, language, linguistic

Procedia PDF Downloads 207
522 Data Poisoning Attacks on Federated Learning and Preventive Measures

Authors: Beulah Rani Inbanathan

Abstract:

In the present era, it is vivid from the numerous outcomes that data privacy is being compromised in various ways. Machine learning is one technology that uses the centralized server, and then data is given as input which is being analyzed by the algorithms present on this mentioned server, and hence outputs are predicted. However, each time the data must be sent by the user as the algorithm will analyze the input data in order to predict the output, which is prone to threats. The solution to overcome this issue is federated learning, where the models alone get updated while the data resides on the local machine and does not get exchanged with the other local models. Nevertheless, even on these local models, there are chances of data poisoning, and it is crystal clear from various experiments done by many people. This paper delves into many ways where data poisoning occurs and the many methods through which it is prevalent that data poisoning still exists. It includes the poisoning attacks on IoT devices, Edge devices, Autoregressive model, and also, on Industrial IoT systems and also, few points on how these could be evadible in order to protect our data which is personal, or sensitive, or harmful when exposed.

Keywords: data poisoning, federated learning, Internet of Things, edge computing

Procedia PDF Downloads 71
521 Adapting an Accurate Reverse-time Migration Method to USCT Imaging

Authors: Brayden Mi

Abstract:

Reverse time migration has been widely used in the Petroleum exploration industry to reveal subsurface images and to detect rock and fluid properties since the early 1980s. The seismic technology involves the construction of a velocity model through interpretive model construction, seismic tomography, or full waveform inversion, and the application of the reverse-time propagation of acquired seismic data and the original wavelet used in the acquisition. The methodology has matured from 2D, simple media to present-day to handle full 3D imaging challenges in extremely complex geological conditions. Conventional Ultrasound computed tomography (USCT) utilize travel-time-inversion to reconstruct the velocity structure of an organ. With the velocity structure, USCT data can be migrated with the “bend-ray” method, also known as migration. Its seismic application counterpart is called Kirchhoff depth migration, in which the source of reflective energy is traced by ray-tracing and summed to produce a subsurface image. It is well known that ray-tracing-based migration has severe limitations in strongly heterogeneous media and irregular acquisition geometries. Reverse time migration (RTM), on the other hand, fully accounts for the wave phenomena, including multiple arrives and turning rays due to complex velocity structure. It has the capability to fully reconstruct the image detectable in its acquisition aperture. The RTM algorithms typically require a rather accurate velocity model and demand high computing powers, and may not be applicable to real-time imaging as normally required in day-to-day medical operations. However, with the improvement of computing technology, such a computational bottleneck may not present a challenge in the near future. The present-day (RTM) algorithms are typically implemented from a flat datum for the seismic industry. It can be modified to accommodate any acquisition geometry and aperture, as long as sufficient illumination is provided. Such flexibility of RTM can be conveniently implemented for the application in USCT imaging if the spatial coordinates of the transmitters and receivers are known and enough data is collected to provide full illumination. This paper proposes an implementation of a full 3D RTM algorithm for USCT imaging to produce an accurate 3D acoustic image based on the Phase-shift-plus-interpolation (PSPI) method for wavefield extrapolation. In this method, each acquired data set (shot) is propagated back in time, and a known ultrasound wavelet is propagated forward in time, with PSPI wavefield extrapolation and a piece-wise constant velocity model of the organ (breast). The imaging condition is then applied to produce a partial image. Although each image is subject to the limitation of its own illumination aperture, the stack of multiple partial images will produce a full image of the organ, with a much-reduced noise level if compared with individual partial images.

Keywords: illumination, reverse time migration (RTM), ultrasound computed tomography (USCT), wavefield extrapolation

Procedia PDF Downloads 57
520 Integration of Wireless Sensor Networks and Radio Frequency Identification (RFID): An Assesment

Authors: Arslan Murtaza

Abstract:

RFID (Radio Frequency Identification) and WSN (Wireless sensor network) are two significant wireless technologies that have extensive diversity of applications and provide limitless forthcoming potentials. RFID is used to identify existence and location of objects whereas WSN is used to intellect and monitor the environment. Incorporating RFID with WSN not only provides identity and location of an object but also provides information regarding the condition of the object carrying the sensors enabled RFID tag. It can be widely used in stock management, asset tracking, asset counting, security, military, environmental monitoring and forecasting, healthcare, intelligent home, intelligent transport vehicles, warehouse management, and precision agriculture. This assessment presents a brief introduction of RFID, WSN, and integration of WSN and RFID, and then applications related to both RFID and WSN. This assessment also deliberates status of the projects on RFID technology carried out in different computing group projects to be taken on WSN and RFID technology.

Keywords: wireless sensor network, RFID, embedded sensor, Wi-Fi, Bluetooth, integration, time saving, cost efficient

Procedia PDF Downloads 315
519 Easily Memorable Strong Password Generation and Retrieval

Authors: Shatadru Das, Natarajan Vijayarangan

Abstract:

In this paper, a system and method for generating and recovering an authorization code has been designed and analyzed. The system creates an authorization code by accepting a base-sentence from a user. Based on the characters present in this base-sentence, the system computes a base-sentence matrix. The system also generates a plurality of patterns. The user can either select the pattern from the multiple patterns suggested by the system or can create his/her own pattern. The system then performs multiplications between the base-sentence matrix and the selected pattern matrix at different stages in the path forward, for obtaining a strong authorization code. In case the user forgets the base sentence, the system has a provision to manage and retrieve 'forgotten authorization code'. This is done by fragmenting the base sentence into different matrices and storing the fragmented matrices into a repository after computing matrix multiplication with a security question-answer approach and with a secret key provided by the user.

Keywords: easy authentication, key retrieval, memorable passwords, strong password generation

Procedia PDF Downloads 380
518 Integration of Smart Grid Technologies with Smart Phones for Energy Monitoring and Management

Authors: Arjmand Khaliq, Pemra Sohaib

Abstract:

There is increasing trend of use of smart devices in the present age. The growth of computing techniques and advancement in hardware has also brought the use of sensors and smart devices to a high degree during the course of time. So use of smart devices for control, management communication and optimization has become very popular. This paper gives proposed methodology which involves sensing and switching unite for load, two way communications between utility company and smart phones of consumers using cellular techniques and price signaling resulting active participation of user in energy management .The goal of this proposed control methodology is active participation of user in energy management with accommodation of renewable energy resource. This will provide load adjustment according to consumer’s choice, increased security and reliability for consumer, switching of load according to consumer need and monitoring and management of energy.

Keywords: cellular networks, energy management, renewable energy source, smart grid technology

Procedia PDF Downloads 386
517 Simulation and Modeling of High Voltage Pulse Transformer

Authors: Zahra Emami, H. Reza Mesgarzade, A. Morad Ghorbami, S. Reza Motahari

Abstract:

This paper presents a method for calculation of parasitic elements consisting of leakage inductance and parasitic capacitance in a high voltage pulse transformer. The parasitic elements of pulse transformers significantly influence the resulting pulse shape of a power modulator system. In order to prevent the effects on the pulse shape before constructing the transformer an electrical model is needed. The technique procedures for computing these elements are based on finite element analysis. The finite element model of pulse transformer is created using software "Ansys Maxwell 3D". Finally, the transformer parasitic elements is calculated and compared with the value obtained from the actual test and pulse modulator is simulated and results is compared with actual test of pulse modulator. The results obtained are very similar with the test values.

Keywords: pulse transformer, simulation, modeling, Maxwell 3D, modulator

Procedia PDF Downloads 441
516 Empowering a New Frontier in Heart Disease Detection: Unleashing Quantum Machine Learning

Authors: Sadia Nasrin Tisha, Mushfika Sharmin Rahman, Javier Orduz

Abstract:

Machine learning is applied in a variety of fields throughout the world. The healthcare sector has benefited enormously from it. One of the most effective approaches for predicting human heart diseases is to use machine learning applications to classify data and predict the outcome as a classification. However, with the rapid advancement of quantum technology, quantum computing has emerged as a potential game-changer for many applications. Quantum algorithms have the potential to execute substantially faster than their classical equivalents, which can lead to significant improvements in computational performance and efficiency. In this study, we applied quantum machine learning concepts to predict coronary heart diseases from text data. We experimented thrice with three different features; and three feature sets. The data set consisted of 100 data points. We pursue to do a comparative analysis of the two approaches, highlighting the potential benefits of quantum machine learning for predicting heart diseases.

Keywords: quantum machine learning, SVM, QSVM, matrix product state

Procedia PDF Downloads 73
515 Blockchain’s Feasibility in Military Data Networks

Authors: Brenden M. Shutt, Lubjana Beshaj, Paul L. Goethals, Ambrose Kam

Abstract:

Communication security is of particular interest to military data networks. A relatively novel approach to network security is blockchain, a cryptographically secured distribution ledger with a decentralized consensus mechanism for data transaction processing. Recent advances in blockchain technology have proposed new techniques for both data validation and trust management, as well as different frameworks for managing dataflow. The purpose of this work is to test the feasibility of different blockchain architectures as applied to military command and control networks. Various architectures are tested through discrete-event simulation and the feasibility is determined based upon a blockchain design’s ability to maintain long-term stable performance at industry standards of throughput, network latency, and security. This work proposes a consortium blockchain architecture with a computationally inexpensive consensus mechanism, one that leverages a Proof-of-Identity (PoI) concept and a reputation management mechanism.

Keywords: blockchain, consensus mechanism, discrete-event simulation, fog computing

Procedia PDF Downloads 120
514 Variational Explanation Generator: Generating Explanation for Natural Language Inference Using Variational Auto-Encoder

Authors: Zhen Cheng, Xinyu Dai, Shujian Huang, Jiajun Chen

Abstract:

Recently, explanatory natural language inference has attracted much attention for the interpretability of logic relationship prediction, which is also known as explanation generation for Natural Language Inference (NLI). Existing explanation generators based on discriminative Encoder-Decoder architecture have achieved noticeable results. However, we find that these discriminative generators usually generate explanations with correct evidence but incorrect logic semantic. It is due to that logic information is implicitly encoded in the premise-hypothesis pairs and difficult to model. Actually, logic information identically exists between premise-hypothesis pair and explanation. And it is easy to extract logic information that is explicitly contained in the target explanation. Hence we assume that there exists a latent space of logic information while generating explanations. Specifically, we propose a generative model called Variational Explanation Generator (VariationalEG) with a latent variable to model this space. Training with the guide of explicit logic information in target explanations, latent variable in VariationalEG could capture the implicit logic information in premise-hypothesis pairs effectively. Additionally, to tackle the problem of posterior collapse while training VariaztionalEG, we propose a simple yet effective approach called Logic Supervision on the latent variable to force it to encode logic information. Experiments on explanation generation benchmark—explanation-Stanford Natural Language Inference (e-SNLI) demonstrate that the proposed VariationalEG achieves significant improvement compared to previous studies and yields a state-of-the-art result. Furthermore, we perform the analysis of generated explanations to demonstrate the effect of the latent variable.

Keywords: natural language inference, explanation generation, variational auto-encoder, generative model

Procedia PDF Downloads 128
513 On Block Vandermonde Matrix Constructed from Matrix Polynomial Solvents

Authors: Malika Yaici, Kamel Hariche

Abstract:

In control engineering, systems described by matrix fractions are studied through properties of block roots, also called solvents. These solvents are usually dealt with in a block Vandermonde matrix form. Inverses and determinants of Vandermonde matrices and block Vandermonde matrices are used in solving problems of numerical analysis in many domains but require costly computations. Even though Vandermonde matrices are well known and method to compute inverse and determinants are many and, generally, based on interpolation techniques, methods to compute the inverse and determinant of a block Vandermonde matrix have not been well studied. In this paper, some properties of these matrices and iterative algorithms to compute the determinant and the inverse of a block Vandermonde matrix are given. These methods are deducted from the partitioned matrix inversion and determinant computing methods. Due to their great size, parallelization may be a solution to reduce the computations cost, so a parallelization of these algorithms is proposed and validated by a comparison using algorithmic complexity.

Keywords: block vandermonde matrix, solvents, matrix polynomial, matrix inverse, matrix determinant, parallelization

Procedia PDF Downloads 218
512 Chemical Reaction Algorithm for Expectation Maximization Clustering

Authors: Li Ni, Pen ManMan, Li KenLi

Abstract:

Clustering is an intensive research for some years because of its multifaceted applications, such as biology, information retrieval, medicine, business and so on. The expectation maximization (EM) is a kind of algorithm framework in clustering methods, one of the ten algorithms of machine learning. Traditionally, optimization of objective function has been the standard approach in EM. Hence, research has investigated the utility of evolutionary computing and related techniques in the regard. Chemical Reaction Optimization (CRO) is a recently established method. So the property embedded in CRO is used to solve optimization problems. This paper presents an algorithm framework (EM-CRO) with modified CRO operators based on EM cluster problems. The hybrid algorithm is mainly to solve the problem of initial value sensitivity of the objective function optimization clustering algorithm. Our experiments mainly take the EM classic algorithm:k-means and fuzzy k-means as an example, through the CRO algorithm to optimize its initial value, get K-means-CRO and FKM-CRO algorithm. The experimental results of them show that there is improved efficiency for solving objective function optimization clustering problems.

Keywords: chemical reaction optimization, expection maimization, initia, objective function clustering

Procedia PDF Downloads 691
511 An Event-Related Potential Investigation of Speech-in-Noise Recognition in Native and Nonnative Speakers of English

Authors: Zahra Fotovatnia, Jeffery A. Jones, Alexandra Gottardo

Abstract:

Speech communication often occurs in environments where noise conceals part of a message. Listeners should compensate for the lack of auditory information by picking up distinct acoustic cues and using semantic and sentential context to recreate the speaker’s intended message. This situation seems to be more challenging in a nonnative than native language. On the other hand, early bilinguals are expected to show an advantage over the late bilingual and monolingual speakers of a language due to their better executive functioning components. In this study, English monolingual speakers were compared with early and late nonnative speakers of English to understand speech in noise processing (SIN) and the underlying neurobiological features of this phenomenon. Auditory mismatch negativities (MMNs) were recorded using a double-oddball paradigm in response to a minimal pair that differed in their middle vowel (beat/bit) at Wilfrid Laurier University in Ontario, Canada. The results did not show any significant structural and electroneural differences across groups. However, vocabulary knowledge correlated positively with performance on tests that measured SIN processing in participants who learned English after age 6. Moreover, their performance on the test negatively correlated with the integral area amplitudes in the left superior temporal gyrus (STG). In addition, the STG was engaged before the inferior frontal gyrus (IFG) in noise-free and low-noise test conditions in all groups. We infer that the pre-attentive processing of words engages temporal lobes earlier than the fronto-central areas and that vocabulary knowledge helps the nonnative perception of degraded speech.

Keywords: degraded speech perception, event-related brain potentials, mismatch negativities, brain regions

Procedia PDF Downloads 89
510 Heuristic Evaluation of Children’s Authoring Tool for Game Making

Authors: Laili Farhana Md Ibharim, Maizatul Hayati Mohamad Yatim

Abstract:

The main purpose of this study is to evaluate the heuristic inspection of children’s authoring tools to develop games. The researcher has selected 15 authoring tools for making games specifically for educational purposes. Nine students from Diploma of Game Design and Development course and four lecturers from the computing department involved in this evaluation. A set of usability heuristic checklist used as a guideline for the students and lecturers to observe and test the authoring tools selected. The study found that there are just a few authoring tools that fulfill most of the heuristic requirement and suitable to apply to children. In this evaluation, only six out of fifteen authoring tools have passed above than five elements in the heuristic inspection checklist. The researcher identified that in order to develop a usable authoring tool developer has to emphasis children acceptance and interaction of the authoring tool. Furthermore, the authoring tool can be a tool to enhance their mental development especially in creativity and skill.

Keywords: authoring tool, children, game making, heuristic

Procedia PDF Downloads 334
509 Exploring Twitter Data on Human Rights Activism on Olympics Stage through Social Network Analysis and Mining

Authors: Teklu Urgessa, Joong Seek Lee

Abstract:

Social media is becoming the primary choice of activists to make their voices heard. This fact is coupled by two main reasons. The first reason is the emergence web 2.0, which gave the users opportunity to become content creators than passive recipients. Secondly the control of the mainstream mass media outlets by the governments and individuals with their political and economic interests. This paper aimed at exploring twitter data of network actors talking about the marathon silver medalists on Rio2016, who showed solidarity with the Oromo protesters in Ethiopia on the marathon race finish line when he won silver. The aim is to discover important insight using social network analysis and mining. The hashtag #FeyisaLelisa was used for Twitter network search. The actors’ network was visualized and analyzed. It showed the central influencers during first 10 days in August, were international media outlets while it was changed to individual activist in September. The degree distribution of the network is scale free where the frequency of degrees decay by power low. Text mining was also used to arrive at meaningful themes from tweet corpus about the event selected for analysis. The semantic network indicated important clusters of concepts (15) that provided different insight regarding the why, who, where, how of the situation related to the event. The sentiments of the words in the tweets were also analyzed and indicated that 95% of the opinions in the tweets were either positive or neutral. Overall, the finding showed that Olympic stage protest of the marathoner brought the issue of Oromo protest to the global stage. The new research framework is proposed based for event-based social network analysis and mining based on the practical procedures followed in this research for event-based social media sense making.

Keywords: human rights, Olympics, social media, network analysis, social network ming

Procedia PDF Downloads 235
508 ICT Education: Digital History Learners

Authors: Lee Bih Ni, Elvis Fung

Abstract:

This article is to review and understand the new generation of students to understand their expectations and attitudes. There are a group of students on school projects, creative work, educational software and digital signal source, the use of social networking tools to communicate with friends and a part in the competition. Today's students have been described as the new millennium students. They use information and communication technology in a more creative and innovative at home than at school, because the information and communication technologies for different purposes, in the home, usually occur in school. They collaborate and communicate more effectively when they are at home. Most children enter school, they will bring about how to use information and communication technologies, some basic skills and some tips on how to use information and communication technology will provide a more advanced than most of the school's expectations. Many teachers can help students, however, still a lot of work, "tradition", without a computer, and did not see the "new social computing networks describe young people to learn and new ways of working life in the future", in the education system of the benefits of using a computer.

Keywords: ICT education, digital history, new generation of students, benefits of using a computer

Procedia PDF Downloads 386
507 Adjusted LOLE and EENS Indices for the Consideration of Load Excess Transfer in Power Systems Adequacy Studies

Authors: François Vallée, Jean-François Toubeau, Zacharie De Grève, Jacques Lobry

Abstract:

When evaluating the capacity of a generation park to cover the load in transmission systems, traditional Loss of Load Expectation (LOLE) and Expected Energy not Served (EENS) indices can be used. If those indices allow computing the annual duration and severity of load non-covering situations, they do not take into account the fact that the load excess is generally shifted from one penury state (hour or quarter of an hour) to the following one. In this paper, a sequential Monte Carlo framework is introduced in order to compute adjusted LOLE and EENS indices. Practically, those adapted indices permit to consider the effect of load excess transfer on the global adequacy of a generation park, providing thus a more accurate evaluation of this quantity.

Keywords: expected energy not served, loss of load expectation, Monte Carlo simulation, reliability, wind generation

Procedia PDF Downloads 389
506 A Pervasive System Architecture for Smart Environments in Internet of Things Context

Authors: Patrick Santos, João Casal, João Santos Luis Varandas, Tiago Alves, Carlos Romeiro, Sérgio Lourenço

Abstract:

Nowadays, technology makes it possible to, in one hand, communicate with various objects of the daily life through the Internet, and in the other, put these objects interacting with each other through this channel. Simultaneously, with the raise of smartphones as the most ubiquitous technology on persons lives, emerge new agents for these devices - Intelligent Personal Assistants. These agents have the goal of helping the user manage and organize his information as well as supporting the user in his/her day-to-day tasks. Moreover, other emergent concept is the Cloud Computing, which allows computation and storage to get out of the users devices, bringing benefits in terms of performance, security, interoperability and others. Connecting these three paradigms, in this work we propose an architecture for an intelligent system which provides an interface that assists the user on smart environments, informing, suggesting actions and allowing to manage the objects of his/her daily life.

Keywords: internet of things, cloud, intelligent personal assistant, architecture

Procedia PDF Downloads 493
505 BigCrypt: A Probable Approach of Big Data Encryption to Protect Personal and Business Privacy

Authors: Abdullah Al Mamun, Talal Alkharobi

Abstract:

As data size is growing up, people are became more familiar to store big amount of secret information into cloud storage. Companies are always required to need transfer massive business files from one end to another. We are going to lose privacy if we transmit it as it is and continuing same scenario repeatedly without securing the communication mechanism means proper encryption. Although asymmetric key encryption solves the main problem of symmetric key encryption but it can only encrypt limited size of data which is inapplicable for large data encryption. In this paper we propose a probable approach of pretty good privacy for encrypt big data using both symmetric and asymmetric keys. Our goal is to achieve encrypt huge collection information and transmit it through a secure communication channel for committing the business and personal privacy. To justify our method an experimental dataset from three different platform is provided. We would like to show that our approach is working for massive size of various data efficiently and reliably.

Keywords: big data, cloud computing, cryptography, hadoop, public key

Procedia PDF Downloads 303
504 Multithreading/Multiprocessing Simulation of The International Space Station Multibody System Using A Divide and Conquer Dynamics Formulation with Flexible Bodies

Authors: Luong A. Nguyen, Elihu Deneke, Thomas L. Harman

Abstract:

This paper describes a multibody dynamics algorithm formulated for parallel implementation on multiprocessor computing platforms using the divide-and-conquer approach. The system of interest is a general topology of rigid and elastic articulated bodies with or without loops. The algorithm is an extension of Featherstone’s divide and conquer approach to include the flexible-body dynamics formulation. The equations of motion, configured for the International Space Station (ISS) with its robotic manipulator arm as a system of articulated flexible bodies, are implemented in separate computer processors. The performance of this divide-and-conquer algorithm implementation in multiple processors is compared with an existing method implemented on a single processor.

Keywords: multibody dynamics, multiple processors, multithreading, divide-and-conquer algorithm, computational efficiency, flexible body dynamics

Procedia PDF Downloads 316
503 Vascular Crossed Aphasia in Dextrals: A Study on Bengali-Speaking Population in Eastern India

Authors: Durjoy Lahiri, Vishal Madhukar Sawale, Ashwani Bhat, Souvik Dubey, Gautam Das, Biman Kanti Roy, Suparna Chatterjee, Goutam Gangopadhyay

Abstract:

Crossed aphasia has been an area of considerable interest for cognitive researchers as it offers a fascinating insight into cerebral lateralization for language function. We conducted an observational study in the stroke unit of a tertiary care neurology teaching hospital in eastern India on subjects with crossed aphasia over a period of four years. During the study period, we detected twelve cases of crossed aphasia in strongly right-handed patients, caused by ischemic stroke. The age, gender, vernacular language and educational status of the patients were noted. Aphasia type and severity were assessed using Bengali version of Western Aphasia Battery (validated). Computed tomography, magnetic resonance imaging and angiography were used to evaluate the location and extent of the ischemic lesion in brain. Our series of 12 cases of crossed aphasia included 7 male and 5 female with mean age being 58.6 years. Eight patients were found to have Broca’s aphasia, 3 had trans-cortical motor aphasia and 1 patient suffered from global aphasia. Nine patients were having very severe aphasia and 3 suffered from mild aphasia. Mirror-image type of crossed aphasia was found in 3 patients, whereas 9 had anomalous variety. In our study crossed aphasia was found to be more frequent in males. Anomalous pattern was more common than mirror-image. Majority of the patients had motor-type aphasia and no patient was found to have pure comprehension deficit. We hypothesize that in Bengali-speaking right-handed population, lexical-semantic system of the language network remains loyal to the left hemisphere even if the phonological output system is anomalously located in the right hemisphere.

Keywords: aphasia, crossed, lateralization, language function, vascular

Procedia PDF Downloads 168
502 An ALM Matrix Completion Algorithm for Recovering Weather Monitoring Data

Authors: Yuqing Chen, Ying Xu, Renfa Li

Abstract:

The development of matrix completion theory provides new approaches for data gathering in Wireless Sensor Networks (WSN). The existing matrix completion algorithms for WSN mainly consider how to reduce the sampling number without considering the real-time performance when recovering the data matrix. In order to guarantee the recovery accuracy and reduce the recovery time consumed simultaneously, we propose a new ALM algorithm to recover the weather monitoring data. A lot of experiments have been carried out to investigate the performance of the proposed ALM algorithm by using different parameter settings, different sampling rates and sampling models. In addition, we compare the proposed ALM algorithm with some existing algorithms in the literature. Experimental results show that the ALM algorithm can obtain better overall recovery accuracy with less computing time, which demonstrate that the ALM algorithm is an effective and efficient approach for recovering the real world weather monitoring data in WSN.

Keywords: wireless sensor network, matrix completion, singular value thresholding, augmented Lagrange multiplier

Procedia PDF Downloads 363
501 Marketing Strategy of Agricultural Products in Remote Districts: A Case Study of Mudan Township, Taiwan

Authors: Ying-Hsiang Ho, Hsiao-Tseng Lin

Abstract:

Mudan Township is a remote mountainous area in Taiwan. In recent years, due to the migration of the population, inconvenient transportation, digital divide, and low production, agricultural products marketing have become a major issue. This research aims to develop the marketing strategy suitable for the agricultural products of the rural areas. The main objective of this work is to conduct in-depth interviews with scholars and experts in the marketing field, combined with the marketing 4P combination, to analyze and summarize the possible marketing strategies for agricultural products for remote districts. The interviews consist of seven experts from industry who have practical experience in producing, marketing, and selling agricultural products and three professors that have experience in teaching marketing management. The in-depth interviews are conducted for about an hour using a pre-drafted interview outline. The results of the interviews are summarized by semantic analysis and presented in a marketing 4P combination. The results indicate that in terms of products, high-quality products with original characteristics can be added through the implementation of production history, organic certification, and cultural packaging. In the place part, we found that the use of emerging communities, the emphasis on cross-industry alliances, the improvement of information application capabilities of rural households, production and marketing group, and contractual farming system are the development priorities. In terms of promotion, it should be an emphasis on the management of internet social media and word-of-mouth marketing. Mudan Township may consider promoting agricultural products through special festivals such as farmer's market, wild ginger flower season and hot spring season. This research also proposes relevant recommendations for the government's public sector and related industry reference for the promotion of agricultural products for remote area.

Keywords: marketing strategy, remote districts, agricultural products, in-depth interviews

Procedia PDF Downloads 110