Search results for: editable emoticon
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9

Search results for: editable emoticon

9 Effective Editable Emoticon Description Schema for Mobile Applications

Authors: Jiwon Lee, Si-hwan Jang, Sanghyun Joo

Abstract:

The popularity of emoticons are on the rise since the mobile messengers are generalized. At the same time, few problems of emoticons are also occurred due to innate characteristics of emoticons. Too many emoticons make difficult people to select one which is well-suited for user's intention. On the contrary to this, sometimes user cannot find the emoticon which expresses user's exact intention. Poor information delivery of emoticon is another problem due to a major part of current emoticons are focused on emotion delivery. In this situation, we propose a new concept of emoticons, editable emoticons, to solve above drawbacks of emoticons. User can edit the components inside the proposed editable emoticon and send it to express his exact intention. By doing so, the number of editable emoticons can be maintained reasonable, and it can express user's exact intention. Further, editable emoticons can be used as information deliverer according to user's intention and editing skills. In this paper, we propose the concept of editable emoticons and schema based editable emoticon description method. The proposed description method is 200 times superior to the compared screen capturing method in the view of transmission bandwidth. Further, the description method is designed to have compatibility since it follows MPEG-UD international standard. The proposed editable emoticons can be exploited not only mobile applications, but also various fields such as education and medical field.

Keywords: description schema, editable emoticon, emoticon transmission, mobile applications

Procedia PDF Downloads 272
8 The Use of Emoticons in Polite Phrases of Greeting and Thanks

Authors: Zuzana Komrsková

Abstract:

This paper shows the connection between emoticons and politeness in written computer-mediated communication. It studies if there are some differences in the use of emoticon between Czech and English written tweets. My assumptions about the use of emoticons were based on the use of greetings and thanks in real, face to face situations. The first assumption, that welcome greeting phrase would be accompanied by positive emoticon was correct. But for the farewell greeting both positive and negative emoticons are possible. My results show lower frequency of negative emoticons in this context. I also found quite often both positive and negative emoticon in the same tweet. The expression of gratitude is associated with positive emotions. The results show that emoticons accompany polite phrases of greeting and thanks very often both in Czech and English. The use of emoticons with studied polite phrases shows that emoticons have become an integral part of these phrases.

Keywords: Czech, emoticon, english, politeness, twitter

Procedia PDF Downloads 382
7 Customizable Sonic EEG Neurofeedback Environment to Train Self-Regulation of Momentary Mental and Emotional State

Authors: Cyril Kaplan, Nikola Jajcay

Abstract:

We developed purely sonic, musical based, highly customizable EEG neurofeedback environment designed to administer a new neurofeedback training protocol. The training protocol concentrates on improving the ability to switch between several mental states characterized by different levels of arousal, each of them correlated to specific brain wave activity patterns in several specific regions of neocortex. This paper describes the neurofeedback training environment we developed and its specificities, thus can be helpful as a manual to guide other neurofeedback users (both researchers and practitioners) interested in our editable open source program (available to download and usage under CC license). Responses and reaction of first trainees that used our environment are presented in this article. Combination of qualitative methods (thematic analysis of neurophenomenological insights of trainees and post-session semi-structured interviews) and quantitative methods (power spectra analysis of EEG recorded during the training) were employed to obtain a multifaceted view on our new training protocol.

Keywords: EEG neurofeedback, mixed methods, self-regulation, switch-between-states training

Procedia PDF Downloads 191
6 Colour Recognition Pen Technology in Dental Technique and Dental Laboratories

Authors: M. Dabirinezhad, M. Bayat Pour, A. Dabirinejad

Abstract:

Recognition of the color spectrum of the teeth plays a significant role in the dental laboratories to produce dentures. Since there are various types and colours of teeth for each patient, there is a need to specify the exact and the most suitable colour to produce a denture. Usually, dentists utilize pallets to identify the color that suits a patient based on the color of the adjacent teeth. Consistent with this, there can be human errors by dentists to recognize the optimum colour for the patient, and it can be annoying for the patient. According to the statistics, there are some claims from the patients that they are not satisfied by the colour of their dentures after the installation of the denture in their mouths. This problem emanates from the lack of sufficient accuracy during the colour recognition process of denture production. The colour recognition pen (CRP) is a technology to distinguish the colour spectrum of the intended teeth with the highest accuracy. CRP is equipped with a sensor that is capable to read and analyse a wide range of spectrums. It is also connected to a database that contains all the spectrum ranges, which exist in the market. The database is editable and updatable based on market requirements. Another advantage of this invention can be mentioned as saving time for the patients since there is no need to redo the denture production in case of failure on the first try.

Keywords: colour recognition pen, colour spectrum, dental laboratory, denture

Procedia PDF Downloads 172
5 Towards a Large Scale Deep Semantically Analyzed Corpus for Arabic: Annotation and Evaluation

Authors: S. Alansary, M. Nagi

Abstract:

This paper presents an approach of conducting semantic annotation of Arabic corpus using the Universal Networking Language (UNL) framework. UNL is intended to be a promising strategy for providing a large collection of semantically annotated texts with formal, deep semantics rather than shallow. The result would constitute a semantic resource (semantic graphs) that is editable and that integrates various phenomena, including predicate-argument structure, scope, tense, thematic roles and rhetorical relations, into a single semantic formalism for knowledge representation. The paper will also present the Interactive Analysis​ tool for automatic semantic annotation (IAN). In addition, the cornerstone of the proposed methodology which are the disambiguation and transformation rules, will be presented. Semantic annotation using UNL has been applied to a corpus of 20,000 Arabic sentences representing the most frequent structures in the Arabic Wikipedia. The representation, at different linguistic levels was illustrated starting from the morphological level passing through the syntactic level till the semantic representation is reached. The output has been evaluated using the F-measure. It is 90% accurate. This demonstrates how powerful the formal environment is, as it enables intelligent text processing and search.

Keywords: semantic analysis, semantic annotation, Arabic, universal networking language

Procedia PDF Downloads 563
4 BIM Data and Digital Twin Framework: Preserving the Past and Predicting the Future

Authors: Mazharuddin Syed Ahmed

Abstract:

This research presents a framework used to develop The Ara Polytechnic College of Architecture Studies building “Kahukura” which is Green Building certified. This framework integrates the development of a smart building digital twin by utilizing Building Information Modelling (BIM) and its BIM maturity levels, including Levels of Development (LOD), eight dimensions of BIM, Heritage-BIM (H-BIM) and Facility Management BIM (FM BIM). The research also outlines a structured approach to building performance analysis and integration with the circular economy, encapsulated within a five-level digital twin framework. Starting with Level 1, the Descriptive Twin provides a live, editable visual replica of the built asset, allowing for specific data inclusion and extraction. Advancing to Level 2, the Informative Twin integrates operational and sensory data, enhancing data verification and system integration. At Level 3, the Predictive Twin utilizes operational data to generate insights and proactive management suggestions. Progressing to Level 4, the Comprehensive Twin simulates future scenarios, enabling robust “what-if” analyses. Finally, Level 5, the Autonomous Twin, represents the pinnacle of digital twin evolution, capable of learning and autonomously acting on behalf of users.

Keywords: building information modelling, circular economy integration, digital twin, predictive analytics

Procedia PDF Downloads 19
3 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 294
2 Creating a Digital Map to Monitor the Care of People Living with HIV/Aids in Porto Alegre, Brazil: An Experience Report

Authors: Tiago Sigal Linhares, Ana Amélia Nascimento da Silva Bones, Juliana Miola, McArthur Alexander Barrow, Airton Tetelbom Stein

Abstract:

Introduction: As a result of increased globalization and changing migration trends, it is expected that a significant portion of People Living with HIV/AIDS (PLWHA) will change their place of residence over time. In order to provide better health care, monitor the HIV epidemic and plan urban public health care and policies, there is a growing need to formulate a strategy for monitoring PLWHA care, location and migration patterns. The Porto Alegre District is characterized by a high prevalence of PLWHA and is considered one of the epicenters of HIV epidemic in Latin America. Objectives: The aim of this study is to create a digital and easily editable map in order to create a visual representation of the location of PLWHA and to monitor their migration within the city and the country in an effort to promote longitudinal care. Methods: This Experience Report used Google Maps Map Creator to generate an active digital map showing the location and changes in residence of 165 PLWHA who received care at two Primary Health Care (PHC) clinics, which attended an estimated population of five thousand patients, in downtown Porto Alegre over the last four years. Their current addresses were discovered in the unified Brazilian health care system digital records (e-SUS) and updated on the map. Results: A digital map with PLWHA current residence location was created. It was possible to demonstrate visually areas with a large concentration of PLWHA and the migration of the population within the city as wells as other cities, regions and states. Conclusions: An easily reproducible and free map could aid in PLWHA monitoring, urban public health planning, target interventions and situational diagnosis. Moreover, a visual representation of PLWHA location and migration could help bring more attention and investments to areas with geographic inequities or higher prevalence of PLWHA. It also enables notification of local PHC units of monitored patients inside their area, which are in clinical risk or with treatment abandonment through active case findings, improving the care of PLWHA.

Keywords: health care, medical public health, theoretical and conceptual innovations, urban public health

Procedia PDF Downloads 99
1 Predicting OpenStreetMap Coverage by Means of Remote Sensing: The Case of Haiti

Authors: Ran Goldblatt, Nicholas Jones, Jennifer Mannix, Brad Bottoms

Abstract:

Accurate, complete, and up-to-date geospatial information is the foundation of successful disaster management. When the 2010 Haiti Earthquake struck, accurate and timely information on the distribution of critical infrastructure was essential for the disaster response community for effective search and rescue operations. Existing geospatial datasets such as Google Maps did not have comprehensive coverage of these features. In the days following the earthquake, many organizations released high-resolution satellite imagery, catalyzing a worldwide effort to map Haiti and support the recovery operations. Of these organizations, OpenStreetMap (OSM), a collaborative project to create a free editable map of the world, used the imagery to support volunteers to digitize roads, buildings, and other features, creating the most detailed map of Haiti in existence in just a few weeks. However, large portions of the island are still not fully covered by OSM. There is an increasing need for a tool to automatically identify which areas in Haiti, as well as in other countries vulnerable to disasters, that are not fully mapped. The objective of this project is to leverage different types of remote sensing measurements, together with machine learning approaches, in order to identify geographical areas where OSM coverage of building footprints is incomplete. Several remote sensing measures and derived products were assessed as potential predictors of OSM building footprints coverage, including: intensity of light emitted at night (based on VIIRS measurements), spectral indices derived from Sentinel-2 satellite (normalized difference vegetation index (NDVI), normalized difference built-up index (NDBI), soil-adjusted vegetation index (SAVI), urban index (UI)), surface texture (based on Sentinel-1 SAR measurements)), elevation and slope. Additional remote sensing derived products, such as Hansen Global Forest Change, DLR`s Global Urban Footprint (GUF), and World Settlement Footprint (WSF), were also evaluated as predictors, as well as OSM street and road network (including junctions). Using a supervised classification with a random forest classifier resulted in the prediction of 89% of the variation of OSM building footprint area in a given cell. These predictions allowed for the identification of cells that are predicted to be covered but are actually not mapped yet. With these results, this methodology could be adapted to any location to assist with preparing for future disastrous events and assure that essential geospatial information is available to support the response and recovery efforts during and following major disasters.

Keywords: disaster management, Haiti, machine learning, OpenStreetMap, remote sensing

Procedia PDF Downloads 101