Search results for: learning emotion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7065

Search results for: learning emotion

7065 Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network

Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah

Abstract:

Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.

Keywords: CNN, deep-learning, facial emotion recognition, machine learning

Procedia PDF Downloads 51
7064 The Relationships among Learning Emotion, Major Satisfaction, Learning Flow, and Academic Achievement in Medical School Students

Authors: S. J. Yune, S. Y. Lee, S. J. Im, B. S. Kam, S. Y. Baek

Abstract:

This study explored whether academic emotion, major satisfaction, and learning flow are associated with academic achievement in medical school. We know that emotion and affective factors are important factors in students' learning and performance. Emotion has taken the stage in much of contemporary educational psychology literature, no longer relegated to secondary status behind traditionally studied cognitive constructs. Medical school students (n=164) completed academic emotion, major satisfaction, and learning flow online survey. Academic performance was operationalized as students' average grade on two semester exams. For data analysis, correlation analysis, multiple regression analysis, hierarchical multiple regression analyses and ANOVA were conducted. The results largely confirmed the hypothesized relations among academic emotion, major satisfaction, learning flow and academic achievement. Positive academic emotion had a correlation with academic achievement (β=.191). Positive emotion had 8.5% explanatory power for academic achievement. Especially, sense of accomplishment had a significant impact on learning performance (β=.265). On the other hand, negative emotion, major satisfaction, and learning flow did not affect academic performance. Also, there were differences in sense of great (F=5.446, p=.001) and interest (F=2.78, p=.043) among positive emotion, boredom (F=3.55, p=.016), anger (F=4.346, p=.006), and petulance (F=3.779, p=.012) among negative emotion by grade. This study suggested that medical students' positive emotion was an important contributor to their academic achievement. At the same time, it is important to consider that some negative emotions can act to increase one’s motivation. Of particular importance is the notion that instructors can and should create learning environment that foster positive emotion for students. In doing so, instructors improve their chances of positively impacting students’ achievement emotions, as well as their subsequent motivation, learning, and performance. This result had an implication for medical educators striving to understand the personal emotional factors that influence learning and performance in medical training.

Keywords: academic achievement, learning emotion, learning flow, major satisfaction

Procedia PDF Downloads 231
7063 Multimodal Characterization of Emotion within Multimedia Space

Authors: Dayo Samuel Banjo, Connice Trimmingham, Niloofar Yousefi, Nitin Agarwal

Abstract:

Technological advancement and its omnipresent connection have pushed humans past the boundaries and limitations of a computer screen, physical state, or geographical location. It has provided a depth of avenues that facilitate human-computer interaction that was once inconceivable such as audio and body language detection. Given the complex modularities of emotions, it becomes vital to study human-computer interaction, as it is the commencement of a thorough understanding of the emotional state of users and, in the context of social networks, the producers of multimodal information. This study first acknowledges the accuracy of classification found within multimodal emotion detection systems compared to unimodal solutions. Second, it explores the characterization of multimedia content produced based on their emotions and the coherence of emotion in different modalities by utilizing deep learning models to classify emotion across different modalities.

Keywords: affective computing, deep learning, emotion recognition, multimodal

Procedia PDF Downloads 100
7062 Emotion Oriented Students' Opinioned Topic Detection for Course Reviews in Massive Open Online Course

Authors: Zhi Liu, Xian Peng, Monika Domanska, Lingyun Kang, Sannyuya Liu

Abstract:

Massive Open education has become increasingly popular among worldwide learners. An increasing number of course reviews are being generated in Massive Open Online Course (MOOC) platform, which offers an interactive feedback channel for learners to express opinions and feelings in learning. These reviews typically contain subjective emotion and topic information towards the courses. However, it is time-consuming to artificially detect these opinions. In this paper, we propose an emotion-oriented topic detection model to automatically detect the students’ opinioned aspects in course reviews. The known overall emotion orientation and emotional words in each review are used to guide the joint probabilistic modeling of emotion and aspects in reviews. Through the experiment on real-life review data, it is verified that the distribution of course-emotion-aspect can be calculated to capture the most significant opinioned topics in each course unit. This proposed technique helps in conducting intelligent learning analytics for teachers to improve pedagogies and for developers to promote user experiences.

Keywords: Massive Open Online Course (MOOC), course reviews, topic model, emotion recognition, topical aspects

Procedia PDF Downloads 231
7061 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features

Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova

Abstract:

The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.

Keywords: emotion recognition, facial recognition, signal processing, machine learning

Procedia PDF Downloads 279
7060 Facial Emotion Recognition Using Deep Learning

Authors: Ashutosh Mishra, Nikhil Goyal

Abstract:

A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations.

Keywords: facial recognition, computational intelligence, convolutional neural network, depth map

Procedia PDF Downloads 183
7059 Color-Based Emotion Regulation Model: An Affective E-Learning Environment

Authors: Sabahat Nadeem, Farman Ali Khan

Abstract:

Emotions are considered as a vital factor affecting the process of information handling, level of attention, memory capacity and decision making. Latest e-Learning systems are therefore taking into consideration the effective state of learners to make the learning process more effective and enjoyable. One such use of user’s affective information is in the systems that tend to regulate users’ emotions to a state optimally desirable for learning. So for, this objective has been tried to be achieved with the help of teaching strategies, background music, guided imagery, video clips and odors. Nevertheless, we know that colors can affect human emotions. Relationship between color and emotions has a strong influence on how we perceive our environment. Similarly, the colors of the interface can also affect the user positively as well as negatively. This affective behavior of color and its use as emotion regulation agent is not yet exploited. Therefore, this research proposes a Color-based Emotion Regulation Model (CERM), a new framework that can automatically adapt its colors according to user’s emotional state and her personality type and can help in producing a desirable emotional effect, aiming at providing an unobtrusive emotional support to the users of e-learning environment. The evaluation of CERM is carried out by comparing it with classical non-adaptive, static colored learning management system. Results indicate that colors of the interface, when carefully selected has significant positive impact on learner’s emotions.

Keywords: effective learning, e-learning, emotion regulation, emotional design

Procedia PDF Downloads 269
7058 An Investigation the Effectiveness of Emotion Regulation Training on the Reduction of Cognitive-Emotion Regulation Problem in Patients with Multiple Sclerosis

Authors: Mahboobeh Sadeghi, Zahra Izadi Khah, Mansour Hakim Javadi, Masoud Gholamali Lavasani

Abstract:

Background: Since there is a relation between psychological and physiological factors, the aim of this study was to examine the effect of Emotion Regulation training on cognitive emotion regulation problem in patients with Multiple Sclerosis(MS) Method: In a randomized clinical trial thirty patients diagnosed with Multiple Sclerosis referred to state welfare organization were selected. The sample group was randomized into either an experimental group or a nonintervention control group. The subjects participated in 75-minute treatment sessions held three times a week for 4weeks (12 sessions). All 30 individuals were administered with Cognitive Emotion Regulation questionnaire (CERQ). Participants completed the questionnaire in pretest and post-test. Data obtained from the questionnaire was analyzed using Mancova. Results: Emotion Regulation significantly decreased the Cognitive Emotion Regulation problems patients with Multiple sclerosis (p < 0.001). Conclusions: Emotion Regulation can be used for the treatment of cognitive-emotion regulation problem in Multiple sclerosis.

Keywords: Multiple Sclerosis, cognitive-emotion regulation, emotion regulation, MS

Procedia PDF Downloads 416
7057 Generating Music with More Refined Emotions

Authors: Shao-Di Feng, Von-Wun Soo

Abstract:

To generate symbolic music with specific emotions is a challenging task due to symbolic music datasets that have emotion labels are scarce and incomplete. This research aims to generate more refined emotions based on the training datasets that are only labeled with four quadrants in Russel’s 2D emotion model. We focus on the theory of Music Fadernet and map arousal and valence to the low-level attributes, and build a symbolic music generation model by combining transformer and GM-VAE. We adopt an in-attention mechanism for the model and improve it by allowing modulation by conditional information. And we show the music generation model could control the generation of music according to the emotions specified by users in terms of high-level linguistic expression and by manipulating their corresponding low-level musical attributes. Finally, we evaluate the model performance using a pre-trained emotion classifier against a pop piano midi dataset called EMOPIA, and by subjective listening evaluation, we demonstrate that the model could generate music with more refined emotions correctly.

Keywords: music generation, music emotion controlling, deep learning, semi-supervised learning

Procedia PDF Downloads 49
7056 Individualized Emotion Recognition Through Dual-Representations and Ground-Established Ground Truth

Authors: Valentina Zhang

Abstract:

While facial expression is a complex and individualized behavior, all facial emotion recognition (FER) systems known to us rely on a single facial representation and are trained on universal data. We conjecture that: (i) different facial representations can provide different, sometimes complementing views of emotions; (ii) when employed collectively in a discussion group setting, they enable more accurate emotion reading which is highly desirable in autism care and other applications context sensitive to errors. In this paper, we first study FER using pixel-based DL vs semantics-based DL in the context of deepfake videos. Our experiment indicates that while the semantics-trained model performs better with articulated facial feature changes, the pixel-trained model outperforms on subtle or rare facial expressions. Armed with these findings, we have constructed an adaptive FER system learning from both types of models for dyadic or small interacting groups and further leveraging the synthesized group emotions as the ground truth for individualized FER training. Using a collection of group conversation videos, we demonstrate that FER accuracy and personalization can benefit from such an approach.

Keywords: neurodivergence care, facial emotion recognition, deep learning, ground truth for supervised learning

Procedia PDF Downloads 93
7055 Correlation between Speech Emotion Recognition Deep Learning Models and Noises

Authors: Leah Lee

Abstract:

This paper examines the correlation between deep learning models and emotions with noises to see whether or not noises mask emotions. The deep learning models used are plain convolutional neural networks (CNN), auto-encoder, long short-term memory (LSTM), and Visual Geometry Group-16 (VGG-16). Emotion datasets used are Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS), Crowd-sourced Emotional Multimodal Actors Dataset (CREMA-D), Toronto Emotional Speech Set (TESS), and Surrey Audio-Visual Expressed Emotion (SAVEE). To make it four times bigger, audio set files, stretch, and pitch augmentations are utilized. From the augmented datasets, five different features are extracted for inputs of the models. There are eight different emotions to be classified. Noise variations are white noise, dog barking, and cough sounds. The variation in the signal-to-noise ratio (SNR) is 0, 20, and 40. In summation, per a deep learning model, nine different sets with noise and SNR variations and just augmented audio files without any noises will be used in the experiment. To compare the results of the deep learning models, the accuracy and receiver operating characteristic (ROC) are checked.

Keywords: auto-encoder, convolutional neural networks, long short-term memory, speech emotion recognition, visual geometry group-16

Procedia PDF Downloads 30
7054 Emotion Recognition in Video and Images in the Wild

Authors: Faizan Tariq, Moayid Ali Zaidi

Abstract:

Facial emotion recognition algorithms are expanding rapidly now a day. People are using different algorithms with different combinations to generate best results. There are six basic emotions which are being studied in this area. Author tried to recognize the facial expressions using object detector algorithms instead of traditional algorithms. Two object detection algorithms were chosen which are Faster R-CNN and YOLO. For pre-processing we used image rotation and batch normalization. The dataset I have chosen for the experiments is Static Facial Expression in Wild (SFEW). Our approach worked well but there is still a lot of room to improve it, which will be a future direction.

Keywords: face recognition, emotion recognition, deep learning, CNN

Procedia PDF Downloads 146
7053 Intrinsic Motivational Factor of Students in Learning Mathematics and Science Based on Electroencephalogram Signals

Authors: Norzaliza Md. Nor, Sh-Hussain Salleh, Mahyar Hamedi, Hadrina Hussain, Wahab Abdul Rahman

Abstract:

Motivational factor is mainly the students’ desire to involve in learning process. However, it also depends on the goal towards their involvement or non-involvement in academic activity. Even though, the students’ motivation might be in the same level, but the basis of their motivation may differ. In this study, it focuses on the intrinsic motivational factor which student enjoy learning or feeling of accomplishment the activity or study for its own sake. The intrinsic motivational factor of students in learning mathematics and science has found as difficult to be achieved because it depends on students’ interest. In the Program for International Student Assessment (PISA) for mathematics and science, Malaysia is ranked as third lowest. The main problem in Malaysian educational system, students tend to have extrinsic motivation which they have to score in exam in order to achieve a good result and enrolled as university students. The use of electroencephalogram (EEG) signals has found to be scarce especially to identify the students’ intrinsic motivational factor in learning science and mathematics. In this research study, we are identifying the correlation between precursor emotion and its dynamic emotion to verify the intrinsic motivational factor of students in learning mathematics and science. The 2-D Affective Space Model (ASM) was used in this research in order to identify the relationship of precursor emotion and its dynamic emotion based on the four basic emotions, happy, calm, fear and sad. These four basic emotions are required to be used as reference stimuli. Then, in order to capture the brain waves, EEG device was used, while Mel Frequency Cepstral Coefficient (MFCC) was adopted to be used for extracting the features before it will be feed to Multilayer Perceptron (MLP) to classify the valence and arousal axes for the ASM. The results show that the precursor emotion had an influence the dynamic emotions and it identifies that most students have no interest in mathematics and science according to the negative emotion (sad and fear) appear in the EEG signals. We hope that these results can help us further relate the behavior and intrinsic motivational factor of students towards learning of mathematics and science.

Keywords: EEG, MLP, MFCC, intrinsic motivational factor

Procedia PDF Downloads 325
7052 The Effectiveness of Dialectical Behavior Therapy in Developing Emotion Regulation Skill for Adolescent with Intellectual Disability

Authors: Shahnaz Safitri, Rose Mini Agoes Salim, Pratiwi Widyasari

Abstract:

Intellectual disability is characterized by significant limitations in intellectual functioning and adaptive behavior that appears before the age of 18 years old. The prominent impacts of intellectual disability in adolescents are failure to establish interpersonal relationships as socially expected and lower academic achievement. Meanwhile, it is known that emotion regulation skills have a role in supporting the functioning of individual, either by nourishing the development of social skills as well as by facilitating the process of learning and adaptation in school. This study aims to look for the effectiveness of Dialectical Behavior Therapy (DBT) in developing emotion regulation skills for adolescents with intellectual disability. DBT's special consideration toward clients’ social environment and their biological condition is foreseen to be the key for developing emotion regulation capacity for subjects with intellectual disability. Through observations on client's behavior, conducted before and after the completion of DBT intervention program, it was found that there is an improvement in client's knowledge and attitudes related to the mastery of emotion regulation skills. In addition, client's consistency to actually practice emotion regulation techniques over time is largely influenced by the support received from the client's social circles.

Keywords: adolescent, dialectical behavior therapy, emotion regulation, intellectual disability

Procedia PDF Downloads 262
7051 A Neuroscience-Based Learning Technique: Framework and Application to STEM

Authors: Dante J. Dorantes-González, Aldrin Balsa-Yepes

Abstract:

Existing learning techniques such as problem-based learning, project-based learning, or case study learning are learning techniques that focus mainly on technical details, but give no specific guidelines on learner’s experience and emotional learning aspects such as arousal salience and valence, being emotional states important factors affecting engagement and retention. Some approaches involving emotion in educational settings, such as social and emotional learning, lack neuroscientific rigorousness and use of specific neurobiological mechanisms. On the other hand, neurobiology approaches lack educational applicability. And educational approaches mainly focus on cognitive aspects and disregard conditioning learning. First, authors start explaining the reasons why it is hard to learn thoughtfully, then they use the method of neurobiological mapping to track the main limbic system functions, such as the reward circuit, and its relations with perception, memories, motivations, sympathetic and parasympathetic reactions, and sensations, as well as the brain cortex. The authors conclude explaining the major finding: The mechanisms of nonconscious learning and the triggers that guarantee long-term memory potentiation. Afterward, the educational framework for practical application and the instructors’ guidelines are established. An implementation example in engineering education is given, namely, the study of tuned-mass dampers for earthquake oscillations attenuation in skyscrapers. This work represents an original learning technique based on nonconscious learning mechanisms to enhance long-term memories that complement existing cognitive learning methods.

Keywords: emotion, emotion-enhanced memory, learning technique, STEM

Procedia PDF Downloads 55
7050 Tourist Emotion, Creative Experience and Behavioral Intention in Creative Tourism

Authors: Yi-Ju Lee

Abstract:

This study identified the hypothesized relationships among tourist emotion, creative experience, and behavioral intention of handmade ancient candy in Tainan, Taiwan. A face-to-face questionnaire survey was administered in Anping, Tainan. The result also revealed significant positive relationships between emotion, creative experience and behavioral intention in handmade activities. This paper provides additional suggestions for enhancing behavioral intention and guidance regarding creative tourism.

Keywords: creative tourism, sense of achievement, unique learning, interaction with instructors

Procedia PDF Downloads 296
7049 Parental Bonding and Cognitive Emotion Regulation

Authors: Fariea Bakul, Chhanda Karmaker

Abstract:

The present study was designed to investigate the effects of parental bonding on adult’s cognitive emotion regulation and also to investigate gender differences in parental bonding and cognitive emotion regulation. Data were collected by using convenience sampling technique from 100 adult students (50 males and 50 females) of different universities of Dhaka city, ages between 20 to 25 years, using Bengali version of Parental Bonding Inventory and Bengali version of Cognitive Emotion Regulation Questionnaire. The obtained data were analyzed by using multiple regression analysis and independent samples t-test. The results revealed that fathers care (β =0.317, p < 0.05) was only significantly positively associated with adult’s cognitive emotion regulation. Adjusted R² indicated that the model explained 30% of the variance in adult’s adaptive cognitive emotion regulation. No significant association was found between parental bonding and less adaptive cognitive emotion regulations. Results from independent samples t-test also revealed that there was no significant gender difference in both parental bonding and cognitive emotion regulations.

Keywords: cognitive emotion regulation, parental bonding, parental care, parental over-protection

Procedia PDF Downloads 328
7048 Facial Emotion Recognition with Convolutional Neural Network Based Architecture

Authors: Koray U. Erbas

Abstract:

Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented.

Keywords: convolutional neural network, deep learning, deep learning based FER, facial emotion recognition

Procedia PDF Downloads 215
7047 Age Related Changes in the Neural Substrates of Emotion Regulation: Mechanisms, Consequences, and Interventions

Authors: Yasaman Mohammadi

Abstract:

Emotion regulation is a complex process that allows individuals to manage and modulate their emotional responses in order to adaptively respond to environmental demands. As individuals age, emotion regulation abilities may decline, leading to an increased vulnerability to mood disorders and other negative health outcomes. Advances in neuroimaging techniques have greatly enhanced our understanding of the neural substrates underlying emotion regulation and age-related changes in these neural systems. Additionally, genetic research has identified several candidate genes that may influence age-related changes in emotion regulation. In this paper, we review recent findings from neuroimaging and genetic research on age-related changes in the neural substrates of emotion regulation, highlighting the mechanisms and consequences of these changes. We also discuss potential interventions, including cognitive and behavioral approaches, that may be effective in mitigating age-related declines in emotion regulation. We propose that a better understanding of the mechanisms underlying age-related changes in emotion regulation may lead to the development of more targeted interventions aimed at promoting healthy emotional functioning in older adults. Overall, this paper highlights the importance of studying age-related changes in emotion regulation and provides a roadmap for future research in this field.

Keywords: emotion regulation, aging, neural substrates, neuroimaging, emotional functioning, healthy aging

Procedia PDF Downloads 63
7046 Emotion Recognition Using Artificial Intelligence

Authors: Rahul Mohite, Lahcen Ouarbya

Abstract:

This paper focuses on the interplay between humans and computer systems and the ability of these systems to understand and respond to human emotions, including non-verbal communication. Current emotion recognition systems are based solely on either facial or verbal expressions. The limitation of these systems is that it requires large training data sets. The paper proposes a system for recognizing human emotions that combines both speech and emotion recognition. The system utilizes advanced techniques such as deep learning and image recognition to identify facial expressions and comprehend emotions. The results show that the proposed system, based on the combination of facial expression and speech, outperforms existing ones, which are based solely either on facial or verbal expressions. The proposed system detects human emotion with an accuracy of 86%, whereas the existing systems have an accuracy of 70% using verbal expression only and 76% using facial expression only. In this paper, the increasing significance and demand for facial recognition technology in emotion recognition are also discussed.

Keywords: facial reputation, expression reputation, deep gaining knowledge of, photo reputation, facial technology, sign processing, photo type

Procedia PDF Downloads 62
7045 Pattern Discovery from Student Feedback: Identifying Factors to Improve Student Emotions in Learning

Authors: Angelina A. Tzacheva, Jaishree Ranganathan

Abstract:

Interest in (STEM) Science Technology Engineering Mathematics education especially Computer Science education has seen a drastic increase across the country. This fuels effort towards recruiting and admitting a diverse population of students. Thus the changing conditions in terms of the student population, diversity and the expected teaching and learning outcomes give the platform for use of Innovative Teaching models and technologies. It is necessary that these methods adapted should also concentrate on raising quality of such innovations and have positive impact on student learning. Light-Weight Team is an Active Learning Pedagogy, which is considered to be low-stake activity and has very little or no direct impact on student grades. Emotion plays a major role in student’s motivation to learning. In this work we use the student feedback data with emotion classification using surveys at a public research institution in the United States. We use Actionable Pattern Discovery method for this purpose. Actionable patterns are patterns that provide suggestions in the form of rules to help the user achieve better outcomes. The proposed method provides meaningful insight in terms of changes that can be incorporated in the Light-Weight team activities, resources utilized in the course. The results suggest how to enhance student emotions to a more positive state, in particular focuses on the emotions ‘Trust’ and ‘Joy’.

Keywords: actionable pattern discovery, education, emotion, data mining

Procedia PDF Downloads 57
7044 Job Characteristics, Emotion Regulation and University Teachers' Well-Being: A Job Demands-Resources Analysis

Authors: Jiying Han

Abstract:

Teaching is widely known to be an emotional endeavor, and teachers’ ability to regulate their emotions is important for their well-being and the effectiveness of their classroom management. Considering that teachers’ emotion regulation is an underexplored issue in the field of educational research, some studies have attempted to explore the role of emotion regulation in teachers’ work and to explore the links between teachers’ emotion regulation, job characteristics, and well-being, based on the Job Demands-Resources (JD-R) model. However, those studies targeted primary or secondary teachers. So far, very little is known about the relationships between university teachers’ emotion regulation and its antecedents and effects on teacher well-being. Based on the job demands-resources model and emotion regulation theory, this study examined the relationships between job characteristics of university teaching (i.e., emotional job demands and teaching support), emotion regulation strategies (i.e., reappraisal and suppression), and university teachers’ well-being. Data collected from a questionnaire survey of 643 university teachers in China were analysed. The results indicated that (1) both emotional job demands and teaching support had desirable effects on university teachers’ well-being; (2) both emotional job demands and teaching support facilitated university teachers’ use of reappraisal strategies; and (3) reappraisal was beneficial to university teachers’ well-being, whereas suppression was harmful. These findings support the applicability of the job demands-resources model to the contexts of higher education and highlight the mediating role of emotion regulation.

Keywords: emotional job demands, teaching support, emotion regulation strategies, the job demands-resources model

Procedia PDF Downloads 114
7043 A Systematic Review Emotion Regulation through Music in Children, Adults, and Elderly

Authors: Fabiana Ribeiro, Ana Moreno, Antonio Oliveira, Patricia Oliveira-Silva

Abstract:

Music is present in our daily lives, and to our knowledge music is often used to change the emotions in the listeners. For this reason, the objective of this study was to explore and synthesize results examining the use and effects of music on emotion regulation in children, adults, and elderly, and clarify if the music is effective across ages to promote emotion regulation. A literature search was conducted using ISI Web of Knowledge, Pubmed, PsycINFO, and Scopus, inclusion criteria comprised children, adolescents, young, and old adults, including health population. Articles applying musical intervention, specifically musical listening, and assessing the emotion regulation directly through reports or neurophysiological measures were included in this review. Results showed age differences in the function of musical listening; initially, adolescents revealed age increments in emotional listening compared to children, and young adults in comparison to older adults, in which the first use music aiming to emotion regulation and social connection, while older adults also utilize music as emotion regulation searching for personal growth. Moreover, some of the studies showed that personal characteristics also would determine the efficiency of the emotion regulation strategy. In conclusion, it was observed that music could beneficiate all ages investigated, however, this review detected a necessity to develop adequate paradigms to explore the use of music for emotion regulation.

Keywords: music, emotion, regulation, musical listening

Procedia PDF Downloads 129
7042 Development of an EEG-Based Real-Time Emotion Recognition System on Edge AI

Authors: James Rigor Camacho, Wansu Lim

Abstract:

Over the last few years, the development of new wearable and processing technologies has accelerated in order to harness physiological data such as electroencephalograms (EEGs) for EEG-based applications. EEG has been demonstrated to be a source of emotion recognition signals with the highest classification accuracy among physiological signals. However, when emotion recognition systems are used for real-time classification, the training unit is frequently left to run offline or in the cloud rather than working locally on the edge. That strategy has hampered research, and the full potential of using an edge AI device has yet to be realized. Edge AI devices are computers with high performance that can process complex algorithms. It is capable of collecting, processing, and storing data on its own. It can also analyze and apply complicated algorithms like localization, detection, and recognition on a real-time application, making it a powerful embedded device. The NVIDIA Jetson series, specifically the Jetson Nano device, was used in the implementation. The cEEGrid, which is integrated to the open-source brain computer-interface platform (OpenBCI), is used to collect EEG signals. An EEG-based real-time emotion recognition system on Edge AI is proposed in this paper. To perform graphical spectrogram categorization of EEG signals and to predict emotional states based on input data properties, machine learning-based classifiers were used. Until the emotional state was identified, the EEG signals were analyzed using the K-Nearest Neighbor (KNN) technique, which is a supervised learning system. In EEG signal processing, after each EEG signal has been received in real-time and translated from time to frequency domain, the Fast Fourier Transform (FFT) technique is utilized to observe the frequency bands in each EEG signal. To appropriately show the variance of each EEG frequency band, power density, standard deviation, and mean are calculated and employed. The next stage is to identify the features that have been chosen to predict emotion in EEG data using the K-Nearest Neighbors (KNN) technique. Arousal and valence datasets are used to train the parameters defined by the KNN technique.Because classification and recognition of specific classes, as well as emotion prediction, are conducted both online and locally on the edge, the KNN technique increased the performance of the emotion recognition system on the NVIDIA Jetson Nano. Finally, this implementation aims to bridge the research gap on cost-effective and efficient real-time emotion recognition using a resource constrained hardware device, like the NVIDIA Jetson Nano. On the cutting edge of AI, EEG-based emotion identification can be employed in applications that can rapidly expand the research and implementation industry's use.

Keywords: edge AI device, EEG, emotion recognition system, supervised learning algorithm, sensors

Procedia PDF Downloads 69
7041 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition

Authors: A. Shoiynbek, K. Kozhakhmet, P. Menezes, D. Kuanyshbay, D. Bayazitov

Abstract:

Speech emotion recognition has received increasing research interest all through current years. There was used emotional speech that was collected under controlled conditions in most research work. Actors imitating and artificially producing emotions in front of a microphone noted those records. There are four issues related to that approach, namely, (1) emotions are not natural, and it means that machines are learning to recognize fake emotions. (2) Emotions are very limited by quantity and poor in their variety of speaking. (3) There is language dependency on SER. (4) Consequently, each time when researchers want to start work with SER, they need to find a good emotional database on their language. In this paper, we propose the approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describe the sequence of actions of the proposed approach. One of the first objectives of the sequence of actions is a speech detection issue. The paper gives a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian languages. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To illustrate the working capacity of the developed model, we have performed an analysis of speech detection and extraction from real tasks.

Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset

Procedia PDF Downloads 61
7040 Emotion Regulation Mediates the Relationship between Affective Disposition and Depression

Authors: Valentina Colonnello, Paolo Maria Russo

Abstract:

Studies indicate a link between individual differences in affective disposition and depression, as well as between emotion dysregulation and depression. However, the specific role of emotion dysregulation domains in mediating the relationship between affective disposition and depression remains largely unexplored. In three cross-sectional quantitative studies (total n = 1350), we explored the extent to which specific emotion regulation difficulties mediate the relationship between personal distress disposition (Study 1), separation distress as a primary emotional trait (Study 2), and an insecure, anxious attachment style (Study 3) and depression. Across all studies, we found that the relationship between affective disposition and depression was mediated by difficulties in accessing adaptive emotion regulation strategies. These findings underscore the potential for modifiable abilities that could be targeted through preventive interventions.

Keywords: emotions, mental health, individual traits, personality

Procedia PDF Downloads 8
7039 A Comparison of South East Asian Face Emotion Classification based on Optimized Ellipse Data Using Clustering Technique

Authors: M. Karthigayan, M. Rizon, Sazali Yaacob, R. Nagarajan, M. Muthukumaran, Thinaharan Ramachandran, Sargunam Thirugnanam

Abstract:

In this paper, using a set of irregular and regular ellipse fitting equations using Genetic algorithm (GA) are applied to the lip and eye features to classify the human emotions. Two South East Asian (SEA) faces are considered in this work for the emotion classification. There are six emotions and one neutral are considered as the output. Each subject shows unique characteristic of the lip and eye features for various emotions. GA is adopted to optimize irregular ellipse characteristics of the lip and eye features in each emotion. That is, the top portion of lip configuration is a part of one ellipse and the bottom of different ellipse. Two ellipse based fitness equations are proposed for the lip configuration and relevant parameters that define the emotions are listed. The GA method has achieved reasonably successful classification of emotion. In some emotions classification, optimized data values of one emotion are messed or overlapped to other emotion ranges. In order to overcome the overlapping problem between the emotion optimized values and at the same time to improve the classification, a fuzzy clustering method (FCM) of approach has been implemented to offer better classification. The GA-FCM approach offers a reasonably good classification within the ranges of clusters and it had been proven by applying to two SEA subjects and have improved the classification rate.

Keywords: ellipse fitness function, genetic algorithm, emotion recognition, fuzzy clustering

Procedia PDF Downloads 516
7038 Emotion-Convolutional Neural Network for Perceiving Stress from Audio Signals: A Brain Chemistry Approach

Authors: Anup Anand Deshmukh, Catherine Soladie, Renaud Seguier

Abstract:

Emotion plays a key role in many applications like healthcare, to gather patients’ emotional behavior. Unlike typical ASR (Automated Speech Recognition) problems which focus on 'what was said', it is equally important to understand 'how it was said.' There are certain emotions which are given more importance due to their effectiveness in understanding human feelings. In this paper, we propose an approach that models human stress from audio signals. The research challenge in speech emotion detection is finding the appropriate set of acoustic features corresponding to an emotion. Another difficulty lies in defining the very meaning of emotion and being able to categorize it in a precise manner. Supervised Machine Learning models, including state of the art Deep Learning classification methods, rely on the availability of clean and labelled data. One of the problems in affective computation is the limited amount of annotated data. The existing labelled emotions datasets are highly subjective to the perception of the annotator. We address the first issue of feature selection by exploiting the use of traditional MFCC (Mel-Frequency Cepstral Coefficients) features in Convolutional Neural Network. Our proposed Emo-CNN (Emotion-CNN) architecture treats speech representations in a manner similar to how CNN’s treat images in a vision problem. Our experiments show that Emo-CNN consistently and significantly outperforms the popular existing methods over multiple datasets. It achieves 90.2% categorical accuracy on the Emo-DB dataset. We claim that Emo-CNN is robust to speaker variations and environmental distortions. The proposed approach achieves 85.5% speaker-dependant categorical accuracy for SAVEE (Surrey Audio-Visual Expressed Emotion) dataset, beating the existing CNN based approach by 10.2%. To tackle the second problem of subjectivity in stress labels, we use Lovheim’s cube, which is a 3-dimensional projection of emotions. Monoamine neurotransmitters are a type of chemical messengers in the brain that transmits signals on perceiving emotions. The cube aims at explaining the relationship between these neurotransmitters and the positions of emotions in 3D space. The learnt emotion representations from the Emo-CNN are mapped to the cube using three component PCA (Principal Component Analysis) which is then used to model human stress. This proposed approach not only circumvents the need for labelled stress data but also complies with the psychological theory of emotions given by Lovheim’s cube. We believe that this work is the first step towards creating a connection between Artificial Intelligence and the chemistry of human emotions.

Keywords: deep learning, brain chemistry, emotion perception, Lovheim's cube

Procedia PDF Downloads 113
7037 Various Perspectives for the Concept of the Emotion Labor

Authors: Jae Soo Do, Kyoung-Seok Kim

Abstract:

Radical changes in the industrial environment, and spectacular developments of IT have changed the current of managements from people-centered to technology- or IT-centered. Interpersonal emotion exchanges have long become insipid and interactive services have also come as mechanical reactions. This study offers various concepts for the emotional labor based on traditional studies on emotional labor. Especially the present day, on which human emotions are subject to being served as machinized thing, is the time when the study on human emotions comes momentous. Precedent researches on emotional labors commonly and basically dealt with the relationship between the active group who performs actions and the passive group who is done with the action. This study focuses on the passive group and tries to offer a new perspective of 'liquid emotion' as a defence mechanism for the passive group from the external environment. Especially, this addresses a concrete discussion on directions of following studies on the liquid labor as a newly suggested perspective.

Keywords: emotion labor, surface acting, deep acting, liquid emotion

Procedia PDF Downloads 307
7036 Emotion Expression of the Leader and Collective Efficacy: Pride and Guilt

Authors: Hsiu-Tsu Cho

Abstract:

Collective efficacy refers to a group’s sense of its capacity to complete a task successfully or to reach objectives. Little effort has been expended on investigating the relationship between the emotion expression of a leader and collective efficacy. In this study, we examined the impact of the different emotions and emotion expression of a group leader on collective efficacy and explored whether the emotion–expressive effects differed under conditions of negative and positive emotions. A total of 240 undergraduate and graduate students recruited using Facebook and posters at a university participated in this research. The participants were separated randomly into 80 groups of four persons consisting of three participants and a confederate. They were randomly assigned to one of five conditions in a 2 (pride vs. guilt) × 2 (emotion expression of group leader vs. no emotion expression of group leader) factorial design and a control condition. Each four-person group was instructed to get the reward in a group competition of solving the five-disk Tower of Hanoi puzzle and making decisions on an investment case. We surveyed the participants by employing the emotional measure revised from previous researchers and collective efficacy questionnaire on a 5-point scale. To induce an emotion of pride (or guilt), the experimenter announced whether the group performance was good enough to have a chance of getting the reward (ranking the top or bottom 20% among all groups) after group task. The leader (confederate) could either express or not express a feeling of pride (or guilt) following the instruction according to the assigned condition. To check manipulation of emotion, we added a control condition under which the experimenter revealed no results regarding group performance in maintaining a neutral emotion. One-way ANOVAs and post hoc pairwise comparisons among the three emotion conditions (pride, guilt, and control condition) involved assigning pride and guilt scores (pride: F(1,75) = 32.41, p < .001; guilt: F(1,75) = 6.75, p < .05). The results indicated that manipulations of emotion were successful. A two-way between-measures ANOVA was conducted to examine the predictions of the main effects of emotion types and emotion expression as well as the interaction effect of these two variables on collective efficacy. The experimental findings suggest that pride did not affect collective efficacy (F(1,60) = 1.90, ns.) more than guilt did and that the group leader did not motivate collective efficacy regardless of whether he or she expressed emotion (F(1,60) = .89, ns.). However, the interaction effect of emotion types and emotion expression was statistically significant (F(1,60) = 4.27, p < .05, ω2 = .066); the effects accounted for 6.6% of the variance. Additional results revealed that, under the pride condition, the leader enhanced group efficacy when expressing emotion, whereas, under the guilt condition, an expression of emotion could reduce collective efficacy. Overall, these findings challenge the assumption that the effect of expression emotion are the same on all emotions and suggest that a leader should be cautious when expressing negative emotions toward a group to avoid reducing group effectiveness.

Keywords: collective efficacy, group leader, emotion expression, pride, guilty

Procedia PDF Downloads 291