{"title":"Development System for Emotion Detection Based on Brain Signals and Facial Images","authors":"Suprijanto, Linda Sari, Vebi Nadhira , IGN. Merthayasa. Farida I.M","volume":26,"journal":"International Journal of Psychological and Behavioral Sciences","pagesStart":13,"pagesEnd":21,"ISSN":"1307-6892","URL":"https:\/\/publications.waset.org\/pdf\/8769","abstract":"
Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive\/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback.<\/p>\r\n","references":"[1] C. Busso, Z. Deng *, S. Yildirim, M. Bulut, C. Min Lee, A.\r\nKazemzadeh, S. Lee, U. Neumann*, S. Narayanan, \"Analysis of\r\nEmotion Recognition using Facial Expressions, Speech and\r\nMultimodal Information\", the Proceedings of of ACM 6th\r\nInternational Conference on Multimodal Interfaces (ICMI 2004),\r\nState College, PA, Oct 2004\r\n[2] M. Pantic and L.J.M. Rothkrantz, \"Toward an Affect-Sensitive Multimodal\r\nHuman-Computer Interaction\", Invited Speaker on the Proceedings\r\nof IEEE Vol. 91, No. 9, September 2003\r\n[3] D. DeCarlo, D. Metaxas, \"Optical Flow Constraints on Deformable\r\nModels with Applications to Face Tracking\", International Journal of\r\nComputer Vision, 38(2), pp. 99-127, July 2000\r\n[4] S. Periaswamy and H. Farid. Elastic registration in the presence of\r\nintensity variations. IEEE Transactions on Medical Imaging, 2003\r\n[5] Cohn, J.F., Zlochower, A.J., Lien, J.J., and Kanade, T., \"Feature-\r\nPoint Tracking by Optical Flow Discriminates Subtle Differences in\r\nFacial Expression,\" The Proceedings of Third IEEE International\r\nConference on Automatic Face and Gesture Recognition, Japan, April\r\n1998\r\n[6] J. Lien, T. Kanade, J. Cohn, and C. Li, \"Automatic Analysis of Facial\r\nExpressions: The State of the Art\", IEEE Transactions on Pattern\r\nAnalysis and Machine Intelligence, pp. 1424-1445, Vol.22, Issue 12,\r\n2000\r\n[7] Sato, S., Nishio, K., and Ando, Y. \u00d4\u00c7\u00ffPropagation of alpha waves\r\ncorresponding to subjective preference from the right hemisphere to\r\nthe left with changes in the IACC of a sound field-. J. Temporal Des.\r\nArch. Environ. 3, 60-69. 2003\r\n[8] John L. Semmlow, Biosignal and Biomedical Image Processing:\r\nMATLAB-Based Applications, (Signal Processing and Communications,\r\n22), Marcel Dekker, Inc. 2004\r\n[9] Webster, John G (1998), \"Medical Instrumentation: Application and\r\nDesign\": John Wiley and Sons, Third Edition, USA.\r\n[10] James A. Cond, John J.B. Allen and E. Harmon, Voluntary facial\r\nexpression and hemispheric asymmetry over the frontal cortex, Psychophysiology,\r\n38 ~2001!, 912-925. Cambridge\r\n[11] Carmen J. Duthoit, et.al, \"Optical Flow Image Analysis of Facial\r\nExpression of Human Emotion - Forensic Applications, Journal e-\r\nForensic, January 2008,","publisher":"World Academy of Science, Engineering and Technology","index":"Open Science Index 26, 2009"}