Electroencephalography-Based Intention Recognition and Consensus Assessment during Emergency Response
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32845
Electroencephalography-Based Intention Recognition and Consensus Assessment during Emergency Response

Authors: Siyao Zhu, Yifang Xu

Abstract:

After natural and man-made disasters, robots can bypass the danger, expedite the search, and acquire unprecedented situational awareness to design rescue plans. Brain-computer interface is a promising option to overcome the limitations of tedious manual control and operation of robots in the urgent search-and-rescue tasks. This study aims to test the feasibility of using electroencephalography (EEG) signals to decode human intentions and detect the level of consensus on robot-provided information. EEG signals were classified using machine-learning and deep-learning methods to discriminate search intentions and agreement perceptions. The results show that the average classification accuracy for intention recognition and consensus assessment is 67% and 72%, respectively, proving the potential of incorporating recognizable users’ bioelectrical responses into advanced robot-assisted systems for emergency response.

Keywords: Consensus assessment, electroencephalogram, EEG, emergency response, human-robot collaboration, intention recognition, search and rescue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 276

References:


[1] J. Ayers, J. L. Davis, and A. Rudolph, “Neurotechnology for biomimetic robots,” p. 636, 2002.
[2] N. Pezeshkian, H. G. Nguyen, and A. Burmeister, “Unmanned ground vehicle radio relay deployment system for non-line-of-sight operations,” undefined, Jan. 2007, doi: 10.21236/ADA475525.
[3] E. C. Leuthardt, G. Schalk, J. Roland, A. Rouse, and D. W. Moran, “Evolution of brain-computer interfaces: Going beyond classic motor physiology,” Neurosurg. Focus, vol. 27, no. 1, 2009, doi: 10.3171/2009.4.FOCUS0979.
[4] R. Richer, N. Zhao, J. Amores, B. M. Eskofier, and J. A. Paradiso, “Real-time Mental State Recognition using a Wearable EEG,” undefined, vol. 2018, pp. 5495–5498, Jul. 2018, doi: 10.1109/EMBC.2018.8513653.
[5] H. Adeli and S. Ghosh-Dastidar, “Automated EEG-based diagnosis of neurological disorders: inventing the future of neurology,” p. 387, 2010.
[6] J. M. Fuster, “Frontal lobe and cognitive development,” J. Neurocytol., vol. 31, no. 3–5, pp. 373–385, Mar. 2002, doi: 10.1023/A:1024190429920.
[7] D. R. Euston, A. J. Gruber, and B. L. McNaughton, “The role of medial prefrontal cortex in memory and decision making,” Neuron, vol. 76, no. 6, pp. 1057–1070, Dec. 2012, doi: 10.1016/J.NEURON.2012.12.002.
[8] A. Rehman and Y. Al Khalili, “Neuroanatomy, Occipital Lobe,” StatPearls, Jul. 2021, Accessed: Nov. 11, 2021. (Online). Available: https://www.ncbi.nlm.nih.gov/books/NBK544320/.
[9] R. Lindenberg and L. Scheef, “Supramodal language comprehension: role of the left temporal lobe for listening and reading,” Neuropsychologia, vol. 45, no. 10, pp. 2407–2415, 2007, doi: 10.1016/J.NEUROPSYCHOLOGIA.2007.02.008.
[10] L. Fogassi, P. F. Ferrari, B. Gesierich, S. Rozzi, F. Chersi, and G. Rizzolotti, “Neuroscience: Parietal lobe: From action organization to intention understanding,” Science (80-.)., vol. 308, no. 5722, pp. 662–667, Apr. 2005, doi: 10.1126/SCIENCE.1106138/SUPPL_FILE/FOGASSI_SOM.PDF.
[11] D. Bzdok, G. Hartwigsen, A. Reid, A. R. Laird, P. T. Fox, and S. B. Eickhoff, “Left inferior parietal lobe engagement in social cognition and language,” Neurosci. Biobehav. Rev., vol. 68, pp. 319–334, Sep. 2016, doi: 10.1016/J.NEUBIOREV.2016.02.024.
[12] J. M. Spielberg, J. L. Stewart, R. L. Levin, G. A. Miller, and W. Heller, “Prefrontal Cortex, Emotion, and Approach/Withdrawal Motivation,” Soc. Personal. Psychol. Compass, vol. 2, no. 1, pp. 135–153, Jan. 2008, doi: 10.1111/J.1751-9004.2007.00064.X.
[13] A. Maydeu-olivares, “Quantitative Methods in Psychology Quantitative Methods in Psychology,” Psychol. Bull., vol. 112, no. 1, pp. 155–159, 1992.
[14] M. Toscani, T. Marzi, S. Righi, M. P. Viggiano, and S. Baldassi, “Alpha waves: a neural signature of visual suppression,” Exp. brain Res., vol. 207, no. 3–4, pp. 213–219, Dec. 2010, doi: 10.1007/S00221-010-2444-7.
[15] M. Zaepffel, R. Trachel, B. E. Kilavik, and T. Brochier, “Modulations of EEG Beta Power during Planning and Execution of Grasping Movements,” PLoS One, vol. 8, no. 3, p. e60060, Mar. 2013, doi: 10.1371/JOURNAL.PONE.0060060.
[16] Z. Yaple, M. Martinez-Saito, N. Novikov, D. Altukhov, A. Shestakova, and V. Klucharev, “Power of feedback-induced beta oscillations reflect omission of rewards: Evidence from an eeg gambling study,” Front. Neurosci., vol. 12, Oct. 2018, doi: 10.3389/FNINS.2018.00776/FULL.
[17] M. Gola, M. Magnuski, I. Szumska, and A. Wróbel, “EEG beta band activity is related to attention and attentional deficits in the visual performance of elderly subjects,” Int. J. Psychophysiol., vol. 89, no. 3, pp. 334–341, Sep. 2013, doi: 10.1016/J.IJPSYCHO.2013.05.007.
[18] X. Jia and A. Kohn, “Gamma Rhythms in the Brain,” PLoS Biol., vol. 9, no. 4, Apr. 2011, doi: 10.1371/JOURNAL.PBIO.1001045.
[19] G. Chanel, J. Kronegg, D. Grandjean, and T. Pun, “Emotion Assessment: Arousal Evaluation Using EEG’s and Peripheral Physiological Signals,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 4105 LNCS, pp. 530–537, 2006, doi: 10.1007/11848035_70.
[20] E. Lew, R. Chavarriaga, S. Silvoni, and J. del R. Millán, “Detection of self-paced reaching movement intention from EEGsignals,” Front. Neuroeng., vol. 5, no. July, Jul. 2012, doi: 10.3389/FNENG.2012.00013.
[21] Y. Wang and S. Makeig, “Predicting Intended Movement Direction Using EEG from Human Posterior Parietal Cortex,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 5638 LNAI, pp. 437–446, 2009, doi: 10.1007/978-3-642-02812-0_52.
[22] C. Babiloni et al., “Human movement-related potentials vs desynchronization of EEG alpha rhythm: a high-resolution EEG study,” Neuroimage, vol. 10, no. 6, pp. 658–665, 1999, doi: 10.1006/NIMG.1999.0504.
[23] I. H. Kim, J. W. Kim, S. Haufe, and S. W. Lee, “Detection of braking intention in diverse situations during simulated driving based on EEG feature combination,” J. Neural Eng., vol. 12, no. 1, p. 016001, Nov. 2014, doi: 10.1088/1741-2560/12/1/016001.
[24] S. Ge et al., “Temporal-Spatial Features of Intention Understanding Based on EEG-fNIRS Bimodal Measurement,” IEEE Access, vol. 5, pp. 14245–14258, 2017, doi: 10.1109/ACCESS.2017.2723428.
[25] S.-H. Oh, “Subject Independent Classification of Implicit Intention Based on EEG Signals,” Int. J. Contents, vol. 12, no. 3, pp. 12–16, Sep. 2016, doi: 10.5392/IJOC.2016.12.3.012.
[26] J. S. Kang, U. Park, V. Gonuguntla, K. C. Veluvolu, and M. Lee, “Human implicit intent recognition based on the phase synchrony of EEG signals,” Pattern Recognit. Lett., vol. 66, pp. 144–152, Nov. 2015, doi: 10.1016/J.PATREC.2015.06.013.
[27] X. Wang, Y. Huang, Q. Ma, and N. Li, “Event-related potential P2 correlates of implicit aesthetic experience,” Neuroreport, vol. 23, no. 14, pp. 862–866, Oct. 2012, doi: 10.1097/WNR.0B013E3283587161.
[28] W. Zhang, J. Jin, A. Wang, Q. Ma, and H. Yu, “Consumers’ Implicit Motivation of Purchasing Luxury Brands: An EEG Study,” Psychol. Res. Behav. Manag., vol. 12, p. 913, 2019, doi: 10.2147/PRBM.S215751.
[29] M. Falkenstein, J. Hoormann, S. Christ, and J. Hohnsbein, “ERP components on reaction errors and their functional significance: a tutorial,” Biol. Psychol., vol. 51, no. 2–3, pp. 87–107, Jan. 2000, doi: 10.1016/S0301-0511(99)00031-9.
[30] A. F. Salazar-Gomez, J. Delpreto, S. Gil, F. H. Guenther, and D. Rus, “Correcting robot mistakes in real time using EEG signals,” Proc. - IEEE Int. Conf. Robot. Autom., pp. 6570–6577, Jul. 2017, doi: 10.1109/ICRA.2017.7989777.
[31] C. Blais, D. M. Ellis, K. M. Wingert, A. B. Cohen, and G. A. Brewer, “Alpha suppression over parietal electrode sites predicts decisions to trust,” Soc. Neurosci., vol. 14, no. 2, pp. 226–235, Mar. 2019, doi: 10.1080/17470919.2018.1433717.
[32] S. Oh, Y. Seong, S. Yi, and S. Park, “Neurological Measurement of Human Trust in Automation Using Electroencephalogram,” Int. J. Fuzzy Log. Intell. Syst., vol. 20, no. 4, pp. 261–271, 2020, doi: 10.5391/IJFIS.2020.20.4.261.
[33] M. Wang, A. Hussein, R. F. Rojas, K. Shafi, and H. A. Abbass, “EEG-Based Neural Correlates of Trust in Human-Autonomy Interaction,” Proc. 2018 IEEE Symp. Ser. Comput. Intell. SSCI 2018, pp. 350–357, Jan. 2019, doi: 10.1109/SSCI.2018.8628649.
[34] K. Akash, W.-L. Hu, N. Jain, and T. Reid, “A Classification Model for Sensing Human Trust in Machines Using EEG and GSR,” ACM Trans. Interact. Intell. Syst., vol. 8, no. 4, Mar. 2018, doi: 10.1145/3132743.
[35] A. Delorme and S. Makeig, “EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis,” J. Neurosci. Methods, vol. 134, no. 1, pp. 9–21, Mar. 2004, doi: 10.1016/J.JNEUMETH.2003.10.009.
[36] M. Akin, “Comparison of Wavelet Transform and FFT Methods in the Analysis of EEG Signals,” J. Med. Syst. 2002 263, vol. 26, no. 3, pp. 241–247, Jun. 2002, doi: 10.1023/A:1015075101937.
[37] A. Yazdani, T. Ebrahimi, and U. Hoffmann, “Classification of EEG signals using Dempster Shafer theory and a K-nearest neighbor classifier,” 2009 4th Int. IEEE/EMBS Conf. Neural Eng. NER ’09, pp. 327–330, 2009, doi: 10.1109/NER.2009.5109299.
[38] A. Subasi and E. Erçelebi, “Classification of EEG signals using neural network and logistic regression,” Comput. Methods Programs Biomed., vol. 78, no. 2, pp. 87–99, May 2005, doi: 10.1016/J.CMPB.2004.10.009.
[39] A. Sharmila and P. Geethanjali, “DWT Based Detection of Epileptic Seizure from EEG Signals Using Naive Bayes and k-NN Classifiers,” IEEE Access, vol. 4, pp. 7716–7727, 2016, doi: 10.1109/ACCESS.2016.2585661.
[40] D. Garrett, D. A. Peterson, C. W. Anderson, and M. H. Thaut, “Comparison of linear, nonlinear, and feature selection methods for EEG signal classification,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 11, no. 2, pp. 141–144, Jun. 2003, doi: 10.1109/TNSRE.2003.814441.
[41] A. Craik, Y. He, and J. L. Contreras-Vidal, “Deep learning for electroencephalogram (EEG) classification tasks: A review,” J. Neural Eng., vol. 16, no. 3, 2019, doi: 10.1088/1741-2552/AB0AB5.
[42] Y. Roy, H. Banville, I. Albuquerque, A. Gramfort, T. H. Falk, and J. Faubert, “Deep learning-based electroencephalography analysis: a systematic review,” J. Neural Eng., vol. 16, no. 5, p. 051001, Aug. 2019, doi: 10.1088/1741-2552/AB260C.
[43] T. Gehr, “Electroencephalogram (EEG) Classification of Known Vs Unknown Skills: Unpublished. Data from Jo Shattuck’s Neural Substrates of Motor Learning the Effect of Real Time Haptic Feedback on Kinesthetic Awareness and Motor Skill Performance View project Electroencephalogram (EEG) Classification,” 2017, Accessed: Nov. 12, 2021. (Online). Available: https://www.researchgate.net/publication/333853132.
[44] V. J. Lawhern, A. J. Solon, N. R. Waytowich, S. M. Gordon, C. P. Hung, and B. J. Lance, “EEGNet: A Compact Convolutional Network for EEG-based Brain-Computer Interfaces,” J. Neural Eng., vol. 15, no. 5, Nov. 2016, doi: 10.1088/1741-2552/aace8c.
[45] R. Schirrmeister, L. Gemein, K. Eggensperger, F. Hutter, and T. Ball, “Deep learning with convolutional neural networks for decoding and visualization of EEG pathology,” 2017 IEEE Signal Process. Med. Biol. Symp. SPMB 2017 - Proc., vol. 2018-January, pp. 1–7, Aug. 2017, doi: 10.1109/SPMB.2017.8257015.