Full Text:   <5451>

Summary:  <467>

CLC number: TP39

On-line Access: 2024-08-27

Received: 2023-10-17

Revision Accepted: 2024-05-08

Crosschecked: 2022-07-28

Cited: 0

Clicked: 2430

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Shaojie LI

https://orcid.org/0000-0003-2432-0482

Wei LI

https://orcid.org/0000-0002-5388-129X

Xiaowei ZHANG

https://orcid.org/0000-0001-8562-416X

Bin HU

https://orcid.org/0000-0003-3514-5413

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2022 Vol.23 No.8 P.1158-1173

http://doi.org/10.1631/FITEE.2100489


A personality-guided affective brain–computer interface for implementation of emotional intelligence in machines


Author(s):  Shaojie LI, Wei LI, Zejian XING, Wenjie YUAN, Xiangyu WEI, Xiaowei ZHANG, Bin HU

Affiliation(s):  School of Information Science and Engineering, Lanzhou University, Lanzhou 730099, China

Corresponding email(s):   zhangxw@lzu.edu.cn, bh@lzu.edu.cn

Key Words:  Electroencephalogram (EEG), Emotion recognition, Attention mechanism, Personality traits


Shaojie LI, Wei LI, Zejian XING, Wenjie YUAN, Xiangyu WEI, Xiaowei ZHANG, Bin HU. A personality-guided affective brain–computer interface for implementation of emotional intelligence in machines[J]. Frontiers of Information Technology & Electronic Engineering, 2022, 23(8): 1158-1173.

@article{title="A personality-guided affective brain–computer interface for implementation of emotional intelligence in machines",
author="Shaojie LI, Wei LI, Zejian XING, Wenjie YUAN, Xiangyu WEI, Xiaowei ZHANG, Bin HU",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="23",
number="8",
pages="1158-1173",
year="2022",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2100489"
}

%0 Journal Article
%T A personality-guided affective brain–computer interface for implementation of emotional intelligence in machines
%A Shaojie LI
%A Wei LI
%A Zejian XING
%A Wenjie YUAN
%A Xiangyu WEI
%A Xiaowei ZHANG
%A Bin HU
%J Frontiers of Information Technology & Electronic Engineering
%V 23
%N 8
%P 1158-1173
%@ 2095-9184
%D 2022
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2100489

TY - JOUR
T1 - A personality-guided affective brain–computer interface for implementation of emotional intelligence in machines
A1 - Shaojie LI
A1 - Wei LI
A1 - Zejian XING
A1 - Wenjie YUAN
A1 - Xiangyu WEI
A1 - Xiaowei ZHANG
A1 - Bin HU
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 23
IS - 8
SP - 1158
EP - 1173
%@ 2095-9184
Y1 - 2022
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2100489


Abstract: 
Affective brain–computer interfaces have become an increasingly important topic to achieve emotional intelligence in human–machine collaboration. However, due to the complexity of electroencephalogram (EEG) signals and the individual differences in emotional response, it is still a great challenge to design a reliable and effective model. Considering the influence of personality traits on emotional response, it would be helpful to integrate personality information and EEG signals for emotion recognition. This study proposes a personality-guided attention neural network that can use personality information to learn effective EEG representations for emotion recognition. Specifically, we first use a convolutional neural network to extract rich temporal and regional representations of EEG signals, and a special convolution kernel is designed to learn inter- and intra-regional correlations simultaneously. Second, inspired by the fact that electrodes within distinct brain scalp regions play different roles in emotion recognition, a personality-guided regional-attention mechanism is proposed to further explore the contributions of electrodes within a region and between regions. Finally, attention-based long short-term memory is designed to explore the temporal dynamics of EEG signals. Experiments on the AMIGOS dataset, which is a dataset for multimodal research for affect, personality traits, and mood on individuals and groups, show that the proposed method can significantly improve the performance of subject-independent emotion recognition and outperform state-of-the-art methods.

一种面向机器情感智能的人格引导型情感脑机接口

李少杰,李伟,邢泽健,袁文杰,韦香玉,张晓炜,胡斌
兰州大学信息科学与工程学院,中国兰州市,730099
摘要:情感脑机接口(brain–computerinterfaces, BCIs)已成为在人机协作中实现情感智能的一个重要途径。然而,由于脑电图(electroencephalogram, EEG)信号的复杂性和情绪反应的个体差异性,设计一个可靠和有效的模型仍然是一个巨大挑战。考虑到不同人格特征的个体在情绪感知和反应过程中的差异,整合人格信息和脑电信号对情绪识别是有帮助的。鉴于此,提出一种人格引导的注意力神经网络,其可以利用人格信息学习更为有效的EEG表征以用于情感识别。具体来说,我们首先利用卷积神经网络提取脑电信号的时域和空域表征,进而设计一种特殊的卷积核同时学习大脑头皮不同区域间和区域内的EEG导联相关关系。其次,考虑到不同大脑头皮区域在情绪识别中可能发挥不同的作用,提出一种人格引导的区域注意力机制,以进一步探索区域内和区域间EEG导联的贡献。最后,设计一种基于注意力的长短期记忆网络(long short-term memory, LSTM)建模EEG信号的时域动态特征。在AMIGOS数据集(一个用于个人和群体的情感、人格特征和情绪多模态研究的数据集)的实验结果表明,本研究所提方法可以显著提升被试独立策略下情感识别的性能,并优于现有情感识别方法。

关键词:脑电图;情感识别;注意力机制;人格特征

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Alarcão SM, Fonseca MJ, 2019. Emotions recognition using EEG signals: a survey. IEEE Trans Affect Comput, 10(3):374-393.

[2]Alhagry S, Fahmy AA, El-Khoribi RA, 2017. Emotion recognition based on EEG using LSTM recurrent neural network. Int J Adv Comput Sci Appl, 8(10):355-358.

[3]Alhussein M, 2016. Automatic facial emotion recognition using Weber local descriptor for e-Healthcare system. Clust Comput, 19(1):99-108.

[4]Ayata D, Yaslan Y, Kamasak ME, 2018. Emotion based music recommendation system using wearable physiological sensors. IEEE Trans Consum Electron, 64(2):196-203.

[5]Barkana BD, Ozkan Y, Badara JA, 2022. Analysis of working memory from EEG signals under different emotional states. Biomed Signal Process Contr, 71:103249.

[6]Bhardwaj A, Gupta A, Jain P, et al., 2015. Classification of human emotions from EEG signals using SVM and LDA classifiers. 2nd Int Conf on Signal Processing and Integrated Networks, p.180-185.

[7]Chaplin TM, 2015. Gender and emotion expression: a developmental contextual perspective. Emot Rev, 7(1):14-21.

[8]Chen H, Song Y, Li XL, 2019. A deep learning framework for identifying children with ADHD using an EEG-based brain network. Neurocomputing, 356:83-96.

[9]Chevalier P, Martin JC, Isableu B, et al., 2015. Impact of personality on the recognition of emotion expressed via human, virtual, and robotic embodiments. 24th IEEE Int Symp on Robot and Human Interactive Communication, p.229-234.

[10]Cui H, Liu AP, Zhang X, et al., 2020. EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network. Knowl Based Syst, 205:106243.

[11]Ding Y, Robinson N, Zeng QH, et al., 2020. TSception: a deep learning framework for emotion detection using EEG. Int Joint Conf on Neural Networks, p.1-7.

[12]Ding Y, Hu X, Xia ZY, et al., 2021. Inter-brain EEG feature extraction and analysis for continuous implicit emotion tagging during video watching. IEEE Trans Affect Comput, 12(1):92-102.

[13]Ding Y, Robinson N, Zhang S, et al., 2022. TSception: capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition. IEEE Trans Affect Comput, early access.

[14]Duan RN, Zhu JY, Lu BL, 2013. Differential entropy feature for EEG-based emotion classification. 6th Int IEEE/EMBS Conf on Neural Engineering, p.81-84.

[15]Etkin A, Egner T, Kalisch R, 2011. Emotional processing in anterior cingulate and medial prefrontal cortex. Trends Cogn Sci, 15(2):85-93.

[16]Farahat A, Reichert C, Sweeney-Reed CM, et al., 2019. Convolutional neural networks for decoding of covert attention focus and saliency maps for EEG feature visualization. J Neur Eng, 16(6):066010.

[17]Fiterau M, Bhooshan S, Fries J, et al., 2017. ShortFuse: biomedical time series representations in the presence of structured information. Proc 2nd Machine Learning for Healthcare Conf, p.59-74.

[18]Fossum TA, Barrett LF, 2000. Distinguishing evaluation from description in the personality-emotion relationship. Pers Soc Psychol Bull, 26(6):669-678.

[19]Furnes D, Berg H, Mitchell RM, et al., 2019. Exploring the effects of personality traits on the perception of emotions from prosody. Front Psychol, 10:184.

[20]He KM, Zhang XY, Ren SQ, et al., 2016. Deep residual learning for image recognition. IEEE Conf on Computer Vision and Pattern Recognition, p.770-778.

[21]Kehoe EG, Toomey JM, Balsters JH, et al., 2012. Personality modulates the effects of emotional arousal and valence on brain activation. Soc Cogn Affect Neurosci, 7(7):858-870.

[22]Klados MA, Konstantinidi P, Dacosta-Aguayo R, et al., 2020. Automatic recognition of personality profiles using EEG functional connectivity during emotional processing. Brain Sci, 10(5):278.

[23]Koelstra S, Muhl C, Soleymani M, et al., 2012. DEAP: a database for emotion analysis using physiological signals. IEEE Trans Affect Comput, 3(1):18-31.

[24]Kragel PA, LaBar KS, 2016. Decoding the nature of emotion in the brain. Trends Cogn Sci, 20(6):444-455.

[25]Lan ZR, Sourina O, Wang LP, et al., 2016. Real-time EEG-based emotion monitoring using stable features. Vis Comput, 32(3):347-358.

[26]Larsen RJ, Ketelaar T, 1991. Personality and susceptibility to positive and negative emotional states. J Pers Soc Psychol, 61(1):132-140.

[27]Lawhern VJ, Solon AJ, Waytowich NR, et al., 2018. EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces. J Neur Eng, 15(5):056013.

[28]Li WY, Hu X, Long XF, et al., 2020. EEG responses to emotional videos can quantitatively predict Big-Five personality traits. Neurocomputing, 415:368-381.

[29]Li X, Song DW, Zhang P, et al., 2016. Emotion recognition from multi-channel EEG data through convolutional recurrent neural network. IEEE Int Conf on Bioinformatics and Biomedicine, p.352-359.

[30]Li Y, Zheng WM, Zong Y, et al., 2021. A bi-hemisphere domain adversarial neural network model for EEG emotion recognition. IEEE Trans Affect Comput, 12(2):494-504.

[31]Li Y, Zheng WM, Wang L, et al., 2022. From regional to global brain: a novel hierarchical spatial-temporal neural network model for EEG emotion recognition. IEEE Trans Affect Comput, 13(2):568-578.

[32]Lindquist KA, Barrett LF, 2012. A functional architecture of the human brain: emerging insights from the science of emotion. Trends Cogn Sci, 16(11):533-540.

[33]Lotfi E, Akbarzadeh-T MR, 2014. Practical emotional neural networks. Neur Netw, 59:61-72.

[34]Ma WF, Gong YF, Zhou GX, et al., 2021. A channel-mixing convolutional neural network for motor imagery EEG decoding and feature visualization. Biomed Signal Process Contr, 70:103021.

[35]Maaoui C, Pruski A, 2010. Emotion recognition through physiological signals for human-machine communication. In: Kordic V (Ed.), Cutting Edge Robotics 2010. IntechOpen, Rijeka, Croatia.

[36]Martínez-Tejada LA, Maruyama Y, Yoshimura N, et al., 2020. Analysis of personality and EEG features in emotion recognition using machine learning techniques to classify arousal and valence labels. Mach Learn Knowl Extr, 2(2):99-124.

[37]Miranda-Correa JA, Abadi MK, Sebe N, et al., 2021. AMIGOS: a dataset for affect, personality and mood research on individuals and groups. IEEE Trans Affect Comput, 12(2):479-493.

[38]Mohammadi Z, Frounchi J, Amiri M, 2017. Wavelet-based emotion recognition system using EEG signal. Neur Comput Appl, 28(8):1985-1990.

[39]Mühl C, Allison B, Nijholt A, et al., 2014. A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges. Brain Comput Interf, 1(2):66-84.

[40]Niemic CP, 2002. Studies of emotion: a theoretical and empirical review of psychophysiological studies of emotion. J Undergrad Res, 1:15-18.

[41]Orgeta V, 2009. Specificity of age differences in emotion regulation. Aging Ment Health, 13(6):818-826.

[42]Özerdem MS, Polat H, 2017. Emotion recognition based on EEG features in movie clips with channel selection. Brain Inform, 4(4):241-252.

[43]Perugini M, Di Blas L, 2002. Analyzing personality related adjectives from an eticemic perspective: the Big Five marker scales (BFMS) and the Italian AB5C taxonomy. In: de Raad B, Perugini M (Eds.), Big Five Assessment. Hogrefe & Huber Publishers, Seattle, USA, p.281-304.

[44]Picard RW, Vyzas E, Healey J, 2001. Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Patt Anal Mach Intell, 23(10):1175-1191.

[45]Rao HC, Wang SQ, Hu XP, et al., 2021. A self-supervised gait encoding approach with locality-awareness for 3D skeleton based person re-identification. IEEE Trans Patt Anal Mach Intell, early access.

[46]Revelle WR, Scherer KR, 2009. Personality and emotion. In: Sander D, Scherer KR (Eds.), Oxford Companion to Emotion and the Affective Sciences. Oxford University Press, Oxford, UK, p.304-305.

[47]Rukavina S, Gruss S, Hoffmann H, et al., 2016. Affective computing and the impact of gender and age. PLoS ONE, 11(3):e0150584.

[48]Salovey P, Mayer JD, 1990. Emotional intelligence. Imaginat Cognit Pers, 9(3):185-211.

[49]Schirrmeister RT, Springenberg JT, Fiederer LDJ, et al., 2017. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum Brain Mapp, 38(11):5391-5420.

[50]Schmidtke JI, Heller W, 2004. Personality, affect and EEG: predicting patterns of regional brain activity related to extraversion and neuroticism. Pers Individ Differ, 36(3):717-732.

[51]Shen J, Zhang XW, Wang G, et al., 2022. An improved empirical mode decomposition of electroencephalogram signals for depression detection. IEEE Trans Affect Comput, 13(1):262-271.

[52]Shen T, Zhou TY, Long GD, et al., 2018. DiSAN: directional self-attention network for RNN/CNN-free language understanding. Proc 32nd AAAI Conf on Artificial Intelligence, p.5446-5455.

[53]Shu YY, Wang SF, 2017. Emotion recognition through integrating EEG and peripheral signals. IEEE Int Conf on Acoustics, Speech and Signal Processing, p.2871-2875.

[54]Steinert S, Friedrich O, 2020. Wired emotions: ethical issues of affective brain-computer interfaces. Sci Eng Ethics, 26(1):351-367.

[55]Topic A, Russo M, Stella M, et al., 2022. Emotion recognition using a reduced set of EEG channels based on holographic feature maps. Sensors, 22(9):3248.

[56]van Leeuwen KG, Sun H, Tabaeizadeh M, et al., 2019. Detecting abnormal electroencephalograms using deep convolutional networks. Clin Neurophysiol, 130(1):77-84.

[57]Vaswani A, Shazeer N, Parmar N, et al., 2017. Attention is all you need. Proc 32st Int Conf on Neural Information Processing Systems, p.6000-6010.

[58]Vuoskoski JK, Eerola T, 2011. The role of mood and personality in the perception of emotions represented by music. Cortex, 47(9):1099-1106.

[59]Waytowich N, Lawhern VJ, Garcia JO, et al., 2018. Compact convolutional neural networks for classification of asynchronous steady-state visual evoked potentials. J Neur Eng, 15(6):066031.

[60]Wen ZY, Xu RF, Du JC, 2017. A novel convolutional neural networks for emotion recognition based on EEG signal. Int Conf on Security, Pattern Analysis, and Cybernetics, p.672-677.

[61]Wolpaw JR, Birbaumer N, McFarland DJ, et al., 2002. Brain–computer interfaces for communication and control. Clin Neurophysiol, 113(6):767-791.

[62]Xu SH, Rao HC, Peng H, et al., 2021. Attention-based multilevel co-occurrence graph convolutional LSTM for 3-D action recognition. IEEE Internet Things J, 8(21):15990-16001.

[63]Yang YL, Wu QF, Qiu M, et al., 2018. Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network. Int Joint Conf on Neural Networks, p.1-7.

[64]Zhang DL, Yao LN, Zhang X, et al., 2018. Cascade and parallel convolutional recurrent neural networks on EEG-based intention recognition for brain computer interface. Proc 32nd AAAI Conf on Artificial Intelligence, p.1703-1710.

[65]Zhang DL, Yao LN, Chen KX, et al., 2019. A convolutional recurrent attention model for subject-independent EEG signal analysis. IEEE Signal Process Lett, 26(5):715-719.

[66]Zhang GH, Yu MJ, Liu YJ, et al., 2021. SparseDGCNN: recognizing emotion from multichannel EEG signals. IEEE Trans Affect Comput, early access.

[67]Zhang T, Cui Z, Xu CY, et al., 2020. Variational pathway reasoning for EEG emotion recognition. Proc 34th AAAI Conf on Artificial Intelligence, 2709-2716.

[68]Zhang XW, Li JL, Hou KC, et al., 2020. EEG-based depression detection using convolutional neural network with demographic attention mechanism. 42nd Annual Int Conf of the IEEE Engineering in Medicine & Biology Society, p.128-133.

[69]Zhang XW, Liu JY, Shen J, et al., 2021. Emotion recognition from multimodal physiological signals using a regularized deep fusion of kernel machine. IEEE Trans Cybern, 51(9):4386-4399.

[70]Zhang XW, Pan J, Shen J, et al., 2022. Fusing of electroencephalogram and eye movement with group sparse canonical correlation analysis for anxiety detection. IEEE Trans Affect Comput, 13(2):958-971.

[71]Zhang YH, Prasad S, Kilicarslan A, et al., 2017. Multiple kernel based region importance learning for neural classification of gait states from EEG signals. Front Neurosci, 11:170.

[72]Zhao GZ, Zhang YL, Ge Y, et al., 2018a. Asymmetric hemisphere activation in tenderness: evidence from EEG signals. Sci Rep, 8(1):8029.

[73]Zhao GZ, Ge Y, Shen BY, et al., 2018b. Emotion analysis for personality inference from EEG signals. IEEE Trans Affect Comput, 9(3):362-371.

[74]Zhao GZ, Zhang YL, Ge Y, 2018c. Frontal EEG asymmetry and middle line power difference in discrete emotions. Front Behav Neurosci, 12:225.

[75]Zhao SC, Ding GG, Han JG, et al., 2018. Personality-aware personalized emotion recognition from physiological signals. Proc 27th Int Joint Conf on Artificial Intelligence, p.1660-1667.

[76]Zheng WL, Lu BL, 2015. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev, 7(3):162-175.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE