CLC number: TP183
On-line Access: 2024-12-26
Received: 2023-11-17
Revision Accepted: 2024-02-26
Crosschecked: 2025-01-24
Cited: 0
Clicked: 1104
Yanping ZHU, Lei HUANG, Jixin CHEN, Shenyun WANG, Fayu WAN, Jianan CHEN. VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition[J]. Frontiers of Information Technology & Electronic Engineering, 2024, 25(11): 1497-1514.
@article{title="VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition",
author="Yanping ZHU, Lei HUANG, Jixin CHEN, Shenyun WANG, Fayu WAN, Jianan CHEN",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="25",
number="11",
pages="1497-1514",
year="2024",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2300781"
}
%0 Journal Article
%T VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition
%A Yanping ZHU
%A Lei HUANG
%A Jixin CHEN
%A Shenyun WANG
%A Fayu WAN
%A Jianan CHEN
%J Frontiers of Information Technology & Electronic Engineering
%V 25
%N 11
%P 1497-1514
%@ 2095-9184
%D 2024
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2300781
TY - JOUR
T1 - VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition
A1 - Yanping ZHU
A1 - Lei HUANG
A1 - Jixin CHEN
A1 - Shenyun WANG
A1 - Fayu WAN
A1 - Jianan CHEN
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 25
IS - 11
SP - 1497
EP - 1514
%@ 2095-9184
Y1 - 2024
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2300781
Abstract: Human emotions are intricate psychological phenomena that reflect an individual’s current physiological and psychological state. Emotions have a pronounced influence on human behavior, cognition, communication, and decision-making. However, current emotion recognition methods often suffer from suboptimal performance and limited scalability in practical applications. To solve this problem, a novel electroencephalogram (EEG) emotion recognition network named VG-DOCoT is proposed, which is based on depthwise over-parameterized convolutional (DO-Conv), transformer, and variational automatic encoder-generative adversarial network (VAE-GAN) structures. Specifically, the differential entropy (DE) can be extracted from EEG signals to create mappings into the temporal, spatial, and frequency information in preprocessing. To enhance the training data, VAE-GAN is employed for data augmentation. A novel convolution module DO-Conv is used to replace the traditional convolution layer to improve the network. A transformer structure is introduced into the network framework to reveal the global dependencies from EEG signals. Using the proposed model, a binary classification on the DEAP dataset is carried out, which achieves an accuracy of 92.52% for arousal and 92.27% for valence. Next, a ternary classification is conducted on SEED, which classifies neutral, positive, and negative emotions; an impressive average prediction accuracy of 93.77% is obtained. The proposed method significantly improves the accuracy for EEG-based emotion recognition.
[1]Aznan NKN, Atapour-Abarghouei A, Bonner S, et al., 2019. Simulating brain signals: creating synthetic EEG data via neural-based generative models for improved SSVEP classification. Int Joint Conf on Neural Networks, p.1-8.
[2]Bahdanau D, Cho K, Bengio Y, 2015. Neural machine translation by jointly learning to align and translate. 3rd Int Conf on Learning Representations.
[3]Bernat E, Bunce S, Shevrin H, 2001. Event-related brain potentials differentiate positive and negative mood adjectives during both supraliminal and subliminal visual processing. Int J Psychophysiol, 42(1):11-34.
[4]Cao JM, Li YY, Sun MC, et al., 2022. Do-Conv: depthwise over-parameterized convolutional layer. IEEE Trans Image Process, 31:3726-3736.
[5]Chao H, Dong L, 2021. Emotion recognition using three-dimensional feature and convolutional neural network from multichannel EEG signals. IEEE Sens J, 21(2):2024-2034.
[6]Cheng J, Chen MY, Li C, et al., 2021. Emotion recognition from multi-channel EEG via deep forest. IEEE J Biomed Health Inform, 25(2):453-464.
[7]Goodfellow I, Pouget-Abadie J, Mirza M, et al., 2020. Generative adversarial networks. Commun ACM, 63(11):139-144.
[8]Guo JY, Cai Q, An JP, et al., 2022. A Transformer based neural network for emotion recognition and visualizations of crucial EEG channels. Phys A Stat Mech Appl, 603:127700.
[9]Hu JF, Min JL, 2018. Automated detection of driver fatigue based on EEG signals using gradient boosting decision tree model. Cogn Neurodyn, 12(4):431-440.
[10]Jenke R, Peer A, Buss M, 2014. Feature extraction and selection for emotion recognition from EEG. IEEE Trans Affect Comput, 5(3):327-339.
[11]Kingma DP, Welling M, 2014. Auto-encoding variational Bayes. 2nd Int Conf on Learning Representations.
[12]Koelstra S, Muhl C, Soleymani M, et al., 2012. DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput, 3(1):18-31.
[13]Lan ZR, Sourina O, Wang LP, et al., 2016. Real-time EEG-based emotion monitoring using stable features. Vis Comput, 32(3):347-358.
[14]Lew WCL, Wang D, Shylouskaya K, et al., 2020. EEG-based emotion recognition using spatial-temporal representation via Bi-GRU. 42nd Annual Int Conf of the IEEE Engineering in Medicine & Biology Society, p.116-119.
[15]Li C, Lin XJ, Liu Y, et al., 2022. EEG-based emotion recognition via efficient convolutional neural network and contrastive learning. IEEE Sens J, 22(20):19608-19619.
[16]Li JP, Zhang ZX, He HG, 2018. Hierarchical convolutional neural networks for EEG-based emotion recognition. Cogn Comput, 10(2):368-380.
[17]Li SJ, Li W, Xing ZJ, et al., 2022. A personality-guided affective brain–computer interface for implementation of emotional intelligence in machines. Front Inform Technol Electron Eng, 23(8):1158-1173.
[18]Li X, Song DW, Zhang P, et al., 2016. Emotion recognition from multi-channel EEG data through convolutional recurrent neural network. IEEE Int Conf on Bioinformatics and Biomedicine, p.352-359.
[19]Li X, Zhang YZ, Tiwari P, et al., 2022. EEG based emotion recognition: a tutorial and review. ACM Comput Surv, 55(4):79.
[20]Lin YP, Wang CH, Wu TL, et al., 2009. EEG-based emotion recognition in music listening: a comparison of schemes for multiclass support vector machine. IEEE Int Conf on Acoustics, Speech and Signal Processing, p.489-492.
[21]Liu YJ, Yu MJ, Zhao GZ, et al., 2018. Real-time movie-induced discrete emotion recognition from EEG signals. IEEE Trans Affect Comput, 9(4):550-562.
[22]Liu YS, Sourina O, 2014. EEG-based subject-dependent emotion recognition algorithm using fractal dimension. IEEE Int Conf on Systems, Man, and Cybernetics, p.3166-3171.
[23]Mohammadi Z, Frounchi J, Amiri M, 2017. Wavelet-based emotion recognition system using EEG signal. Neur Comput Appl, 28(8):1985-1990.
[24]Picard RW, 2000. Affective Computing. MIT Press, Cambridge, UK.
[25]Salama ES, El-Khoribi RA, Shoman ME, et al., 2018. EEG-based emotion recognition using 3D convolutional neural networks. Int J Adv Comput Sci Appl, 9(8):329-337.
[26]Song TF, Zheng WM, Song P, et al., 2020. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput, 11(3):532-541.
[27]Sorkhabi MM, 2014. Emotion detection from EEG signals with continuous wavelet analyzing. Am J Comput Res Repos, 2(4):66-70.
[28]Stam CJ, 2005. Nonlinear dynamical analysis of EEG and MEG: review of an emerging field. Clin Neurophysiol, 116(10):2266-2301.
[29]Tang ZC, Li C, Wu JF, et al., 2019. Classification of EEG-based single-trial motor imagery tasks using a B-CSP method for BCI. Front Inform Technol Electron Eng, 20(8):1087-1098.
[30]Tao W, Li C, Song RC, et al., 2023. EEG-based emotion recognition via channel-wise attention and self attention. IEEE Trans Affect Comput, 14(1):382-393.
[31]Tripathi S, Acharya S, Sharma RD, et al., 2017. Using deep and convolutional neural networks for accurate emotion classification on DEAP data. Proc 31st AAAI Conf on Artificial Intelligence, p.4746-4752.
[32]Vaswani A, Shazeer N, Parmar N, et al., 2017. Attention is all you need. 31st Int Conf on Neural Information Processing Systems, p.6000-6010.
[33]Vijayan AE, Sen D, Sudheer AP, 2015. EEG-based emotion recognition using statistical measures and auto-regressive modeling. IEEE Int Conf on Computational Intelligence & Communication Technology, p.587-591.
[34]Wang Q, Sourina O, Nguyen MK, 2011. Fractal dimension based neurofeedback in serious games. Vis Comput, 27(4):299-309.
[35]Wang XW, Nie D, Lu BL, 2014. Emotional state classification from EEG data using machine learning approach. Neurocomputing, 129:94-106.
[36]Wei C, Chen LL, Song ZZ, et al., 2020. EEG-based emotion recognition using simple recurrent units network and ensemble learning. Biomed Signal Process Contr, 58:101756.
[37]Yang BH, He LF, Lin L, et al., 2015. Fast removal of ocular artifacts from electroencephalogram signals using spatial constraint independent component analysis based recursive least squares in brain-computer interface. Front Inform Technol Electron Eng, 16(6):486-496.
[38]Yang Y, Gao Q, Song XL, et al., 2021. Facial expression and EEG fusion for investigating continuous emotions of deaf subjects. IEEE Sens J, 21(15):16894-16903.
[39]Yang YL, Wu QF, Fu YZ, et al., 2018a. Continuous convolutional neural network with 3D input for EEG-based emotion recognition. 25th Int Conf on Neural Information Processing, p.433-443.
[40]Yang YL, Wu QF, Qiu M, et al., 2018b. Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network. Int Joint Conf on Neural Networks, p.1-7.
[41]Yang YX, Gao ZK, Wang XM, et al., 2018. A recurrence quantification analysis-based channel-frequency convolutional neural network for emotion recognition from EEG. Chaos, 28(8):085724.
[42]Yin YQ, Zheng XW, Hu B, et al., 2021. EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM. Appl Soft Comput, 100:106954.
[43]Zhang DL, Yao LN, Zhang X, et al., 2018. Cascade and parallel convolutional recurrent neural networks on EEG-based intention recognition for brain computer interface. Proc 32nd AAAI Conf on Artificial Intelligence, p.1703-1710.
[44]Zhang QQ, Liu Y, 2018. Improving brain computer interface performance by data augmentation with conditional deep convolutional generative adversarial networks. https://arxiv.org/abs/1806.07108
[45]Zhang T, Zheng WM, Cui Z, et al., 2019. Spatial–temporal recurrent neural network for emotion recognition. IEEE Trans Cybern, 49(3):839-847.
[46]Zheng WL, Lu BL, 2015. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev, 7(3):162-175.
[47]Zhong XY, Gu Y, Luo YT, et al., 2023. Bi-hemisphere asymmetric attention network: recognizing emotion from EEG signals based on the transformer. Appl Intell, 53(12):15278-15294.
Open peer comments: Debate/Discuss/Question/Opinion
<1>