Full Text:   <454>

CLC number: 

On-line Access: 2024-08-27

Received: 2023-10-17

Revision Accepted: 2024-05-08

Crosschecked: 0000-00-00

Cited: 0

Clicked: 771

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Journal of Zhejiang University SCIENCE C 1998 Vol.-1 No.-1 P.

http://doi.org/10.1631/FITEE.2300781


VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition


Author(s):  Yanping ZHU, Lei HUANG, Jixin CHEN, Shenyun WANG, Fayu WAN, Jianan CHEN

Affiliation(s):  Nanjing University of Information Science and Technology, School of Electronic and Information Engineering, Nanjing 210044, China

Corresponding email(s):   001520@nuist.edu.cn, 20211249221@nuist.edu.cn, 202212490689@nuist.edu.cn, wangsy2006@126.com, 002470@nuist.edu.cn, 202212490688@nuist.edu.cn

Key Words:  Emotion recognition, EEG, Depthwise over-parameterized convolutional (DO-Conv), Transformer, VAE-GAN


Yanping ZHU, Lei HUANG, Jixin CHEN, Shenyun WANG, Fayu WAN, Jianan CHEN. VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition[J]. Frontiers of Information Technology & Electronic Engineering, 1998, -1(-1): .

@article{title="VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition",
author="Yanping ZHU, Lei HUANG, Jixin CHEN, Shenyun WANG, Fayu WAN, Jianan CHEN",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="-1",
number="-1",
pages="",
year="1998",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2300781"
}

%0 Journal Article
%T VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition
%A Yanping ZHU
%A Lei HUANG
%A Jixin CHEN
%A Shenyun WANG
%A Fayu WAN
%A Jianan CHEN
%J Journal of Zhejiang University SCIENCE C
%V -1
%N -1
%P
%@ 2095-9184
%D 1998
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2300781

TY - JOUR
T1 - VG-DOCoT: a novel DO-Conv and transformer framework via VAE-GAN technique for EEG emotion recognition
A1 - Yanping ZHU
A1 - Lei HUANG
A1 - Jixin CHEN
A1 - Shenyun WANG
A1 - Fayu WAN
A1 - Jianan CHEN
J0 - Journal of Zhejiang University Science C
VL - -1
IS - -1
SP -
EP -
%@ 2095-9184
Y1 - 1998
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2300781


Abstract: 
Human emotions are intricate psychological phenomena that reflect an individual's current physiological and psychological state. Emotions have a pronounced influence on human behavior, cognition, communication, and decision-making. However, current emotion recognition methods often suffer from suboptimal performance and limited scalability in practical applications. To solve this problem, a novel electroencephalogram (EEG) emotion recognition network named VG-DOCoT is proposed, which is based on depthwise over-parameterized convolutional (DO-Conv), transformer, and VAE-GAN structures. Specifically, the differential entropy can be extracted from EEG signals to create mappings into the temporal, spatial, and frequency information in preprocessing. To enhance the training data, VAE-GAN is employed for data augmentation. A novel convolution module DO-Conv is used to replace the traditional convolution layer to improve the network. A transformer structure is introduced into the network framework to reveal the global dependencies from EEG signals. Using the proposed model, a binary classification on the DEAP dataset is carried out, which achieves an accuracy of 92.52% for arousal and 92.27% for valence. Next, a ternary classification is conducted on the SEED dataset, which classifies neutral, positive, and negative emotions; an impressive average prediction accuracy of 93.77% is obtained. The proposed method significantly improves the accuracy for EEG-based emotion recognition.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE