Publishing Service

Polishing & Checking

Frontiers of Information Technology & Electronic Engineering

ISSN 2095-9184 (print), ISSN 2095-9230 (online)

LDformer: a parallel neural network model for long-term power forecasting

Abstract: Accurate long-term power forecasting is important in the decision-making operation of the power grid and power consumption management of customers to ensure the power system’s reliable power supply and the grid economy’s reliable operation. However, most time-series forecasting models do not perform well in dealing with long-time-series prediction tasks with a large amount of data. To address this challenge, we propose a parallel time-series prediction model called LDformer. First, we combine Informer with long short-term memory (LSTM) to obtain deep representation abilities in the time series. Then, we propose a parallel encoder module to improve the robustness of the model and combine convolutional layers with an attention mechanism to avoid value redundancy in the attention mechanism. Finally, we propose a probabilistic sparse (ProbSparse) self-attention mechanism combined with UniDrop to reduce the computational overhead and mitigate the risk of losing some key connections in the sequence. Experimental results on five datasets show that LDformer outperforms the state-of-the-art methods for most of the cases when handling the different long-time-series prediction tasks.

Key words: Long-term power forecasting; Long short-term memory (LSTM); UniDrop; Self-attention mechanism

Chinese Summary  <5> LDformer:面向长期电力预测的并行神经网络模型

田冉,李新梅,马忠彧,刘颜星,王晶霞,王楚
西北师范大学计算机科学与工程学院,中国兰州市,730070
摘要:准确的长期电力预测对电网决策运行和用户用电管理非常重要,可保证电力系统的可靠供电和电网经济的可靠运行。然而,大多数时间序列预测模型在数据量大、预测精度高的长时间序列预测任务中表现不佳。为了应对这一挑战,提出名为LDformer的并行时间序列预测模型。首先,将Informer与长短期记忆网络相结合,以获得时间序列的深度表达能力。其次,提出并行编码器模块提高模型鲁棒性,并将卷积层与注意力机制相结合,以避免注意力机制中的值冗余。最后,提出结合UniDrop的概率稀疏注意力机制,以减少计算开销并减轻序列中一些关键连接丢失的风险。在5个真实数据集上的实验结果显示,在不同的长时间序列预测任务中,LDformer大部分结果都优于最先进的基线结果。

关键词组:长期电力预测;长短期记忆网络;UniDrop;自注意力机制


Share this article to: More

Go to Contents

References:

<Show All>

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





DOI:

10.1631/FITEE.2200540

CLC number:

TP183

Download Full Text:

Click Here

Downloaded:

773

Download summary:

<Click Here> 

Downloaded:

164

Clicked:

1182

Cited:

0

On-line Access:

2023-06-21

Received:

2022-11-03

Revision Accepted:

2023-09-21

Crosschecked:

2023-02-13

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952276; Fax: +86-571-87952331; E-mail: jzus@zju.edu.cn
Copyright © 2000~ Journal of Zhejiang University-SCIENCE