|
Frontiers of Information Technology & Electronic Engineering
ISSN 2095-9184 (print), ISSN 2095-9230 (online)
2023 Vol.24 No.9 P.1287-1301
LDformer: a parallel neural network model for long-term power forecasting
Abstract: Accurate long-term power forecasting is important in the decision-making operation of the power grid and power consumption management of customers to ensure the power system’s reliable power supply and the grid economy’s reliable operation. However, most time-series forecasting models do not perform well in dealing with long-time-series prediction tasks with a large amount of data. To address this challenge, we propose a parallel time-series prediction model called LDformer. First, we combine Informer with long short-term memory (LSTM) to obtain deep representation abilities in the time series. Then, we propose a parallel encoder module to improve the robustness of the model and combine convolutional layers with an attention mechanism to avoid value redundancy in the attention mechanism. Finally, we propose a probabilistic sparse (ProbSparse) self-attention mechanism combined with UniDrop to reduce the computational overhead and mitigate the risk of losing some key connections in the sequence. Experimental results on five datasets show that LDformer outperforms the state-of-the-art methods for most of the cases when handling the different long-time-series prediction tasks.
Key words: Long-term power forecasting; Long short-term memory (LSTM); UniDrop; Self-attention mechanism
西北师范大学计算机科学与工程学院,中国兰州市,730070
摘要:准确的长期电力预测对电网决策运行和用户用电管理非常重要,可保证电力系统的可靠供电和电网经济的可靠运行。然而,大多数时间序列预测模型在数据量大、预测精度高的长时间序列预测任务中表现不佳。为了应对这一挑战,提出名为LDformer的并行时间序列预测模型。首先,将Informer与长短期记忆网络相结合,以获得时间序列的深度表达能力。其次,提出并行编码器模块提高模型鲁棒性,并将卷积层与注意力机制相结合,以避免注意力机制中的值冗余。最后,提出结合UniDrop的概率稀疏注意力机制,以减少计算开销并减轻序列中一些关键连接丢失的风险。在5个真实数据集上的实验结果显示,在不同的长时间序列预测任务中,LDformer大部分结果都优于最先进的基线结果。
关键词组:
References:
Open peer comments: Debate/Discuss/Question/Opinion
<1>
DOI:
10.1631/FITEE.2200540
CLC number:
TP183
Download Full Text:
Downloaded:
1132
Download summary:
<Click Here>Downloaded:
368Clicked:
2062
Cited:
0
On-line Access:
2024-08-27
Received:
2023-10-17
Revision Accepted:
2024-05-08
Crosschecked:
2023-02-13