|
Frontiers of Information Technology & Electronic Engineering
ISSN 2095-9184 (print), ISSN 2095-9230 (online)
2025 Vol.26 No.2 P.301-308
TSNet: a foundation model for wireless network status prediction in digital twins
Abstract: Predicting future network status is a key capability in digital twin networks and can assist operators in estimating network performance and taking proactive measures in advance. Existing methods, including statistical methods, machine learning based methods, and deep learning based methods, suffer from several limitations in generalization ability and data dependency. To overcome these drawbacks, inspired by the success of pretraining and the fine-tuning framework in the natural language processing (NLP) and computer vision (CV) domains, we propose TSNet, a Transformer-based foundation model for predicting various network status measurements in digital twins. To adapt the Transformer architecture to time-series data, frequency learning attention and time-series decomposition blocks are implemented. We also design a fine-tuning strategy to enable TSNet to adapt to new data or scenarios. Experiments demonstrate that zero-shot prediction using TSNet without any training data performs better than fully supervised baseline methods and achieves higher accuracy. In addition, the prediction accuracy could be further improved by equipping the proposed model with fine-tuning. Overall, TSNet exhibits strong capability and achieves high accuracy in various datasets.
Key words:
华为技术有限公司,中国深圳市,518129
摘要:预测网络未来状态是数字孪生网络的一个关键能力,可帮助运维人员估计网络性能变化,以提前采取相关操作。现有预测方法--包括统计方法、机器学习方法和深度学习方法--在泛化能力和训练数据依赖上存在诸多限制。为解决这些问题,受自然语言处理和计算机视觉领域预训练与微调框架启发,提出一个基于Transformer的基础模型TSNet,用于预测多样化的网络性能指标。为了利用Transformer架构更好地建模时间序列,引入频域注意力机制和时序分解。此外,设计了一种轻量的微调策略,使TSNet可以快速泛化到新数据或新场景。实验结果表明,基于零样本的TSNet预测(无需任何训练数据)表现优于有监督的基线方法。使用少样本的微调策略,预测准确性可进一步提升。整体而言,TSNet在多种数据上表现出较高的准确性和泛化能力。
关键词组:
References:
Open peer comments: Debate/Discuss/Question/Opinion
<1>
DOI:
10.1631/FITEE.2400295
CLC number:
Download Full Text:
Downloaded:
205
Download summary:
<Click Here>Downloaded:
62Clicked:
257
Cited:
0
On-line Access:
2025-03-07
Received:
2024-04-16
Revision Accepted:
2024-12-03
Crosschecked:
2025-03-07