|
|
Frontiers of Information Technology & Electronic Engineering
ISSN 2095-9184 (print), ISSN 2095-9230 (online)
2025 Vol.26 No.10 P.1793-1808
Knowledge distillation for financial large language models: a systematic review of strategies, applications, and evaluation
Abstract: Financial large language models (FinLLMs) offer immense potential for financial applications. While excessive deployment expenditures and considerable inference latency constitute major obstacles, as a prominent compression methodology, knowledge distillation (KD) offers an effective solution to these difficulties. A comprehensive survey is conducted in this work on how KD interacts with FinLLMs, covering three core aspects: strategy, application, and evaluation. At the strategy level, this review introduces a structured taxonomy to comparatively analyze existing distillation pathways. At the application level, this review puts forward a logical upstream–midstream–downstream framework to systematically explain the practical value of distilled models in the financial field. At the evaluation level, to tackle the absence of standards in the financial field, this review constructs a comprehensive evaluation framework that proceeds from multiple dimensions such as financial accuracy, reasoning fidelity, and robustness. In summary, this research aims to provide a clear roadmap for this interdisciplinary field, to accelerate the development of distilled FinLLMs.
Key words: Financial large language models (FinLLMs); Knowledge distillation; Model compression; Quantitative trading
1平安科技(深圳)有限公司,中国深圳市,518046
2中国科学技术大学先进技术研究院,中国合肥市,230027
摘要:金融大语言模型为金融应用提供了巨大潜力。然而,过高的部署成本和巨大的推理延迟构成了主要障碍。作为一种重要压缩方法,知识蒸馏为这些难题提供了有效解决方案。本文对知识蒸馏如何与金融大语言模型相互作用进行了全面调查,涵盖了策略、应用和评估3个核心方面。在策略层面,引入一个结构化分类法,以比较分析现有蒸馏路径。在应用层面,提出一个逻辑的上游–中游–下游框架,系统地解释蒸馏模型在金融领域的实际价值。在评估层面,为解决金融领域缺乏标准的问题,构建了一个综合评估框架,从金融准确性、推理保真度和稳健性等多个维度进行评估。总而言之,本文旨在为这一跨学科领域提供清晰的路线图,以加速蒸馏型金融大模型发展。
关键词组:
References:
Open peer comments: Debate/Discuss/Question/Opinion
<1>
DOI:
10.1631/FITEE.2500282
CLC number:
TP391
Download Full Text:
Downloaded:
158
Download summary:
<Click Here>Downloaded:
24Clicked:
582
Cited:
0
On-line Access:
2025-11-17
Received:
2025-04-30
Revision Accepted:
2025-11-18
Crosschecked:
2025-09-05