Publishing Service

Polishing & Checking

Frontiers of Information Technology & Electronic Engineering

ISSN 2095-9184 (print), ISSN 2095-9230 (online)

Efficient decoding self-attention for end-to-end speech synthesis

Abstract: Self-attention has been innovatively applied to text-to-speech (TTS) because of its parallel structure and superior strength in modeling sequential data. However, when used in end-to-end speech synthesis with an autoregressive decoding scheme, its inference speed becomes relatively low due to the quadratic complexity in sequence length. This problem becomes particularly severe on devices without graphics processing units (GPUs). To alleviate the dilemma, we propose an efficient decoding self-attention (EDSA) module as an alternative. Combined with a dynamic programming decoding procedure, TTS model inference can be effectively accelerated to have a linear computation complexity. We conduct studies on Mandarin and English datasets and find that our proposed model with EDSA can achieve 720% and 50% higher inference speed on the central processing unit (CPU) and GPU respectively, with almost the same performance. Thus, this method may make the deployment of such models easier when there are limited GPU resources. In addition, our model may perform better than the baseline Transformer TTS on out-of-domain utterances.

Key words: Efficient decoding; End-to-end; Self-attention; Speech synthesis

Chinese Summary  <28> 一种端到端语音合成中的高效解码自注意力网络

赵伟1,2,许力1,2
1浙江大学电气工程学院,中国杭州市,310027
2浙江大学机器人研究院,中国余姚市,315400
摘要:自注意力网络由于其并行结构和强大的序列建模能力,被广泛应用于语音合成(TTS)领域。然而,当使用自回归解码方法进行端到端语音合成时,由于序列长度的二次复杂性,其推理速度相对较慢。当部署设备未配备图形处理器(GPU)时,该效率问题更加严重。为解决该问题,提出一种高效解码自注意力网络(EDSA)作为替代。通过一个动态规划解码过程,有效加速TTS模型推理,使其具有线性计算复杂度。基于普通话和英文数据集的实验结果表明,所提EDSA模型在中央处理器(CPU)和GPU上的推理速度分别提高720%和50%,而性能几乎相同。因此,在GPU资源有限的情况下,该方法可使此类模型的部署更加容易。此外,所提模型在域外语言处理上可能比基线Transformer TTS性能更好。

关键词组:高效解码;端到端;自注意力网络;语音合成


Share this article to: More

Go to Contents

References:

<Show All>

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





DOI:

10.1631/FITEE.2100501

CLC number:

TN912.3

Download Full Text:

Click Here

Downloaded:

4462

Download summary:

<Click Here> 

Downloaded:

300

Clicked:

2283

Cited:

0

On-line Access:

2022-07-21

Received:

2021-10-21

Revision Accepted:

2022-07-21

Crosschecked:

2022-01-09

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952276; Fax: +86-571-87952331; E-mail: jzus@zju.edu.cn
Copyright © 2000~ Journal of Zhejiang University-SCIENCE