Publishing Service

Polishing & Checking

Frontiers of Information Technology & Electronic Engineering

ISSN 2095-9184 (print), ISSN 2095-9230 (online)

Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method

Abstract: We introduce the fractional-order global optimal backpropagation machine, which is trained by an improved fractional-order steepest descent method (FSDM). This is a fractional-order backpropagation neural network (FBPNN), a state-of-the-art fractional-order branch of the family of backpropagation neural networks (BPNNs), different from the majority of the previous classic first-order BPNNs which are trained by the traditional first-order steepest descent method. The reverse incremental search of the proposed FBPNN is in the negative directions of the approximate fractional-order partial derivatives of the square error. First, the theoretical concept of an FBPNN trained by an improved FSDM is described mathematically. Then, the mathematical proof of fractional-order global optimal convergence, an assumption of the structure, and fractional-order multi-scale global optimization of the FBPNN are analyzed in detail. Finally, we perform three (types of) experiments to compare the performances of an FBPNN and a classic first-order BPNN, i.e., example function approximation, fractional-order multi-scale global optimization, and comparison of global search and error fitting abilities with real data. The higher optimal search ability of an FBPNN to determine the global optimal solution is the major advantage that makes the FBPNN superior to a classic first-order BPNN.

Key words: Fractional calculus, Fractional-order backpropagation algorithm, Fractional-order steepest descent method, Mean square error, Fractional-order multi-scale global optimization

Chinese Summary  <33> 用改进的分数阶最速下降法训练分数阶全局最优反向传播机

蒲亦非1,王健2
1四川大学计算机学院,中国成都市,610065
2中国石油大学(华东)理学院,中国青岛市,266580

摘要:本文介绍采用改进的分数阶最速下降法(FSDM)训练分数阶全局最优反向传播机。该反向传播机是一种分数阶反向传播神经网络(FBPNN)。分数阶反向传播神经网络是反向传播神经网络(BPNNs)大家族中一个先进的分数阶分支,它不同于绝大多数传统一阶最速下降法训练的经典一阶BPNNs。本文提出的FBPNN反向增量搜索在其均方误差近似分数阶偏导数的负方向进行。首先,从数学上描述用改进FSDM训练的FBPNN理论概念。然后,详细给出FBPNN分数阶全局最优收敛性的数学证明,分析神经网络结构构建以及分数阶多尺度全局寻优问题。最后,通过实验比较FBPNN和经典一阶BPNN的性能:包括函数逼近、分数阶多尺度全局寻优以及基于实际数据的全局搜索和误差拟合能力比对。相比经典一阶BPNN,FBPNN最主要优点是具有更高效的全局寻优能力,能够判定全局最优解。

关键词组:分数阶微积分;分数阶反向传播算法;分数阶最速下降法;均方误差;分数阶多尺度全局优化


Share this article to: More

Go to Contents

References:

<Show All>

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





DOI:

10.1631/FITEE.1900593

CLC number:

O235; N93

Download Full Text:

Click Here

Downloaded:

5237

Download summary:

<Click Here> 

Downloaded:

1629

Clicked:

6043

Cited:

0

On-line Access:

2020-06-12

Received:

2019-10-31

Revision Accepted:

2020-01-13

Crosschecked:

2020-03-31

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952276; Fax: +86-571-87952331; E-mail: jzus@zju.edu.cn
Copyright © 2000~ Journal of Zhejiang University-SCIENCE