## Abstract

Efficient learning by the backpropagation (BP) algorithm is required for many practical applications. The BP algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a two-term algorithm consisting of a learning rate (LR) and a momentum factor (MF). The major drawbacks of the two-term BP learning algorithm are the problems of local minima and slow convergence speeds, which limit the scope for real-time applications. Recently the addition of an extra term, called a proportional factor (PF), to the two-term BP algorithm was proposed. The third increases the speed of the BP algorithm. However, the PF term also reduces the convergence of the BP algorithm, and criteria for evaluating convergence are required to facilitate the application of the three terms BP algorithm. This paper analyzes the convergence of the new three-term backpropagation algorithm. If the learning parameters of the three-term BP algorithm satisfy the conditions given in this paper, then it is guaranteed that the system is stable and will converge to a local minimum. It is proved that if at least one of the eigenvalues of matrix F (compose of the Hessian of the cost function and the system Jacobian of the error vector at each iteration) is negative, then the system becomes unstable. Also the paper shows that all the local minima of the three-term BP algorithm cost function are stable. The relationship between the learning parameters are established in this paper such that the stability conditions are met.

Original language | British English |
---|---|

Pages (from-to) | 1341-1347 |

Number of pages | 7 |

Journal | Neural Networks |

Volume | 18 |

Issue number | 10 |

DOIs | |

State | Published - Dec 2005 |

## Keywords

- Backpropagation
- Jury test
- Neural networks
- Stability