These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Global convergence of online BP training with dynamic learning rate. Author: Zhang R, Xu ZB, Huang GB, Wang D. Journal: IEEE Trans Neural Netw Learn Syst; 2012 Feb; 23(2):330-41. PubMed ID: 24808511. Abstract: The online backpropagation (BP) training procedure has been extensively explored in scientific research and engineering applications. One of the main factors affecting the performance of the online BP training is the learning rate. This paper proposes a new dynamic learning rate which is based on the estimate of the minimum error. The global convergence theory of the online BP training procedure with the proposed learning rate is further studied. It is proved that: 1) the error sequence converges to the global minimum error; and 2) the weight sequence converges to a fixed point at which the error function attains its global minimum. The obtained global convergence theory underlies the successful applications of the online BP training procedure. Illustrative examples are provided to support the theoretical analysis.[Abstract] [Full Text] [Related] [New Search]