Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Adaptive Stochastic Nesterov Accelerated Quasi Newton Method for Training RNNs

Published 9 Sep 2019 in cs.LG and stat.ML | (1909.03620v1)

Abstract: A common problem in training neural networks is the vanishing and/or exploding gradient problem which is more prominently seen in training of Recurrent Neural Networks (RNNs). Thus several algorithms have been proposed for training RNNs. This paper proposes a novel adaptive stochastic Nesterov accelerated quasiNewton (aSNAQ) method for training RNNs. The proposed method aSNAQ is an accelerated method that uses the Nesterov's gradient term along with second order curvature information. The performance of the proposed method is evaluated in Tensorflow on benchmark sequence modeling problems. The results show an improved performance while maintaining a low per-iteration cost and thus can be effectively used to train RNNs.

Citations (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.