Papers
Topics
Authors
Recent
Search
2000 character limit reached

Nesterov's Acceleration For Approximate Newton

Published 17 Oct 2017 in cs.LG | (1710.08496v1)

Abstract: Optimization plays a key role in machine learning. Recently, stochastic second-order methods have attracted much attention due to their low computational cost in each iteration. However, these algorithms might perform poorly especially if it is hard to approximate the Hessian well and efficiently. As far as we know, there is no effective way to handle this problem. In this paper, we resort to Nesterov's acceleration technique to improve the convergence performance of a class of second-order methods called approximate Newton. We give a theoretical analysis that Nesterov's acceleration technique can improve the convergence performance for approximate Newton just like for first-order methods. We accordingly propose an accelerated regularized sub-sampled Newton. Our accelerated algorithm performs much better than the original regularized sub-sampled Newton in experiments, which validates our theory empirically. Besides, the accelerated regularized sub-sampled Newton has good performance comparable to or even better than classical algorithms.

Citations (11)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.