Papers
Topics
Authors
Recent
Search
2000 character limit reached

$S^{2}$-LBI: Stochastic Split Linearized Bregman Iterations for Parsimonious Deep Learning

Published 24 Apr 2019 in stat.ML, cs.CV, and cs.LG | (1904.10873v1)

Abstract: This paper proposes a novel Stochastic Split Linearized Bregman Iteration ($S{2}$-LBI) algorithm to efficiently train the deep network. The $S{2}$-LBI introduces an iterative regularization path with structural sparsity. Our $S{2}$-LBI combines the computational efficiency of the LBI, and model selection consistency in learning the structural sparsity. The computed solution path intrinsically enables us to enlarge or simplify a network, which theoretically, is benefited from the dynamics property of our $S{2}$-LBI algorithm. The experimental results validate our $S{2}$-LBI on MNIST and CIFAR-10 dataset. For example, in MNIST, we can either boost a network with only 1.5K parameters (1 convolutional layer of 5 filters, and 1 FC layer), achieves 98.40\% recognition accuracy; or we simplify $82.5\%$ of parameters in LeNet-5 network, and still achieves the 98.47\% recognition accuracy. In addition, we also have the learning results on ImageNet, which will be added in the next version of our report.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.