Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generalization of GANs and overparameterized models under Lipschitz continuity

Published 6 Apr 2021 in cs.LG | (2104.02388v2)

Abstract: Generative adversarial networks (GANs) are so complex that the existing learning theories do not provide a satisfactory explanation for why GANs have great success in practice. The same situation also remains largely open for deep neural networks. To fill this gap, we introduce a Lipschitz theory to analyze generalization. We demonstrate its simplicity by analyzing generalization and consistency of overparameterized neural networks. We then use this theory to derive Lipschitz-based generalization bounds for GANs. Our bounds show that penalizing the Lipschitz constant of the GAN loss can improve generalization. This result answers the long mystery of why the popular use of Lipschitz constraint for GANs often leads to great success, empirically without a solid theory. Finally but surprisingly, we show that, when using Dropout or spectral normalization, both \emph{truly deep} neural networks and GANs can generalize well without the curse of dimensionality.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.