Content Tags

There are no tags.

Many Paths to Equilibrium: GANs Do Not Need to Decrease a Divergence At Every Step.

RSS Source
Authors
William Fedus, Mihaela Rosca, Balaji Lakshminarayanan, Andrew M. Dai, Shakir Mohamed, Ian Goodfellow

Generative adversarial networks (GANs) are a family of generative models thatdo not minimize a single training criterion. Unlike other generative models,the data distribution is learned via a game between a generator (the generativemodel) and a discriminator (a teacher providing training signal) that eachminimize their own cost. GANs are designed to reach a Nash equilibrium at whicheach player cannot reduce their cost without changing the other players'parameters. One useful approach for the theory of GANs is to show that adivergence between the training distribution and the model distribution obtainsits minimum value at equilibrium. Several recent research directions have beenmotivated by the idea that this divergence is the primary guide for thelearning process and that every step of learning should decrease thedivergence. We show that this view is overly restrictive. During GAN training,the discriminator provides learning signal in situations where the gradients ofthe divergences between distributions would not be useful. We provide empiricalcounterexamples to the view of GAN training as divergence minimization.Specifically, we demonstrate that GANs are able to learn distributions insituations where the divergence minimization point of view predicts they wouldfail. We also show that gradient penalties motivated from the divergenceminimization perspective are equally helpful when applied in other contexts inwhich the divergence minimization perspective does not predict they would behelpful. This contributes to a growing body of evidence that GAN training maybe more usefully viewed as approaching Nash equilibria via trajectories that donot necessarily minimize a specific divergence at each step.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.