Path: blob/master/notebooks/book2/26/gan_mixture_of_gaussians.ipynb
1192 views
This notebook implements a Generative Adversarial Network to fit a synthetic dataset generated from a mixture of Gaussians in 2D.
The code was adapted from the ODEGAN code here: https://github.com/deepmind/deepmind-research/blob/master/ode_gan/odegan_mog16.ipynb. The original notebook was created by Chongli Qin.
Some modifications made by Mihaela Rosca here were also incorporated.
Imports
|████████████████████████████████| 990 kB 31.9 MB/s
|████████████████████████████████| 38.1 MB 1.2 MB/s
|████████████████████████████████| 98 kB 8.6 MB/s
Building wheel for jax (setup.py) ... done
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
albumentations 0.1.12 requires imgaug<0.2.7,>=0.2.5, but you have imgaug 0.2.9 which is incompatible.
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
|████████████████████████████████| 88 kB 7.2 MB/s
|████████████████████████████████| 272 kB 62.8 MB/s
|████████████████████████████████| 125 kB 51.9 MB/s
|████████████████████████████████| 1.1 MB 51.6 MB/s
Building wheel for probml-utils (PEP 517) ... done
Building wheel for TexSoup (setup.py) ... done
Building wheel for umap-learn (setup.py) ... done
Building wheel for pynndescent (setup.py) ... done
Data Generation
Data is generated from a 2D mixture of Gaussians.
Plotting
Models and Training
A multilayer perceptron with the ReLU activation function.
The loss function for the discriminator is:
where , as in the original GAN.
The loss function for the generator is:
where for the non-saturating generator loss.
Perform a training step by first updating the discriminator parameters using the gradient and then updating the generator parameters using the gradient .
Plot Results
Plot the data and the examples generated by the generator.