Path: blob/master/notebooks/book2/26/gan_jax_celebA_demo.ipynb
1192 views
AUTHOR OF THE NOTEBOOK : Susnato Dhar https://github.com/susnato
ACKNOWLEDGEMENTS AND MENTIONS
- The whole explanation about GANs can be found in the book Probabilistic Machine Learning: Advanced Topics by Kevin P Murphy.
- I would like to thank Valentin Goldité for making this awesome repository for Implementing GANs using JAX.
- I would like to thank the creators of CelebA dataset.
INSTALL REQUIREMENTS
|████████████████████████████████| 309 kB 5.4 MB/s
|████████████████████████████████| 136 kB 34.7 MB/s
|████████████████████████████████| 256 kB 11.3 MB/s
|████████████████████████████████| 70 kB 6.2 MB/s
CLONE THE REPOSITORY
GAN JAX - A toy project to generate images from GANs with JAX
This project aims to bring the power of JAX, a Python framework developped by Google and DeepMind to train Generative Adversarial Networks for images generation.
- All the credits of implementing DCGAN and ProGAN using JAX goes to the author
- I Took some of the examples from the repository (eg pretrained CIFAR) model and convert to a self-contained colab. as described by Kevin P Murphy in this thread.
/content
Cloning into 'GANJax'...
remote: Enumerating objects: 238, done.
remote: Counting objects: 100% (95/95), done.
remote: Compressing objects: 100% (59/59), done.
remote: Total 238 (delta 42), reused 69 (delta 31), pack-reused 143
Receiving objects: 100% (238/238), 43.22 MiB | 25.52 MiB/s, done.
Resolving deltas: 100% (76/76), done.
/content/GANJax
IMPORTS AND CONFIGS
DATASET
We are using the CelebA Dataset,
In order to use it either we have to download it from their website or if we want to download the cropped version of dataset from the Kaggle we will need to have a Kaggle account and setup the kaggle token.
Since I was facing some issues downloading from their Gdrive and it may not be relevant for some users to create a Kaggle Account and setup the token and Kaggle API, I uploaded the dataset to Dropbox and made it public so that everyone can use it without any setups or anything! If you are facing any issues with this link then you can mail me.
Let's Download the files from the DropBox link and extract the images.
DOWNLOAD THE DATASET
PREPARE THE DATASET
The code in the main repository is fine and can be used without any error but it shuffles the images after creating the dataset using tf.data.Dataset
and doing so it takes some time so I commented the line out. Feel free to use the load_images
from utils.data
.
Let's see some images from the dataset to make sure that they are ok.
MODEL
We are using a ProGAN(PROGRESSIVE GROWING OF GANs)
MODEL TRAINING AND TESTING
Let's try to get some initial results quickly using the Pre-Trained Model, then we will go to the training part
MODEL TESTING
TEST THE PRE-TRAINED MODEL
We are very thankful to the author of the repository since he has provided pretrained DCGAN models on CIFAR-10
, CelebA
and MNIST
datasets
MODEL TRAINING
In order to train the model just define the trainer
first then define the config
then start training using trainer.main
.
At first the predictons are only noise and later it will start making sense but since Colab does not allow us to train the model for very long time we are just training for 10 epochs.
If you want to train the model then you might need a GPU on your local machine or the training time will be enormous, if you don't have a GPU then you can use Kaggle, it gives you almost 35 hours of GPU runtime per week for free(You can run each notebook upto 10 hours)!
- Download this notebook.
- Then create a notebook in kaggle and change the Accelerator to GPU.
- The Upload the notebook to Kaggle and run it in and don't forget to increase the EPOCHS.
CONCLUSION
- We see that the predicted outputs(using the pretrained model) are pretty good but they are still distorted, In order to overcome issue we can try to train it for hundreads of epochs since the dataset is very large, any data related problem won't arise.
- If you have made it to the end of the notebook without any error then congrats! and thanks for reading the notebook!
- If you have any comments on how to improve the notebook then please write them into comments.
REFERENCES