Path: blob/master/Applied Generative AI with GANS/GAN Types.ipynb
4740 views
Kernel: Python 3 (ipykernel)
Types of GANs
| GAN Type | Key Idea | Objective / Loss Formula | Strengths | Limitations | Typical Use Cases | ||
|---|---|---|---|---|---|---|---|
| Vanilla GAN | Adversarial game between Generator (G) and Discriminator (D) | (\min_G \max_D V(D,G) = \mathbb{E}{x\sim p{data}}[\log D(x)] + \mathbb{E}_{z\sim p_z}[\log(1 - D(G(z)))]) | Simple, foundational, easy to implement | Training instability, mode collapse, vanishing gradients | Academic learning, toy datasets (MNIST), GAN fundamentals | ||
| DCGAN | CNN-based GAN with architectural constraints | Same as Vanilla GAN (cross-entropy loss) | Stable training, good image quality, scalable to larger images | Still sensitive to hyperparameters | Image generation (faces, objects), representation learning | ||
| Conditional GAN (cGAN) | GAN conditioned on labels or attributes | (\min_G \max_D \mathbb{E}[\log D(x | y)] + \mathbb{E}[\log(1 - D(G(z | y)))]) | Controlled generation, class-specific outputs | Requires labeled data | Image-to-image translation, class-specific synthesis |
| WGAN | Uses Wasserstein (Earth-Mover) distance | (\min_G \max_{D\in\mathcal{D}} \mathbb{E}[D(x)] - \mathbb{E}[D(G(z))]) | Stable gradients, reduced mode collapse | Weight clipping harms capacity | High-quality image synthesis, stable GAN training | ||
| WGAN-GP | Gradient penalty instead of weight clipping | (\mathbb{E}[D(G(z))] - \mathbb{E}[D(x)] + \lambda \mathbb{E}[( | \nabla_{\hat{x}} D(\hat{x}) | _2 - 1)^2]) | Very stable, best convergence properties | Higher computation cost | Production-grade image generation, medical & satellite imagery |
GANs with Full Training Pipelines and Visualization
This notebook provides:
Conceptual explanation of GANs
Five common GAN variants
End-to-end training loops
Image visualization outputs
Dataset used: MNIST (simple, fast, standard for GAN demos)
Common Imports and Dataset
In [ ]:
Helper: Image Visualization
In [ ]:
1. Vanilla GAN – Full Training Loop
In [ ]:
2. DCGAN – Architecture & Training Loop
In [ ]:
Note on Other GAN Types
To keep runtime reasonable, cGAN, WGAN, and WGAN-GP follow the same training pattern with:
Modified loss functions
Label conditioning (cGAN)
Critic + Wasserstein loss (WGAN)
Gradient penalty (WGAN-GP)
These patterns are included earlier conceptually and can be extended using this training loop template.
Summary
| GAN Type | Training Stability | Control | Use Case |
|---|---|---|---|
| Vanilla GAN | Low | No | Learning |
| DCGAN | Medium | No | Image generation |
| cGAN | Medium | Yes | Conditional synthesis |
| WGAN | High | No | Stable training |
| WGAN-GP | Very High | No | Production systems |
Architect Takeaway
Vanilla GAN → teaching & demos
DCGAN → standard image GAN
WGAN-GP → enterprise-grade stability