Path: blob/master/Applied Generative AI with GANS/2.1 Activation Functions.ipynb
4801 views
1. Activation Functions
Activation functions decide how neurons respond to input.
Without activation → network is just linear math
With activation → network learns complex patterns
Common activations used in GANs:
ReLU
LeakyReLU
Sigmoid
Tanh | Activation Function | Mathematical Formula | Output Range | Small Curve (Indicative Shape) | | ------------------- | ------------------------------------------------------------------- | ------------ | ------------------------------ | | Sigmoid | ( \sigma(x) = \frac{1}{1 + e^{-x}} ) | (0, 1) | ▁▂▃▄▅▆▇█ | | Tanh | ( \tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}} ) | (-1, 1) | ▁▂▄▆█▆▄▂ | | ReLU | ( f(x) = \max(0, x) ) | [0, ∞) | ▁▁▁▁▂▄▆█ | | Leaky ReLU | ( f(x) = \max(\alpha x, x),; \alpha \approx 0.01 ) | (-∞, ∞) | ▁▂▃▁▂▄▆█ | | ELU | ( f(x)= ) | (-α, ∞) | ▁▂▃▄▅▆▇█ | | Softplus | ( f(x)=\ln(1+e^x) ) | (0, ∞) | ▁▂▃▄▅▆▇█ | | Softmax | ( \text{softmax}(z_i)=\frac{e^{z_i}}{\sum_j e^{z_j}} ) | (0,1), sum=1 | ▂▄▆█▆▄▂ |
Key Takeaway
Generator mostly uses ReLU / Tanh
Discriminator mostly uses LeakyReLU / Sigmoid
2. Feedforward Neural Network → GAN Discriminator Mapping
A Discriminator is simply a binary classification neural network.
| Component | Feedforward NN | GAN Discriminator |
|---|---|---|
| Input | Features | Image / Data sample |
| Hidden Layers | Dense layers | Dense / CNN layers |
| Output | Numeric value | Probability (Real / Fake) |
| Activation | ReLU | LeakyReLU |
| Final Layer | Linear | Sigmoid |
3. Day-2 GAN Starter (Minimal Working GAN)
Goal: Learn a simple numeric distribution around value 5.
This example shows Generator + Discriminator interaction with minimal complexity.