Path: blob/master/Applied Generative AI with GANS/1.4 Epoch Iteration and Batch.ipynb
4960 views
Epoch vs Iteration in Neural Networks
Understanding the difference between epoch and iteration is crucial in machine learning model training.
1. Epoch
An epoch refers to one complete pass through the entire training dataset.
Suppose you have 10,000 data points and decide to train your model for 5 epochs. This means the model will see all 10,000 data points five times.
Increasing the number of epochs allows the model to learn better but may risk overfitting if the number is too high.
Example:
Dataset size = 10,000 samples
Batch size = 500
1 Epoch = The model processes all 10,000 samples once.
2. Iteration
An iteration refers to one update of model weights based on a single batch of data.
In the above example:
Dataset size = 10,000 samples
Batch size = 500
In one epoch, the model will require: [ \text{Iterations per epoch} = \frac{\text{Total Samples}}{\text{Batch Size}} = \frac{10,000}{500} = 20 ]
So, in one epoch, there will be 20 iterations.
Key Difference
| Aspect | Epoch | Iteration |
|---|---|---|
| Definition | One full pass through the entire dataset | One forward & backward pass over a single batch |
| Control | Controlled by specifying the number of epochs | Defined by batch size and dataset size |
| Example | 5 epochs = The model sees the full dataset 5 times | With batch size 500 and 10,000 data points, each epoch has 20 iterations |
Example Scenario
Suppose you train a model with the following settings:
Dataset size: 10,000 samples
Batch size: 500
Epochs: 3
Total Iterations = 3 (epochs) × 20 (iterations per epoch) = 60 iterations
Quick Tip
More epochs = More learning (but risk of overfitting).
More iterations = Faster convergence (but may require tuning batch size).