Kernel: Python 3 (ipykernel)
Machine Learning with PyTorch and Scikit-Learn
-- Code Examples
Package version checks
Add folder to path in order to load from the check_packages.py script:
In [1]:
Check recommended package versions:
In [3]:
Out[3]:
[OK] torch 1.10.1
[OK] torchvision 0.11.2
[OK] numpy 1.21.5
[OK] matplotlib 3.5.1
Chapter 17 - Generative Adversarial Networks for Synthesizing New Data (Part 2/2)
Note that the optional watermark extension is a small IPython notebook plugin that I developed to make the code reproducible. You can just skip the following line(s).
In [2]:
Out[2]:
Author: Sebastian Raschka, Yuxi (Hayden) Liu & Vahid Mirjalili
Last updated: 2021-12-27
numpy : 1.21.5
scipy : 1.7.3
matplotlib: 3.5.1
torch : 1.10.1
In [2]:
Improving the quality of synthesized images using a convolutional and Wasserstein GAN
Transposed convolution
In [3]:
Out[3]:
In [4]:
Out[4]:
Batch normalization
In [5]:
Out[5]:
Implementing the generator and discriminator
In [6]:
Out[6]:
In [7]:
Out[7]:
Setting up the Google Colab
In [8]:
In [9]:
Out[9]:
1.10.0+cu113
GPU Available: True
In [10]:
Train the DCGAN model
In [11]:
In [12]:
In [13]:
Out[13]:
Sequential(
(0): ConvTranspose2d(100, 128, kernel_size=(4, 4), stride=(1, 1), bias=False)
(1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(2): LeakyReLU(negative_slope=0.2)
(3): ConvTranspose2d(128, 64, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(4): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(5): LeakyReLU(negative_slope=0.2)
(6): ConvTranspose2d(64, 32, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
(7): BatchNorm2d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(8): LeakyReLU(negative_slope=0.2)
(9): ConvTranspose2d(32, 1, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
(10): Tanh()
)
Discriminator(
(network): Sequential(
(0): Conv2d(1, 32, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
(1): LeakyReLU(negative_slope=0.2)
(2): Conv2d(32, 64, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
(3): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(4): LeakyReLU(negative_slope=0.2)
(5): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(6): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(7): LeakyReLU(negative_slope=0.2)
(8): Conv2d(128, 1, kernel_size=(4, 4), stride=(1, 1), bias=False)
(9): Sigmoid()
)
)
In [14]:
In [15]:
In [16]:
In [17]:
In [18]:
Out[18]:
Epoch 001 | Avg Losses >> G/D 4.5246/0.1213
Epoch 002 | Avg Losses >> G/D 4.3550/0.1622
Epoch 003 | Avg Losses >> G/D 3.5067/0.2883
Epoch 004 | Avg Losses >> G/D 3.0862/0.3388
Epoch 005 | Avg Losses >> G/D 2.9113/0.3387
Epoch 006 | Avg Losses >> G/D 2.8452/0.3513
Epoch 007 | Avg Losses >> G/D 2.8693/0.3268
Epoch 008 | Avg Losses >> G/D 2.9476/0.3122
Epoch 009 | Avg Losses >> G/D 2.9538/0.3233
Epoch 010 | Avg Losses >> G/D 2.9754/0.3221
Epoch 011 | Avg Losses >> G/D 3.0405/0.2882
Epoch 012 | Avg Losses >> G/D 3.0717/0.2732
Epoch 013 | Avg Losses >> G/D 3.1362/0.2705
Epoch 014 | Avg Losses >> G/D 3.2441/0.2409
Epoch 015 | Avg Losses >> G/D 3.3397/0.2372
Epoch 016 | Avg Losses >> G/D 3.4194/0.2276
Epoch 017 | Avg Losses >> G/D 3.3906/0.2368
Epoch 018 | Avg Losses >> G/D 3.4867/0.2339
Epoch 019 | Avg Losses >> G/D 3.4793/0.2192
Epoch 020 | Avg Losses >> G/D 3.4953/0.2327
Epoch 021 | Avg Losses >> G/D 3.5337/0.2347
Epoch 022 | Avg Losses >> G/D 3.5944/0.1870
Epoch 023 | Avg Losses >> G/D 3.7159/0.1911
Epoch 024 | Avg Losses >> G/D 3.7075/0.2069
Epoch 025 | Avg Losses >> G/D 3.7487/0.2201
Epoch 026 | Avg Losses >> G/D 3.7958/0.1948
Epoch 027 | Avg Losses >> G/D 3.7065/0.2164
Epoch 028 | Avg Losses >> G/D 3.7537/0.2130
Epoch 029 | Avg Losses >> G/D 3.8666/0.1889
Epoch 030 | Avg Losses >> G/D 3.8544/0.1767
Epoch 031 | Avg Losses >> G/D 3.8471/0.1977
Epoch 032 | Avg Losses >> G/D 3.9974/0.1798
Epoch 033 | Avg Losses >> G/D 3.9846/0.1895
Epoch 034 | Avg Losses >> G/D 4.0168/0.1613
Epoch 035 | Avg Losses >> G/D 4.0456/0.1923
Epoch 036 | Avg Losses >> G/D 4.0748/0.1498
Epoch 037 | Avg Losses >> G/D 4.1228/0.1949
Epoch 038 | Avg Losses >> G/D 4.1116/0.1533
Epoch 039 | Avg Losses >> G/D 4.1364/0.1808
Epoch 040 | Avg Losses >> G/D 4.1411/0.1698
Epoch 041 | Avg Losses >> G/D 4.0890/0.1765
Epoch 042 | Avg Losses >> G/D 4.2081/0.1688
Epoch 043 | Avg Losses >> G/D 4.1475/0.1516
Epoch 044 | Avg Losses >> G/D 4.1919/0.1799
Epoch 045 | Avg Losses >> G/D 4.1874/0.1800
Epoch 046 | Avg Losses >> G/D 4.3176/0.1370
Epoch 047 | Avg Losses >> G/D 4.3111/0.1493
Epoch 048 | Avg Losses >> G/D 4.3623/0.1528
Epoch 049 | Avg Losses >> G/D 4.4283/0.1412
Epoch 050 | Avg Losses >> G/D 4.4012/0.1598
Epoch 051 | Avg Losses >> G/D 4.3176/0.1499
Epoch 052 | Avg Losses >> G/D 4.4296/0.1422
Epoch 053 | Avg Losses >> G/D 4.3227/0.1637
Epoch 054 | Avg Losses >> G/D 4.3717/0.1721
Epoch 055 | Avg Losses >> G/D 4.5143/0.1397
Epoch 056 | Avg Losses >> G/D 4.4612/0.1260
Epoch 057 | Avg Losses >> G/D 4.6134/0.1242
Epoch 058 | Avg Losses >> G/D 4.6475/0.1305
Epoch 059 | Avg Losses >> G/D 4.4129/0.1753
Epoch 060 | Avg Losses >> G/D 4.5356/0.1320
Epoch 061 | Avg Losses >> G/D 4.4809/0.1574
Epoch 062 | Avg Losses >> G/D 4.5545/0.1480
Epoch 063 | Avg Losses >> G/D 4.6331/0.1299
Epoch 064 | Avg Losses >> G/D 4.6304/0.1485
Epoch 065 | Avg Losses >> G/D 4.6701/0.1383
Epoch 066 | Avg Losses >> G/D 4.7063/0.1308
Epoch 067 | Avg Losses >> G/D 4.6042/0.1421
Epoch 068 | Avg Losses >> G/D 4.6230/0.1342
Epoch 069 | Avg Losses >> G/D 4.7576/0.1189
Epoch 070 | Avg Losses >> G/D 4.7010/0.1398
Epoch 071 | Avg Losses >> G/D 4.7855/0.1310
Epoch 072 | Avg Losses >> G/D 4.5200/0.1621
Epoch 073 | Avg Losses >> G/D 4.7661/0.1308
Epoch 074 | Avg Losses >> G/D 4.7582/0.1241
Epoch 075 | Avg Losses >> G/D 4.7590/0.1085
Epoch 076 | Avg Losses >> G/D 4.8761/0.1062
Epoch 077 | Avg Losses >> G/D 4.8262/0.1239
Epoch 078 | Avg Losses >> G/D 4.8433/0.1375
Epoch 079 | Avg Losses >> G/D 4.9040/0.1087
Epoch 080 | Avg Losses >> G/D 4.9725/0.0859
Epoch 081 | Avg Losses >> G/D 4.8008/0.1711
Epoch 082 | Avg Losses >> G/D 4.8954/0.1103
Epoch 083 | Avg Losses >> G/D 4.9791/0.1283
Epoch 084 | Avg Losses >> G/D 4.9569/0.1007
Epoch 085 | Avg Losses >> G/D 4.8485/0.1428
Epoch 086 | Avg Losses >> G/D 5.0601/0.0951
Epoch 087 | Avg Losses >> G/D 4.9004/0.1282
Epoch 088 | Avg Losses >> G/D 4.9365/0.1444
Epoch 089 | Avg Losses >> G/D 4.8809/0.1165
Epoch 090 | Avg Losses >> G/D 5.0286/0.1063
Epoch 091 | Avg Losses >> G/D 4.8509/0.1583
Epoch 092 | Avg Losses >> G/D 5.0228/0.1115
Epoch 093 | Avg Losses >> G/D 5.1098/0.0902
Epoch 094 | Avg Losses >> G/D 5.0027/0.1224
Epoch 095 | Avg Losses >> G/D 5.1134/0.1100
Epoch 096 | Avg Losses >> G/D 5.0539/0.1301
Epoch 097 | Avg Losses >> G/D 5.0903/0.1090
Epoch 098 | Avg Losses >> G/D 5.1078/0.0903
Epoch 099 | Avg Losses >> G/D 5.1667/0.0946
Epoch 100 | Avg Losses >> G/D 5.1004/0.1002
In [19]:
Out[19]:
Dissimilarity measures between two distributions
In [20]:
Out[20]:
In [21]:
Out[21]:
Using EM distance in practice for GANs
Gradient penalty
Implementing WGAN-GP to train the DCGAN model
In [22]:
In [23]:
In [24]:
In [25]:
In [26]:
In [29]:
Out[29]:
Epoch 001 | D Loss >> -0.5891
Epoch 002 | D Loss >> -0.6277
Epoch 003 | D Loss >> -0.6074
Epoch 004 | D Loss >> -0.6043
Epoch 005 | D Loss >> -0.5765
Epoch 006 | D Loss >> -0.5410
Epoch 007 | D Loss >> -0.5239
Epoch 008 | D Loss >> -0.4946
Epoch 009 | D Loss >> -0.4813
Epoch 010 | D Loss >> -0.4701
Epoch 011 | D Loss >> -0.4644
Epoch 012 | D Loss >> -0.4613
Epoch 013 | D Loss >> -0.4554
Epoch 014 | D Loss >> -0.4508
Epoch 015 | D Loss >> -0.4517
Epoch 016 | D Loss >> -0.4494
Epoch 017 | D Loss >> -0.4498
Epoch 018 | D Loss >> -0.4505
Epoch 019 | D Loss >> -0.4476
Epoch 020 | D Loss >> -0.4534
Epoch 021 | D Loss >> -0.4516
Epoch 022 | D Loss >> -0.4560
Epoch 023 | D Loss >> -0.4531
Epoch 024 | D Loss >> -0.4543
Epoch 025 | D Loss >> -0.4548
Epoch 026 | D Loss >> -0.4536
Epoch 027 | D Loss >> -0.4508
Epoch 028 | D Loss >> -0.4526
Epoch 029 | D Loss >> -0.4547
Epoch 030 | D Loss >> -0.4585
Epoch 031 | D Loss >> -0.4624
Epoch 032 | D Loss >> -0.4575
Epoch 033 | D Loss >> -0.4569
Epoch 034 | D Loss >> -0.4602
Epoch 035 | D Loss >> -0.4634
Epoch 036 | D Loss >> -0.4656
Epoch 037 | D Loss >> -0.4623
Epoch 038 | D Loss >> -0.4629
Epoch 039 | D Loss >> -0.4692
Epoch 040 | D Loss >> -0.4627
Epoch 041 | D Loss >> -0.4696
Epoch 042 | D Loss >> -0.4640
Epoch 043 | D Loss >> -0.4694
Epoch 044 | D Loss >> -0.4657
Epoch 045 | D Loss >> -0.4717
Epoch 046 | D Loss >> -0.4667
Epoch 047 | D Loss >> -0.4726
Epoch 048 | D Loss >> -0.4695
Epoch 049 | D Loss >> -0.4732
Epoch 050 | D Loss >> -0.4656
Epoch 051 | D Loss >> -0.4696
Epoch 052 | D Loss >> -0.4681
Epoch 053 | D Loss >> -0.4673
Epoch 054 | D Loss >> -0.4726
Epoch 055 | D Loss >> -0.4760
Epoch 056 | D Loss >> -0.4732
Epoch 057 | D Loss >> -0.4789
Epoch 058 | D Loss >> -0.4750
Epoch 059 | D Loss >> -0.4715
Epoch 060 | D Loss >> -0.4726
Epoch 061 | D Loss >> -0.4750
Epoch 062 | D Loss >> -0.4770
Epoch 063 | D Loss >> -0.4805
Epoch 064 | D Loss >> -0.4738
Epoch 065 | D Loss >> -0.4745
Epoch 066 | D Loss >> -0.4744
Epoch 067 | D Loss >> -0.4759
Epoch 068 | D Loss >> -0.4729
Epoch 069 | D Loss >> -0.4790
Epoch 070 | D Loss >> -0.4756
Epoch 071 | D Loss >> -0.4816
Epoch 072 | D Loss >> -0.4716
Epoch 073 | D Loss >> -0.4740
Epoch 074 | D Loss >> -0.4742
Epoch 075 | D Loss >> -0.4829
Epoch 076 | D Loss >> -0.4780
Epoch 077 | D Loss >> -0.4805
Epoch 078 | D Loss >> -0.4764
Epoch 079 | D Loss >> -0.4754
Epoch 080 | D Loss >> -0.4743
Epoch 081 | D Loss >> -0.4777
Epoch 082 | D Loss >> -0.4824
Epoch 083 | D Loss >> -0.4813
Epoch 084 | D Loss >> -0.4811
Epoch 085 | D Loss >> -0.4794
Epoch 086 | D Loss >> -0.4782
Epoch 087 | D Loss >> -0.4737
Epoch 088 | D Loss >> -0.4800
Epoch 089 | D Loss >> -0.4820
Epoch 090 | D Loss >> -0.4818
Epoch 091 | D Loss >> -0.4793
Epoch 092 | D Loss >> -0.4786
Epoch 093 | D Loss >> -0.4795
Epoch 094 | D Loss >> -0.4800
Epoch 095 | D Loss >> -0.4808
Epoch 096 | D Loss >> -0.4787
Epoch 097 | D Loss >> -0.4825
Epoch 098 | D Loss >> -0.4768
Epoch 099 | D Loss >> -0.4849
Epoch 100 | D Loss >> -0.4794
In [30]:
Out[30]:
Mode collapse
In [31]:
Out[31]:
Readers may ignore the next cell.
In [4]:
Out[4]:
[NbConvertApp] WARNING | Config option `kernel_spec_manager_class` not recognized by `NbConvertApp`.
[NbConvertApp] Converting notebook ch17_part2.ipynb to script
[NbConvertApp] Writing 13766 bytes to ch17_part2.py