Path: blob/master/CNN/lab-10-6-mnist_batchnorm.ipynb
618 views
Kernel: Python 3
In [1]:
In [2]:
In [3]:
In [4]:
In [5]:
In [6]:
In [7]:
In [8]:
In [9]:
Out[9]:
[Epoch 1-TRAIN] Batchnorm Loss(Acc): bn_loss:0.15962(bn_acc:0.95) vs No Batchnorm Loss(Acc): nn_loss:0.18374(nn_acc:0.95)
[Epoch 1-VALID] Batchnorm Loss(Acc): bn_loss:0.17032(bn_acc:0.95) vs No Batchnorm Loss(Acc): nn_loss:0.18346(nn_acc:0.94)
[Epoch 2-TRAIN] Batchnorm Loss(Acc): bn_loss:0.11463(bn_acc:0.96) vs No Batchnorm Loss(Acc): nn_loss:0.15138(nn_acc:0.96)
[Epoch 2-VALID] Batchnorm Loss(Acc): bn_loss:0.13644(bn_acc:0.96) vs No Batchnorm Loss(Acc): nn_loss:0.18243(nn_acc:0.95)
[Epoch 3-TRAIN] Batchnorm Loss(Acc): bn_loss:0.10040(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.14125(nn_acc:0.96)
[Epoch 3-VALID] Batchnorm Loss(Acc): bn_loss:0.12638(bn_acc:0.96) vs No Batchnorm Loss(Acc): nn_loss:0.16916(nn_acc:0.95)
[Epoch 4-TRAIN] Batchnorm Loss(Acc): bn_loss:0.08774(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.18832(nn_acc:0.94)
[Epoch 4-VALID] Batchnorm Loss(Acc): bn_loss:0.12113(bn_acc:0.96) vs No Batchnorm Loss(Acc): nn_loss:0.22995(nn_acc:0.93)
[Epoch 5-TRAIN] Batchnorm Loss(Acc): bn_loss:0.08417(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.12536(nn_acc:0.96)
[Epoch 5-VALID] Batchnorm Loss(Acc): bn_loss:0.11717(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.17861(nn_acc:0.95)
[Epoch 6-TRAIN] Batchnorm Loss(Acc): bn_loss:0.07613(bn_acc:0.98) vs No Batchnorm Loss(Acc): nn_loss:0.13720(nn_acc:0.96)
[Epoch 6-VALID] Batchnorm Loss(Acc): bn_loss:0.10860(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.18730(nn_acc:0.95)
[Epoch 7-TRAIN] Batchnorm Loss(Acc): bn_loss:0.07492(bn_acc:0.98) vs No Batchnorm Loss(Acc): nn_loss:0.12262(nn_acc:0.97)
[Epoch 7-VALID] Batchnorm Loss(Acc): bn_loss:0.11592(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.17871(nn_acc:0.96)
[Epoch 8-TRAIN] Batchnorm Loss(Acc): bn_loss:0.06917(bn_acc:0.98) vs No Batchnorm Loss(Acc): nn_loss:0.12606(nn_acc:0.97)
[Epoch 8-VALID] Batchnorm Loss(Acc): bn_loss:0.10645(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.17672(nn_acc:0.95)
[Epoch 9-TRAIN] Batchnorm Loss(Acc): bn_loss:0.06872(bn_acc:0.98) vs No Batchnorm Loss(Acc): nn_loss:0.10984(nn_acc:0.97)
[Epoch 9-VALID] Batchnorm Loss(Acc): bn_loss:0.11192(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.16769(nn_acc:0.96)
[Epoch 10-TRAIN] Batchnorm Loss(Acc): bn_loss:0.05977(bn_acc:0.98) vs No Batchnorm Loss(Acc): nn_loss:0.09960(nn_acc:0.97)
[Epoch 10-VALID] Batchnorm Loss(Acc): bn_loss:0.11220(bn_acc:0.97) vs No Batchnorm Loss(Acc): nn_loss:0.16750(nn_acc:0.96)
Learning finished
In [10]:
In [11]:
Out[11]: