Path: blob/master/notebooks/book1/14/layer_norm_jax.ipynb
1192 views
Kernel: Python 3
Please find torch implementation of this notebook here: https://colab.research.google.com/github/probml/pyprobml/blob/master/notebooks/book1/14/layer_norm_torch.ipynb
In [2]:
In [3]:
Out[3]:
batch norm
[[-1. 1. 1.]
[ 1. -1. -1.]]
layer norm
[[ 0.47376014 -1.39085732 0.91709718]
[ 1.41421356 -0.70711669 -0.70709687]]
In [4]:
Out[4]:
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
batch norm
[[-0.99999815 0.99978346 0.99999744]
[ 0.99999815 -0.9997831 -0.9999975 ]]
layer norm
[[ 0.473758 -1.3908514 0.9170933 ]
[ 1.4142125 -0.70711625 -0.7070964 ]]