Path: blob/master/notebooks/book1/15/positional_encoding_jax.ipynb
1192 views
Kernel: Python 3
Please find torch implementation of this notebook here: https://colab.research.google.com/github/probml/pyprobml/blob/master/notebooks/book1/15/positional_encoding_torch.ipynb
AUTHOR : Susnato Dhar(GitHub : https://github.com/susnato)
Posititional encoding for transformers.
- We show how to implement positional encoding using JAX. Based on sec 10.6 of http://d2l.ai/chapter_attention-mechanisms/self-attention-and-positional-encoding.html.
In [2]:
In [3]:
In [4]:
Out[4]:
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
In [5]:
Out[5]:
In [5]: