Path: blob/master/notebooks/book1/15/positional_encoding_torch.ipynb
1192 views
Kernel: Python 3
Please find jax implementation of this notebook here: https://colab.research.google.com/github/probml/pyprobml/blob/master/notebooks/book1/15/positional_encoding_jax.ipynb
Posititional encoding for transformers.
We show how to implement positional encoding. Based on sec 10.6 of http://d2l.ai/chapter_attention-mechanisms/self-attention-and-positional-encoding.html.
In [2]:
In [3]:
In [5]:
Out[5]:
In [6]:
Out[6]: