Name | Size | Last Modified |
---|---|---|
README.md | 8.2 KB | |
attention_jax.ipynb | 56.9 KB | |
attention_torch.ipynb | 56.7 KB | |
bert_jax.ipynb | 149.8 KB | |
bert_torch.ipynb | 136.5 KB | |
cnn1d_sentiment_jax.ipynb | 71.1 KB | |
cnn1d_sentiment_torch.ipynb | 69.3 KB | |
entailment_attention_mlp_jax.ipynb | 88.3 KB | |
entailment_attention_mlp_torch.ipynb | 84.8 KB | |
gru_jax.ipynb | 99.6 KB | |
gru_torch.ipynb | 91.6 KB | |
kernel_regression_attention.ipynb | 162.4 KB | |
lstm_jax.ipynb | 98.8 KB | |
lstm_torch.ipynb | 91.2 KB | |
multi_head_attention_jax.ipynb | 9.7 KB | |
multi_head_attention_torch.ipynb | 10.7 KB | |
nmt_attention_jax.ipynb | 98.7 KB | |
nmt_attention_torch.ipynb | 93 KB | |
nmt_jax.ipynb | 73.3 KB | |
nmt_torch.ipynb | 66.2 KB | |
positional_encoding_jax.ipynb | 104 KB | |
positional_encoding_torch.ipynb | 99.2 KB | |
rnn_jax.ipynb | 111.2 KB | |
rnn_sentiment_jax.ipynb | 96.2 KB | |
rnn_sentiment_torch.ipynb | 93.4 KB | |
rnn_torch.ipynb | 109.4 KB | |
transformers_jax.ipynb | 317.2 KB | |
transformers_torch.ipynb | 297.5 KB |