Path: tree/master/4 - Natural Language Processing with Attention Models/Week 2
65 views
Name | Size | Last Modified |
|---|---|---|
| C4W2_A1_Transformer_Summarizer.ipynb | 115 KB | |
| C4W2_L1_Attention.ipynb | 10.2 KB | |
| C4W2_L2_Transformer_Decoder.ipynb | 13.6 KB | |
| attention_lnb_figs/ | - | |
| causal.png | 83.5 KB | |
| data/ | - | |
| decoder.png | 113.5 KB | |
| dotproduct.png | 80.8 KB | |
| masked-attention.png | 29.2 KB | |
| transformer.png | 203.3 KB | |
| transformerNews.png | 425 KB | |
| transformer_decoder.png | 124.5 KB | |
| transformer_decoder_1.png | 42.4 KB | |
| transformer_decoder_lnb_figs/ | - | |
| transformer_decoder_zoomin.png | 332.9 KB | |
| vocab_dir/ | - |