Path: tree/master/4 - Natural Language Processing with Attention Models/Week 1
65 views
Name | Size | Last Modified |
|---|---|---|
| C4W1_A1_NMT_with_Attention.ipynb | 95 KB | |
| C4W1_L1_Ungraded_Lab_Stack_Semantics.ipynb | 20.5 KB | |
| C4W1_L2_Ungraded_Lab_Bleu_Score.ipynb | 52.3 KB | |
| NMTModel.png | 125.9 KB | |
| Stack1.png | 35.5 KB | |
| Stack2.png | 35.7 KB | |
| Stack3.png | 34.9 KB | |
| Stack4.png | 46.2 KB | |
| attention_overview.png | 94.1 KB | |
| data/ | - | |
| input_encoder.png | 15.4 KB | |
| output_dir/ | - | |
| plain_rnn.png | 59.3 KB | |
| pre_attention_decoder.png | 17.7 KB | |
| w1_unittest.py | 19.1 KB | |
| wmt19_can.txt | 5.8 KB | |
| wmt19_ref.txt | 11.8 KB | |
| wmt19_src.txt | 10 KB |