Path: blob/master/second_edition/chapter11_part02_sequence-models.ipynb
713 views
This is a companion notebook for the book Deep Learning with Python, Second Edition. For readability, it only contains runnable code blocks and section titles, and omits everything else in the book: text paragraphs, figures, and pseudocode.
If you want to be able to follow what's going on, I recommend reading the notebook side by side with your copy of the book.
This notebook was generated for TensorFlow 2.6.
Processing words as a sequence: The sequence model approach
A first practical example
Downloading the data
Preparing the data
Preparing integer sequence datasets
A sequence model built on one-hot encoded vector sequences
Training a first basic sequence model
Understanding word embeddings
Learning word embeddings with the Embedding layer
Instantiating an Embedding
layer
Model that uses an Embedding
layer trained from scratch
Understanding padding and masking
Using an Embedding
layer with masking enabled
Using pretrained word embeddings
Parsing the GloVe word-embeddings file
Preparing the GloVe word-embeddings matrix
Model that uses a pretrained Embedding layer