Kernel: Python 3
MLP applied to IMDB movie reviews (binary sentiment analysis)
We use the IMDB movie review dataset, where the task is to classify the sentiment of the review as positive or negative. We use the preprocessed version of the dataset from https://www.tensorflow.org/datasets
In [1]:
In [2]:
In [5]:
Out[5]:
(25000,)
[1, 14, 22, 16, 43, 530, 973, 1622, 1385, 65, 458, 4468, 66, 3941, 4, 173, 36, 256, 5, 25, 100, 43, 838, 112, 50, 670, 2, 9, 35, 480, 284, 5, 150, 4, 172, 112, 167, 2, 336, 385, 39, 4, 172, 4536, 1111, 17, 546, 38, 13, 447, 4, 192, 50, 16, 6, 147, 2025, 19, 14, 22, 4, 1920, 4613, 469, 4, 22, 71, 87, 12, 16, 43, 530, 38, 76, 15, 13, 1247, 4, 22, 17, 515, 17, 12, 16, 626, 18, 2, 5, 62, 386, 12, 8, 316, 8, 106, 5, 4, 2223, 5244, 16, 480, 66, 3785, 33, 4, 130, 12, 16, 38, 619, 5, 25, 124, 51, 36, 135, 48, 25, 1415, 33, 6, 22, 12, 215, 28, 77, 52, 5, 14, 407, 16, 82, 2, 8, 4, 107, 117, 5952, 15, 256, 4, 2, 7, 3766, 5, 723, 36, 71, 43, 530, 476, 26, 400, 317, 46, 7, 4, 2, 1029, 13, 104, 88, 4, 381, 15, 297, 98, 32, 2071, 56, 26, 141, 6, 194, 7486, 18, 4, 226, 22, 21, 134, 476, 26, 480, 5, 144, 30, 5535, 18, 51, 36, 28, 224, 92, 25, 104, 4, 226, 65, 16, 38, 1334, 88, 12, 16, 283, 5, 16, 4472, 113, 103, 32, 15, 16, 5345, 19, 178, 32]
example 0, label 1
<START> this film was just brilliant casting location scenery story direction everyone's really suited the part they played and you could just imagine being there robert <UNK> is an amazing actor and now the same being director <UNK> father came from the same scottish island as myself so i loved the fact there was a real connection with this film the witty remarks throughout the film were great it was just brilliant so much that i bought the film as soon as it was released for <UNK> and would recommend it to everyone to watch and the fly fishing was amazing really cried at the end it was so sad and you know what they say if you cry at a film it must have been good and this definitely was also <UNK> to the two little boy's that played the <UNK> of norman and paul they were just brilliant children are often left out of the <UNK> list i think because the stars that play them all grown up are such a big profile for the whole film but these children are amazing and should be praised for what they have done don't you think the whole story was so lovely because it was true and was someone's life after all that was shared with us all
example 1, label 0
<START> big hair big boobs bad music and a giant safety pin these are the words to best describe this terrible movie i love cheesy horror movies and i've seen hundreds but this had got to be on of the worst ever made the plot is paper thin and ridiculous the acting is an abomination the script is completely laughable the best is the end showdown with the cop and how he worked out who the killer is it's just so damn terribly written the clothes are sickening and funny in equal <UNK> the hair is big lots of boobs <UNK> men wear those cut <UNK> shirts that show off their <UNK> sickening that men actually wore them and the music is just <UNK> trash that plays over and over again in almost every scene there is trashy music boobs and <UNK> taking away bodies and the gym still doesn't close for <UNK> all joking aside this is a truly bad film whose only charm is to look back on the disaster that was the 80's and have a good old laugh at how bad everything was back then
In [6]:
Out[6]:
(25000, 256)
[ 1 14 22 16 43 530 973 1622 1385 65 458 4468 66 3941
4 173 36 256 5 25 100 43 838 112 50 670 2 9
35 480 284 5 150 4 172 112 167 2 336 385 39 4
172 4536 1111 17 546 38 13 447 4 192 50 16 6 147
2025 19 14 22 4 1920 4613 469 4 22 71 87 12 16
43 530 38 76 15 13 1247 4 22 17 515 17 12 16
626 18 2 5 62 386 12 8 316 8 106 5 4 2223
5244 16 480 66 3785 33 4 130 12 16 38 619 5 25
124 51 36 135 48 25 1415 33 6 22 12 215 28 77
52 5 14 407 16 82 2 8 4 107 117 5952 15 256
4 2 7 3766 5 723 36 71 43 530 476 26 400 317
46 7 4 2 1029 13 104 88 4 381 15 297 98 32
2071 56 26 141 6 194 7486 18 4 226 22 21 134 476
26 480 5 144 30 5535 18 51 36 28 224 92 25 104
4 226 65 16 38 1334 88 12 16 283 5 16 4472 113
103 32 15 16 5345 19 178 32 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0]
In [7]:
Out[7]:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
embedding (Embedding) (None, None, 16) 160000
_________________________________________________________________
global_average_pooling1d (Gl (None, 16) 0
_________________________________________________________________
dense (Dense) (None, 16) 272
_________________________________________________________________
dense_1 (Dense) (None, 1) 17
=================================================================
Total params: 160,289
Trainable params: 160,289
Non-trainable params: 0
_________________________________________________________________
In [8]:
Out[8]:
Epoch 1/50
30/30 [==============================] - 0s 17ms/step - loss: 0.6917 - acc: 0.5245 - val_loss: 0.6891 - val_acc: 0.5353
Epoch 2/50
30/30 [==============================] - 0s 13ms/step - loss: 0.6844 - acc: 0.6204 - val_loss: 0.6796 - val_acc: 0.7214
Epoch 3/50
30/30 [==============================] - 0s 12ms/step - loss: 0.6700 - acc: 0.7399 - val_loss: 0.6630 - val_acc: 0.6894
Epoch 4/50
30/30 [==============================] - 0s 13ms/step - loss: 0.6465 - acc: 0.7456 - val_loss: 0.6363 - val_acc: 0.7644
Epoch 5/50
30/30 [==============================] - 0s 13ms/step - loss: 0.6129 - acc: 0.7871 - val_loss: 0.6016 - val_acc: 0.7877
Epoch 6/50
30/30 [==============================] - 0s 13ms/step - loss: 0.5714 - acc: 0.8108 - val_loss: 0.5612 - val_acc: 0.8076
Epoch 7/50
30/30 [==============================] - 0s 13ms/step - loss: 0.5255 - acc: 0.8345 - val_loss: 0.5191 - val_acc: 0.8228
Epoch 8/50
30/30 [==============================] - 0s 13ms/step - loss: 0.4798 - acc: 0.8501 - val_loss: 0.4795 - val_acc: 0.8355
Epoch 9/50
30/30 [==============================] - 0s 13ms/step - loss: 0.4374 - acc: 0.8631 - val_loss: 0.4447 - val_acc: 0.8446
Epoch 10/50
30/30 [==============================] - 0s 12ms/step - loss: 0.4005 - acc: 0.8731 - val_loss: 0.4142 - val_acc: 0.8541
Epoch 11/50
30/30 [==============================] - 0s 13ms/step - loss: 0.3679 - acc: 0.8827 - val_loss: 0.3899 - val_acc: 0.8588
Epoch 12/50
30/30 [==============================] - 0s 13ms/step - loss: 0.3401 - acc: 0.8909 - val_loss: 0.3706 - val_acc: 0.8636
Epoch 13/50
30/30 [==============================] - 0s 12ms/step - loss: 0.3172 - acc: 0.8962 - val_loss: 0.3533 - val_acc: 0.8692
Epoch 14/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2972 - acc: 0.9017 - val_loss: 0.3403 - val_acc: 0.8730
Epoch 15/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2791 - acc: 0.9074 - val_loss: 0.3291 - val_acc: 0.8751
Epoch 16/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2631 - acc: 0.9119 - val_loss: 0.3214 - val_acc: 0.8749
Epoch 17/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2497 - acc: 0.9153 - val_loss: 0.3127 - val_acc: 0.8800
Epoch 18/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2365 - acc: 0.9209 - val_loss: 0.3066 - val_acc: 0.8814
Epoch 19/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2250 - acc: 0.9253 - val_loss: 0.3014 - val_acc: 0.8810
Epoch 20/50
30/30 [==============================] - 0s 13ms/step - loss: 0.2142 - acc: 0.9274 - val_loss: 0.2975 - val_acc: 0.8811
Epoch 21/50
30/30 [==============================] - 0s 12ms/step - loss: 0.2049 - acc: 0.9303 - val_loss: 0.2939 - val_acc: 0.8832
Epoch 22/50
30/30 [==============================] - 0s 14ms/step - loss: 0.1953 - acc: 0.9355 - val_loss: 0.2914 - val_acc: 0.8838
Epoch 23/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1867 - acc: 0.9399 - val_loss: 0.2896 - val_acc: 0.8845
Epoch 24/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1790 - acc: 0.9425 - val_loss: 0.2879 - val_acc: 0.8838
Epoch 25/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1711 - acc: 0.9448 - val_loss: 0.2868 - val_acc: 0.8841
Epoch 26/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1641 - acc: 0.9489 - val_loss: 0.2879 - val_acc: 0.8828
Epoch 27/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1577 - acc: 0.9511 - val_loss: 0.2867 - val_acc: 0.8845
Epoch 28/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1515 - acc: 0.9538 - val_loss: 0.2878 - val_acc: 0.8854
Epoch 29/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1452 - acc: 0.9567 - val_loss: 0.2876 - val_acc: 0.8845
Epoch 30/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1391 - acc: 0.9590 - val_loss: 0.2885 - val_acc: 0.8843
Epoch 31/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1345 - acc: 0.9608 - val_loss: 0.2898 - val_acc: 0.8851
Epoch 32/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1290 - acc: 0.9631 - val_loss: 0.2903 - val_acc: 0.8863
Epoch 33/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1237 - acc: 0.9648 - val_loss: 0.2922 - val_acc: 0.8852
Epoch 34/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1192 - acc: 0.9674 - val_loss: 0.2936 - val_acc: 0.8853
Epoch 35/50
30/30 [==============================] - 0s 13ms/step - loss: 0.1145 - acc: 0.9686 - val_loss: 0.2963 - val_acc: 0.8849
Epoch 36/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1102 - acc: 0.9700 - val_loss: 0.2977 - val_acc: 0.8838
Epoch 37/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1058 - acc: 0.9715 - val_loss: 0.3003 - val_acc: 0.8842
Epoch 38/50
30/30 [==============================] - 0s 12ms/step - loss: 0.1022 - acc: 0.9730 - val_loss: 0.3029 - val_acc: 0.8839
Epoch 39/50
30/30 [==============================] - 0s 12ms/step - loss: 0.0981 - acc: 0.9745 - val_loss: 0.3055 - val_acc: 0.8827
Epoch 40/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0947 - acc: 0.9754 - val_loss: 0.3085 - val_acc: 0.8825
Epoch 41/50
30/30 [==============================] - 0s 12ms/step - loss: 0.0915 - acc: 0.9763 - val_loss: 0.3143 - val_acc: 0.8788
Epoch 42/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0885 - acc: 0.9773 - val_loss: 0.3151 - val_acc: 0.8826
Epoch 43/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0842 - acc: 0.9796 - val_loss: 0.3191 - val_acc: 0.8802
Epoch 44/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0811 - acc: 0.9813 - val_loss: 0.3228 - val_acc: 0.8805
Epoch 45/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0780 - acc: 0.9813 - val_loss: 0.3260 - val_acc: 0.8801
Epoch 46/50
30/30 [==============================] - 0s 12ms/step - loss: 0.0759 - acc: 0.9819 - val_loss: 0.3300 - val_acc: 0.8793
Epoch 47/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0724 - acc: 0.9842 - val_loss: 0.3329 - val_acc: 0.8791
Epoch 48/50
30/30 [==============================] - 0s 12ms/step - loss: 0.0701 - acc: 0.9850 - val_loss: 0.3371 - val_acc: 0.8789
Epoch 49/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0673 - acc: 0.9855 - val_loss: 0.3412 - val_acc: 0.8786
Epoch 50/50
30/30 [==============================] - 0s 13ms/step - loss: 0.0649 - acc: 0.9871 - val_loss: 0.3458 - val_acc: 0.8783
In [9]:
Out[9]:
dict_keys(['loss', 'acc', 'val_loss', 'val_acc'])
782/782 [==============================] - 1s 2ms/step - loss: 0.3712 - acc: 0.8652
[0.37122461199760437, 0.8651999831199646]
In [10]:
Out[10]:
In [11]:
Out[11]:
In [12]:
In [13]:
Out[13]:
.WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/training/tracking/tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/training/tracking/tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/training/tracking/tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.
WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/training/tracking/tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.
Instructions for updating:
This property should not be used in TensorFlow 2.0, as updates are applied automatically.
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
..INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
..INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
.INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
INFO:tensorflow:Assets written to: imdb_keras_best_model.ckpt/assets
..