Path: blob/master/Model-1/WordClassifier-Seq2Seq2CNN.ipynb
427 views
Kernel: Python 3
Recurrent Neural Network - Word Classification
Using Special model
Implemented in TensorFlow. Using Seq2Seq to generate the sequence of letter images and recognise them by RNN.
TODO
In [1]:
Out[1]:
Tensorflow 1.4.0
Loading images
In [2]:
In [3]:
Out[3]:
Loading words...
-> Number of words: 5069
Settings
In [4]:
Dataset
In [5]:
Out[5]:
Training images: 4055
Testing images: 1014
In [6]:
Out[6]:
Total train images 4055
In [7]:
In [8]:
Out[8]:
Iterator created.
Iterator created.
Placeholders
In [9]:
Decoder Train Feeds
In [10]:
Encoder
In [11]:
In [12]:
In [13]:
Decoder
In [14]:
TRAIN DECODER
In [15]:
INFERENCE DECODER
In [16]:
In [17]:
RNN
In [18]:
Optimizer
Weights + Paddings
In [19]:
In [20]:
In [ ]:
(3, 11, 54)
Training
In [ ]:
<IPython.core.display.Javascript object>
batch 0 - loss: 5.0267339
expected > [45 48 30 32 1 0 0 0 0 0 0 0]
predicted > [50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50]
expected > [24 36 41 31 42 50 1 0 0 0 0 0]
predicted > [50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50]
batch 1500 - loss: 3.0129859
expected > [10 41 46 47 28 39 39 1 0 0]
predicted > [46 36 39 39 1 1 1 1]
expected > [32 41 32 45 34 52 1 0 0 0]
predicted > [46 42 42 1 1 1 1 1]
batch 3000 - loss: 2.2141256
expected > [38 32 1 0 0 0]
predicted > [29 32 1 1 1]
expected > [10 39 39 1 0 0]
predicted > [34 32 1 1 1]
batch 4500 - loss: 2.4519362
expected > [11 36 41 31 45 36 30 35 1 0]
predicted > [11 36 31 31 41 41 31 1 1]
expected > [46 32 46 36 47 1 0 0 0 0]
predicted > [46 32 46 46 47 47 1 1 1]
batch 6000 - loss: 2.2202351
expected > [21 28 47 28 41 28 1 0 0 0]
predicted > [21 28 47 32 32 28 1 1]
expected > [12 32 49 36 41 1 0 0 0 0]
predicted > [12 32 40 40 40 1 1 1]
batch 7500 - loss: 1.8579268
expected > [45 28 47 36 42 1 0]
predicted > [45 32 47 47 42 1]
expected > [50 35 42 1 0 0 0]
predicted > [49 32 39 42 1 1]
batch 9000 - loss: 1.6952639
expected > [49 42 31 28 1 0 0 0 0]
predicted > [49 42 31 28 1 1 1 1 1]
expected > [53 32 41 36 47 1 0 0 0]
predicted > [30 32 41 36 47 1 1 1 1]
batch 10500 - loss: 1.288264
expected > [2 1 0 0 0 0]
predicted > [28 1 1 1 1]
expected > [20 1 0 0 0 0]
predicted > [10 1 1 1 1]
batch 12000 - loss: 1.2034814
expected > [20 47 32 39 28 1 0 0 0 0]
predicted > [20 47 32 39 28 1 1 1 1]
expected > [ 4 35 28 39 38 1 0 0 0 0]
predicted > [ 4 38 28 39 38 1 1 1 1]
batch 13500 - loss: 1.581913
expected > [10 34 41 28 30 1]
predicted > [20 43 32 28 1]
expected > [ 3 28 52 1 0 0]
predicted > [ 3 28 52 1 1]
batch 15000 - loss: 0.97768688
expected > [29 31 36 1 0 0]
predicted > [29 47 36 1 1]
expected > [47 45 36 43 1 0]
predicted > [29 45 43 43 1]
batch 16500 - loss: 1.7931079
expected > [38 45 28 47 28 46 36 1 0 0]
predicted > [38 45 28 39 28 45 36 1 1 1 1]
expected > [39 32 47 48 46 38 28 1 0 0]
predicted > [39 32 48 46 39 28 28 28 1 1 1]
batch 18000 - loss: 1.3911827
expected > [ 2 39 36 30 32 1 0 0 0 0]
predicted > [38 39 40 30 32 32 1 1 1 1]
expected > [30 36 46 39 42 1 0 0 0 0]
predicted > [28 36 46 39 42 1 1 1 1 1]
batch 19500 - loss: 1.6664226
expected > [47 32 35 31 52 1]
predicted > [47 32 31 31 52 1]
expected > [42 1 0 0 0 0]
predicted > [42 1 1 1 1 1]
batch 21000 - loss: 1.7441137
expected > [11 36 41 31 45 36 30 35 1 0]
predicted > [11 36 41 45 36 45 38 1]
expected > [45 36 46 38 1 0 0 0 0 0]
predicted > [45 36 28 38 1 1 1 1]
batch 22500 - loss: 1.4565325
expected > [46 40 32 47 1 0 0 0 0]
predicted > [48 41 32 47 1 1 1 1 1 1]
expected > [25 32 41 28 1 0 0 0 0]
predicted > [25 32 41 28 1 1 1 1 1 1]
batch 24000 - loss: 1.4579525
expected > [34 1 0 0 0 0 0]
predicted > [26 1 1 1 1]
expected > [43 45 32 47 47 52 1]
predicted > [43 42 53 52 1]
batch 25500 - loss: 1.1414853
expected > [42 45 34 28 41 1 0 0]
predicted > [30 45 34 28 41 1 1]
expected > [19 48 31 42 39 33 1 0]
predicted > [ 5 28 31 28 39 33 1]
batch 27000 - loss: 1.6382754
expected > [28 1 0 0 0]
predicted > [28 1 1 1 1 1]
expected > [10 39 39 1 0]
predicted > [24 1 1 1 1 1]
batch 28500 - loss: 1.542223
expected > [43 28 48 46 32 1 0 0 0]
predicted > [43 28 47 32 32 1 1 1 1]
expected > [53 28 46 47 48 43 52 1 0]
predicted > [43 28 46 47 48 41 42 1 1]
batch 30000 - loss: 1.039789
expected > [32 51 30 36 47 32 1 0 0]
predicted > [32 51 30 36 47 32 1 1 1 1]
expected > [40 42 47 42 45 1 0 0 0]
predicted > [40 42 47 42 45 1 1 1 1 1]
batch 31500 - loss: 0.82488751
expected > [12 36 41 42 1 0 0 0 0 0]
predicted > [25 36 41 42 1 1 1 1 1 1]
expected > [43 45 36 30 32 1 0 0 0 0]
predicted > [43 45 36 41 32 1 1 1 1 1]
batch 33000 - loss: 2.1014812
expected > [53 28 1 0 0 0 0 0 0]
predicted > [53 28 1 1 1 1 1 1 1]
expected > [34 45 42 48 43 1 0 0 0]
predicted > [34 45 42 28 1 1 1 1 1]
batch 34500 - loss: 1.229347
expected > [40 28 47 32 45 36 28 39 1 0 0 0]
predicted > [40 28 47 32 45 28 28 1 1 1]
expected > [46 28 50 1 0 0 0 0 0 0 0 0]
predicted > [46 28 48 48 1 1 1 1 1 1]
batch 36000 - loss: 1.4402622
expected > [28 30 1 0 0 0]
predicted > [28 30 1 1 1 1]
expected > [48 31 1 0 0 0]
predicted > [48 31 1 1 1 1]
batch 37500 - loss: 1.3146204
expected > [47 52 43 32 1]
predicted > [47 52 43 32 1]
expected > [28 39 32 1 0]
predicted > [28 39 32 1 1]
batch 39000 - loss: 1.7878778
expected > [30 28 45 32 1 0 0 0 0 0 0]
predicted > [30 28 45 32 1 1 1 1 1 1 1]
expected > [46 48 29 46 30 45 36 29 32 1 0]
predicted > [46 48 29 46 30 45 29 32 1 1 1]
batch 40500 - loss: 0.89473355
expected > [46 48 31 31 32 41 1 0 0 0]
predicted > [46 45 31 31 32 41 41 1 1 1]
expected > [41 32 46 47 28 30 36 39 42 1]
predicted > [41 32 46 47 28 30 36 39 36 1]
batch 42000 - loss: 1.211858
expected > [47 50 42 1 0 0 0]
predicted > [47 50 42 1 1 1 1]
expected > [35 45 28 47 1 0 0]
predicted > [35 45 28 47 1 1 1]
batch 43500 - loss: 1.6062182
expected > [30 28 37 1 0 0]
predicted > [ 4 28 37 1 1 1 1]
expected > [37 28 38 42 1 0]
predicted > [52 32 38 42 1 1 1]
batch 45000 - loss: 0.84628856
expected > [2 1 0 0 0 0]
predicted > [2 1 1 1 1 1]
expected > [44 48 36 47 32 1]
predicted > [34 40 36 47 32 1]
batch 46500 - loss: 1.578271
expected > [39 32 53 32 47 1 0]
predicted > [29 32 32 47 1 1 1]
expected > [50 36 31 32 1 0 0]
predicted > [50 36 31 32 1 1 1]
batch 48000 - loss: 1.5562747
expected > [45 36 46 38 1 0 0 0 0]
predicted > [45 36 46 38 1 1 1 1]
expected > [45 28 30 32 38 1 0 0 0]
predicted > [45 28 30 32 38 1 1 1]
Training interrupted, model saved.
'models/word-clas/en/SeqRNN/Classifier2'
In [ ]:
Expected images: 4
Predicted images: 4
Expected images: 4
Predicted images: 4