Path: blob/master/Generative NLP Models using Python/Lab Work RNN for Alphabetical Sequence Generation .ipynb
3074 views
Kernel: Python 3 (ipykernel)
Alphabetical Sequence Generation with a Simple RNN
In [1]:
In [10]:
In [11]:
In [8]:
Out[8]:
['a',
'b',
'c',
'd',
'e',
'f',
'g',
'h',
'i',
'j',
'k',
'l',
'm',
'n',
'o',
'p',
'q',
'r',
's',
't',
'u',
'v',
'w',
'x',
'y',
'z']
In [5]:
Out[5]:
Input Sequence = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]
In [6]:
Out[6]:
Output Sequence = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25]
In [7]:
Out[7]:
Input tensor = tensor([[ 0],
[ 1],
[ 2],
[ 3],
[ 4],
[ 5],
[ 6],
[ 7],
[ 8],
[ 9],
[10],
[11],
[12],
[13],
[14],
[15],
[16],
[17],
[18],
[19],
[20],
[21],
[22],
[23],
[24]])
Alphabetical Sequence Generation using TensorFlow
In [18]:
In [19]:
In [20]:
Out[20]:
C:\Users\Suyashi144893\AppData\Local\anaconda3\Lib\site-packages\keras\src\layers\core\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.
super().__init__(activity_regularizer=activity_regularizer, **kwargs)
In [21]:
Out[21]:
Epoch 0, Loss: 0.0111, Accuracy: 1.0000
Epoch 50, Loss: 0.0111, Accuracy: 1.0000
Epoch 100, Loss: 0.0111, Accuracy: 1.0000
Epoch 150, Loss: 0.0111, Accuracy: 1.0000
Epoch 200, Loss: 0.0111, Accuracy: 1.0000
Epoch 250, Loss: 0.0111, Accuracy: 1.0000
Epoch 300, Loss: 0.0111, Accuracy: 1.0000
In [24]:
Out[24]:
lmnopqrstuvwxyz