Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
suyashi29
GitHub Repository: suyashi29/python-su
Path: blob/master/Generative AI for Intelligent Data Handling/Day 5.1 RNN for Text Sequence.ipynb
3074 views
Kernel: Python 3 (ipykernel)

RNN is used to generate a sequence of text, such as generating text character by character.

  • A simple Recurrent Neural Network (RNN) can be used for text data sequence generation in various ways. Here are a few examples:

Character-level Text Generation:

Generate text character by character.

Example: Predict the next character in a sequence based on previous characters.

  • Application: Generating new text in the style of a given text corpus (e.g., generating new Shakespearean text). Word-level Text Generation:

Generate text word by word.

Example: Predict the next word in a sequence based on previous words.

  • Application: Creating coherent sentences or paragraphs based on a training corpus (e.g., generating news headlines).

Example 1: Predicting the Next Character in a String

import numpy as np from tensorflow.keras.models import Sequential from tensorflow.keras.layers import SimpleRNN, Dense from tensorflow.keras.utils import to_categorical # Generate a sequence of characters alphabet = "abcdefghijklmnopqrstuvwxyz" sequence_length = len(alphabet) # Prepare data X = [] y = [] for i in range(sequence_length - 4): X.append([ord(char) for char in alphabet[i:i+4]]) y.append(ord(alphabet[i+4])) X = np.array(X).reshape((-1, 4, 1)) / 255.0 # Normalize y = to_categorical(y, num_classes=256) # One-hot encoding y
array([[0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], ..., [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.]], dtype=float32)
# Define RNN model model = Sequential() model.add(SimpleRNN(50, activation='relu', input_shape=(4, 1))) model.add(Dense(256, activation='softmax')) model.compile(optimizer='adam', loss='categorical_crossentropy') # Train the model model.fit(X, y, epochs=1000, verbose=0)
<keras.callbacks.History at 0x1bde3d4b940>
# Generate a new sequence input_sequence = np.array([ord(char) for char in "abcd"]).reshape((1, 4, 1)) / 255.0 prediction = model.predict(input_sequence, verbose=0) predicted_char = chr(np.argmax(prediction))
input_sequence
array([[[0.38039216], [0.38431373], [0.38823529], [0.39215686]]])
# Print results print("Input Sequence:", "abcd") print("Next Character Prediction:", predicted_char)
Input Sequence: abcd Next Character Prediction: e

Example 2: Generate text word by word

import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, LSTM # Example input text text = "how are you feeling today?" # Create character mapping chars = sorted(set(text)) char_to_idx = {ch: idx for idx, ch in enumerate(chars)} idx_to_char = {idx: ch for idx, ch in enumerate(chars)} num_chars = len(chars) # Prepare input-output pairs for training max_len = 10 # Adjust max_len based on the sequences used for training step = 1 sequences = [] next_chars = [] for i in range(0, len(text) - max_len, step): sequences.append(text[i:i + max_len]) next_chars.append(text[i + max_len]) # Vectorization X = np.zeros((len(sequences), max_len, num_chars), dtype=np.float32) y = np.zeros((len(sequences), num_chars), dtype=np.float32) for i, sequence in enumerate(sequences): for t, char in enumerate(sequence): X[i, t, char_to_idx[char]] = 1.0 y[i, char_to_idx[next_chars[i]]] = 1.0 # Build the RNN model model = Sequential() model.add(LSTM(128, input_shape=(max_len, num_chars))) model.add(Dense(num_chars, activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer='adam') # Training the model model.fit(X, y, batch_size=1, epochs=100, verbose=2) # Function to generate text def generate_text(model, seed_text, max_len, num_chars): generated_text = seed_text for _ in range(max_len): x_pred = np.zeros((1, max_len, num_chars), dtype=np.float32) for t, char in enumerate(seed_text): x_pred[0, t, char_to_idx[char]] = 1.0 preds = model.predict(x_pred, verbose=0)[0] next_index = np.random.choice(num_chars, p=preds) next_char = idx_to_char[next_index] generated_text += next_char seed_text = seed_text[1:] + next_char return generated_text # Generate a sequence generated_sequence = generate_text(model, seed_text="hello ", max_len=10, num_chars=num_chars) # Use max_len consistent with training print("Generated Sequence:") print(generated_sequence)

3.Example Word sequence Generation

import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, LSTM, Embedding from tensorflow.keras.preprocessing.text import Tokenizer from tensorflow.keras.preprocessing.sequence import pad_sequences # Example text data (replace with your dataset) text_data = [ "Hi what are you doing today? Any plans" ] # Tokenize the text data tokenizer = Tokenizer() tokenizer.fit_on_texts(text_data) total_words = len(tokenizer.word_index) + 1 # Create input sequences using tokenizer input_sequences = [] for line in text_data: token_list = tokenizer.texts_to_sequences([line])[0] for i in range(1, len(token_list)): n_gram_sequence = token_list[:i+1] input_sequences.append(n_gram_sequence) # Pad sequences for equal length input max_sequence_len = max([len(seq) for seq in input_sequences]) input_sequences = np.array(pad_sequences(input_sequences, maxlen=max_sequence_len, padding='pre')) # Create predictors and labels predictors, label = input_sequences[:,:-1],input_sequences[:,-1] # Convert labels to categorical one-hot encoding label = tf.keras.utils.to_categorical(label, num_classes=total_words) # Build the model model = Sequential() model.add(Embedding(total_words, 10, input_length=max_sequence_len-1)) model.add(LSTM(50)) model.add(Dense(total_words, activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) # Train the model model.fit(predictors, label, epochs=100, verbose=1) # Function to generate text def generate_text(seed_text, next_words, model, max_sequence_len): for _ in range(next_words): token_list = tokenizer.texts_to_sequences([seed_text])[0] token_list = pad_sequences([token_list], maxlen=max_sequence_len-1, padding='pre') predicted_probs = model.predict(token_list, verbose=0)[0] predicted_index = np.argmax(predicted_probs) output_word = "" for word, index in tokenizer.word_index.items(): if index == predicted_index: output_word = word break seed_text += " " + output_word return seed_text # Generate text generated_text = generate_text("Hi Rubi", 5, model, max_sequence_len) print(generated_text)
Epoch 1/100 1/1 [==============================] - 3s 3s/step - loss: 2.1941 - accuracy: 0.2857 Epoch 2/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1915 - accuracy: 0.2857 Epoch 3/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1889 - accuracy: 0.2857 Epoch 4/100 1/1 [==============================] - 0s 5ms/step - loss: 2.1862 - accuracy: 0.4286 Epoch 5/100 1/1 [==============================] - 0s 5ms/step - loss: 2.1835 - accuracy: 0.4286 Epoch 6/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1806 - accuracy: 0.2857 Epoch 7/100 1/1 [==============================] - 0s 5ms/step - loss: 2.1775 - accuracy: 0.2857 Epoch 8/100 1/1 [==============================] - 0s 6ms/step - loss: 2.1744 - accuracy: 0.1429 Epoch 9/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1710 - accuracy: 0.1429 Epoch 10/100 1/1 [==============================] - 0s 5ms/step - loss: 2.1675 - accuracy: 0.1429 Epoch 11/100 1/1 [==============================] - 0s 6ms/step - loss: 2.1638 - accuracy: 0.1429 Epoch 12/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1598 - accuracy: 0.1429 Epoch 13/100 1/1 [==============================] - 0s 5ms/step - loss: 2.1556 - accuracy: 0.2857 Epoch 14/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1511 - accuracy: 0.2857 Epoch 15/100 1/1 [==============================] - 0s 6ms/step - loss: 2.1464 - accuracy: 0.2857 Epoch 16/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1413 - accuracy: 0.2857 Epoch 17/100 1/1 [==============================] - 0s 8ms/step - loss: 2.1358 - accuracy: 0.2857 Epoch 18/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1300 - accuracy: 0.2857 Epoch 19/100 1/1 [==============================] - 0s 5ms/step - loss: 2.1238 - accuracy: 0.2857 Epoch 20/100 1/1 [==============================] - 0s 11ms/step - loss: 2.1171 - accuracy: 0.2857 Epoch 21/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1099 - accuracy: 0.2857 Epoch 22/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1021 - accuracy: 0.2857 Epoch 23/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0938 - accuracy: 0.1429 Epoch 24/100 1/1 [==============================] - 0s 7ms/step - loss: 2.0848 - accuracy: 0.1429 Epoch 25/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0751 - accuracy: 0.2857 Epoch 26/100 1/1 [==============================] - 0s 10ms/step - loss: 2.0646 - accuracy: 0.2857 Epoch 27/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0533 - accuracy: 0.2857 Epoch 28/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0411 - accuracy: 0.2857 Epoch 29/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0279 - accuracy: 0.2857 Epoch 30/100 1/1 [==============================] - 0s 7ms/step - loss: 2.0137 - accuracy: 0.4286 Epoch 31/100 1/1 [==============================] - 0s 7ms/step - loss: 1.9983 - accuracy: 0.4286 Epoch 32/100 1/1 [==============================] - 0s 7ms/step - loss: 1.9818 - accuracy: 0.4286 Epoch 33/100 1/1 [==============================] - 0s 7ms/step - loss: 1.9640 - accuracy: 0.4286 Epoch 34/100 1/1 [==============================] - 0s 5ms/step - loss: 1.9450 - accuracy: 0.4286 Epoch 35/100 1/1 [==============================] - 0s 6ms/step - loss: 1.9246 - accuracy: 0.4286 Epoch 36/100 1/1 [==============================] - 0s 7ms/step - loss: 1.9028 - accuracy: 0.4286 Epoch 37/100 1/1 [==============================] - 0s 6ms/step - loss: 1.8797 - accuracy: 0.4286 Epoch 38/100 1/1 [==============================] - 0s 5ms/step - loss: 1.8553 - accuracy: 0.4286 Epoch 39/100 1/1 [==============================] - 0s 6ms/step - loss: 1.8296 - accuracy: 0.4286 Epoch 40/100 1/1 [==============================] - 0s 4ms/step - loss: 1.8027 - accuracy: 0.4286 Epoch 41/100 1/1 [==============================] - 0s 7ms/step - loss: 1.7746 - accuracy: 0.2857 Epoch 42/100 1/1 [==============================] - 0s 5ms/step - loss: 1.7454 - accuracy: 0.2857 Epoch 43/100 1/1 [==============================] - 0s 6ms/step - loss: 1.7150 - accuracy: 0.4286 Epoch 44/100 1/1 [==============================] - 0s 5ms/step - loss: 1.6833 - accuracy: 0.4286 Epoch 45/100 1/1 [==============================] - 0s 6ms/step - loss: 1.6504 - accuracy: 0.4286 Epoch 46/100 1/1 [==============================] - 0s 5ms/step - loss: 1.6161 - accuracy: 0.4286 Epoch 47/100 1/1 [==============================] - 0s 5ms/step - loss: 1.5806 - accuracy: 0.4286 Epoch 48/100 1/1 [==============================] - 0s 8ms/step - loss: 1.5441 - accuracy: 0.4286 Epoch 49/100 1/1 [==============================] - 0s 7ms/step - loss: 1.5071 - accuracy: 0.5714 Epoch 50/100 1/1 [==============================] - 0s 5ms/step - loss: 1.4698 - accuracy: 0.5714 Epoch 51/100 1/1 [==============================] - 0s 5ms/step - loss: 1.4328 - accuracy: 0.7143 Epoch 52/100 1/1 [==============================] - 0s 5ms/step - loss: 1.3961 - accuracy: 0.5714 Epoch 53/100 1/1 [==============================] - 0s 11ms/step - loss: 1.3600 - accuracy: 0.5714 Epoch 54/100 1/1 [==============================] - 0s 8ms/step - loss: 1.3244 - accuracy: 0.5714 Epoch 55/100 1/1 [==============================] - 0s 7ms/step - loss: 1.2894 - accuracy: 0.5714 Epoch 56/100 1/1 [==============================] - 0s 10ms/step - loss: 1.2551 - accuracy: 0.5714 Epoch 57/100 1/1 [==============================] - 0s 6ms/step - loss: 1.2217 - accuracy: 0.5714 Epoch 58/100 1/1 [==============================] - 0s 6ms/step - loss: 1.1891 - accuracy: 0.5714 Epoch 59/100 1/1 [==============================] - 0s 5ms/step - loss: 1.1572 - accuracy: 0.5714 Epoch 60/100 1/1 [==============================] - 0s 7ms/step - loss: 1.1258 - accuracy: 0.7143 Epoch 61/100 1/1 [==============================] - 0s 6ms/step - loss: 1.0948 - accuracy: 0.8571 Epoch 62/100 1/1 [==============================] - 0s 8ms/step - loss: 1.0643 - accuracy: 0.8571 Epoch 63/100 1/1 [==============================] - 0s 8ms/step - loss: 1.0344 - accuracy: 0.8571 Epoch 64/100 1/1 [==============================] - 0s 6ms/step - loss: 1.0050 - accuracy: 0.8571 Epoch 65/100 1/1 [==============================] - 0s 5ms/step - loss: 0.9761 - accuracy: 0.8571 Epoch 66/100 1/1 [==============================] - 0s 8ms/step - loss: 0.9476 - accuracy: 0.8571 Epoch 67/100 1/1 [==============================] - 0s 5ms/step - loss: 0.9199 - accuracy: 0.8571 Epoch 68/100 1/1 [==============================] - 0s 5ms/step - loss: 0.8928 - accuracy: 1.0000 Epoch 69/100 1/1 [==============================] - 0s 5ms/step - loss: 0.8664 - accuracy: 1.0000 Epoch 70/100 1/1 [==============================] - 0s 7ms/step - loss: 0.8404 - accuracy: 1.0000 Epoch 71/100 1/1 [==============================] - 0s 8ms/step - loss: 0.8149 - accuracy: 1.0000 Epoch 72/100 1/1 [==============================] - 0s 7ms/step - loss: 0.7898 - accuracy: 1.0000 Epoch 73/100 1/1 [==============================] - 0s 7ms/step - loss: 0.7650 - accuracy: 1.0000 Epoch 74/100 1/1 [==============================] - 0s 6ms/step - loss: 0.7407 - accuracy: 1.0000 Epoch 75/100 1/1 [==============================] - 0s 6ms/step - loss: 0.7171 - accuracy: 1.0000 Epoch 76/100 1/1 [==============================] - 0s 5ms/step - loss: 0.6943 - accuracy: 0.8571 Epoch 77/100 1/1 [==============================] - 0s 6ms/step - loss: 0.6728 - accuracy: 0.8571 Epoch 78/100 1/1 [==============================] - 0s 8ms/step - loss: 0.6529 - accuracy: 0.8571 Epoch 79/100 1/1 [==============================] - 0s 6ms/step - loss: 0.6348 - accuracy: 0.8571 Epoch 80/100 1/1 [==============================] - 0s 8ms/step - loss: 0.6186 - accuracy: 0.8571 Epoch 81/100 1/1 [==============================] - 0s 6ms/step - loss: 0.6036 - accuracy: 0.8571 Epoch 82/100 1/1 [==============================] - 0s 6ms/step - loss: 0.5894 - accuracy: 0.8571 Epoch 83/100 1/1 [==============================] - 0s 5ms/step - loss: 0.5756 - accuracy: 0.8571 Epoch 84/100 1/1 [==============================] - 0s 6ms/step - loss: 0.5619 - accuracy: 0.8571 Epoch 85/100 1/1 [==============================] - 0s 6ms/step - loss: 0.5482 - accuracy: 0.8571 Epoch 86/100 1/1 [==============================] - 0s 6ms/step - loss: 0.5346 - accuracy: 1.0000 Epoch 87/100 1/1 [==============================] - 0s 7ms/step - loss: 0.5211 - accuracy: 1.0000 Epoch 88/100 1/1 [==============================] - 0s 6ms/step - loss: 0.5079 - accuracy: 1.0000 Epoch 89/100 1/1 [==============================] - 0s 6ms/step - loss: 0.4953 - accuracy: 1.0000 Epoch 90/100 1/1 [==============================] - 0s 5ms/step - loss: 0.4832 - accuracy: 1.0000 Epoch 91/100 1/1 [==============================] - 0s 5ms/step - loss: 0.4718 - accuracy: 1.0000 Epoch 92/100 1/1 [==============================] - 0s 7ms/step - loss: 0.4610 - accuracy: 1.0000 Epoch 93/100 1/1 [==============================] - 0s 5ms/step - loss: 0.4507 - accuracy: 1.0000 Epoch 94/100 1/1 [==============================] - 0s 6ms/step - loss: 0.4407 - accuracy: 1.0000 Epoch 95/100 1/1 [==============================] - 0s 5ms/step - loss: 0.4306 - accuracy: 1.0000 Epoch 96/100 1/1 [==============================] - 0s 9ms/step - loss: 0.4204 - accuracy: 1.0000 Epoch 97/100 1/1 [==============================] - 0s 6ms/step - loss: 0.4098 - accuracy: 1.0000 Epoch 98/100 1/1 [==============================] - 0s 6ms/step - loss: 0.3992 - accuracy: 1.0000 Epoch 99/100 1/1 [==============================] - 0s 5ms/step - loss: 0.3887 - accuracy: 1.0000 Epoch 100/100 1/1 [==============================] - 0s 6ms/step - loss: 0.3788 - accuracy: 1.0000 Hi Rubi what are you doing today
import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, LSTM, Embedding from tensorflow.keras.preprocessing.text import Tokenizer from tensorflow.keras.preprocessing.sequence import pad_sequences # Example text data (replace with your dataset) text_data = [ "The quick brown fox jumps over the lazy dog.", "She sells seashells by the seashore.", "How much wood would a woodchuck chuck if a woodchuck could chuck wood?" ] # Tokenize the text data tokenizer = Tokenizer() tokenizer.fit_on_texts(text_data) total_words = len(tokenizer.word_index) + 1 # Create input sequences using tokenizer input_sequences = [] for line in text_data: token_list = tokenizer.texts_to_sequences([line])[0] for i in range(1, len(token_list)): n_gram_sequence = token_list[:i+1] input_sequences.append(n_gram_sequence) # Pad sequences for equal length input max_sequence_len = max([len(seq) for seq in input_sequences]) input_sequences = np.array(pad_sequences(input_sequences, maxlen=max_sequence_len, padding='pre')) # Create predictors and labels predictors, label = input_sequences[:,:-1],input_sequences[:,-1] # Convert labels to categorical one-hot encoding label = tf.keras.utils.to_categorical(label, num_classes=total_words) # Build the model model = Sequential() model.add(Embedding(total_words, 10, input_length=max_sequence_len-1)) model.add(LSTM(50)) model.add(Dense(total_words, activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) # Train the model model.fit(predictors, label, epochs=100, verbose=1) # Function to generate text def generate_text(seed_text, next_words, model, max_sequence_len): for _ in range(next_words): token_list = tokenizer.texts_to_sequences([seed_text])[0] token_list = pad_sequences([token_list], maxlen=max_sequence_len-1, padding='pre') predicted_probs = model.predict(token_list, verbose=0)[0] predicted_index = np.argmax(predicted_probs) output_word = "" for word, index in tokenizer.word_index.items(): if index == predicted_index: output_word = word break seed_text += " " + output_word return seed_text # User input to generate text user_input = input("Enter a starting phrase: ") num_words = int(input("Enter number of words to generate: ")) generated_text = generate_text(user_input.lower(), num_words, model, max_sequence_len) print("Generated Text:", generated_text)
Epoch 1/100 1/1 [==============================] - 2s 2s/step - loss: 3.1357 - accuracy: 0.0800 Epoch 2/100 1/1 [==============================] - 0s 8ms/step - loss: 3.1338 - accuracy: 0.1200 Epoch 3/100 1/1 [==============================] - 0s 8ms/step - loss: 3.1318 - accuracy: 0.1600 Epoch 4/100 1/1 [==============================] - 0s 9ms/step - loss: 3.1299 - accuracy: 0.1200 Epoch 5/100 1/1 [==============================] - 0s 9ms/step - loss: 3.1279 - accuracy: 0.1200 Epoch 6/100 1/1 [==============================] - 0s 8ms/step - loss: 3.1258 - accuracy: 0.1200 Epoch 7/100 1/1 [==============================] - 0s 9ms/step - loss: 3.1237 - accuracy: 0.1200 Epoch 8/100 1/1 [==============================] - 0s 10ms/step - loss: 3.1214 - accuracy: 0.0800 Epoch 9/100 1/1 [==============================] - 0s 7ms/step - loss: 3.1190 - accuracy: 0.0800 Epoch 10/100 1/1 [==============================] - 0s 9ms/step - loss: 3.1165 - accuracy: 0.0800 Epoch 11/100 1/1 [==============================] - 0s 12ms/step - loss: 3.1137 - accuracy: 0.0800 Epoch 12/100 1/1 [==============================] - 0s 7ms/step - loss: 3.1108 - accuracy: 0.0800 Epoch 13/100 1/1 [==============================] - 0s 9ms/step - loss: 3.1076 - accuracy: 0.0800 Epoch 14/100 1/1 [==============================] - 0s 6ms/step - loss: 3.1041 - accuracy: 0.0800 Epoch 15/100 1/1 [==============================] - 0s 8ms/step - loss: 3.1002 - accuracy: 0.0800 Epoch 16/100 1/1 [==============================] - 0s 8ms/step - loss: 3.0960 - accuracy: 0.0800 Epoch 17/100 1/1 [==============================] - 0s 7ms/step - loss: 3.0912 - accuracy: 0.0800 Epoch 18/100 1/1 [==============================] - 0s 9ms/step - loss: 3.0860 - accuracy: 0.0800 Epoch 19/100 1/1 [==============================] - 0s 5ms/step - loss: 3.0802 - accuracy: 0.0800 Epoch 20/100 1/1 [==============================] - 0s 9ms/step - loss: 3.0737 - accuracy: 0.0800 Epoch 21/100 1/1 [==============================] - 0s 6ms/step - loss: 3.0664 - accuracy: 0.0800 Epoch 22/100 1/1 [==============================] - 0s 8ms/step - loss: 3.0584 - accuracy: 0.0800 Epoch 23/100 1/1 [==============================] - 0s 10ms/step - loss: 3.0494 - accuracy: 0.0800 Epoch 24/100 1/1 [==============================] - 0s 9ms/step - loss: 3.0397 - accuracy: 0.0800 Epoch 25/100 1/1 [==============================] - 0s 9ms/step - loss: 3.0293 - accuracy: 0.0800 Epoch 26/100 1/1 [==============================] - 0s 6ms/step - loss: 3.0186 - accuracy: 0.0800 Epoch 27/100 1/1 [==============================] - 0s 6ms/step - loss: 3.0084 - accuracy: 0.0800 Epoch 28/100 1/1 [==============================] - 0s 8ms/step - loss: 2.9995 - accuracy: 0.0800 Epoch 29/100 1/1 [==============================] - 0s 6ms/step - loss: 2.9929 - accuracy: 0.0800 Epoch 30/100 1/1 [==============================] - 0s 9ms/step - loss: 2.9883 - accuracy: 0.0800 Epoch 31/100 1/1 [==============================] - 0s 8ms/step - loss: 2.9841 - accuracy: 0.0800 Epoch 32/100 1/1 [==============================] - 0s 8ms/step - loss: 2.9784 - accuracy: 0.0800 Epoch 33/100 1/1 [==============================] - 0s 8ms/step - loss: 2.9708 - accuracy: 0.0800 Epoch 34/100 1/1 [==============================] - 0s 6ms/step - loss: 2.9619 - accuracy: 0.0800 Epoch 35/100 1/1 [==============================] - 0s 8ms/step - loss: 2.9527 - accuracy: 0.0800 Epoch 36/100 1/1 [==============================] - 0s 7ms/step - loss: 2.9442 - accuracy: 0.1200 Epoch 37/100 1/1 [==============================] - 0s 9ms/step - loss: 2.9367 - accuracy: 0.1200 Epoch 38/100 1/1 [==============================] - 0s 12ms/step - loss: 2.9299 - accuracy: 0.1600 Epoch 39/100 1/1 [==============================] - 0s 6ms/step - loss: 2.9232 - accuracy: 0.1200 Epoch 40/100 1/1 [==============================] - 0s 9ms/step - loss: 2.9162 - accuracy: 0.1200 Epoch 41/100 1/1 [==============================] - 0s 9ms/step - loss: 2.9085 - accuracy: 0.1600 Epoch 42/100 1/1 [==============================] - 0s 7ms/step - loss: 2.8999 - accuracy: 0.1200 Epoch 43/100 1/1 [==============================] - 0s 9ms/step - loss: 2.8903 - accuracy: 0.1600 Epoch 44/100 1/1 [==============================] - 0s 16ms/step - loss: 2.8799 - accuracy: 0.2000 Epoch 45/100 1/1 [==============================] - 0s 8ms/step - loss: 2.8689 - accuracy: 0.2000 Epoch 46/100 1/1 [==============================] - 0s 7ms/step - loss: 2.8575 - accuracy: 0.2000 Epoch 47/100 1/1 [==============================] - 0s 6ms/step - loss: 2.8454 - accuracy: 0.2000 Epoch 48/100 1/1 [==============================] - 0s 8ms/step - loss: 2.8323 - accuracy: 0.2000 Epoch 49/100 1/1 [==============================] - 0s 7ms/step - loss: 2.8175 - accuracy: 0.1200 Epoch 50/100 1/1 [==============================] - 0s 8ms/step - loss: 2.8009 - accuracy: 0.1200 Epoch 51/100 1/1 [==============================] - 0s 8ms/step - loss: 2.7826 - accuracy: 0.1600 Epoch 52/100 1/1 [==============================] - 0s 7ms/step - loss: 2.7633 - accuracy: 0.1600 Epoch 53/100 1/1 [==============================] - 0s 9ms/step - loss: 2.7431 - accuracy: 0.1600 Epoch 54/100 1/1 [==============================] - 0s 7ms/step - loss: 2.7220 - accuracy: 0.1600 Epoch 55/100 1/1 [==============================] - 0s 9ms/step - loss: 2.6993 - accuracy: 0.1600 Epoch 56/100 1/1 [==============================] - 0s 11ms/step - loss: 2.6750 - accuracy: 0.1600 Epoch 57/100 1/1 [==============================] - 0s 5ms/step - loss: 2.6495 - accuracy: 0.1600 Epoch 58/100 1/1 [==============================] - 0s 8ms/step - loss: 2.6239 - accuracy: 0.1600 Epoch 59/100 1/1 [==============================] - 0s 6ms/step - loss: 2.5985 - accuracy: 0.1600 Epoch 60/100 1/1 [==============================] - 0s 8ms/step - loss: 2.5723 - accuracy: 0.2000 Epoch 61/100 1/1 [==============================] - 0s 6ms/step - loss: 2.5456 - accuracy: 0.1600 Epoch 62/100 1/1 [==============================] - 0s 7ms/step - loss: 2.5193 - accuracy: 0.1600 Epoch 63/100 1/1 [==============================] - 0s 10ms/step - loss: 2.4925 - accuracy: 0.1600 Epoch 64/100 1/1 [==============================] - 0s 7ms/step - loss: 2.4650 - accuracy: 0.2000 Epoch 65/100 1/1 [==============================] - 0s 7ms/step - loss: 2.4387 - accuracy: 0.2400 Epoch 66/100 1/1 [==============================] - 0s 6ms/step - loss: 2.4116 - accuracy: 0.2000 Epoch 67/100 1/1 [==============================] - 0s 8ms/step - loss: 2.3857 - accuracy: 0.2000 Epoch 68/100 1/1 [==============================] - 0s 6ms/step - loss: 2.3587 - accuracy: 0.2400 Epoch 69/100 1/1 [==============================] - 0s 9ms/step - loss: 2.3329 - accuracy: 0.2000 Epoch 70/100 1/1 [==============================] - 0s 10ms/step - loss: 2.3063 - accuracy: 0.2000 Epoch 71/100 1/1 [==============================] - 0s 6ms/step - loss: 2.2784 - accuracy: 0.2000 Epoch 72/100 1/1 [==============================] - 0s 8ms/step - loss: 2.2507 - accuracy: 0.2000 Epoch 73/100 1/1 [==============================] - 0s 8ms/step - loss: 2.2249 - accuracy: 0.2000 Epoch 74/100 1/1 [==============================] - 0s 8ms/step - loss: 2.2061 - accuracy: 0.2400 Epoch 75/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1887 - accuracy: 0.2800 Epoch 76/100 1/1 [==============================] - 0s 7ms/step - loss: 2.1485 - accuracy: 0.2800 Epoch 77/100 1/1 [==============================] - 0s 10ms/step - loss: 2.1572 - accuracy: 0.2400 Epoch 78/100 1/1 [==============================] - 0s 6ms/step - loss: 2.1033 - accuracy: 0.3200 Epoch 79/100 1/1 [==============================] - 0s 8ms/step - loss: 2.1072 - accuracy: 0.3600 Epoch 80/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0536 - accuracy: 0.3200 Epoch 81/100 1/1 [==============================] - 0s 7ms/step - loss: 2.0820 - accuracy: 0.2800 Epoch 82/100 1/1 [==============================] - 0s 6ms/step - loss: 2.0184 - accuracy: 0.2800 Epoch 83/100 1/1 [==============================] - 0s 9ms/step - loss: 2.0337 - accuracy: 0.3600 Epoch 84/100 1/1 [==============================] - 0s 9ms/step - loss: 1.9902 - accuracy: 0.3200 Epoch 85/100 1/1 [==============================] - 0s 7ms/step - loss: 1.9831 - accuracy: 0.2800 Epoch 86/100 1/1 [==============================] - 0s 8ms/step - loss: 1.9513 - accuracy: 0.3200 Epoch 87/100 1/1 [==============================] - 0s 8ms/step - loss: 1.9403 - accuracy: 0.4000 Epoch 88/100 1/1 [==============================] - 0s 10ms/step - loss: 1.9273 - accuracy: 0.4000 Epoch 89/100 1/1 [==============================] - 0s 9ms/step - loss: 1.8945 - accuracy: 0.3600 Epoch 90/100 1/1 [==============================] - 0s 8ms/step - loss: 1.8983 - accuracy: 0.2400 Epoch 91/100 1/1 [==============================] - 0s 6ms/step - loss: 1.8664 - accuracy: 0.4000 Epoch 92/100 1/1 [==============================] - 0s 7ms/step - loss: 1.8640 - accuracy: 0.4000 Epoch 93/100 1/1 [==============================] - 0s 8ms/step - loss: 1.8287 - accuracy: 0.4400 Epoch 94/100 1/1 [==============================] - 0s 9ms/step - loss: 1.8356 - accuracy: 0.3600 Epoch 95/100 1/1 [==============================] - 0s 11ms/step - loss: 1.8069 - accuracy: 0.4000 Epoch 96/100 1/1 [==============================] - 0s 8ms/step - loss: 1.8005 - accuracy: 0.4000 Epoch 97/100 1/1 [==============================] - 0s 7ms/step - loss: 1.7725 - accuracy: 0.4000 Epoch 98/100 1/1 [==============================] - 0s 8ms/step - loss: 1.7639 - accuracy: 0.3600 Epoch 99/100 1/1 [==============================] - 0s 8ms/step - loss: 1.7551 - accuracy: 0.4000 Epoch 100/100 1/1 [==============================] - 0s 8ms/step - loss: 1.7319 - accuracy: 0.3600 Enter a starting phrase: Delhi Enter number of words to generate: 30 Generated Text: delhi sells sells sells much the the the the the the dog dog dog dog could could could chuck could chuck could could could could could could could could could could