Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
suyashi29
GitHub Repository: suyashi29/python-su
Path: blob/master/Generative NLP Models using Python/4 Introduction to Neural Networks.ipynb
3074 views
Kernel: Python 3 (ipykernel)

Introduction to Neural Networks

What is a Neural Network?

A Neural Network is a series of algorithms that attempt to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates.

Neural Networks are made of layers:

  • Input Layer: Receives input data

  • Hidden Layers: Process the data using weights and activation functions

  • Output Layer: Produces the final prediction

image.png

Simple mathematical calculation for a feedforward neural network with:

    • 3 inputs

    • 2 hidden layers

    • 1st hidden layer: 4 neurons

    • 2nd hidden layer: 3 neurons

  • 1 output neuron

  • Using ReLU as the activation function in hidden layers and a linear activation at the output.

We'll calculate the output step-by-step.


1. Input Layer

Let the inputs be:

x=[x1x2x3]=[1.00.51.5]\mathbf{x} = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 1.0 \\ 0.5 \\ -1.5 \end{bmatrix}

2. Hidden Layer 1 (4 neurons)

Let the weights for the first hidden layer be:

W[1]=[0.20.30.50.40.10.20.30.60.10.50.20.4],b[1]=[0.10.10.20.0]W^{[1]} = \begin{bmatrix} 0.2 & -0.3 & 0.5 \\ -0.4 & 0.1 & -0.2 \\ 0.3 & 0.6 & -0.1 \\ -0.5 & 0.2 & 0.4 \end{bmatrix}, \quad \mathbf{b}^{[1]} = \begin{bmatrix} 0.1 \\ -0.1 \\ 0.2 \\ 0.0 \end{bmatrix}

Calculate z[1]=W[1]x+b[1]z^{[1]} = W^{[1]} \cdot \mathbf{x} + \mathbf{b}^{[1]}:

z[1]=[0.21+(0.3)0.5+0.5(1.5)+0.10.41+0.10.5+(0.2)(1.5)0.10.31+0.60.5+(0.1)(1.5)+0.20.51+0.20.5+0.4(1.5)+0]=[0.20.150.75+0.10.4+0.05+0.30.10.3+0.3+0.15+0.20.5+0.10.6]=[0.60.150.951.0]z^{[1]} = \begin{bmatrix} 0.2 \cdot 1 + (-0.3) \cdot 0.5 + 0.5 \cdot (-1.5) + 0.1 \\ -0.4 \cdot 1 + 0.1 \cdot 0.5 + (-0.2) \cdot (-1.5) - 0.1 \\ 0.3 \cdot 1 + 0.6 \cdot 0.5 + (-0.1) \cdot (-1.5) + 0.2 \\ -0.5 \cdot 1 + 0.2 \cdot 0.5 + 0.4 \cdot (-1.5) + 0 \end{bmatrix} = \begin{bmatrix} 0.2 - 0.15 - 0.75 + 0.1 \\ -0.4 + 0.05 + 0.3 - 0.1 \\ 0.3 + 0.3 + 0.15 + 0.2 \\ -0.5 + 0.1 - 0.6 \end{bmatrix} = \begin{bmatrix} -0.6 \\ -0.15 \\ 0.95 \\ -1.0 \end{bmatrix}

Apply ReLU: a[1]=ReLU(z[1])=max(0,z[1])a^{[1]} = \text{ReLU}(z^{[1]}) = \max(0, z^{[1]})

a[1]=[000.950]a^{[1]} = \begin{bmatrix} 0 \\ 0 \\ 0.95 \\ 0 \end{bmatrix}

3. Hidden Layer 2 (3 neurons)

Weights and bias:

W[2]=[0.10.20.30.40.30.60.10.20.50.40.20.1],b[2]=[0.00.10.2]W^{[2]} = \begin{bmatrix} 0.1 & -0.2 & 0.3 & 0.4 \\ -0.3 & 0.6 & -0.1 & 0.2 \\ 0.5 & -0.4 & 0.2 & -0.1 \end{bmatrix}, \quad \mathbf{b}^{[2]} = \begin{bmatrix} 0.0 \\ 0.1 \\ -0.2 \end{bmatrix}

Calculate z[2]=W[2]a[1]+b[2]z^{[2]} = W^{[2]} \cdot a^{[1]} + b^{[2]}:

Only the third value of a[1]a^{[1]} is nonzero:

z[2]=[0.30.95+00.10.95+0.10.20.950.2]=[0.2850.095+0.1=0.0050.190.2=0.01]z^{[2]} = \begin{bmatrix} 0.3 \cdot 0.95 + 0 \\ -0.1 \cdot 0.95 + 0.1 \\ 0.2 \cdot 0.95 - 0.2 \end{bmatrix} = \begin{bmatrix} 0.285 \\ -0.095 + 0.1 = 0.005 \\ 0.19 - 0.2 = -0.01 \end{bmatrix}

Apply ReLU:

a[2]=max(0,z[2])=[0.2850.0050]a^{[2]} = \max(0, z^{[2]}) = \begin{bmatrix} 0.285 \\ 0.005 \\ 0 \end{bmatrix}

4. Output Layer (1 neuron)

Weights and bias:

W[3]=[0.40.60.3],b[3]=0.05W^{[3]} = \begin{bmatrix} 0.4 & -0.6 & 0.3 \end{bmatrix}, \quad b^{[3]} = 0.05y=W[3]a[2]+b[3]=0.40.2850.60.005+0.05=0.1140.003+0.05=0.161y = W^{[3]} \cdot a^{[2]} + b^{[3]} = 0.4 \cdot 0.285 - 0.6 \cdot 0.005 + 0.05 = 0.114 - 0.003 + 0.05 = 0.161

Final Output:

0.161\boxed{0.161}

Understanding Neural Networks in Deep Learning

Neural networks are capable of learning and identifying patterns directly from data without pre-defined rules. These networks are built from several key components:

  • Neurons: The basic units that receive inputs, each neuron is governed by a threshold and an activation function.

  • Connections: Links between neurons that carry information, regulated by weights and biases.

  • Weights and Biases: These parameters determine the strength and influence of connections.

  • Propagation Functions: Mechanisms that help process and transfer data across layers of neurons.

  • Learning Rule: The method that adjusts weights and biases over time to improve accuracy.

Activation Functions

An activation function is a mathematical operation applied to the output of each neuron in a neural network layer.Without an activation function, a neural network is just a linear function — no matter how many layers you stack: %7BA351B13F-3562-406C-B90D-BD8FD1DDCD8E%7D.png

Example:

𝑊 1 𝑥 + 𝑏 1

𝑊 2 ( 𝑊 1 𝑥 + 𝑏 1 ) + 𝑏 2 y=W 1 x+b 1 ⇒y=W 2 (W 1 x+b 1 )+b 2

  • Activation functions allow the model to learn complex patterns and non-linear relationships.

  • Helps the Model Learn Complex Mappings Real-world problems (like image recognition, text generation, etc.) are non-linear in nature. Activation functions allow the network to capture such non-linear mappings.

  • Controls the Output Range Activation functions can:

  • Limit outputs (e.g., between 0–1 or -1–1)

  • Introduce probabilities (e.g., softmax in classification)

  • Help with gradient flow (some functions like ReLU improve training speed and reduce vanishing gradients)

NameFormulaOutput RangeUse Case
ReLUf(x) = max(0, x)[0, ∞)Hidden layers (fast & efficient)
Sigmoidf(x) = 1 / (1 + e^-x)(0, 1)Binary classification
Tanhf(x) = (e^x - e^-x)/(e^x + e^-x)(-1, 1)Can be better than sigmoid
Softmaxe^xᵢ / Σe^xⱼ(0, 1)Multi-class classification output
a=[1,23,34] b=[9,8,7] a+b type(a+b) import numpy as np a1=np.array(a) b1=np.array(b) a1+b1 a1/b1 a1*b1
array([ 9, 184, 238])
# Import Required Libraries import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense

Number Sequence Generation Task

Objective: Train a neural network to learn a sequence pattern. Example: 1 → 2, 2 → 3, ..., 9 → 10.

# Prepare training data X = np.array([i for i in range(1, 50)]) y = np.array([i + 1 for i in range(1, 50)]) # Try similar for y=2x, y=x*x go with , 50, 100 # Reshape input for Keras (samples, features) X = X.reshape(-1, 1) y = y.reshape(-1, 1)
X y
array([[ 2], [ 3], [ 4], [ 5], [ 6], [ 7], [ 8], [ 9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50]])

Build a Simple Neural Network

model = Sequential([ Dense(50, activation='relu', input_shape=(1,)), # Hidden layer Dense(1) # Output layer ]) # Compile the model model.compile(optimizer='adam', loss='mse') # Summary of the model model.summary()
C:\Users\Suyashi144893\AppData\Local\anaconda3\Lib\site-packages\keras\src\layers\core\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead. super().__init__(activity_regularizer=activity_regularizer, **kwargs)

Train the Model

  • A epoch is a single pass through the entire training data

  • After each epoch, the model weights are updated based on training data

Verbose: level of details in every step want to display enter 1 or 2

model.fit(X, y, epochs=250, verbose=1)
Epoch 1/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0133 Epoch 2/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0130 Epoch 3/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0141 Epoch 4/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0128 Epoch 5/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0125 Epoch 6/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0144 Epoch 7/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0139 Epoch 8/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0134 Epoch 9/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0131 Epoch 10/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0128 Epoch 11/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0134 Epoch 12/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0131 Epoch 13/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0143 Epoch 14/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0131 Epoch 15/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0124 Epoch 16/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0138 Epoch 17/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0129 Epoch 18/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0138 Epoch 19/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0128 Epoch 20/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0137 Epoch 21/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0138 Epoch 22/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0120 Epoch 23/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0140 Epoch 24/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0140 Epoch 25/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0132 Epoch 26/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0123 Epoch 27/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0133 Epoch 28/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0137 Epoch 29/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0130 Epoch 30/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0128 Epoch 31/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0125 Epoch 32/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0134 Epoch 33/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0116 Epoch 34/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0126 Epoch 35/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0130 Epoch 36/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0133 Epoch 37/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0133 Epoch 38/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 39/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0125 Epoch 40/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0129 Epoch 41/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0124 Epoch 42/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0112 Epoch 43/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0129 Epoch 44/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0119 Epoch 45/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0131 Epoch 46/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0128 Epoch 47/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0127 Epoch 48/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0130 Epoch 49/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0127 Epoch 50/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0128 Epoch 51/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 52/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 53/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0119 Epoch 54/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0128 Epoch 55/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0121 Epoch 56/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0122 Epoch 57/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0122 Epoch 58/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0130 Epoch 59/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0123 Epoch 60/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0126 Epoch 61/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0118 Epoch 62/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0124 Epoch 63/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0120 Epoch 64/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0123 Epoch 65/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0123 Epoch 66/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0123 Epoch 67/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0126 Epoch 68/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0122 Epoch 69/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0129 Epoch 70/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0119 Epoch 71/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0128 Epoch 72/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0115 Epoch 73/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0118 Epoch 74/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0125 Epoch 75/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0124 Epoch 76/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0124 Epoch 77/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0121 Epoch 78/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0122 Epoch 79/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0119 Epoch 80/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0112 Epoch 81/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0118 Epoch 82/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0126 Epoch 83/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0123 Epoch 84/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0123 Epoch 85/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0123 Epoch 86/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0126 Epoch 87/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 88/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 89/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 90/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0115 Epoch 91/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0123 Epoch 92/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0124 Epoch 93/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 94/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0119 Epoch 95/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0115 Epoch 96/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0115 Epoch 97/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0113 Epoch 98/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0108 Epoch 99/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0121 Epoch 100/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 101/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0115 Epoch 102/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0127 Epoch 103/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0119 Epoch 104/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0118 Epoch 105/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0118 Epoch 106/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0114 Epoch 107/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0115 Epoch 108/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0111 Epoch 109/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0116 Epoch 110/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0116 Epoch 111/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0113 Epoch 112/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 113/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0116 Epoch 114/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0112 Epoch 115/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 116/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0119 Epoch 117/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0112 Epoch 118/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 119/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0104 Epoch 120/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0110 Epoch 121/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0120 Epoch 122/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 123/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0116 Epoch 124/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0115 Epoch 125/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0107 Epoch 126/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 127/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0109 Epoch 128/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0115 Epoch 129/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0108 Epoch 130/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0116 Epoch 131/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0116 Epoch 132/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0112 Epoch 133/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0115 Epoch 134/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0112 Epoch 135/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0116 Epoch 136/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0109 Epoch 137/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0105 Epoch 138/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0109 Epoch 139/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0107 Epoch 140/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0110 Epoch 141/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0110 Epoch 142/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0111 Epoch 143/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0117 Epoch 144/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0106 Epoch 145/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0107 Epoch 146/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 147/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0112 Epoch 148/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0104 Epoch 149/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0107 Epoch 150/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0110 Epoch 151/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0106 Epoch 152/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0109 Epoch 153/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0107 Epoch 154/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0109 Epoch 155/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0103 Epoch 156/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0111 Epoch 157/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0100 Epoch 158/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0106 Epoch 159/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0105 Epoch 160/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0106 Epoch 161/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0107 Epoch 162/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0099 Epoch 163/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0108 Epoch 164/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0109 Epoch 165/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0101 Epoch 166/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0103 Epoch 167/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0101 Epoch 168/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0104 Epoch 169/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0106 Epoch 170/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0096 Epoch 171/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0110 Epoch 172/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0103 Epoch 173/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0103 Epoch 174/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0099 Epoch 175/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0104 Epoch 176/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0107 Epoch 177/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0102 Epoch 178/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0106 Epoch 179/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0101 Epoch 180/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0115 Epoch 181/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0105 Epoch 182/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0097 Epoch 183/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0097 Epoch 184/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0110 Epoch 185/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0102 Epoch 186/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0100 Epoch 187/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0104 Epoch 188/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0106 Epoch 189/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 190/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0101 Epoch 191/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0108 Epoch 192/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0105 Epoch 193/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0105 Epoch 194/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0102 Epoch 195/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0102 Epoch 196/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 197/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0096 Epoch 198/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0094 Epoch 199/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0097 Epoch 200/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0104 Epoch 201/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0098 Epoch 202/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0096 Epoch 203/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0098 Epoch 204/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 205/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0097 Epoch 206/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0100 Epoch 207/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0096 Epoch 208/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0092 Epoch 209/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0096 Epoch 210/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0104 Epoch 211/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0101 Epoch 212/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0095 Epoch 213/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0096 Epoch 214/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0102 Epoch 215/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0093 Epoch 216/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0100 Epoch 217/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0097 Epoch 218/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 219/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0102 Epoch 220/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0098 Epoch 221/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0093 Epoch 222/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0091 Epoch 223/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0101 Epoch 224/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0103 Epoch 225/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0095 Epoch 226/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0096 Epoch 227/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0101 Epoch 228/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0095 Epoch 229/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0096 Epoch 230/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0097 Epoch 231/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0089 Epoch 232/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0100 Epoch 233/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 234/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0092 Epoch 235/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0093 Epoch 236/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0101 Epoch 237/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0090 Epoch 238/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0098 Epoch 239/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0099 Epoch 240/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0088 Epoch 241/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0094 Epoch 242/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0090 Epoch 243/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0091 Epoch 244/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0093 Epoch 245/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0094 Epoch 246/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.00948 Epoch 247/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0088 Epoch 248/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0091 Epoch 249/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0091 Epoch 250/250 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0092
<keras.src.callbacks.history.History at 0x1ca9e275750>

Test the Model

# Predict the next number in the sequence test_input = np.array([[100]]) predicted = model.predict(test_input) print(f"Input: 11 → Predicted Output: {predicted[0][0]:.2f}")
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step Input: 11 → Predicted Output: 101.39