Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
suyashi29
GitHub Repository: suyashi29/python-su
Path: blob/master/Applied Generative AI with GANS/3 Introduction to Neural Networks.ipynb
4843 views
Kernel: Python 3 (ipykernel)

Introduction to Neural Networks

What is a Neural Network?

A Neural Network is a series of algorithms that attempt to recognize underlying relationships in a set of data through a process that mimics the way the human brain operates.

Neural Networks are made of layers:

  • Input Layer: Receives input data

  • Hidden Layers: Process the data using weights and activation functions

  • Output Layer: Produces the final prediction

image.png

f(X)=sin(x)

classifcation problem (Accuracy)

Accuracy= TP+TN / FP+FN+TP+TN

1. Binary Classification= sigmoid function 2. Multi Classifucation = softmax function

Regression: tanhx ,mse

Simple mathematical calculation for a feedforward neural network with:

    • 3 inputs

    • 2 hidden layers

    • 1st hidden layer: 4 neurons

    • 2nd hidden layer: 3 neurons

  • 1 output neuron

  • Using ReLU as the activation function in hidden layers and a linear activation at the output.

We'll calculate the output step-by-step.


1. Input Layer

Let the inputs be:

x=[x1x2x3]=[1.00.51.5]\mathbf{x} = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 1.0 \\ 0.5 \\ -1.5 \end{bmatrix}

2. Hidden Layer 1 (4 neurons)

Let the weights for the first hidden layer be:

W[1]=[0.20.30.50.40.10.20.30.60.10.50.20.4],b[1]=[0.10.10.20.0]W^{[1]} = \begin{bmatrix} 0.2 & -0.3 & 0.5 \\ -0.4 & 0.1 & -0.2 \\ 0.3 & 0.6 & -0.1 \\ -0.5 & 0.2 & 0.4 \end{bmatrix}, \quad \mathbf{b}^{[1]} = \begin{bmatrix} 0.1 \\ -0.1 \\ 0.2 \\ 0.0 \end{bmatrix}

Calculate z[1]=W[1]x+b[1]z^{[1]} = W^{[1]} \cdot \mathbf{x} + \mathbf{b}^{[1]}:

z[1]=[0.21+(0.3)0.5+0.5(1.5)+0.10.41+0.10.5+(0.2)(1.5)0.10.31+0.60.5+(0.1)(1.5)+0.20.51+0.20.5+0.4(1.5)+0]=[0.20.150.75+0.10.4+0.05+0.30.10.3+0.3+0.15+0.20.5+0.10.6]=[0.60.150.951.0]z^{[1]} = \begin{bmatrix} 0.2 \cdot 1 + (-0.3) \cdot 0.5 + 0.5 \cdot (-1.5) + 0.1 \\ -0.4 \cdot 1 + 0.1 \cdot 0.5 + (-0.2) \cdot (-1.5) - 0.1 \\ 0.3 \cdot 1 + 0.6 \cdot 0.5 + (-0.1) \cdot (-1.5) + 0.2 \\ -0.5 \cdot 1 + 0.2 \cdot 0.5 + 0.4 \cdot (-1.5) + 0 \end{bmatrix} = \begin{bmatrix} 0.2 - 0.15 - 0.75 + 0.1 \\ -0.4 + 0.05 + 0.3 - 0.1 \\ 0.3 + 0.3 + 0.15 + 0.2 \\ -0.5 + 0.1 - 0.6 \end{bmatrix} = \begin{bmatrix} -0.6 \\ -0.15 \\ 0.95 \\ -1.0 \end{bmatrix}

Apply ReLU: a[1]=ReLU(z[1])=max(0,z[1])a^{[1]} = \text{ReLU}(z^{[1]}) = \max(0, z^{[1]})

a[1]=[000.950]a^{[1]} = \begin{bmatrix} 0 \\ 0 \\ 0.95 \\ 0 \end{bmatrix}

3. Hidden Layer 2 (3 neurons)

Weights and bias:

W[2]=[0.10.20.30.40.30.60.10.20.50.40.20.1],b[2]=[0.00.10.2]W^{[2]} = \begin{bmatrix} 0.1 & -0.2 & 0.3 & 0.4 \\ -0.3 & 0.6 & -0.1 & 0.2 \\ 0.5 & -0.4 & 0.2 & -0.1 \end{bmatrix}, \quad \mathbf{b}^{[2]} = \begin{bmatrix} 0.0 \\ 0.1 \\ -0.2 \end{bmatrix}

Calculate z[2]=W[2]a[1]+b[2]z^{[2]} = W^{[2]} \cdot a^{[1]} + b^{[2]}:

Only the third value of a[1]a^{[1]} is nonzero:

z[2]=[0.30.95+00.10.95+0.10.20.950.2]=[0.2850.095+0.1=0.0050.190.2=0.01]z^{[2]} = \begin{bmatrix} 0.3 \cdot 0.95 + 0 \\ -0.1 \cdot 0.95 + 0.1 \\ 0.2 \cdot 0.95 - 0.2 \end{bmatrix} = \begin{bmatrix} 0.285 \\ -0.095 + 0.1 = 0.005 \\ 0.19 - 0.2 = -0.01 \end{bmatrix}

Apply ReLU:

a[2]=max(0,z[2])=[0.2850.0050]a^{[2]} = \max(0, z^{[2]}) = \begin{bmatrix} 0.285 \\ 0.005 \\ 0 \end{bmatrix}

4. Output Layer (1 neuron)

Weights and bias:

W[3]=[0.40.60.3],b[3]=0.05W^{[3]} = \begin{bmatrix} 0.4 & -0.6 & 0.3 \end{bmatrix}, \quad b^{[3]} = 0.05y=W[3]a[2]+b[3]=0.40.2850.60.005+0.05=0.1140.003+0.05=0.161y = W^{[3]} \cdot a^{[2]} + b^{[3]} = 0.4 \cdot 0.285 - 0.6 \cdot 0.005 + 0.05 = 0.114 - 0.003 + 0.05 = 0.161

Final Output:

0.161\boxed{0.161}

Understanding Neural Networks in Deep Learning

Neural networks are capable of learning and identifying patterns directly from data without pre-defined rules. These networks are built from several key components:

  • Neurons: The basic units that receive inputs, each neuron is governed by a threshold and an activation function.

  • Connections: Links between neurons that carry information, regulated by weights and biases.

  • Weights and Biases: These parameters determine the strength and influence of connections.

  • Propagation Functions: Mechanisms that help process and transfer data across layers of neurons.

  • Learning Rule: The method that adjusts weights and biases over time to improve accuracy.

Activation Functions

An activation function is a mathematical operation applied to the output of each neuron in a neural network layer.Without an activation function, a neural network is just a linear function — no matter how many layers you stack: %7BA351B13F-3562-406C-B90D-BD8FD1DDCD8E%7D.png

Example:

𝑊 1 𝑥 + 𝑏 1

𝑊 2 ( 𝑊 1 𝑥 + 𝑏 1 ) + 𝑏 2 y=W 1 x+b 1 ⇒y=W 2 (W 1 x+b 1 )+b 2

  • Activation functions allow the model to learn complex patterns and non-linear relationships.

  • Helps the Model Learn Complex Mappings Real-world problems (like image recognition, text generation, etc.) are non-linear in nature. Activation functions allow the network to capture such non-linear mappings.

  • Controls the Output Range Activation functions can:

  • Limit outputs (e.g., between 0–1 or -1–1)

  • Introduce probabilities (e.g., softmax in classification)

  • Help with gradient flow (some functions like ReLU improve training speed and reduce vanishing gradients)

NameFormulaOutput RangeUse Case
ReLUf(x) = max(0, x)[0, ∞)Hidden layers (fast & efficient)
Sigmoidf(x) = 1 / (1 + e^-x)(0, 1)Binary classification
Tanhf(x) = (e^x - e^-x)/(e^x + e^-x)(-1, 1)Can be better than sigmoid
Softmaxe^xᵢ / Σe^xⱼ(0, 1)Multi-class classification output
# Import Required Libraries import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense

Number Sequence Generation Task

Objective: Train a neural network to learn a sequence pattern. Example: 1 → 2, 2 → 3, ..., 9 → 10.

# Prepare training data X = np.array([i for i in range(1, 51)]) y = np.array([i + 1 for i in range(1, 51)]) #Y=x+1 # Try similar for y=2x, y=x*x go with , 50, 100 # Reshape input for Keras (samples, features) X = X.reshape(-1, 1) y = y.reshape(-1, 1)
X
array([[ 1], [ 2], [ 3], [ 4], [ 5], [ 6], [ 7], [ 8], [ 9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50]])

Build a Simple Neural Network

model = Sequential([ Dense(50, activation='relu', input_shape=(1,)), # Hidden layer Dense(1) # Output layer ]) # Compile the model model.compile(optimizer='adam', loss='mse') # Summary of the model model.summary()
C:\Users\Suyashi144893\AppData\Local\anaconda3\Lib\site-packages\keras\src\layers\core\dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead. super().__init__(activity_regularizer=activity_regularizer, **kwargs)

Train the Model

  • A epoch is a single pass through the entire training data

  • After each epoch, the model weights are updated based on training data

Verbose: level of details in every step want to display enter 1 or 2

model.fit(X, y, epochs=300, verbose=1)
Epoch 1/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - loss: 0.0142 Epoch 2/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0145 Epoch 3/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0146 Epoch 4/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0148 Epoch 5/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0147 Epoch 6/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0150 Epoch 7/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0139 Epoch 8/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0140 Epoch 9/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0131 Epoch 10/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0144 Epoch 11/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0140 Epoch 12/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0143 Epoch 13/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0134 Epoch 14/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0143 Epoch 15/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0135 Epoch 16/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0141 Epoch 17/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0141 Epoch 18/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0134 Epoch 19/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0137 Epoch 20/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0134 Epoch 21/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0140 Epoch 22/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0137 Epoch 23/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0134 Epoch 24/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0139 Epoch 25/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0140 Epoch 26/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0140 Epoch 27/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0132 Epoch 28/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0134 Epoch 29/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0127 Epoch 30/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0130 Epoch 31/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0132 Epoch 32/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0137 Epoch 33/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0142 Epoch 34/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0143 Epoch 35/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0134 Epoch 36/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0144 Epoch 37/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0136 Epoch 38/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0119 Epoch 39/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0133 Epoch 40/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0126 Epoch 41/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0135 Epoch 42/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0135 Epoch 43/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0127 Epoch 44/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0136 Epoch 45/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0134 Epoch 46/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0135 Epoch 47/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0139 Epoch 48/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0128 Epoch 49/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0130 Epoch 50/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0127 Epoch 51/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0129 Epoch 52/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0131 Epoch 53/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0133 Epoch 54/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0120 Epoch 55/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 56/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0125 Epoch 57/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0130 Epoch 58/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0127 Epoch 59/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0128 Epoch 60/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0126 Epoch 61/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0130 Epoch 62/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0131 Epoch 63/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0134 Epoch 64/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0124 Epoch 65/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0135 Epoch 66/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0122 Epoch 67/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0130 Epoch 68/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0126 Epoch 69/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0128 Epoch 70/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0125 Epoch 71/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0123 Epoch 72/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0123 Epoch 73/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 74/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0125 Epoch 75/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0122 Epoch 76/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0117 Epoch 77/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0131 Epoch 78/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0124 Epoch 79/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0126 Epoch 80/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0127 Epoch 81/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0113 Epoch 82/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0126 Epoch 83/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.01174 Epoch 84/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0128 Epoch 85/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0116 Epoch 86/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0126 Epoch 87/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0120 Epoch 88/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 89/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0121 Epoch 90/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0115 Epoch 91/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0121 Epoch 92/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0119 Epoch 93/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0122 Epoch 94/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 95/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0116 Epoch 96/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0125 Epoch 97/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0118 Epoch 98/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0116 Epoch 99/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0120 Epoch 100/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0118 Epoch 101/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0118 Epoch 102/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0122 Epoch 103/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0116 Epoch 104/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0127 Epoch 105/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0118 Epoch 106/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0122 Epoch 107/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0119 Epoch 108/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0119 Epoch 109/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0116 Epoch 110/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0122 Epoch 111/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0119 Epoch 112/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0116 Epoch 113/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0118 Epoch 114/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0122 Epoch 115/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0113 Epoch 116/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0116 Epoch 117/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0115 Epoch 118/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0119 Epoch 119/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0116 Epoch 120/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0114 Epoch 121/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0120 Epoch 122/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 123/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0110 Epoch 124/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0114 Epoch 125/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0120 Epoch 126/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0113 Epoch 127/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0115 Epoch 128/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0117 Epoch 129/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0106 Epoch 130/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0105 Epoch 131/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0114 Epoch 132/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0116 Epoch 133/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0117 Epoch 134/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0110 Epoch 135/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0109 Epoch 136/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0111 Epoch 137/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0115 Epoch 138/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0108 Epoch 139/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0115 Epoch 140/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0107 Epoch 141/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0113 Epoch 142/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0104 Epoch 143/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0113 Epoch 144/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0120 Epoch 145/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0102 Epoch 146/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0109 Epoch 147/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0111 Epoch 148/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0105 Epoch 149/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0109 Epoch 150/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0104 Epoch 151/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0107 Epoch 152/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0106 Epoch 153/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0112 Epoch 154/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0108 Epoch 155/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0111 Epoch 156/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0107 Epoch 157/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0107 Epoch 158/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0112 Epoch 159/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0105 Epoch 160/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0112 Epoch 161/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0108 Epoch 162/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0101 Epoch 163/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0096 Epoch 164/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.01072 Epoch 165/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0114 Epoch 166/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0111 Epoch 167/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0105 Epoch 168/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0105 Epoch 169/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0102 Epoch 170/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0095 Epoch 171/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0090 Epoch 172/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0108 Epoch 173/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0101 Epoch 174/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0100 Epoch 175/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0101 Epoch 176/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0113 Epoch 177/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0109 Epoch 178/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 179/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0099 Epoch 180/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0101 Epoch 181/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0105 Epoch 182/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0105 Epoch 183/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0102 Epoch 184/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0100 Epoch 185/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0105 Epoch 186/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0103 Epoch 187/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0100 Epoch 188/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0103 Epoch 189/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0106 Epoch 190/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0096 Epoch 191/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0096 Epoch 192/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0101 Epoch 193/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0095 Epoch 194/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0097 Epoch 195/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0098 Epoch 196/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0094 Epoch 197/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0098 Epoch 198/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0092 Epoch 199/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0094 Epoch 200/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0093 Epoch 201/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0099 Epoch 202/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0099 Epoch 203/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0094 Epoch 204/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0095 Epoch 205/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0101 Epoch 206/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0098 Epoch 207/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0107 Epoch 208/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0092 Epoch 209/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0100 Epoch 210/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0091 Epoch 211/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0097 Epoch 212/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0095 Epoch 213/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0090 Epoch 214/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0100 Epoch 215/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0095 Epoch 216/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0097 Epoch 217/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0100 Epoch 218/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0088 Epoch 219/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0090 Epoch 220/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0092 Epoch 221/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0091 Epoch 222/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0089 Epoch 223/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0090 Epoch 224/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.0087 Epoch 225/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0095 Epoch 226/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0090 Epoch 227/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0099 Epoch 228/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0096 Epoch 229/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0092 Epoch 230/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0096 Epoch 231/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0094 Epoch 232/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0090 Epoch 233/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0089 Epoch 234/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0087 Epoch 235/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0089 Epoch 236/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0089 Epoch 237/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0095 Epoch 238/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0086 Epoch 239/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0090 Epoch 240/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0085 Epoch 241/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0085 Epoch 242/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0090 Epoch 243/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0092 Epoch 244/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0082 Epoch 245/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0086 Epoch 246/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0085 Epoch 247/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0087 Epoch 248/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0090 Epoch 249/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0086 Epoch 250/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0093 Epoch 251/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0087 Epoch 252/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0089 Epoch 253/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0092 Epoch 254/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0089 Epoch 255/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0079 Epoch 256/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.0082 Epoch 257/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0088 Epoch 258/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0086 Epoch 259/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0082 Epoch 260/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0084 Epoch 261/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0080 Epoch 262/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 996us/step - loss: 0.0082 Epoch 263/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0083 Epoch 264/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0084 Epoch 265/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0080 Epoch 266/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0079 Epoch 267/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0087 Epoch 268/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0082 Epoch 269/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0088 Epoch 270/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0082 Epoch 271/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0083 Epoch 272/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0083 Epoch 273/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.0080 Epoch 274/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0078 Epoch 275/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0087 Epoch 276/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0086 Epoch 277/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0086 Epoch 278/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0082 Epoch 279/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0085 Epoch 280/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0084 Epoch 281/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0086 Epoch 282/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0078 Epoch 283/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0084 Epoch 284/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0080 Epoch 285/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0076 Epoch 286/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0080 Epoch 287/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0081 Epoch 288/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0080 Epoch 289/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step - loss: 0.0076 Epoch 290/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0080 Epoch 291/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0079 Epoch 292/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0080 Epoch 293/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0082 Epoch 294/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0083 Epoch 295/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0075 Epoch 296/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - loss: 0.0077 Epoch 297/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0084 Epoch 298/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - loss: 0.0076 Epoch 299/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0074 Epoch 300/300 2/2 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - loss: 0.0080
<keras.src.callbacks.history.History at 0x11901464dd0>

Test the Model

# Predict the next number in the sequence test_input = np.array([[150]]) predicted = model.predict(test_input) print(f"Input: 150 → Predicted Output: {predicted[0][0]:.2f}")
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step Input: 150 → Predicted Output: 151.62