Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
suyashi29
GitHub Repository: suyashi29/python-su
Path: blob/master/RNN Fundamentals/2 Back propagation in Neural Network .ipynb
3074 views
Kernel: Python 3 (ipykernel)

An example of a binary classification neural network using Keras for backpropagation:

  • In this example, we use the make_classification function from scikit-learn to generate synthetic binary classification data. The neural network architecture has an input layer with 20 neurons (since we have 20 features), a hidden layer with 16 neurons and ReLU activation, and an output layer with a single neuron and sigmoid activation for binary classification.

# Import necessary libraries import numpy as np from keras.models import Sequential from keras.layers import Dense from sklearn.model_selection import train_test_split from sklearn.preprocessing import StandardScaler from sklearn.datasets import make_classification # Generate synthetic data for binary classification X, y = make_classification(n_samples=1000, n_features=20, n_informative=10, n_clusters_per_class=2, random_state=42) # Split the data into training and testing sets X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) # Standardize the data scaler = StandardScaler() X_train = scaler.fit_transform(X_train) X_test = scaler.transform(X_test)

Model

  • The model.compile line specifies 'adam' as the optimization algorithm, 'binary_crossentropy' as the loss function (suitable for binary classification), and 'accuracy' as the metric to monitor during training.

  • The model.fit line is where the backpropagation takes place. The model is trained on the training data for a specified number of epochs.

# Build a binary classification neural network model = Sequential() model.add(Dense(units=16, activation='relu', input_dim=20)) # Input layer with 20 features model.add(Dense(units=1, activation='sigmoid')) # Output layer with a single neuron (binary classification) # Compile the model model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy']) # Train the model (backpropagation happens here) model.fit(X_train, y_train, epochs=10, batch_size=32, validation_split=0.2) # Evaluate the model on the test set loss, accuracy = model.evaluate(X_test, y_test) print(f'Test Loss: {loss:.4f}, Test Accuracy: {accuracy:.4f}') # Make predictions on new data sample_data = np.random.randn(5, 20) # Random 5 samples with 20 features predictions = model.predict(sample_data) print("Predictions:") print(predictions)
Epoch 1/10 20/20 [==============================] - 0s 7ms/step - loss: 0.6381 - accuracy: 0.6219 - val_loss: 0.6127 - val_accuracy: 0.6625 Epoch 2/10 20/20 [==============================] - 0s 2ms/step - loss: 0.5859 - accuracy: 0.6938 - val_loss: 0.5795 - val_accuracy: 0.7188 Epoch 3/10 20/20 [==============================] - 0s 2ms/step - loss: 0.5462 - accuracy: 0.7563 - val_loss: 0.5496 - val_accuracy: 0.7500 Epoch 4/10 20/20 [==============================] - 0s 2ms/step - loss: 0.5121 - accuracy: 0.8047 - val_loss: 0.5244 - val_accuracy: 0.7875 Epoch 5/10 20/20 [==============================] - 0s 3ms/step - loss: 0.4829 - accuracy: 0.8266 - val_loss: 0.4991 - val_accuracy: 0.7875 Epoch 6/10 20/20 [==============================] - 0s 2ms/step - loss: 0.4560 - accuracy: 0.8391 - val_loss: 0.4758 - val_accuracy: 0.8188 Epoch 7/10 20/20 [==============================] - 0s 3ms/step - loss: 0.4318 - accuracy: 0.8500 - val_loss: 0.4540 - val_accuracy: 0.8188 Epoch 8/10 20/20 [==============================] - 0s 2ms/step - loss: 0.4093 - accuracy: 0.8641 - val_loss: 0.4337 - val_accuracy: 0.8188 Epoch 9/10 20/20 [==============================] - 0s 3ms/step - loss: 0.3887 - accuracy: 0.8734 - val_loss: 0.4142 - val_accuracy: 0.8313 Epoch 10/10 20/20 [==============================] - 0s 3ms/step - loss: 0.3697 - accuracy: 0.8859 - val_loss: 0.3968 - val_accuracy: 0.8375 7/7 [==============================] - 0s 1ms/step - loss: 0.3762 - accuracy: 0.8550 Test Loss: 0.3762, Test Accuracy: 0.8550 1/1 [==============================] - 0s 55ms/step Predictions: [[0.37654167] [0.49492466] [0.7099343 ] [0.7996195 ] [0.11795358]]

Multi Classification example using Backpropagation using Keras

Let us understand irsis data

import pandas as pd import matplotlib.pyplot as plt import seaborn as sns from sklearn.datasets import load_iris # Load the Iris dataset iris = load_iris() # Create a DataFrame for the features iris_df = pd.DataFrame(data=iris.data, columns=iris.feature_names)
# Add the target column to the DataFrame iris_df['target'] = iris.target # Display the first few rows of the DataFrame iris_df.head()
# Display descriptive statistics for the features print("\nDescriptive Statistics for Features:") iris_df.describe()
Descriptive Statistics for Features:
# Display the count of each class in the target column print("\nCount of Each Class in Target:") print(iris_df['target'].value_counts())
Count of Each Class in Target: 0 50 1 50 2 50 Name: target, dtype: int64
# Visualizations plt.figure(figsize=(12, 6)) # Pairplot for feature distribution and relationships sns.pairplot(iris_df, hue='target', markers=["o", "s", "D"])
<seaborn.axisgrid.PairGrid at 0x11f2e0363d0>
<Figure size 864x432 with 0 Axes>
Image in a Jupyter notebook

Multi Classification example using Backpropagation using Keras

  • we are using the Iris dataset, a well-known dataset for multi-class classification. The neural network architecture has an input layer with 4 neurons (since the Iris dataset has 4 features), and an output layer with 3 neurons (one for each class) using softmax activation for multi-class classification.

# Import necessary libraries import numpy as np from keras.models import Sequential from keras.layers import Dense from keras.utils import to_categorical from sklearn.model_selection import train_test_split from sklearn.preprocessing import StandardScaler from sklearn.datasets import load_iris # Load the Iris dataset (a popular multi-class classification dataset) iris = load_iris() X, y = iris.data, iris.target
# Convert labels to one-hot encoding y_one_hot = to_categorical(y, num_classes=3) # Split the data into training and testing sets X_train, X_test, y_train, y_test = train_test_split(X, y_one_hot, test_size=0.2, random_state=42) # Standardize the data scaler = StandardScaler() X_train = scaler.fit_transform(X_train) X_test = scaler.transform(X_test)
# Build a multi-class classification neural network species_model = Sequential() species_model.add(Dense(units=16, activation='relu', input_dim=4)) # Input layer with 4 features (Iris dataset has 4 features) species_model.add(Dense(units=3, activation='softmax')) # Output layer with 3 neurons (one for each class) and softmax activation
# Compile the model species_model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) # Train the model (backpropagation happens here) species_model.fit(X_train, y_train, epochs=50, batch_size=10, validation_split=0.2)
Epoch 1/50 10/10 [==============================] - 0s 14ms/step - loss: 1.2379 - accuracy: 0.2500 - val_loss: 1.1222 - val_accuracy: 0.2083 Epoch 2/50 10/10 [==============================] - 0s 4ms/step - loss: 1.1869 - accuracy: 0.2500 - val_loss: 1.0711 - val_accuracy: 0.2500 Epoch 3/50 10/10 [==============================] - 0s 4ms/step - loss: 1.1427 - accuracy: 0.2500 - val_loss: 1.0221 - val_accuracy: 0.2500 Epoch 4/50 10/10 [==============================] - 0s 5ms/step - loss: 1.1007 - accuracy: 0.2708 - val_loss: 0.9831 - val_accuracy: 0.3333 Epoch 5/50 10/10 [==============================] - 0s 4ms/step - loss: 1.0638 - accuracy: 0.3438 - val_loss: 0.9443 - val_accuracy: 0.5000 Epoch 6/50 10/10 [==============================] - 0s 5ms/step - loss: 1.0294 - accuracy: 0.3750 - val_loss: 0.9104 - val_accuracy: 0.5833 Epoch 7/50 10/10 [==============================] - 0s 5ms/step - loss: 0.9985 - accuracy: 0.3750 - val_loss: 0.8774 - val_accuracy: 0.6667 Epoch 8/50 10/10 [==============================] - 0s 4ms/step - loss: 0.9693 - accuracy: 0.4688 - val_loss: 0.8487 - val_accuracy: 0.7917 Epoch 9/50 10/10 [==============================] - 0s 4ms/step - loss: 0.9419 - accuracy: 0.5312 - val_loss: 0.8178 - val_accuracy: 0.8333 Epoch 10/50 10/10 [==============================] - 0s 4ms/step - loss: 0.9163 - accuracy: 0.5417 - val_loss: 0.7901 - val_accuracy: 0.8750 Epoch 11/50 10/10 [==============================] - 0s 5ms/step - loss: 0.8918 - accuracy: 0.5938 - val_loss: 0.7658 - val_accuracy: 0.8750 Epoch 12/50 10/10 [==============================] - 0s 4ms/step - loss: 0.8693 - accuracy: 0.6354 - val_loss: 0.7418 - val_accuracy: 0.8750 Epoch 13/50 10/10 [==============================] - 0s 4ms/step - loss: 0.8467 - accuracy: 0.6667 - val_loss: 0.7220 - val_accuracy: 0.8750 Epoch 14/50 10/10 [==============================] - 0s 5ms/step - loss: 0.8259 - accuracy: 0.6875 - val_loss: 0.6996 - val_accuracy: 0.8750 Epoch 15/50 10/10 [==============================] - 0s 4ms/step - loss: 0.8050 - accuracy: 0.6979 - val_loss: 0.6794 - val_accuracy: 0.8750 Epoch 16/50 10/10 [==============================] - 0s 4ms/step - loss: 0.7851 - accuracy: 0.7500 - val_loss: 0.6619 - val_accuracy: 0.9167 Epoch 17/50 10/10 [==============================] - 0s 4ms/step - loss: 0.7641 - accuracy: 0.7812 - val_loss: 0.6439 - val_accuracy: 0.9583 Epoch 18/50 10/10 [==============================] - 0s 4ms/step - loss: 0.7435 - accuracy: 0.7917 - val_loss: 0.6270 - val_accuracy: 0.9583 Epoch 19/50 10/10 [==============================] - 0s 4ms/step - loss: 0.7240 - accuracy: 0.7917 - val_loss: 0.6075 - val_accuracy: 0.9583 Epoch 20/50 10/10 [==============================] - 0s 4ms/step - loss: 0.7026 - accuracy: 0.8125 - val_loss: 0.5920 - val_accuracy: 0.9583 Epoch 21/50 10/10 [==============================] - 0s 5ms/step - loss: 0.6818 - accuracy: 0.8333 - val_loss: 0.5767 - val_accuracy: 0.9583 Epoch 22/50 10/10 [==============================] - 0s 6ms/step - loss: 0.6613 - accuracy: 0.8542 - val_loss: 0.5622 - val_accuracy: 0.9167 Epoch 23/50 10/10 [==============================] - 0s 4ms/step - loss: 0.6412 - accuracy: 0.8542 - val_loss: 0.5451 - val_accuracy: 0.9167 Epoch 24/50 10/10 [==============================] - 0s 4ms/step - loss: 0.6218 - accuracy: 0.8542 - val_loss: 0.5321 - val_accuracy: 0.9167 Epoch 25/50 10/10 [==============================] - 0s 3ms/step - loss: 0.6028 - accuracy: 0.8646 - val_loss: 0.5192 - val_accuracy: 0.9167 Epoch 26/50 10/10 [==============================] - 0s 3ms/step - loss: 0.5844 - accuracy: 0.8542 - val_loss: 0.5073 - val_accuracy: 0.9167 Epoch 27/50 10/10 [==============================] - 0s 5ms/step - loss: 0.5673 - accuracy: 0.8542 - val_loss: 0.4953 - val_accuracy: 0.9167 Epoch 28/50 10/10 [==============================] - 0s 4ms/step - loss: 0.5497 - accuracy: 0.8542 - val_loss: 0.4797 - val_accuracy: 0.9167 Epoch 29/50 10/10 [==============================] - 0s 4ms/step - loss: 0.5333 - accuracy: 0.8646 - val_loss: 0.4690 - val_accuracy: 0.9167 Epoch 30/50 10/10 [==============================] - 0s 4ms/step - loss: 0.5183 - accuracy: 0.8646 - val_loss: 0.4572 - val_accuracy: 0.9167 Epoch 31/50 10/10 [==============================] - 0s 4ms/step - loss: 0.5024 - accuracy: 0.8750 - val_loss: 0.4479 - val_accuracy: 0.9167 Epoch 32/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4886 - accuracy: 0.8750 - val_loss: 0.4374 - val_accuracy: 0.9167 Epoch 33/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4749 - accuracy: 0.8750 - val_loss: 0.4284 - val_accuracy: 0.9167 Epoch 34/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4615 - accuracy: 0.8750 - val_loss: 0.4203 - val_accuracy: 0.9167 Epoch 35/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4499 - accuracy: 0.8854 - val_loss: 0.4135 - val_accuracy: 0.9167 Epoch 36/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4381 - accuracy: 0.8854 - val_loss: 0.4044 - val_accuracy: 0.9167 Epoch 37/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4269 - accuracy: 0.8854 - val_loss: 0.3964 - val_accuracy: 0.9167 Epoch 38/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4164 - accuracy: 0.8854 - val_loss: 0.3881 - val_accuracy: 0.9167 Epoch 39/50 10/10 [==============================] - 0s 4ms/step - loss: 0.4069 - accuracy: 0.8854 - val_loss: 0.3808 - val_accuracy: 0.9167 Epoch 40/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3975 - accuracy: 0.8854 - val_loss: 0.3741 - val_accuracy: 0.9167 Epoch 41/50 10/10 [==============================] - 0s 5ms/step - loss: 0.3900 - accuracy: 0.8854 - val_loss: 0.3706 - val_accuracy: 0.9167 Epoch 42/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3805 - accuracy: 0.8854 - val_loss: 0.3637 - val_accuracy: 0.9167 Epoch 43/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3729 - accuracy: 0.8854 - val_loss: 0.3585 - val_accuracy: 0.9167 Epoch 44/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3655 - accuracy: 0.8958 - val_loss: 0.3526 - val_accuracy: 0.9167 Epoch 45/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3579 - accuracy: 0.8958 - val_loss: 0.3469 - val_accuracy: 0.9167 Epoch 46/50 10/10 [==============================] - 0s 3ms/step - loss: 0.3512 - accuracy: 0.8958 - val_loss: 0.3394 - val_accuracy: 0.9167 Epoch 47/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3449 - accuracy: 0.8958 - val_loss: 0.3347 - val_accuracy: 0.9167 Epoch 48/50 10/10 [==============================] - 0s 5ms/step - loss: 0.3386 - accuracy: 0.8958 - val_loss: 0.3310 - val_accuracy: 0.9167 Epoch 49/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3329 - accuracy: 0.8958 - val_loss: 0.3247 - val_accuracy: 0.9167 Epoch 50/50 10/10 [==============================] - 0s 4ms/step - loss: 0.3271 - accuracy: 0.8958 - val_loss: 0.3186 - val_accuracy: 0.9167
<keras.callbacks.History at 0x11f2f43fca0>
# Evaluate the model on the test set loss, accuracy = species_model.evaluate(X_test, y_test) print(f'Test Loss: {loss:.4f}, Test Accuracy: {accuracy:.4f}')
1/1 [==============================] - 0s 21ms/step - loss: 0.2724 - accuracy: 0.9667 Test Loss: 0.2724, Test Accuracy: 0.9667
# Make predictions on new data sample_data = np.random.randn(4, 4) # Random 5 samples with 4 features predictions = species_model.predict(sample_data) print("Predictions:") print(predictions)
1/1 [==============================] - 0s 16ms/step Predictions: [[0.4744252 0.36803943 0.15753537] [0.03827874 0.78655976 0.1751615 ] [0.26816195 0.5401332 0.19170485] [0.07509188 0.4688139 0.4560942 ]]

we use a neural network with one hidden layer and visualize the activations of the neurons in the first hidden layer. The idea is that neurons with higher activations for certain features might be indicative of important features.

import numpy as np import pandas as pd import matplotlib.pyplot as plt
# Activation visualization for important features def visualize_activations(data, layer_index): from keras.models import Model intermediate_layer_model = Model(inputs=model.input, outputs=model.layers[layer_index].output) activations = intermediate_layer_model.predict(data) plt.figure(figsize=(12, 6)) for i in range(activations.shape[1]): plt.subplot(2, 8, i + 1) plt.hist(activations[:, i], bins=20, color='blue', alpha=0.7) plt.title(f'Activation {i}') plt.tight_layout() plt.show()
visualize_activations(X_test, layer_index=0)
1/1 [==============================] - 0s 42ms/step
Image in a Jupyter notebook