Skip to content

This project is a custom implementation of backpropagation using only NumPy, designed to handle any number of layers and activation functions.

Notifications You must be signed in to change notification settings

abhi-neelam/Custom-Backpropagation-using-NumPy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 

Repository files navigation

Custom Backpropagation Implementation with NumPy

Overview

This project implements backpropagation from scratch using only NumPy to train a neural network for classification tasks. It supports networks with any number of layers and allows switching between multiple activation functions. The model achieves 98.1% accuracy on the MNIST handwritten digit dataset.

Features

  • Flexible network design: Supports any number of layers and neurons.
  • Multiple activation support: ReLU and Sigmoid functions are implemented.
  • Validated with Gradient Checking: Ensures backpropagation is implemented correctly.

Backpropagation

In order to implement backprop correctly, many techniques from Stanford's Lecture on Backpropagation were used. The lecture is well written and shows a lot of simplified computation on computing the gradients.

Training + Architecture

  • Weight Initialization: Uniform distribution within 0.01.
  • Learning rate: 0.01.
  • [256, 128, 10] neurons for MNIST classification.
  • ReLU activation for hidden layers, softmax output layer for final predictions.

Feedforward Architecture

Results

Achieved 98.1% on MNIST handwritten dataset.

Loss Comparison

Loss Curves

Future Ideas

  • Add Adam optimizer for faster training.
  • Experiment with other activation functions like Leaky ReLU or Tanh.
  • Apply more hyperparameter tuning.

Acknowledgments

About

This project is a custom implementation of backpropagation using only NumPy, designed to handle any number of layers and activation functions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published