Skip to content

Conversation

@Iron-Stark
Copy link

  • Repo Link

  • Features:

    • Implemented CPU, Naive, Thrust and Work Efficient scan
    • Implemented Forward propagation and backward propagation for Multilayer Perceptron. Got all 52/52 correct for the given dataset.
    • Apart from the data given, trained the network on MNIST dataset of 60,000 handwritten digits and achieved 95.65% accuracy on that.
  • Feedback: The MLP assignment could have been better structured but I had fun implementing backpropagation from scratch. The one thing that I would like to recommend for future is having more than 1 examples per class for training as training on the same dataset as given is not actually learning but memorization. What I had fun with the most is training it on MNIST. I guess that can be given as the assignment in future iterations of the course.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant