Building Various MLP architectures on MNIST dataset using Keras
Creating Multi layered perceptron to classify whether the given input belongs to classes [0,1,2,3,4,5,6,7,8,9]
To achieve this task, trying out various different optimizers,activation functions, weight initializers and adding layers like (Dropout and Batch Normalization) to get rid from Overfitting.
The objective of this task is understand how MLP works with different kind of parameters (infact Hyperparameters).