This repository contains the implementation of several machine learning models from scratch for multi-class classification and regression tasks. The implemented models include XGBoost, Fischer's Linear Discriminant Analysis (LDA), AdaBoost, Support Vector Machine (SVM) for multi-class classification, and Decision Tree. These models were built without relying on external machine learning libraries. Course Project Information
This project was completed as part of the coursework for the "Machine Learning" course. The goal was to gain a deeper understanding of the inner workings of machine learning algorithms by implementing them from scratch. Implemented Models
XGBoost: A gradient boosting algorithm known for its efficiency and high performance.
Fischer's LDA: Linear Discriminant Analysis for classification tasks.
AdaBoost: An ensemble learning method that combines weak learners to create a strong learner.
SVM (Multi-class): Support Vector Machine adapted for multi-class classification.
Decision Tree: A decision tree classifier for both classification and regression tasks.
The implemented models were trained and tested on two datasets:
Carbon Emissions Dataset: Used for regression tasks.
Crime Rate Dataset: Used for multi-class classification tasks.
Clone the Repository:
git clone https://github.com/your-username/Implementing-ML-Models-from-scratch-for-multi-class-classification-and-regression.git
cd src/
Run the Models: Execute the provided scripts or notebooks to run the implemented machine learning models.
Rishabh Patil