Libraries used: numpy, scipy, matplotlib and graphviz.
Optimization-based algorithms use the gradient descent algorithm. It is possible to use advanced optimization algorithms supported by scipy.optimize.minimize function.
- List of algorithms:
- Supervised Learning:
- Ridge Regression
- Logistic Regression with He initialization, Momentum and RMSprop
- Decision Tree Classifier with Information Gain, Gain Ratio and Gini Index Criteria
- Shallow and Deep Neural Network with He initialization, Momentum and RMSprop
- K-Nearest Neighbours Classifier and Regressor with Minkowski Distance
- Support Vector Machine Classifier and Regressor
- OneVsRest Classifier
- Naive Bayes based Classifiers:
- Guassian Classifier with Laplace Smoothing (Works with both numerical and categorical features)
- Bernoulli Classifier with Laplace Smoothing
- Multinomial Classifier with Laplace Smoothing
- Unsupervised Learning:
- K-Means Clustering with Minkowski Distance
- Principal Component Analysis using Singular Value Decomposition
- Ensemble Methods:
- Bagging Classifier
- Random Forest Classifier
- Majority Vote Classifier
- Evalution Metrics:
- Confusion Matrix
- Micro and Macro Accuracy
- Micro and Macro Recall
- Micro and Macro Precision
- Micro and Macro F1-score
- Mean Absolute Error (MAE)
- Mean Squared Error (MSE)
- R-squared score
- Log Loss Score
- Zero One Loss
- Model Selection:
- Holdout
- Repeated Stratified K-Fold
- Leave-one-out
- Repeated Bootstrap
- Preprocessing:
- MinMax Normalizer
- Z-score Normalizer
- Plotting:
- Plot Simple Regression
- Plot Simple Decision Boundary
- Plot Simple Clustering