Skip to content

ishanb18/ANN-different_optimizers-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 

Repository files navigation

🧠 ANN Optimizer Comparison on CIFAR-10

This project compares the performance of different optimization algorithms on a simple Artificial Neural Network (ANN) using the CIFAR-10 dataset.


πŸ“ Files

  • ANN(Different optimizers).py – Main script that:
    • Loads CIFAR-10 dataset
    • Trains the same ANN architecture using multiple optimizers
    • Visualizes validation loss and accuracy
    • Compares final test accuracy for each optimizer

πŸ§ͺ Optimizers Used

  • Gradient Descent (GD)
  • Stochastic Gradient Descent (SGD)
  • Momentum
  • RMSprop
  • Adam

πŸ“Š Evaluation

  • Training and validation performed on 50% of CIFAR-10 dataset.
  • Plots:
    • Validation Loss vs Epochs
    • Validation Accuracy vs Epochs
  • Prints final test accuracy for each optimizer

πŸ“¦ Requirements

Install necessary packages using:

pip install tensorflow matplotlib seaborn scikit-learn

▢️ How to Run

Simply run the script:

python "ANN(Different optimizers).py"

Make sure your environment has access to the internet to download the CIFAR-10 dataset if not cached locally.


🎯 Objective

To observe and compare:

  • Convergence behavior
  • Generalization performance
  • Speed of different optimizers on a classification task


πŸ‘€ Authors

Ishan Bansal
Deep Learning Explorer

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages