Skip to content

AlexNet Performance Analysis : Impact of Batch Normalization on Imagenette.

Notifications You must be signed in to change notification settings

Jungminii-1114/AlexNet_with_BatchNormalization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

AlexNet Performance Analysis : Impact of Batch Normalization on Imagenette

Overview

This project experimentally verifies the impact of Batch Normalization (BN) on model convergence speed and final performance in deep learning model training.

Using the Imagenette dataset, a 10-class subset of ImageNet, I implemented and trained AlexNet, a classic CNN architecture. I compared the performance of two models - one with Batch Normalization and one without.


Dataset

  • Name : Imagenette (version 2-320)
  • Sourceb : FastAI (Jeremy Howard)
  • Characteristics : A Subset of the ImageNet dataset containing only 10 classes.
  • Data Split :
    • train : Training images (9469 images)
    • val : Validation images (3925 images)

Model Architecture

I implemented a model that is as identical as the original AlexNet paper.

  1. Featuers (Convolutional Layers)
    • Consists of 5 Convolutional Layers and 3 MaPool Layers.
    • If use_bn = True, BatchNorm2d is applied immediately after each Conv Layer (before ReLU).
  2. Classifier (Fully Conntected Layers)
    • Consists of 3 Linear Layers and 2 Drop-out(p=0.5) Layers.
    • The final output is logits (Softmax is included in CrossEntropyLoss).

Results

image image

About

AlexNet Performance Analysis : Impact of Batch Normalization on Imagenette.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published