Skip to content
/ tinny Public

A minimal neural network library in NumPy, born out of experiments during the Neural Networks and Deep Learning course at AGH University

License

Notifications You must be signed in to change notification settings

barhanc/tinny

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

172 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

tinny

This is a small educational neural network library implemented using NumPy only, which originated from my experiments during the Neural Networks and Deep Learning course at AGH University (see the course directory for Jupyter notebooks containing my solutions to course assignments, along with personal commentary and critique).

Currently it contains implementations of:

  • Energy-based models (EBMs): Restricted Boltzmann Machines (RBMs) and Deep Belief Networks (DBNs), trained using Contrastive Divergence (CD-K) and Persistent Contrastive Divergence (PCD).

  • Some conventional feed-forward networks trained via backpropagation.

Unlike more general-purpose libraries this project does not implement a full autograd engine. Instead, it adopts a simplified approach where layers are defined as parametrized tensor functions of the form y = Layer(x; θ). These functions accept a single tensor input and return a single tensor output, resulting in a path-like computation graph. While this limits flexibility (e.g. for branching architectures), it remains extensible. Features like residual connections are still achievable via helper layers (e.g. a custom Residual layer that ensures correct gradient flow).

This repository also serves as a personal sandbox for implementing and experimenting with different models, whether in pure NumPy, PyTorch, or alternative frameworks like tinygrad. As such, you can expect new and varied commits over time. See the notebooks directory for examples.

You can also find accompanying theory notes and derivations here.

Installation

git clone https://github.com/barhanc/tinny.git
cd tinny
uv sync

About

A minimal neural network library in NumPy, born out of experiments during the Neural Networks and Deep Learning course at AGH University

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages