This repository contains a PyTorch implementation of the Subspace Rotation Algorithm (SRA) for training a Restricted Hopfield Network (RHN) with bipolar discrete patterns.
The current code is designed for bipolar patterns (e.g. {−1,+1}). If you want to train with binary patterns (e.g. {0,1}), you will need to adjust the activation function in the output layer accordingly.
- Overview
Model: Restricted Hopfield Network (RHN)
Training algorithm: Subspace Rotation Algorithm (SRA)
Pattern type: Bipolar discrete patterns (default)
Goal: Efficiently train RHN weights so that stored patterns are stable attractors, with improved convergence and robustness compared to classical Hebbian or gradient-based methods.
This code is mainly for research / demonstration of SRA on RHN, not for production deployment.
- Citation
If you use this code in academic work, please consider citing the corresponding paper on the Subspace Rotation Algorithm for Training Restricted Hopfield Networks (add full reference / DOI here).
@inproceedings{lin2023basin, title={On the Basin of Attraction and Capacity of Restricted Hopfield Network as an Auto-Associative Memory}, author={Lin, Ci and Yeap, Tet Hin and Kiringa, Iluju}, booktitle={2023 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)}, pages={146--154}, year={2023}, organization={IEEE} }
@inproceedings{lin2024sra, author={Lin, Ci and Yeap, Tet and Kiringa, Iluju}, booktitle={2024 IEEE 36th International Conference on Tools with Artificial Intelligence (ICTAI)}, title={Subspace Rotation Algorithm for Training Restricted Hopfield Network}, year={2024}, volume={}, number={}, pages={740-747}, keywords={Training;Hopfield neural networks;Distortion;Robustness;Noise measurement;Matrix decomposition;Time complexity;Artificial intelligence;Singular value decomposition;Restricted Hopfield Network;Subspace Rotation Algorithm;Backpropagation;Auto-associative Memory}, doi={10.1109/ICTAI62512.2024.00110} }
- License
All rights reserved. For academic use only.