This repository implements the method presented in paper "Multiplicative update rules for accelerating deep learning training and increasing robustness"
| Base Optimizer | Multiplicative | Hybrid | Updates |
|---|---|---|---|
| SGD | MSGD | MSGD | M_ABS, H_ABS |
| Adam | MAdam | HAdam | M_ABS, H_ABS |
| Adagrad | MAdagrad | HAdagrad | M_ABS, H_ABS |
| RMSprop | MRMSprop | HRMSprop | M_ABS, H_ABS |
@article{kirtas2024multiplicative,
title={Multiplicative update rules for accelerating deep learning training and increasing robustness},
author={Kirtas, Manos and Passalis, Nikolaos and Tefas, Anastasios},
journal={Neurocomputing},
volume={576},
pages={127352},
year={2024},
publisher={Elsevier}
}
This work has received funding from the research project ”Energy Efficient and Trustworthy Deep Learning - DeepLET” is implemented in the framework of H.F.R.I call “Basic research Financing (Horizontal support of all Sciences)” under the National Recovery and Resilience Plan “Greece 2.0” funded by the European Union – NextGenerationEU (H.F.R.I. Project Number: 16762). This publication reflects the authors’ views only. The European Commission is not responsible for any use that may be made of the information it contains.
![]() |
![]() |
![]() |
|---|


