Skip to content

Add SOTA optimisers #61

@GilesStrong

Description

@GilesStrong

There was a big kerfuffle in 2019 about some new optimisers: Regularised Adam (Liu et al., 2019), Look Ahead (Zhang, Lucas, Hinton, & Ba, 2019), and a combination of both of them, Ranger (which also now includes Gradient Centralization (Yong, Huang, Hua, & Zhang, 2020).

Having tried these, (except the latest version of Ranger), I'vbe not found much improvement compared to Adam, but this was only on one dataset. The performance of Ranger, though, looks to be quite good for other datasets, so perhaps it is useful.

User-defined optimisers can easily be used in LUMIN, by passing the partial optimiser to the opt_args argument of ModelBuilder, e.g. opt_args = {'eps':1e-08, 'opt':partial(RAdam)}. It could be useful, however, to include the optimisers in LUMIN, to allow them to be easily used, without the user having to include an copied code.

These git repos include Apache 2.0 - licensed implementations of Radam and Ranger, so inclusion should be straight forward.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestgood first issueGood for newcomerslow priorityNot urgent and won't degrade with time

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions