Skip to content

Conversation

@arturshagito0
Copy link
Contributor

Optimizer cannot be pickled in later PyTorch version directly - their optimizer.state_dict() object must be saved instead. This little change ensures that optimizer attribute saves only state_dict

path, pickle_module=pickle_module, **kwargs)
attributes = {item: getattr(self, item) for item in self.PRESERVE if item != "optimizer"}
optimizer = getattr(self, "optimizer")
attributes["optimizer"] = optimizer.state_dict()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But how do we load the optimizer from that state if we need to? In the current implementation we dump Optimizer instance and can deserialize it to use.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants