Skip to content

Conversation

@mjfox3
Copy link

@mjfox3 mjfox3 commented Jan 13, 2022

using the package in an environment without cuda support causes it to fail. Adding the parameter to shut it off if necessary allows it to function normall.

@amiyamandal-dev
Copy link

@mjfox3 I guess

use_cuda: bool = torch.cuda.is_available()

might be a better approach

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants