Skip to content

Conversation

@gloriforge
Copy link

I added a device option for MiniCheck to support inference using either the GPU or the CPU. For devices with limited resources, such as the Raspberry Pi, inference should be done using the CPU instead of the GPU. To accommodate this, I introduced a device option that allows you to choose between 'cpu' and 'cuda' to set the device for inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant