Skip to content
This repository was archived by the owner on Aug 25, 2025. It is now read-only.
This repository was archived by the owner on Aug 25, 2025. It is now read-only.

10gb lambda #14

@Emveez

Description

@Emveez

So aws lambda now support up to 10gb and increases computational capacity in relation with the mem allocation. I was running inference with 3gb allocation and compared with 10gb but did not see any major improvements. Why could this be? Maybe the static compiled torch can not use all vcpu?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions