Skip to content

Ability to run locally #9

@iplayfast

Description

@iplayfast

I'm able to run autogen using a config list like
config_list = [
{
"model": "mistral-7b-instruct-v0.1.Q5_0.gguf",#"mistral-instruct-7b", #t
he name of your running model
"api_base": "http://0.0.0.0:5001/v1", #the local address of the api
"api_type": "open_ai",
"api_key": "sk-111111111111111111111111111111111111111111111111", # just
a placeholder
}
]

Which talks to text-generation-Webui which has the openai api emulation turned on.
It would be nice to have a similar way of using a local llm with autogen-ui

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions