Skip to content

Conversation

Copy link

Copilot AI commented Jan 11, 2026

Description

Users needed clarity on whether they can use custom local models (e.g., LMStudio instances) or are restricted to predefined models. This adds comprehensive documentation explaining model provider configuration.

Changes

  • New documentation: docs/models.md (384 lines) covering:

    • Built-in providers: OpenAI, LMStudio (port 1234, default model openai/gpt-oss-20b), Ollama (port 11434, default model gpt-oss:20b)
    • Custom provider configuration via ~/.codex/config.toml with all available options (auth, wire protocols, HTTP headers, retry settings)
    • Provider examples: Azure OpenAI, OpenRouter, Together.ai, Groq
    • Configuration profiles for quick provider switching
    • Environment variable overrides (CODEX_OSS_BASE_URL, CODEX_OSS_PORT, OPENAI_BASE_URL)
    • Troubleshooting common issues
  • Updated links:

    • README.md: Added model selection documentation to docs section
    • docs/config.md: Added reference to model selection guide

Example Configuration

[model_providers.lmstudio]
name = "LMStudio"
base_url = "http://localhost:1234/v1"
wire_api = "responses"
requires_openai_auth = false

[profiles.local]
model_provider = "lmstudio"
model = "llama-3.3-70b-instruct"
codex --oss  # Use LMStudio
codex --oss -m deepseek/r1-distill-qwen-32b  # Specify model
codex -p local  # Use profile

Configuration applies to both CLI and IDE usage.

Original prompt

In this article, the details are present of how to select specific models when using the extension. I would like to review this to see if I can use my own choice of local models (I have an LMStudio instance with a wide range of models available) or if I am restricted only to the provided ones. https://developers.openai.com/codex/ide/features#switch-between-models


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI and others added 2 commits January 11, 2026 23:05
Co-authored-by: ian-morgan99 <106852423+ian-morgan99@users.noreply.github.com>
Co-authored-by: ian-morgan99 <106852423+ian-morgan99@users.noreply.github.com>
Copilot AI changed the title [WIP] Review local model selection options for LMStudio Add documentation for model selection and custom providers Jan 11, 2026
Copilot AI requested a review from ian-morgan99 January 11, 2026 23:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants