Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,5 @@ __pycache__/
build/
dist/
*.egg-info/

*.md
env.sh
13 changes: 12 additions & 1 deletion laser/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,13 @@
"google": (GoogleGenAI, {}),
"openai": (OpenAIResponses, {}),
"chat": (OpenAILegacy, {}),
"litellm": (
OpenAILegacy,
{
"base_url": os.getenv("LITELLM_PROXY_API_BASE", "http://localhost:4000"),
"api_key": os.getenv("LITELLM_PROXY_API_KEY"),
},
),
}

SYSTEM_PROMPT = """You are Laser, a coding agent for this repository.
Expand Down Expand Up @@ -73,7 +80,7 @@ def parse_args():
parser.add_argument(
"--model",
default="openai/gpt-5.2",
help="Model identifier to use in the form <provider>/<model>",
help="Model identifier to use in the form <provider>/<model>. Providers: anthropic, google, openai, chat, litellm",
)
parser.add_argument(
"--max-tokens",
Expand Down Expand Up @@ -217,6 +224,10 @@ def build_provider(model: str, max_tokens: int) -> ChatProvider:
provider_cls, provider_kwargs = PROVIDER_CONFIGS[provider]
if provider_cls is Anthropic:
provider_kwargs = {**provider_kwargs, "default_max_tokens": max_tokens}
if provider == "litellm":
Copy link

Copilot AI Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The LITELLM_MODEL environment variable override behavior should be documented. Add a comment explaining that this allows runtime model override for litellm provider, as this is unique behavior not present in other providers.

Suggested change
if provider == "litellm":
if provider == "litellm":
# Allow runtime model override specifically for the litellm provider via
# the LITELLM_MODEL environment variable. This override behavior is unique
# to litellm and is not supported by other providers in this CLI.

Copilot uses AI. Check for mistakes.
env_model = os.getenv("LITELLM_MODEL")
if env_model:
model_name = env_model
return provider_cls(model=model_name, **provider_kwargs)


Expand Down
14 changes: 14 additions & 0 deletions run.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
export LITELLM_MODEL=glm-4.7
export LITELLM_PROXY_API_KEY=sk-1234
export LITELLM_PROXY_API_BASE=http://192.168.43.179:4001
Comment on lines +1 to +3
Copy link

Copilot AI Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The script contains hardcoded credentials (API key) and a specific IP address. This convenience script should not be committed with real credentials or environment-specific configuration. Consider creating a template file (e.g., run.sh.example) instead, or documenting that users should customize these values.

Suggested change
export LITELLM_MODEL=glm-4.7
export LITELLM_PROXY_API_KEY=sk-1234
export LITELLM_PROXY_API_BASE=http://192.168.43.179:4001
# Default model can be overridden via LITELLM_MODEL
export LITELLM_MODEL="${LITELLM_MODEL:-glm-4.7}"
# Require the API key to be provided via the environment; do not hardcode it here.
if [ -z "${LITELLM_PROXY_API_KEY:-}" ]; then
echo "Error: LITELLM_PROXY_API_KEY is not set. Please export it in your environment before running this script." >&2
exit 1
fi
export LITELLM_PROXY_API_KEY
# Default API base can be overridden via LITELLM_PROXY_API_BASE; use localhost by default.
export LITELLM_PROXY_API_BASE="${LITELLM_PROXY_API_BASE:-http://127.0.0.1:4001}"

Copilot uses AI. Check for mistakes.
uv run laser --model litellm/glm-4.7 $@

exit

o use:
Copy link

Copilot AI Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incomplete text. Line 8 appears to be missing the first character. Should be 'To use:' instead of 'o use:'.

Suggested change
o use:
To use:

Copilot uses AI. Check for mistakes.

$: uv tool install git+https://github.com/ExpressGradient/laser
$: laser

or
Copy link

Copilot AI Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing capital letter. Should be 'Or' instead of 'or' for proper sentence structure.

Suggested change
or
Or

Copilot uses AI. Check for mistakes.
$: uvx --from git+https://github.com/ExpressGradient/laser laser
Comment on lines +6 to +14
Copy link

Copilot AI Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The exit statement on line 6 makes all subsequent documentation (lines 8-14) unreachable and non-functional. If this is intentional to prevent accidental execution, the documentation should be moved above the exit or the exit should be removed.

Suggested change
exit
o use:
$: uv tool install git+https://github.com/ExpressGradient/laser
$: laser
or
$: uvx --from git+https://github.com/ExpressGradient/laser laser
# To use:
#
# $: uv tool install git+https://github.com/ExpressGradient/laser
# $: laser
#
# or
# $: uvx --from git+https://github.com/ExpressGradient/laser laser
exit

Copilot uses AI. Check for mistakes.
Loading