Skip to content

Conversation

@xingyaoww
Copy link
Collaborator

@xingyaoww xingyaoww commented Jan 10, 2026

Summary

This PR implements OAuth PKCE flow for authenticating with OpenAI's ChatGPT service, allowing users with ChatGPT Plus/Pro subscriptions to use Codex models without consuming API credits.

Key features:

  • OAuth PKCE flow with local callback server for secure authentication
  • Credential storage with automatic token refresh (stored in ~/.local/share/openhands/auth/)
  • LLM.subscription_login() classmethod for easy access
  • Support for multiple Codex models via ChatGPT subscription:
    • gpt-5.2-codex
    • gpt-5.2
    • gpt-5.1-codex-max
    • gpt-5.1-codex-mini

Usage:

from openhands.sdk import LLM

# First time: opens browser for OAuth login
llm = LLM.subscription_login(model="gpt-5.2-codex")

# Subsequent calls: reuses cached credentials
llm = LLM.subscription_login(model="gpt-5.2-codex")

# Force fresh login
llm = LLM.subscription_login(model="gpt-5.2-codex", force_login=True)

Implementation inspired by: opencode's implementation

New modules:

  • openhands/sdk/llm/auth/__init__.py - Auth module exports
  • openhands/sdk/llm/auth/credentials.py - Credential storage and retrieval
  • openhands/sdk/llm/auth/openai.py - OpenAI OAuth PKCE flow implementation

Checklist

  • If the PR is changing/adding functionality, are there tests to reflect this?
  • If there is an example, have you run the example to make sure that it works?
  • If there are instructions on how to run the code, have you followed the instructions and made sure that it works?
  • If the feature is significant enough to require documentation, is there a PR open on the OpenHands/docs repository with the same branch name?
  • Is the github CI passing?

Note: This feature requires a ChatGPT Plus/Pro subscription to test the actual OAuth flow. The unit tests cover the credential storage, PKCE generation, URL building, and mock token refresh scenarios.

@xingyaoww can click here to continue refining the PR


Agent Server images for this PR

GHCR package: https://github.com/OpenHands/agent-sdk/pkgs/container/agent-server

Variants & Base Images

Variant Architectures Base Image Docs / Tags
java amd64, arm64 eclipse-temurin:17-jdk Link
python amd64, arm64 nikolaik/python-nodejs:python3.12-nodejs22 Link
golang amd64, arm64 golang:1.21-bookworm Link

Pull (multi-arch manifest)

# Each variant is a multi-arch manifest supporting both amd64 and arm64
docker pull ghcr.io/openhands/agent-server:1bfee76-python

Run

docker run -it --rm \
  -p 8000:8000 \
  --name agent-server-1bfee76-python \
  ghcr.io/openhands/agent-server:1bfee76-python

All tags pushed for this build

ghcr.io/openhands/agent-server:1bfee76-golang-amd64
ghcr.io/openhands/agent-server:1bfee76-golang_tag_1.21-bookworm-amd64
ghcr.io/openhands/agent-server:1bfee76-golang-arm64
ghcr.io/openhands/agent-server:1bfee76-golang_tag_1.21-bookworm-arm64
ghcr.io/openhands/agent-server:1bfee76-java-amd64
ghcr.io/openhands/agent-server:1bfee76-eclipse-temurin_tag_17-jdk-amd64
ghcr.io/openhands/agent-server:1bfee76-java-arm64
ghcr.io/openhands/agent-server:1bfee76-eclipse-temurin_tag_17-jdk-arm64
ghcr.io/openhands/agent-server:1bfee76-python-amd64
ghcr.io/openhands/agent-server:1bfee76-nikolaik_s_python-nodejs_tag_python3.12-nodejs22-amd64
ghcr.io/openhands/agent-server:1bfee76-python-arm64
ghcr.io/openhands/agent-server:1bfee76-nikolaik_s_python-nodejs_tag_python3.12-nodejs22-arm64
ghcr.io/openhands/agent-server:1bfee76-golang
ghcr.io/openhands/agent-server:1bfee76-java
ghcr.io/openhands/agent-server:1bfee76-python

About Multi-Architecture Support

  • Each variant tag (e.g., 1bfee76-python) is a multi-arch manifest supporting both amd64 and arm64
  • Docker automatically pulls the correct architecture for your platform
  • Individual architecture tags (e.g., 1bfee76-python-amd64) are also available if needed

Implement OAuth PKCE flow for authenticating with OpenAI's ChatGPT service,
allowing users with ChatGPT Plus/Pro subscriptions to use Codex models
(gpt-5.2-codex, gpt-5.2, gpt-5.1-codex-max, gpt-5.1-codex-mini) without
consuming API credits.

Key features:
- OAuth PKCE flow with local callback server for secure authentication
- Credential storage with automatic token refresh
- LLM.subscription_login() classmethod for easy access
- Support for multiple Codex models via ChatGPT subscription

Usage:
  from openhands.sdk import LLM
  llm = LLM.subscription_login(model='gpt-5.2-codex')

Inspired by opencode's implementation (anomalyco/opencode#7537).

Co-authored-by: openhands <openhands@all-hands.dev>
@github-actions
Copy link
Contributor

github-actions bot commented Jan 10, 2026

Coverage

Coverage Report •
FileStmtsMissCoverMissing
openhands-sdk/openhands/sdk/llm
   llm.py4388879%365, 386–387, 423, 600, 701, 703–704, 732, 782, 793–795, 799–803, 811–813, 823–825, 828–829, 833, 835–836, 838, 861–866, 986, 989–992, 1143–1144, 1153, 1166, 1168–1173, 1175–1192, 1195–1199, 1201–1202, 1208–1217, 1268, 1270
openhands-sdk/openhands/sdk/llm/auth
   credentials.py62198%104
   openai.py26610162%196–197, 201–203, 235, 238–239, 241–243, 245–246, 251, 253–257, 281–282, 293–295, 300–301, 310–312, 437–440, 443, 446, 448–449, 451–455, 460–465, 471–475, 481–482, 485–491, 497, 499–501, 503–505, 507–509, 511, 515–516, 519–520, 524, 527, 533–535, 538, 593, 595, 604, 661–662, 666, 669–673, 676–678, 681–682, 698, 737
TOTAL16712491870% 

- Use authlib for PKCE generation (generate_token, create_s256_code_challenge)
- Use aiohttp.web for OAuth callback server (cleaner than raw asyncio)
- Add Codex-specific parameters (store=false, instructions) via litellm_extra_body
- Add max_output_tokens=None for Codex (not supported by the API)
- Reduce code from ~575 lines to ~492 lines

This addresses the 'Instructions are not valid' error by properly
configuring the Codex API parameters.

Co-authored-by: openhands <openhands@all-hands.dev>
@xingyaoww
Copy link
Collaborator Author

@OpenHands /codereview-roasted and help me fix all the issues and make sure pre-commit run --all-files pass

@openhands-ai
Copy link

openhands-ai bot commented Jan 20, 2026

I'm on it! xingyaoww can track my progress at all-hands.dev

- Move time import to module level in credentials.py
- Replace os.uname() with platform.system()/machine() for cross-platform compatibility
- Move urllib.parse.urlencode import to module level in openai.py
- Remove extra blank line before __all__ in auth/__init__.py
- Apply ruff formatting fixes to llm.py

Co-authored-by: openhands <openhands@all-hands.dev>
@openhands-ai
Copy link

openhands-ai bot commented Jan 20, 2026

Summary

I completed the roasted code review and fixed all identified issues. The changes have been pushed to the feat/openai-subscription-auth branch.

Code Review (Linus-Style) 🔴 Needs Improvement

Critical Issues Fixed:

  1. [credentials.py] time module was imported inside methods instead of at module level - violates Python best practices
  2. [openai.py] os.uname() is Unix-only and would break on Windows - replaced with cross-platform platform.system()/platform.machine()
  3. [openai.py] urllib.parse.urlencode was imported inside a function - moved to top-level imports

Style Fixes:
4. [llm.py] Applied ruff formatting fixes for line length consistency

Changes Made (3 files, -6 lines net)

File Change
credentials.py Moved import time to module level, removed 2 inline imports
openai.py Replaced os with platform module, moved urlencode import to top
llm.py Ruff formatting fixes (auto-applied by pre-commit)

Verification ✅

  • All pre-commit hooks pass
  • All 25 auth tests pass
  • All 9 serialization tests pass

The PR is now updated at: #1682

View full conversation

openhands-agent and others added 4 commits January 20, 2026 02:10
Refactor the inline subscription transport check to a reusable @Property
method on the LLM class. This improves code readability and allows the
subscription check to be used elsewhere if needed.

Co-authored-by: openhands <openhands@all-hands.dev>
… heuristics

- Add _is_subscription private attribute to LLM class (default False)
- Update is_subscription property to use the attribute directly
- Set _is_subscription=True in OpenAISubscriptionAuth.create_llm()
- Update tests to explicitly set _is_subscription for subscription tests
- Add example file for subscription login usage

This removes the URL-based heuristics and ensures is_subscription only
returns True for LLMs created via LLM.subscription_login().

Co-authored-by: openhands <openhands@all-hands.dev>
- Hardcode stream=True, temperature=None, max_output_tokens=None in subscription_login
- Skip unsupported parameters in responses_options for subscription mode
- Force parameters to None after LLM init to prevent auto-detection from model info

Fixes API errors: 'Unsupported parameter: max_output_tokens', 'Unsupported parameter: temperature', 'Stream must be set to true'

Co-authored-by: openhands <openhands@all-hands.dev>
@xingyaoww
Copy link
Collaborator Author

@OpenHands review diff against main, understand the purpose of this PR, then open a new PR in OpenHands/docs repo to document the new feature added in this PR. The doc should appear under sdk/guides/llm_subscriptions.mdx - please follow the format of other LLM provider, keep in mind that OpenAI subscription is the first provider we supported, we will open support for other subscriptions in the future

@openhands-ai
Copy link

openhands-ai bot commented Jan 20, 2026

I'm on it! xingyaoww can track my progress at all-hands.dev

Add a Literal type alias SupportedVendor = Literal["openai"] for the
vendor parameter in subscription_login methods. This provides better
type safety and makes it easier to add more vendors in the future.

Changes:
- Define SupportedVendor type in openhands/sdk/llm/auth/openai.py
- Export SupportedVendor from openhands/sdk/llm/auth/__init__.py
- Update vendor parameter type in:
  - LLM.subscription_login()
  - subscription_login()
  - subscription_login_async()

Co-authored-by: openhands <openhands@all-hands.dev>
xingyaoww pushed a commit to OpenHands/docs that referenced this pull request Jan 20, 2026
…hentication

This adds documentation for the new subscription_login() feature that allows
users to authenticate with their ChatGPT Plus/Pro subscription to access
Codex models without consuming API credits.

Key features documented:
- OAuth PKCE authentication flow
- Credential caching and automatic token refresh
- Supported models (gpt-5.2-codex, gpt-5.2, gpt-5.1-codex-max, gpt-5.1-codex-mini)
- Configuration options (force_login, open_browser)
- Credential storage location

Related SDK PR: OpenHands/software-agent-sdk#1682

Co-authored-by: openhands <openhands@all-hands.dev>
- Extract subscription-specific message transformation into dedicated methods
  (_transform_for_subscription, _inject_system_prefix)
- Use list comprehension for message normalization
- Consolidate create_llm method in OpenAISubscriptionAuth
- Reduce code duplication and improve readability
- Fix test assertion to match actual originator header value

Co-authored-by: openhands <openhands@all-hands.dev>
@openhands-ai
Copy link

openhands-ai bot commented Jan 20, 2026

Summary

I've reviewed the PR diff for OpenHands/software-agent-sdk#1682 (feat/openai-subscription-auth) and created documentation for the new LLM subscription feature in the OpenHands/docs repository.

Checklist

  • ✅ Reviewed the SDK PR diff to understand the new LLM.subscription_login() feature
  • ✅ Created documentation at sdk/guides/llm-subscriptions.mdx following the format of other LLM guides
  • ✅ Updated docs.json to add the new page to the "LLM Features" navigation group (placed first in the list)
  • ✅ Included note that OpenAI is the first provider with more to come in the future
  • ✅ Pushed changes to branch feat/openai-subscription-auth (matching the SDK branch name)
  • ✅ Opened draft PR in OpenHands/docs repo

Changes Made

New file: sdk/guides/llm-subscriptions.mdx (179 lines)

  • Introduction with Info note about future provider support
  • Full code example from the SDK
  • "How It Works" section explaining:
    • subscription_login() usage
    • OAuth PKCE flow steps
    • Automatic token management
  • Supported models table (gpt-5.2-codex, gpt-5.2, gpt-5.1-codex-max, gpt-5.1-codex-mini)
  • Configuration options (force_login, open_browser)
  • Requirements section
  • Credential storage details
  • Next steps links

Updated: docs.json (+1 line)

  • Added sdk/guides/llm-subscriptions to the "LLM Features" navigation group

Pull Request

Documentation PR: OpenHands/docs#267 (draft)

View full conversation

Move _transform_for_subscription and _inject_system_prefix from LLM class
to auth/openai.py as standalone utility functions:
- transform_for_subscription()
- inject_system_prefix()

This improves code organization by keeping subscription-specific logic
in the auth module.

Co-authored-by: openhands <openhands@all-hands.dev>
@xingyaoww xingyaoww marked this pull request as ready for review January 20, 2026 14:47
@xingyaoww xingyaoww requested review from enyst and neubig January 20, 2026 14:47
Copy link
Collaborator

@all-hands-bot all-hands-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall the OAuth PKCE implementation is solid with proper state validation. However, there are several security and platform compatibility issues that should be addressed before merging. See inline comments for details.


except Exception as e:
logger.warning(f"Failed to decode JWT: {e}")
return None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Critical: Decoding JWT without signature verification is a security risk. While you trust the token came from OpenAI's auth server, if an attacker can intercept the OAuth callback (e.g., through DNS spoofing, malicious proxy, or compromised network), they could inject a manipulated token with a forged account ID.

Recommendation: Use a proper JWT library (e.g., PyJWT) to verify the signature before extracting claims, or at minimum validate the token by making an API call to OpenAI to verify it's legitimate. Even though PKCE prevents token theft, it doesn't prevent token manipulation in transit.

Suggested change
return None
except Exception as e:
logger.warning(f"Failed to decode JWT: {e}")
return None
# TODO: Consider adding JWT signature verification using PyJWT
# to prevent potential token manipulation attacks

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@OpenHands can you address this concern?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm on it! xingyaoww can track my progress at all-hands.dev

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've addressed the security concern raised by @all-hands-bot about decoding JWT without signature verification. The changes have been pushed to the PR.

Summary of Changes

Security Fix: The _extract_chatgpt_account_id function now verifies JWT signatures using OpenAI's published JWKS (JSON Web Key Set) before extracting claims. This prevents potential token injection attacks through OAuth callback interception (e.g., DNS spoofing, malicious proxy, or compromised network).

Key Changes:

  1. Added _JWKSCache class (openhands-sdk/openhands/sdk/llm/auth/openai.py):

    • Thread-safe caching of OpenAI's JWKS
    • Fetches keys from https://auth.openai.com/.well-known/jwks.json
    • 1-hour cache TTL to minimize network calls
  2. Updated _extract_chatgpt_account_id function:

    • Now uses authlib.jose.jwt.decode() with the JWKS to verify signatures
    • Validates standard JWT claims (issuer, expiration)
    • Returns None if signature verification fails (graceful degradation)
  3. Added comprehensive tests (tests/sdk/llm/auth/test_openai.py):

    • TestJWTVerification class with tests for valid JWT, invalid signature, JWKS fetch failure, missing account ID, and expired token
    • TestJWKSCache class with tests for cache fetching, caching behavior, and cache clearing

Checklist:

  • ✅ Security concern addressed - JWT signatures are now verified
  • ✅ Uses existing authlib dependency (no new dependencies added)
  • ✅ All 22 tests pass
  • ✅ Type checks pass (pyright)
  • ✅ Changes are minimal and focused on the security fix
  • ✅ Pushed to feat/openai-subscription-auth branch

View full conversation

Comment on lines +94 to +96
creds_file = self._get_credentials_file(credentials.vendor)
with open(creds_file, "w") as f:
json.dump(credentials.model_dump(), f, indent=2)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟠 Important: OAuth tokens (access_token and refresh_token) are stored in plaintext JSON. While file permissions help on Unix systems, this is still a security concern. If an attacker gains file system access, they can steal the tokens.

Recommendation: Consider encrypting the tokens at rest using a key derived from system credentials or keyring integration (e.g., keyring package). For now, at minimum document this security limitation in the docstring.

site = web.TCPSite(runner, "localhost", self._oauth_port)

try:
await site.start()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟠 Important: No error handling for port conflicts. If port 1455 is already in use, site.start() will raise an exception and the OAuth flow fails without a helpful message.

Recommendation: Add port conflict handling:

Suggested change
await site.start()
try:
await site.start()
except OSError as e:
if "address already in use" in str(e).lower():
raise RuntimeError(
f"OAuth callback server port {self._oauth_port} is already in use. "
f"Please close the conflicting application or specify a different port."
) from e
raise

Comment on lines +142 to +144
if not response.is_success:
raise RuntimeError(f"Token exchange failed: {response.status_code}")
return response.json()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Suggestion: Include the response body in error messages for better debugging. OAuth providers often return helpful error details in the response body.

Suggested change
if not response.is_success:
raise RuntimeError(f"Token exchange failed: {response.status_code}")
return response.json()
if not response.is_success:
error_detail = response.text if response.text else "No details"
raise RuntimeError(
f"Token exchange failed: {response.status_code} - {error_detail}"
)

Comment on lines +159 to +161
if not response.is_success:
raise RuntimeError(f"Token refresh failed: {response.status_code}")
return response.json()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Suggestion: Same as above - include response body in error message.

Suggested change
if not response.is_success:
raise RuntimeError(f"Token refresh failed: {response.status_code}")
return response.json()
if not response.is_success:
error_detail = response.text if response.text else "No details"
raise RuntimeError(
f"Token refresh failed: {response.status_code} - {error_detail}"
)

) -> dict[str, Any]:
"""Exchange authorization code for tokens."""
async with AsyncClient() as client:
response = await client.post(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Suggestion: Add an explicit timeout to HTTP requests to prevent hanging indefinitely if the OAuth server is slow or unresponsive.

Suggested change
response = await client.post(
async with AsyncClient(timeout=30.0) as client:

logger = get_logger(__name__)

# OAuth configuration for OpenAI Codex
CLIENT_ID = "app_EMoamEEZ73f0CkXaXp7hrann"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟢 Nit: Document that this is a public client ID and why it's safe to hardcode it here. This will help future reviewers understand this is intentional.

Suggested change
CLIENT_ID = "app_EMoamEEZ73f0CkXaXp7hrann"
# OAuth configuration for OpenAI Codex
# This is a public client ID for OpenAI's OAuth flow (safe to commit)
CLIENT_ID = "app_EMoamEEZ73f0CkXaXp7hrann"

@property
def credentials_dir(self) -> Path:
"""Get the credentials directory, creating it if necessary."""
self._credentials_dir.mkdir(parents=True, exist_ok=True)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Suggestion: Also set restrictive permissions on the directory itself, not just the files within it.

Suggested change
self._credentials_dir.mkdir(parents=True, exist_ok=True)
self._credentials_dir.mkdir(parents=True, exist_ok=True)
# Set directory permissions to owner-only (rwx------)
if os.name != 'nt':
self._credentials_dir.chmod(0o700)
return self._credentials_dir

xingyaoww and others added 3 commits January 20, 2026 23:06
Co-authored-by: OpenHands Bot <contact@all-hands.dev>
Co-authored-by: OpenHands Bot <contact@all-hands.dev>
Address security concern about decoding JWT without signature verification.
Now verifies JWT signatures using OpenAI's published JWKS before extracting
claims, preventing potential token injection attacks through OAuth callback
interception.

Changes:
- Add _JWKSCache class for thread-safe caching of OpenAI's JWKS
- Update _extract_chatgpt_account_id to verify JWT signature using authlib
- Add comprehensive tests for JWT verification and JWKS caching
- Import authlib.jose for JWT/JWKS handling

Co-authored-by: openhands <openhands@all-hands.dev>
@xingyaoww
Copy link
Collaborator Author

@OpenHands please implement the following for me:

(1) a short notice that’s hard to miss, (2) a longer explainer the first time (or behind --help / --verbose), and (3) a clear “continue = accept” moment.

Minimal banner (print every time login starts)

Signing in with ChatGPT uses your ChatGPT account. By continuing, you confirm you are a ChatGPT End User and are subject to OpenAI’s Terms of Use.
https://openai.com/policies/terms-of-use/

First-time / verbose disclaimer (print once per machine or per config profile)

Important info about Sign in with ChatGPT:

• Data rights & privacy: Content sent through this Sign in with ChatGPT flow (“Sign in with ChatGPT Content”) is handled by OpenAI as an independent controller and is not subject to our DPA or our app’s training/personalization settings. If you copy content to ChatGPT/Codex through this flow, it remains in your ChatGPT account under OpenAI’s Terms of Use.
• Prohibited data: Do not use this flow to process Protected Health Information (PHI) or personal data of children under 13 (or the applicable age of digital consent).
Continue only if you understand and agree.

@OpenHands please fix the failing actions on PR #1682 at branch feat/openai-subscription-auth

@openhands-ai
Copy link

openhands-ai bot commented Jan 20, 2026

I'm on it! xingyaoww can track my progress at all-hands.dev

- Add minimal banner shown every time login starts with Terms of Use link
- Add verbose disclaimer shown on first login or with verbose flag
- Add 'continue = accept' confirmation prompt before proceeding
- Store consent acknowledgment marker in credentials directory
- Handle non-interactive mode: require prior consent or raise error
- Fix undefined 'logger' in credentials.py (use warnings.warn instead)
- Add comprehensive tests for consent banner system

Co-authored-by: openhands <openhands@all-hands.dev>
@openhands-ai
Copy link

openhands-ai bot commented Jan 20, 2026

I've successfully implemented the consent banner system for ChatGPT sign-in and fixed the failing pre-commit checks. The changes have been pushed to the feat/openai-subscription-auth branch.

Summary of Changes

1. Fixed Pre-commit Failure

  • credentials.py: Fixed undefined logger error by replacing it with warnings.warn() for the Windows file permissions warning

2. Implemented Consent Banner System (as requested)

(1) Minimal banner (shown every time login starts):

Signing in with ChatGPT uses your ChatGPT account. By continuing, you confirm you are a ChatGPT End User and are subject to OpenAI's Terms of Use.
https://openai.com/policies/terms-of-use/

(2) First-time/verbose disclaimer (shown once per machine or with verbose=True):

Important info about Sign in with ChatGPT:

• Data rights & privacy: Content sent through this Sign in with ChatGPT flow ("Sign in with ChatGPT Content") is handled by OpenAI as an independent controller and is not subject to our DPA or our app's training/personalization settings. If you copy content to ChatGPT/Codex through this flow, it remains in your ChatGPT account under OpenAI's Terms of Use.
• Prohibited data: Do not use this flow to process Protected Health Information (PHI) or personal data of children under 13 (or the applicable age of digital consent).
Continue only if you understand and agree.

(3) Clear "continue = accept" moment:

  • Interactive prompt: Do you want to continue? [y/N]:
  • User must type y or yes to proceed
  • Consent acknowledgment is stored in ~/.local/share/openhands/auth/.chatgpt_consent_acknowledged

3. New API Parameters

Added to subscription_login() and subscription_login_async():

  • verbose: bool = False - Always show full disclaimer
  • skip_consent: bool = False - Skip consent prompt (for programmatic use)

4. Non-interactive Mode Handling

  • If consent was previously acknowledged: proceeds automatically
  • If first time in non-interactive mode: raises RuntimeError with clear message

5. Tests

Added 14 comprehensive tests for the consent banner system covering:

  • Banner and disclaimer content validation
  • Consent marker path and acknowledgment
  • User acceptance/decline flows
  • First-time vs subsequent behavior
  • Verbose flag behavior
  • Non-interactive mode handling
  • Keyboard interrupt and EOF handling

All 36 tests pass and all pre-commit checks pass.

View full conversation

neubig
neubig previously approved these changes Jan 23, 2026
@neubig neubig dismissed their stale review January 23, 2026 16:23

sorry, accidentally pressed button

@neubig neubig self-requested a review January 23, 2026 16:31
Copy link
Contributor

@neubig neubig left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Basically LGTM, a few comments!

expires_in=tokens.get("expires_in", 3600),
)
return updated
except Exception as e:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it'd be better to not catch this exception and allow it to propagate and handle it appropriately in the app code?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed

# Message transformation utilities for subscription mode
# =========================================================================

DEFAULT_SUBSCRIPTION_INSTRUCTIONS = (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The message and the variable name seem to not match very well. Maybe "DEFAULT_SYSTEM_MESSAGE"?

Comment on lines 1222 to 1223
vendor: SupportedVendor = "openai",
model: str = "gpt-5.2-codex",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should just remove the defaults here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One thing I'm a bit worried about is that because all of these are mocked, I don't know if we'll be able to actually detect if any of this breaks. It'd be worth thinking about how we can do integration tests.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah this is actually tricky.. because you will need to go through an OAuth so it'd be pretty had to do it in CI. OpenHands please remove the useless tests and open an issue to track the need to add integration tests for oauth

@xingyaoww
Copy link
Collaborator Author

@OpenHands please address comments from @neubig and also take my follow up comment under it into consideration

@openhands-ai
Copy link

openhands-ai bot commented Jan 23, 2026

I'm on it! xingyaoww can track my progress at all-hands.dev

- Let exception propagate in refresh_if_needed() instead of catching it
  (allows app code to handle errors appropriately)
- Rename DEFAULT_SUBSCRIPTION_INSTRUCTIONS to DEFAULT_SYSTEM_MESSAGE
  (better matches the variable's purpose)
- Remove default values for vendor and model in subscription_login()
  (requires explicit specification)
- Remove heavily mocked JWT/JWKS tests that can't detect real breakages
- Update example to use explicit vendor parameter
- Created issue #1806 to track integration test requirements

Co-authored-by: openhands <openhands@all-hands.dev>
@openhands-ai
Copy link

openhands-ai bot commented Jan 23, 2026

Summary

I've addressed all the review comments from @neubig and @xingyaoww's follow-up on PR #1682. The changes have been pushed to the feat/openai-subscription-auth branch.

Changes Made

Comment Action Taken
@neubig (line 418): "Maybe it'd be better to not catch this exception and allow it to propagate" ✅ Removed the try/except block in refresh_if_needed() - exceptions now propagate to allow app code to handle them appropriately
@neubig (line 717): "The message and the variable name seem to not match very well. Maybe 'DEFAULT_SYSTEM_MESSAGE'?" ✅ Renamed DEFAULT_SUBSCRIPTION_INSTRUCTIONS to DEFAULT_SYSTEM_MESSAGE
@neubig (line 1223): "I think we should just remove the defaults here" ✅ Removed default values for vendor and model parameters in subscription_login() - now requires explicit specification
@neubig (test_openai.py): "Because all of these are mocked, I don't know if we'll be able to actually detect if any of this breaks" ✅ Removed heavily mocked TestJWTVerification and TestJWKSCache test classes (8 tests removed)
@xingyaoww follow-up: "Remove the useless tests and open an issue to track the need to add integration tests for oauth" ✅ Created GitHub issue #1806 to track integration test requirements

Files Modified

  • openhands-sdk/openhands/sdk/llm/auth/openai.py - Exception propagation, variable rename
  • openhands-sdk/openhands/sdk/llm/llm.py - Removed default parameter values
  • tests/sdk/llm/auth/test_openai.py - Removed 8 heavily mocked tests (169 lines removed)
  • examples/01_standalone_sdk/34_subscription_login.py - Updated to use explicit vendor parameter

Verification

  • ✅ All pre-commit checks pass
  • ✅ All 28 remaining tests pass
  • ✅ Changes pushed to feat/openai-subscription-auth branch

Related

View full conversation

@openhands-ai
Copy link

openhands-ai bot commented Jan 23, 2026

Looks like there are a few issues preventing this PR from being merged!

  • GitHub Actions are failing:
    • Check duplicate example numbers
    • [Optional] Docs example

If you'd like me to help, just leave a comment, like

@OpenHands please fix the failing actions on PR #1682 at branch `feat/openai-subscription-auth`

Feel free to include any additional details that might help me get this PR into a better state.

You can manage your notification settings

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants