Skip to content

Feature/epic cocode mthds 1#71

Open
lchoquel wants to merge 2 commits intofeature/Chicagofrom
feature/Epic-cocode-mthds-1
Open

Feature/epic cocode mthds 1#71
lchoquel wants to merge 2 commits intofeature/Chicagofrom
feature/Epic-cocode-mthds-1

Conversation

@lchoquel
Copy link
Member

@lchoquel lchoquel commented Feb 22, 2026

Note

Medium Risk
Touches core inference configuration (model availability, defaults, and inputs), so misconfigurations could break model selection or pipeline execution across environments.

Overview
Refactors Pipelex inference configuration by removing large editor/assistant rule files (.blackboxrules, .cursor/rules/*, .github/copilot-instructions.md) and restructuring the model deck from a single base_deck.toml into separate decks for LLMs, image generation, and extraction.

Updates backend model catalogs across providers to add a thinking_mode field, expand model input support to include pdf for many LLMs, adjust/extend model lists and token limits (notably new Claude/GPT/Gemini/Mistral entries), and tighten some image-gen prompt rules (e.g., positive_only).

Written by Cursor Bugbot for commit 94541fb. This will update automatically on new commits. Configure here.

lchoquel and others added 2 commits February 21, 2026 23:36
…t AI instruction files

Migrate pipeline files from .plx to .mthds extension, restructure the inference deck into separate files (llm, img_gen, extract, custom), update backend configs, and consolidate AI coding instructions into CLAUDE.md by removing .blackboxrules, .cursor/rules, .windsurfrules.md, AGENTS.md, and .github/copilot-instructions.md.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace typing.List with builtin list[str] and remove unnecessary
from __future__ import annotations to fix annotation resolution
failure after pipelex update.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.

best-gemini = "nano-banana-pro"
best-blackforestlabs = "flux-2-pro"

default-general = "flux-2-pro"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Image generation alias references non-existent model

High Severity

The aliases best-blackforestlabs and default-general both point to "flux-2-pro", but no backend configuration file defines a model with that name. The fal.toml backend only defines flux-pro, flux-pro/v1.1, flux-pro/v1.1-ultra, flux-2, and fast-lightning-sdxl. Since default-general is used by the gen-image preset (which is the choice_default), all default image generation will fail to resolve a model. The intended model is likely flux-2.

Fix in Cursor Fix in Web

[extract.aliases]
default-premium = "azure-document-intelligence"
default-extract-document = "mistral-document-ai-2505"
default-extract-image = "mistral-document-ai-2505"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extract alias references non-existent extraction model

High Severity

The aliases default-extract-document and default-extract-image both point to "mistral-document-ai-2505", but no backend defines this model. The Mistral backend only defines mistral-ocr-2503, mistral-ocr-2505, mistral-ocr-2512, and mistral-ocr. Since default-extract-document is the choice_default for extraction, all default document extraction will fail to resolve a model.

Fix in Cursor Fix in Web

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 94541fb8eb

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@@ -28,7 +28,7 @@ description = "Find code files that implement or use elements mentioned in docs"
inputs = { doc_file = "DocumentationFile", repo_map = "RepositoryMap" }
output = "FilePath[]"
model = { model = "llm_for_large_codebase", temperature = 0.1 }

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Replace removed model alias with a valid deck reference

This pipe still uses model = "llm_for_large_codebase", but this commit deletes .pipelex/inference/deck/base_deck.toml (where that identifier was defined) and the new deck files (1_llm_deck.toml + x_custom_llm_deck.toml) do not reintroduce it, so model resolution for this step will fail at runtime (the same stale reference is also present in cocode/pipelines/text_utils.mthds).

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant