Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

Summary

Adds 6 tests for batch=True support in image generation, complementing the backend changes in vlm-lab#1586. Updates the mock ImagePredictions.generate() to return status="pending" when batch=True is passed (matching the real API behavior), and adds test cases covering file paths, URLs, multiple images, PIL objects, explicit batch=False, and the batch→wait polling flow.

Review & Testing Checklist for Human

  • These are mock-only tests: They validate the mock's contract (batch=True → pending response) but do not exercise the real ImagePredictions.generate() in vlmrun/client/predictions.py. Consider whether an integration test against the live API (after vlm-lab#1586 is deployed) is needed to verify the full round-trip.
  • test_image_generate_batch_then_wait: The mock's Prediction.wait() always returns completed immediately — it doesn't simulate a pending→completed transition. Verify this is acceptable or if a more realistic mock (e.g., returning pending on first call, completed on second) would be more valuable.
  • Run make test locally to confirm all 224 tests pass (6 new + 218 existing).

Notes

  • Link to Devin run
  • Requested by: @spillai
  • Pre-existing black formatting failure on vlmrun/cli/_cli/chat.py is unrelated to these changes.

- Update mock ImagePredictions.generate() to return pending status for batch=True
- Add test_image_generate_batch_with_file
- Add test_image_generate_batch_with_url
- Add test_image_generate_batch_with_multiple_images
- Add test_image_generate_batch_with_pil_images
- Add test_image_generate_non_batch_still_works
- Add test_image_generate_batch_then_wait (batch + polling)

Co-Authored-By: Sudeep Pillai <sudeep.pillai@gmail.com>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant