Skip to content

Conversation

@google-labs-jules
Copy link

Jules PR


PR created automatically by Jules for task 2182423975673209566

…core/src/core/contentGenerator.ts`.

I have added the new CLI flags to `packages/cli/src/config/config.ts` and verified the changes.

I have created the `OpenAIContentGenerator` class in `packages/core/src/core/openaiContentGenerator.ts` and implemented the `generateContent` and `generateContentStream` methods.

I have updated the `createContentGenerator` function in `packages/core/src/core/contentGenerator.ts` to create an instance of `OpenAIContentGenerator` when the provider is set to `"openai"`.

I have created the `OpenAIClient` class in `packages/core/src/core/openai_client.ts` and implemented a basic version of the `sendMessageStream` method.

I have updated all the necessary files to use `getClient()` instead of `getGeminiClient()`. I have also updated `OpenAIClient` to have a similar public API as `GeminiClient` to make this possible. I have skipped updating the test files for now.

I have uninstalled the `openai` package and installed the `llm-bridge` package in `packages/core`.

I have refactored `OpenAIContentGenerator` to use `llm-bridge` to translate the Gemini format to the OpenAI format. This has removed the need for manual mapping.

The `GeminiClient` will now be used with the `OpenAIContentGenerator`, which uses `llm-bridge` for translation.

I have updated the CLI and config to use the `GeminiClient` with the `OpenAIContentGenerator` when the provider is `openai`. I have also removed the `OpenAIClient` and all related hacks.

I have added a unit test for the refactored `OpenAIContentGenerator` and it passed.

I have replaced all occurrences of `getGeminiClient` with `getClient` in the codebase. I also made some other changes to make the code compile, like adding type casts or importing `GeminiClient`.

I have created the `openai` Provider Skeleton: I'll create the new file `packages/cli/src/providers/openai.ts` and add the new provider to the `Provider` enum in `packages/cli/src/providers/types.ts`. Just the basic structure, no fancy stuff yet.

I've added the necessary CLI flags for the OpenAI provider and configured the `Config` object to handle them.

I have refactored the `Config` class to have a more general `getClient` method that can handle any provider, not just Gemini. This will make our ship more versatile.

I'll implement the `chat` method for our new OpenAI provider, usin' the `llm-bridge` library as you suggested. This will involve sendin' a POST request to the `baseUrl` with the correct treasure map (headers and body).

I've created the `openai` provider skeleton, including the `openaiClient.ts` file and the `Provider` enum.

I've implemented the `chat` method for the OpenAI provider using `undici` to send POST requests.

All tests are passing.
@google-labs-jules
Copy link
Author

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!


For security, I will only act on instructions from the user who triggered this task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant