-
Notifications
You must be signed in to change notification settings - Fork 19
Open
Description
I think there is an issue with it not being able to return the response. It works in other clients like Cherry studio and Kilocode but not in Trae client
request log from my llm api proxy:
"timestamp": "2025-12-30T23:46:03.743Z",
"level": "INFO",
"category": "UPSTREAM_RES",
"status": 200,
"response": {
"content": "Hello! How can I help you with your project today?\n\nI see you're working in the `ccs-dashboard` directory. If you have any specific tasks or questions about the codebase, feel free to let me know! I can help you with:\n\n* **Understanding the code:** Explaining how different parts of the API work.\n* **Fixing bugs:** Identifying and resolving issues in your Node.js application.\n* **Adding features:** Implementing new endpoints or logic.\n* **Refactoring:** Suggesting improvements to make the code cleaner and more efficient.\n\nWhat's on your mind?",
"reasoningContent": null,
"toolCalls": [],
"usage": {
"prompt_tokens": 7184,
"completion_tokens": 134,
"total_tokens": 7318
}
}
},
{
"timestamp": "2025-12-30T23:46:03.743Z",
"level": "INFO",
"category": "RESPONSE",
"status": 200,
"summary": {
"stream": false,
"hasToolCall": false,
"contentLength": 564,
"hasReasoning": false,
"usage": {
"prompt_tokens": 7184,
"completion_tokens": 134,
"total_tokens": 7318
}
}
}
]
the log does contain the appropriate response content. It's just that Trae isn't able to return the response format into the chat
Images:


Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels