Extend the OpenAI API spec to include reasoning content blocks in chat completion responses (OpenAI-compatible inference endpoint) #3578
d8rt8v
started this conversation in
Feature Requests
Replies: 2 comments
-
|
Hi @d8rt8v - the OpenAI spec doesn't have fields for raw reasoning, so we currently don't return it in the OpenAI-compatible endpoint. You'll have to use the standard |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Hi @d8rt8v @GabrielBianconi please check my PR #3746 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, when making a curl request to an OAI-compatible endpoint like /openai/v1/chat/completions, I only receive the final assistant message. However, with the /inference endpoint, I’m able to see reasoning traces from the model.
Is there any way to enable reasoning output when using the /chat/completions endpoint, or is this strictly not supported?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions