OpenAI Compatible stream api send one more empty chunk after finish chunk. #4955
joneepenk
started this conversation in
Bug Reports
Replies: 1 comment 2 replies
-
|
Hi @joneepenk - could you please share a full response from Qwen API and the corresponding stream from TensorZero, so we can more easily debug what's causing this? I don't have the Qwen API set up on my end so I can't reproduce it. I believe this last chunk is because of usage. TensorZero always requests it for observability, but you didn't request it, so it's a nominal chunk. More broadly, does this present an issue for your application? How? Thanks |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
calling Qwen API of Alibabacloud proxied by t0: https://dashscope.aliyuncs.com/compatible-mode/v1
calling as following code:
After Qwen API returned a finish chunk and t0 forwarded it, t0 send an extra empty chunk to client.
{"id":"chatcmpl-f95c8cd3-fb95-445e-9440-3d48640609d7","choices":[{"delta":{"content":"","function_call":null,"refusal":null,"role":null,"tool_calls":null},"finish_reason":"stop","index":0,"logprobs":null}],"created":1764762469,"model":"qwen3-max","object":"chat.completion.chunk","service_tier":null,"system_fingerprint":null,"usage":null}{"id":"019ae40c-ea36-70b0-aa98-c9f80ff54c03","choices":[{"delta":{"content":null,"function_call":null,"refusal":null,"role":null,"tool_calls":null},"finish_reason":null,"index":0,"logprobs":null}],"created":1764762644,"model":"tensorzero::model_name::qwen3-max","object":"chat.completion.chunk","service_tier":null,"system_fingerprint":"","usage":null,"episode_id":"019ae40c-ea37-7f12-a95c-0ab9805f67da"}Beta Was this translation helpful? Give feedback.
All reactions