Skip to content

OpenAICompatibleToolMessage tool result should be an array #4880

@wangfenjin

Description

@wangfenjin

pub struct OpenAICompatibleAssistantMessage {
pub content: Option<Value>,
pub tool_calls: Option<Vec<OpenAICompatibleToolCall>>,
}
#[derive(Clone, Debug, Deserialize, PartialEq)]
#[serde(tag = "role")]
#[serde(rename_all = "lowercase")]
pub enum OpenAICompatibleMessage {
#[serde(alias = "developer")]
System(OpenAICompatibleSystemMessage),
User(OpenAICompatibleUserMessage),
Assistant(OpenAICompatibleAssistantMessage),
Tool(OpenAICompatibleToolMessage),
}

When LLM assistant requires multiple tool_use concurrently, it will require the following tool result also be an array, with the tool_call_id matches. Otherwise it will return error like:

"error": {

    "message": "An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response messages: call_5lCYn0RkWOzv6sWEqhWAs0k1",

    "type": "invalid_request_error",

    "param": "messages.[5].role",

    "code": null

  }

Currently it's reproducible in gpt-5 and claude 4, grok seems ok with multiple tool sections

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions