LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • LangGraph Checkpoint
    Checkpoint Postgres
    Store Postgres
    Checkpoint SQLite
    LangGraph Prebuilt
    LangGraph CLI
    LangGraph SDK
    LangGraph Supervisor
    LangGraph Swarm
    ⌘I

    LangChain Assistant

    Ask a question to get started

    Enter to send•Shift+Enter new line

    Menu

    LangGraph Checkpoint
    Checkpoint Postgres
    Store Postgres
    Checkpoint SQLite
    LangGraph Prebuilt
    LangGraph CLI
    LangGraph SDK
    LangGraph Supervisor
    LangGraph Swarm
    Language
    Theme
    Pythonlanggraph.prebuilttool_nodeToolRuntime
    Classā—Since v1.0

    ToolRuntime

    Runtime context automatically injected into tools.

    Note

    This is distinct from Runtime (from langgraph.runtime), which is injected into graph nodes and middleware. ToolRuntime includes additional tool-specific attributes like config, state, and tool_call_id that Runtime does not have.

    When a tool function has a parameter named runtime with type hint ToolRuntime, the tool execution system will automatically inject an instance containing:

    • state: The current graph state
    • tool_call_id: The ID of the current tool call
    • config: RunnableConfig for the current execution
    • context: Runtime context (shared with Runtime)
    • store: BaseStore instance for persistent storage (shared with Runtime)
    • stream_writer: StreamWriter for streaming output (shared with Runtime)

    No Annotated wrapper is needed - just use runtime: ToolRuntime as a parameter.

    Copy
    ToolRuntime(
      self,
      state: StateT,
      context: ContextT,
      config: RunnableConfig,
      stream_writer: StreamWriter,
      tool_call_id: str | None,
      store: BaseStore | None,
      execution_info: ExecutionInfo | None = None,
      server_info: ServerInfo | None = None
    )

    Bases

    _DirectlyInjectedToolArgGeneric[ContextT, StateT]

    Example:

    from langchain_core.tools import tool
    from langchain.tools import ToolRuntime
    
    @tool
    def my_tool(x: int, runtime: ToolRuntime) -> str:
        """Tool that accesses runtime context."""
        # Access state
        messages = tool_runtime.state["messages"]
    
        # Access tool_call_id
        print(f"Tool call ID: {tool_runtime.tool_call_id}")
    
        # Access config
        print(f"Run ID: {tool_runtime.config.get('run_id')}")
    
        # Access runtime context
        user_id = tool_runtime.context.get("user_id")
    
        # Access store
        tool_runtime.store.put(("metrics",), "count", 1)
    
        # Stream output
        tool_runtime.stream_writer.write("Processing...")
    
        return f"Processed {x}"

    This is a marker class used for type checking and detection. The actual runtime object will be constructed during tool execution.

    Used in Docs

    • Build a personal assistant with subagents
    • Build a SQL assistant with on-demand skills
    • Build customer support with handoffs
    • Context engineering in Deep Agents
    • Going to production

    Constructors

    constructor
    __init__
    NameType
    stateStateT
    contextContextT
    configRunnableConfig
    stream_writerStreamWriter
    tool_call_idstr | None
    storeBaseStore | None
    execution_infoExecutionInfo | None
    server_infoServerInfo | None

    Attributes

    attribute
    state: StateT
    attribute
    context: ContextT
    attribute
    config: RunnableConfig
    attribute
    stream_writer: StreamWriter
    attribute
    tool_call_id: str | None
    attribute
    store: BaseStore | None
    attribute
    execution_info: ExecutionInfo | None
    attribute
    server_info: ServerInfo | None
    View source on GitHub