Skip to content

Conversation

@varun29ankuS
Copy link

Summary

This SEP proposes adding an optional conversationEvents capability that allows MCP servers to subscribe to conversation-level events. When a user sends a message, subscribed servers receive the content and can respond with context to inject into the LLM's prompt.

Motivation

Building a memory system for AI assistants, I hit a wall: MCP servers only receive data when the LLM explicitly calls a tool. If it doesn't call recall(), the memory system is blind.

Workarounds like strong tool descriptions, piggybacking on other tools, or client-side hooks don't reliably solve this. The protocol needs to support automatic context injection.

Use Cases

  • Memory systems that surface relevant past conversations
  • Knowledge bases that inject relevant docs based on the question
  • Project context that reminds assistants about codebase conventions
  • User preferences that personalize responses automatically

Key Points

  • New conversationEvents capability with onUserMessage subscription
  • Servers respond with context to inject (plain text or structured)
  • 500ms timeout to prevent blocking
  • Purely additive - backward compatible with existing servers/clients

Full specification in the SEP file.


Happy to iterate on the details. The core ask: let servers see the conversation so they can provide automatic context.

Add capability for MCP servers to subscribe to conversation events,
enabling automatic context injection for memory systems, knowledge
bases, and context-aware assistants.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant