Give your AI tools instant access to your entire coding history. OpenMemory remembers everything you code and automatically provides context to GitHub Copilot, Cursor, Claude, and other AI assistants.
- Works with GitHub Copilot, Cursor, Claude, Windsurf, Codex, and any MCP-compatible AI
- Auto-configures all AI tools on first run with zero manual setup
- Tracks every file edit, save, and open automatically
- NEW: Manually add code selections to memory with a single click
- Compresses memories to reduce tokens by 30-70%
- Query responses under 80ms with smart caching
- Real-time token savings and compression metrics
- Background processing never blocks UI
- Install this extension
- Start backend
- Click OpenMemory icon in status bar to verify connection
- Start coding - AI tools now access your coding memory
Backend server required.
openmemory.backendUrl: Backend URL (default:http://localhost:8080)openmemory.apiKey: API key for auth (optional)openmemory.useMCP: Use MCP protocol mode (default:false) - connects to backend MCP server with tools:openmemory_query,openmemory_store,openmemory_list,openmemory_get,openmemory_reinforceopenmemory.mcpServerPath: Path to backend MCP server (default:backend/dist/ai/mcp.js)openmemory.userId: Custom User ID (optional, defaults to auto-generated)openmemory.projectName: Custom Project Name (optional, defaults to workspace name)
OpenMemory: Query Context- Search your coding memoryOpenMemory: Add Selection to Memory- Save selected code to memoryOpenMemory: Quick Note- Add a manual note to memoryOpenMemory: View Patterns- View detected patternsOpenMemory: Toggle Tracking- Pause or resume trackingOpenMemory: Setup- Configure backend and settings
All data stores locally. No telemetry. Open source code available for audit.
Check backend running: curl http://localhost:8080/health
For issues, see GitHub
Made by the Cavira team