Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track costs, debug prompts, and monitor context window usage across your AI development sessions.
python cli proxy developer-tools rich m terminal-dashboard ai-tools llm anthropic token-counter itmproxy
-
Updated
Feb 2, 2026 - Python