Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track costs, debug prompts, and monitor context window usage across your AI development sessions.
-
Updated
Jan 29, 2026 - Python
Intercept LLM API traffic and visualize token usage in a real-time terminal dashboard. Track costs, debug prompts, and monitor context window usage across your AI development sessions.
Add a description, image, and links to the itmproxy topic page so that developers can more easily learn about it.
To associate your repository with the itmproxy topic, visit your repo's landing page and select "manage topics."