Real-time visibility into what your AI agent is thinking: tokens, context window, caching, and session uptime.
The Context Inspector gives you a live view of your agent's internal state:
How many tokens are in the current conversation, broken down by input, output, and cached tokens.
Total capacity vs. used capacity. See how close you are to the model's context limit (e.g., 200K tokens for Claude).
Which parts of the conversation are cached. Cached tokens are faster and cheaper to process.
How long the current session has been active. Long sessions may benefit from a restart to clear memory.
The Context Inspector shows a visual breakdown of where tokens are being used:
Hover over any bar to see more details about what's consuming tokens in that category.
Tokens are the units LLM APIs use for billing. The Context Inspector helps you understand costs:
Everything your agent reads: your messages, files, memory, system prompt. These are cheaper than output tokens.
Everything your agent writes: responses, code, tool calls. These cost more than input tokens (typically 3-5x).
Tokens that don't need reprocessing (like your system prompt and memory). These are ~10x cheaper than regular input tokens and much faster.
Pro tip: If token costs are high, check the Context Inspector to see what's using the most tokens. Long daily files or verbose tool outputs can add up quickly.
The Context Inspector updates automatically as your conversation progresses:
You don't need to manually refresh — the inspector stays in sync with your conversation automatically.
The Context Inspector also shows which LLM model your agent is using:
If you switch models mid-conversation (e.g., from Claude to GPT-4), the inspector updates to show the new model's specs.
Use the Context Inspector to troubleshoot or optimize:
Check if you're near the context limit. If so, start a new session or archive old daily files.
Look at token breakdown to see what's consuming the most. Trim verbose memory or tool outputs.
If context is full, old conversation history gets pruned. Check if important info was lost.
See exactly what context the agent has access to — helps identify missing or incorrect information.
Access the Context Inspector from anywhere in Pinchr:
Press to toggle the inspector panel
You can also open it from the top toolbar by clicking the "Context" button.
Join our community or reach out — we're here to help.