Token Tracker: Token usage tracker for AI coding agent CLIs
Token Tracker is a free, open-source, local-first dashboard that automatically tracks token usage across 13 AI coding agent CLIs in one place: Claude Code, Codex CLI, Cursor IDE, Gemini CLI, OpenCode, OpenClaw, Every Code, Kiro, Hermes Agent, GitHub Copilot, Kimi Code, oh-my-pi (omp), and CodeBuddy. It focuses on minimal data collection, auditable metrics, and clear visibility into model and project usage. Token counts only — never prompts or conversation content.
Install in one command
npx tokentracker-cli
On first run, the CLI auto-detects installed AI coding tools, installs the appropriate hooks
(SessionEnd hooks, TOML notify arrays, plugins, or passive log readers depending on the tool), and
starts a local dashboard at http://localhost:7680. macOS users can also install the
native menu bar app via brew install --cask mm7894215/tokentracker/tokentracker.
What you get after installing
Token Tracker runs entirely on your machine. After npx tokentracker-cli, a local
dashboard at http://localhost:7680 exposes the following views — none of them require an
account, and your data never leaves your device unless you opt into cloud sync.
- Unified usage dashboard — input, output, cached, and cache-creation tokens across all 13 AI coding tools.
- Cost analysis — per-model pricing breakdown across 70+ models.
- Usage limits view — live rate-limit and subscription status for Claude, Codex, Cursor, Gemini, Kiro, and Antigravity.
- Project attribution — see which repository or workspace consumed which tokens.
- Activity heatmap & trend charts — GitHub-style contribution calendar plus day / week / month / total / custom-range charts.
- Skills browser — discover and install agent skills from the open ecosystem.
- macOS menu bar app & desktop widget — at-a-glance usage without opening the dashboard.
Privacy by design: token counts only, never prompts or conversation content. Local-first; cloud sync is optional and powers only the public leaderboard.
Supported AI coding agent CLIs
- Claude Code — Anthropic's official coding agent CLI; integrated via SessionEnd hook.
- Codex CLI — OpenAI Codex command-line agent; integrated via TOML notify array in
~/.codex/config.toml. - Cursor IDE — Cursor editor; usage fetched via local SQLite auth + CSV API.
- Gemini CLI — Google Gemini coding CLI; SessionEnd hook integration.
- OpenCode — open-source coding agent; plugin + SQLite audit.
- OpenClaw — modern session plugin integration.
- Every Code — Codex-compatible TOML notify hook.
- Kiro — SQLite + JSONL hybrid log reader.
- Hermes Agent — SQLite sessions table at
~/.hermes/state.db. - GitHub Copilot — OpenTelemetry file exporter integration (
COPILOT_OTEL_FILE_EXPORTER_PATH). - Kimi Code — passive
wire.jsonlreader. - oh-my-pi (omp) — passive JSONL session reader.
- CodeBuddy (Tencent) — Claude-Code fork; SessionEnd hook in
~/.codebuddy/settings.json.
Who should use Token Tracker
Token Tracker is built for developers, founders, and engineering teams who rely on multiple AI coding assistants and need a single place to understand token consumption. Instead of checking each tool separately, you can compare usage trends by model, project, and time window in one dashboard.
Teams use Token Tracker to answer practical questions: which model is consuming the most budget, whether a new workflow increases cost, how a Pro plan is tracking against rate limits, and how token usage shifts over days or weeks. This makes budgeting, optimization, and reporting easier without introducing heavy tracking overhead.
How it works
Each supported tool emits usage data when a session ends. Token Tracker's parser
(src/lib/rollout.js) normalizes 9 distinct log formats into a unified schema: half-hour
UTC buckets keyed by (source, model, hour_start). Aggregated buckets are written to
queue.jsonl and served from a local HTTP API on port 7680. The React dashboard reads
from those endpoints; cloud sync (optional) uploads bucketed counts to InsForge for the public
leaderboard.
Frequently asked questions
How do I install Token Tracker?
Run npx tokentracker-cli. The CLI auto-detects your AI coding tools, installs hooks, and opens the dashboard.
Which AI coding CLIs does Token Tracker support?
Claude Code, Codex CLI, Cursor IDE, Gemini CLI, OpenCode, OpenClaw, Every Code, Kiro, Hermes Agent, GitHub Copilot, Kimi Code, oh-my-pi (omp), and CodeBuddy — 13 in total.
Is Token Tracker free and open source?
Yes. Source on GitHub at github.com/mm7894215/TokenTracker; npm package tokentracker-cli.
Does Token Tracker collect my prompts or conversations?
No. Only token counts, model names, timestamps, and project attribution are recorded. Prompts and completions are never read or uploaded.
What does Token Tracker track?
Input, output, cached, and cache-creation tokens per model, project, and time window. It also surfaces rate limits and subscription status for Claude Pro, ChatGPT plans, Cursor, Gemini, Kiro, and Antigravity.
Does Token Tracker work offline?
Yes. Token Tracker is local-first. All parsing, aggregation, and dashboard rendering happen on your machine. Cloud sync is optional and powers only the public leaderboard.
Resources
- Homepage and install instructions
- Public token usage leaderboard
- Claude IP region checker
- npm: tokentracker-cli
- GitHub: mm7894215/TokenTracker
- Report an issue
Note: The usage dashboard, rate-limit panel, project attribution, skills browser,
and widget previews are part of the local app at http://localhost:7680 after install —
they are not standalone public URLs.