Instrument once. Attribute every LLM API call to a project, feature, or workflow — with full agentic cost rollup across Anthropic, OpenAI, Gemini, and DeepSeek.
No proxy servers. No SDK rewrites. Kostrack wraps your existing clients and writes cost data asynchronously — never blocking your LLM calls.
docker compose up -d. TimescaleDB and a pre-wired Grafana dashboard are ready in under a minute.kostrack.configure(dsn=...) at app startup with your TimescaleDB connection string. That's it.from anthropic import Anthropic with from kostrack import Anthropic. Identical API.tags dict to attribute costs. Wrap agentic workflows with kostrack.trace() for rollup per run.Everything needed for real cost governance — not just a dashboard on top of your billing page.
docker compose up gives you TimescaleDB + Grafana with pre-provisioned dashboards. Zero configuration.pip install kostrack.With correct per-provider token extraction — cache writes, reasoning tokens, context caching, and DeepSeek-R1 thinking tokens.
Not just the engineer who writes the code — every person who cares about AI spend has a workflow.
kostrack command. No Python knowledge required beyond installation.:8080/spend.docker compose up -d and one import change. That's it.