Essay
Introducing AI Traffic Audit
I split the public tool out of my private AI traffic research repo and open-sourced the reusable part: a local CDP-based auditor for browser AI products.
2026-04-12
I’ve spent a lot of time lately watching what browser-based AI products actually do on the wire.
That started as a private research repo because the first phase was messy by design: personal captures, one-off experiments, rough analysis scripts, draft writeups, and a few local UI ideas that were not ready to leave the lab.
But there was a cleaner core inside it. A reusable tool.
So I split that core out and open-sourced it as AI Traffic Audit.
What it is
AI Traffic Audit is a local tool for inspecting browser-based AI traffic through the Chrome DevTools Protocol.
It does a few practical things:
- captures AI-related network traffic into a local SQLite database
- classifies traffic into functional requests vs telemetry and tracking
- extracts identifiers, events, protocol hints, and message flows
- compares behavior across ChatGPT, Claude, Gemini, Google AI mode, and Grok
- lets you export, replay, diff, and inspect the traffic later instead of trying to live inside the Network tab
It is not a remote interception tool. It is not built on private APIs. It watches your own browser traffic on your own machine and keeps the data local.
Why I split it out
The private repo was useful for research, but it was the wrong artifact to publish.
It mixed together:
- the actual reusable tool
- local databases and exports
- draft charts and writing
- experimental side surfaces
- personal usage-derived artifacts
That is bad release hygiene.
The right move was to separate the public surface from the lab. So the public repo contains the tool, the extension fallback, the local setup path, and the documentation. The private repo stays private.
I think this is a better pattern in general for research-heavy work. Not everything needs to be open-sourced raw. Sometimes the right public artifact is the sharpened subset.
What I care about with this project
The browser already exposes a lot. The problem is not access. The problem is survivability.
After a handful of sessions, the Network tab becomes noise. If you want to answer simple questions like:
- how much of the traffic is actually telemetry
- what gets sent before you hit submit
- which identifiers a product assigns
- what streaming protocol it uses
- how the behavior changes between sessions
you need something that stores, classifies, and lets you query the traffic later.
That is what this tool is for.
What stays private
The public repo is intentionally narrower than the internal workspace.
I did not ship:
- personal capture history
- local databases, logs, or exports
- draft blog assets
- half-finished experimental UI work
That was not just caution. It was discipline. A public repo should stand on its own.
Related writeup
If you want the findings rather than the tool, I wrote that up separately:
That post is the analysis. AI Traffic Audit is the reusable instrumentation surface.
The repo
- GitHub: harsha-gouru/ai-traffic-audit
This is one of those projects where the product is not some giant platform. It is a better lens.
And right now, that feels more useful.