Meta has an internal ranking system where employees compete for the highest AI token consumption.
One employee built a leaderboard on the company intranet called “Claudeonomics” that tracks consumption across more than 85,000 employees, The Information reports. In just 30 days, employees burned through 60 trillion tokens. The top user averaged 281 billion.
The leaderboard uses titles like “Token Legend,” “Model Connoisseur,” and “Cache Wizard” to get employees hooked on working AI tools into their daily routines. But some employees just leave AI agents running for hours to pad their numbers, wasting resources in the process, since every token costs money.
Still, “tokenmaxxing” has turned into a go-to productivity metric across Silicon Valley. Nvidia CEO Jensen Huang said he’d be “deeply alarmed” if an engineer pulling in $500,000 a year wasn’t consuming at least $250,000 worth of tokens. According to Forbes, Meta CTO Andrew Bosworth said one top engineer spends the equivalent of his salary on tokens and supposedly 10x’d his output.
Nobody has actually put up hard numbers to back any of this up, though. Measuring token consumption as a proxy for productivity is a bit like judging a truck driver by how much gas they burn. It tells you the engine is running, but not whether any freight is actually getting delivered.
But connecting raw usage and individual productivity gains to real business results is hard. For AI companies, finding that connection matters a lot to justify the massive investments pouring into AI right now. Even Google in the past resorted to reporting token consumption in its cloud offerings during quarterly earnings calls as a sign of growing adoption, and to make things worse, those numbers were artificially inflated by reasoning tokens. Showing usage instead of real revenue gains probably won’t fly for long.
AI News Without the Hype – Curated by Humans
As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive “AI Radar” Frontier Report 6× per year, access to comments, and our complete archive.
















