Goodhart’s law


When a measure becomes a target, it ceases to be a good measure.

What is Goodhart’s law?

Goodhart’s law is the spiritual warning for the era of Big Data. Named after British economist Charles Goodhart — who originally aimed it at 1970s monetary policy — the law describes the inevitable moment when humans stop trying to do a good job and start trying to make the spreadsheet look like it thinks they are doing a good job. It is the reason why “number of arrests” doesn’t necessarily mean a safer city, and “lines of code written” definitely doesn’t mean better software.

The law works because humans are natural-born gamers of systems. In his book The Tyranny of Metrics, historian Jerry Z. Muller explores how this “metric fixation” actually replaces seasoned judgment with standardized numerical indicators. Muller argues that when we attach high-stakes rewards — like bonuses or promotions — to a single number, we aren’t incentivising performance, we are incentivising the manipulation of data. Surgeons might avoid risky but necessary operations to keep their success rates high, and teachers might teach to the test while the actual light of learning flickers out in the background.

​To understand why these systems are so fragile, one should turn to Donella Meadows’ foundational text, Thinking in Systems. Meadows explains that every system has a purpose, but when we give it a narrow, measurable target, the system’s sub-units (ourselves) will pivot with terrifying efficiency to meet that target, even if it destroys the system’s original goal.

It’s the Cobra Effect in slow motion: the system gives you exactly what you asked for, but usually in the most useless way possible.

Sidebar

The Cobra Effect: a warning on perverse incentives

​While Goodhart’s law tells us that a measure becomes useless once it becomes a target, the Cobra Effect shows us what happens when that target actually backfires.​

The term originates from an anecdote of British colonial rule in India. To reduce the number of venomous cobras in Delhi, the government offered a cash bounty for every dead snake. The measure (dead snakes) was intended to achieve the target (fewer live snakes). However, citizens quickly realized it was easier to breed cobras in captivity and kill them for the reward than it was to hunt them in the wild. When the government eventually cancelled the bounty, the breeders released their now-worthless snakes, leaving the city with a larger cobra population than when the program started.

​In modern systems, the Cobra Effect is the ultimate “perverse incentive.” It reminds us that when we incentivise a specific metric without deep systemic oversight, we don’t just get skewed data, we often get the exact opposite of the result we intended.

For Lex Nunc, Goodhart’s law serves as a reminder that the most important things in life are often the hardest to count. If you find yourself obsessing over a metric — whether it’s your daily step count, your social media likes, or your company’s KPIs — remember that the moment you treat that number as the goal, it loses its power to tell you the truth.