If you want to control a cobra pandemic, don't set dead cobra heads as a target!

Why we should keep this in mind when we write our PMS Goals and OKRs.

Goodhart’s Law is a simple mental model: When a measure becomes a target, it ceases to be a good measure.

Some context first:

If a measure of performance becomes a stated goal, humans tend to optimize for it, regardless of any associated consequences. The measure loses its value as a measure!

Goodhart’s Law is named after British economist Charles Goodhart, who referenced the concept in a 1975 article on British monetary policy. Goodhart wrote that, “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”

The phenomenon has been observed throughout history - Humans consistently establish targets that invite manipulation.

Ex, Indian Cobras:

There were too many cobras in India, so the British colonists started offering bounties for cobra heads. Locals began breeding cobras, killing them, and turning in the heads to earn bounties. Many cobras were released, increasing the population of cobras.

Let's break this down into OKRs: What the British OKR would have looked like →

Objective - Increase the dead cobra headcount

KR - Number of cobra heads per month/quarter

What it should have been →

Objective - Reduce the cobra population to x in Qtr(x)

KR - Cobra population reduction by x% per sq.area in Qtr(x) timeline