“When a measure becomes a target, it ceases to be a good measure.” — Goodhart’s Law
The theory of Goodharts Law, which argues that people act in ways that are consistent with their rational expectations, is applicable to many different fields of economics. Although it is not a comprehensive description of all possible sub-cases, it is an important concept. Its application is most commonly illustrated through the use of IQ tests to determine job performance. Assuming that this criterion is 0.6, the relationship between IQ test scores and job performance will eventually break down.
According to Goodhart's Law, when an economy targets a measure for improvement, it ceases to be an effective measure. This is not to say that measurement should be eliminated altogether. It merely warns us against changing our incentives, perceptions, and actions based on a single measurement. In fact, we should seek to minimize the use of such measures. Using such measures will allow us to make better decisions and achieve a higher level of satisfaction.
The law is also applicable to dynamic systems, which adjust their behavior in order to optimize a given metric. In this way, humans and artificial agents can benefit from the Goodharts Law, but they should avoid making too many changes at once. It is best to choose a single metric to use and measure the entire system. Otherwise, your results will be skewed. Regardless of the metric you choose, be sure to evaluate all outcomes against your objectives.
In addition to being useful, Goodharts Law can also be dangerous. Unless you're using data that is systematically representative of the real world, you're likely to end up with an inaccurate measure. That's why you should always pair opposing indicators. Ideally, you should choose a set of metrics that are representative of the true goals of your organization. This can help you avoid the most frustrating results that come from measuring one metric in isolation.
A good example of this law is the Cobra Effect. An individual can manipulate the outcome of a policy by using a single KPI. As an example, the government's policy of reducing cobra snakes in India, the Cobra Effect, caused people to kill cobras for a monetary reward. Likewise, a government bounty imposed on a particular animal can lead to the same behavior. Those who are not aware of this can abuse the law by ignoring the "good" KPI.
The Cobra Effect is a classic example of the effects of Goodhart's Law. A government reward for killing snakes can cause unintended consequences, including the Cobra Effect. The Cobra Effect is an example of Goodhart's Law in action. A government incentive that rewards a person to kill snakes has a similar effect. The government can also reward people for merely killing them, as long as they do it in a way that is consistent with the goal.
A Goodharts Law is a paradox. Despite its name, it can be applied to the monetary system. The implication of this law is that a currency can have a positive value if it has no value. Therefore, a country's currency could be worth two different currencies. This makes it impossible to apply a currency to a foreign country, and vice versa. The laws of physics and economics can be used to measure the stability of an economy.
In contrast, a Goodharts Law violation is a violation of the law. A bad instance of this violation of the law is the use of standardized testing to assess student performance. Students who have no experience in mathematics are not learning at a high rate, and the system will only encourage students to study harder. However, it is not enough to just use a Goodharts Law to judge a person's character.
As an example of how the law can be used to predict the behavior of individuals, the law can be applied to animals. For example, the British government paid a bounty to anyone who killed a cobra. As an example, an enterprising Indian found a way to make money by killing a cobra. Soviet planners ordered nail factories to increase production. The industry responded by producing millions of useless nails. They soon switched to the weight criterion and produced giant, heavy, useless nails.
Goodhart’s Law is expressed simply as: “When a measure becomes a target, it ceases to be a good measure.” In other words, when we set one specific goal, people will tend to optimize for that objective regardless of the consequences. This leads to problems when other equally important aspects of a situation are neglected. Our call center manager thought that increasing the number of calls processed was a good objective, and his employees dutifully strove to increase their numbers. However by choosing only one metric to measure success, he motivated employees to sacrifice courtesy in the name of quantity. People respond to incentives, and our natural inclination is to maximize the standards by which we are judged.