Cobra Effect and the Law of Unintended Consequences: How to design incentives and make plans work.

“But Mouse, you are not alone,

In proving foresight may be vain:

The best-laid schemes of mice and men

Go oft awry,

And leave us nothing but grief and pain,

For promised joy!”

– Robert Burns, 1785

Economist Horst Siebert devised the term ‘Cobra Effect’ in his book , Der Kobra-Effekt based on an anecdote in India during colonial rule. Troubled about the number of venomous cobras in Delhi, the British government offered a reward for every dead cobra.  Initially, this proved to be a successful strategy as large numbers of snakes were killed by people for the reward. However, a few enterprising people began to breed cobras for the reward money over time. When the British government became aware of this thriving practice, they discarded the reward. Bereft of any further incentives to breed cobras, the breeders set their now-worthless cobras free in the city, which led to the further spike in the number of cobras in Delhi.

Cobra Effect’ refers to the unintended negative consequences of an incentive that was designed to improve society or individual well-being. The law of unintended consequences is a phenomenon in which any action has results that are not part of the actor’s purpose. The superfluous consequences may or may not be foreseeable or even immediately observable, and they may be beneficial, harmful, or neutral in their impact. 

One can indeed recount several instances of unintended consequences of our own purposeful actions, and innumerable examples of unintended consequences of incentives, policy, or regulations of organisations and Governments are etched in history. While an unexpected outcome can be beneficial, such events are extremely rare. What is more likely is that the desired results fail to materialize, and negative consequences make the original problem worse.

The idea of unintended consequences dates back at least to the 17th century English philospher, John Locke, who discussed the unintended consequences of interest rate regulation in his letter to Sir John Somers, a Member of Parliament.

Adam Smith also recognized the idea. In his 1776 book, The Wealth of Nations he observed that the greedy behavior of buyers and sellers of goods in a marketplace unintentionally created a working economy where the supplies of goods met their demands with fair and stable prices.

“It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own self-interest.” 

– Adam Smith

Friedrich Engels touched on the idea of unintended consequences in his discussion of Ludwig Feuerbach and the end of classical German Philosophy.

“The ends of the actions are intended, but the results which actually follow from these actions are not intended; or when they do seem to correspond to the end intended, they ultimately have consequences quite other than those intended.”

– Friedrich Engels

William A. Sherden writes in Best Laid Plans: The Tyranny of Unintended Consequences and How to Avoid Them, :

“Sometimes unintended consequences are catastrophic, sometimes beneficial. Occasionally their impacts are imperceptible, at other times colossal. Large events frequently have a number of unintended consequences, but even small events can trigger them. There are numerous instances of purposeful deeds completely backfiring, causing the exact opposite of what was intended.”

The American sociologist Robert K. Merton grouped unintended consequences into three types:

Unexpected benefit: An unexpected positive benefit (also referred to as luck, serendipity, or a windfall).

For, eg. (a) The creation of “no-man’s lands” during the Cold War, in places like the border between Eastern and Western Europe, and the Korean Demilitarized Zone, has led to significant natural habitats.

(b)The sinking of ships in shallow waters during wartime has created many artificial coral reefs, which can be scientifically valuable and attract recreational divers.

(c) Pfizer developed the drug Viagra to lower blood pressure. Its use for treating erectile dysfunction was discovered as a side effect in clinical trials.

Unexpected drawback: An unexpected detriment occurring in addition to the desired effect of the policy.

 For eg. (a) Prohibition in the 1920s United States, enacted initially to suppress the alcohol trade, drove many small-time alcohol suppliers out of business but consolidated the hold of large-scale organized crime over the illegal alcohol industry.

(b) CIA’s funding of the Afghan Mujahideen in the 1980s helped drive out the Soviets but led to the destabilization of Afghanistan, contributing to the rise of the Taliban and Al-Qaeda.

Perverse result: A perverse effect is contrary to originally intended (when an intended solution makes a problem worse). It is closely related to another popular term, “Perverse incentive which refers to the incentive designed or offered to improve a situation but unintentionally rewards people for making it worse. 

 For eg. (a) Passenger-side airbags in motorcars were intended as a safety feature but led to an increase in child fatalities in the mid-1990s because small children were being hit by airbags that deployed automatically during collisions. The supposed solution to this problem, moving the child seat to the back of the vehicle, led to an increase in the number of children forgotten in unattended cars, some of whom died under extreme temperature conditions.

(b)Forcing people to have overly complex passwords can be another perverse incentive. We simply write down our passwords somewhere “safe” when faced with this complexity.

As such, Perverse results are truly the embodiment of the proverb

“The road to hell is paved with good intentions”.

Why do the best-laid plans go astray?

In studies conducted by Harvard Business School professor, John Kotter and the consultancies McKinsey and Bain, 70 percent of organizations failed to introduce changes in their organizations

The British research group, Organizational Aspects of Information Technology, studied 14,000 organizations and found that 70 percent of IT projects had “failed in some way.

 A KPMG study of 1,450 companies found that 61 percent of IT projects fail. A Standish Group study of 8,380 IT projects found that 84 percent had failed to meet their deadlines or budget goals and that 31 percent of the projects were cancelled midstream. 

The failure rate of new businesses is similarly high. According to a 2009 Small Business Administration report, one-third of new businesses fail within two years, and half fail within five years. Scott Shane, author of Illusions of Entrepreneurship, found that 70 percent of new businesses fail in ten years after their founding.

Although these studies focused on businesses, there is no reason to believe that government agencies, associations, and other non-businesses would fare better in introducing changes to their organizations.

Our Simplicity-Seeking Minds

“Everything connects to everything else”

– Leonardo Da Vinci

Right from birth, we start to see connections in the world around us. A drink quenches thirst; food relieves hunger; rest alleviates tiredness; medicine cures illness. Most of our cause-effect experiences involve straightforward, direct relationships. We believe that every problem has a cause, and simply eliminating the cause will yield a solution. As a result, we tend to think in terms of ‘linear’ causality. Double the cause to double the effect, halve the cause to halve the effect. Remove the cause to eliminate the effect. 

In reality, the world is often more complex than we realise.  We live in highly complex social systems that contain many elements—people, organizations, and institutions—that interact in ways that make their behavior hard to anticipate. The intended outcome might occur when an action is taken, but several unexpected outcomes will always occur. Even small events can give rise to enormous outcomes. World War 1 was started by a relatively small event of the 1914 assassination of Austria’s Archduke Ferdinand.

Ill-designed incentives are another reason for the failure of our best-laid plans. The study of economics boils down to incentives. The right incentives, be they bonuses, subsidies, grants, or stock options, can get people to do nearly anything. As to good incentives, money is not enough. Many incentive systems have backfired because people failed to consider other interests and motivations. Good incentives acknowledge recognition, public perception, and the value of pursuing work that we can be proud of. So yes, if we want to persuade, we should appeal to interests, not reason. But when it comes to interests, appeal not just to net worth but also to self-worth.

The policymakers must ensure to carefully analyse, design, and provide the right incentives in the first place. Ignorance, Errors in analysis, and immediate interests overriding long-term interests could lead to Perverse incentives and hence unintended consequences

Apart from the world’s inherent complexity and perverse incentives, other possible causes of unintended consequences include

  • human stupidity,
  • self-deception,
  • failure to account for human nature, or other cognitive or emotional biases like envisioning familiar events as being more likely to occur than unfamiliar ones,
  • overestimate the likelihood of success in pursuing projects,
  • inferring patterns from randomness and
  • misunderstanding the probability of events happening. For example, one may infer that a project with 10 components, each with a 90 percent chance of succeeding, would have a high probability of success, when, in fact, the probability of all 10 components working is only 35 percent.

How to avoid unintended consequences?

“Any endeavor has unintended consequences. Any ill-conceived endeavor has more.”

Stephen Tobolowsky, The Dangerous Animals Club

Given that our living world consists of tricky, complex systems that are faulty minds have difficulty dealing with, what can and should we do with respect to intervening in our organizations, economies, and other social systems?

A school of thought advocates that the best way to avoid unintended consequences is to stop interfering with complex systems—both biological and social ones. Lewis Thomas, a famous physician and award-winning author, suggested in his essay, ‘On Meddling’ 

“Intervening is a way of causing trouble . . . the safest course seems to be to stand by and wring hands, but not to touch,”

Thomas’s advice is shared by Milton Friedman, a proponent of laissez-faire government. He suggested that the best way to avoid unintended consequences is for governments to do as little as possible.

However, when it comes to our social world, we must meddle, for, without social intervention, our world would be an undesirable place to live. The social interventions have undoubtedly accomplished much in the way of public health, civil rights, fair voting, and many other things that make our world a better place than if we had just let matters evolve on their own. Furthermore, organizations stagnate and become ineffective when left alone, and continual small changes and periodic larger ones are needed to sustain them.

If we want to avoid unwanted surprises, we need to improve our intuitions concerning the operation of cause and effect in complex social-ecological systems. We need to develop methods that progress beyond simplistic linear thinking to minimize the tyranny of unintended consequences.

William Sherden, author of Best Laid Plans: The Tyranny of Unintended Consequences and How to Avoid Them, suggests a six-step process for intervening in complex social systems

Step 1: Avoid Rushing the Big Bang

The biggest mistake is to rush overconfidently into a major initiative. Major initiatives are inherently riddled with unintended consequences, mostly terrible ones.

Step 2: Adopt a Humble Frame of Mind

The way to begin intervening in complex social systems is to adopt a humble perspective by understanding the nature of the challenge and the uncertainty of success.

Start by identifying what is known and unknown about a complex system.

Use counter argumentation, a process whereby you ask yourself or enlist the help of others about how your initial assumptions and beliefs might be wrong. 

Step 3: Develop a Deep Understanding of the System

Before developing a specific action plan, make sure you fully understand the system you seek to change by filling in the unknowns identified in the prior step. Think of the system as an ecosystem with many different players. Cast a large net to identify all the existing and potential stakeholders and assess their current and evolving roles. Identify their motives and how they might react to your proposed endeavor. Determine how these players are interlinked within the system and their factions and sources of power. Analyze the system’s history, its trends, and prior efforts to change it.

Step 4: Draft Plans Specifically Suited to the System

Avoid a one-size-fits-all approach in developing the plan. Consider how the system differs from seemingly similar others, recognizing that no two cities, organizations, countries, or cultures are ever exactly alike and that the same plan will not work in every case.

Step 5: Thinking through the Maze

The next step is to subject your plan to a mental acid test by thinking through how events might unfold and how unintended consequences might arise. This involves crafting potential outcome scenarios considering the full array of possible results of implementing your plan. Use the Inversion technique, where we invert our thinking and consider what we want to avoid, not what we want to cause. 

Step 6: Start Small and Learn by Doing

Start with small projects. This can include a limited version of the full program or a complete program piloted locally on a small scale. Starting with small projects enables you to quickly implement new concepts and learn by doing. You can get quick feedback to discover what works while encountering failures on a small scale.

“You can never do merely one thing.” 

– Garrett Hardin

Our actions and decisions can have surprising repercussions that have no relation to our initial intentions. Sometimes we can open Pandora’s box or kick a hornet’s nest without realizing it.  The lesson is that simplistic policies can come back to bite you.

The next time you hear a politician, bureaucrat, or business leader proclaiming a simple fix to a complex problem, better check for the cobras lurking in the bushes!

References:

Leave a Reply