Condi Rice was on ABC’s “This Week with George Stephanopolis” talking about how Isreal should “consider the broader consequences” of its current bombing activities in Lebanon, and she specifically urged Isreal to try harder to avoid civilian casualties.
Huh? As much as I deplore Israel’s bombings, does anyone in the Bush Administration have standing to complain about a nation that bombs another country, kills its civilians, and fails to consider the broader consequences of its actions?
Let me make Rice an example of “people who see in others what they desperately need to see in themselves.”
The problem, of course, that both the Bush Administration and Israel have fallen into is the belief that the best way to fight terrorism is through grossly disproportionate military action — the consequences be damned — despite a complete lack of evidence that such brutality is actually effective in reducing terrorism in the long-term. Israel has employed this strategy for half a century, apparently to no avail. Likewise, the Bush Administration has employed this strategy for several years in Iraq, also to no avail. Why? Because actions have consequences, often the opposite of what is intended.
Jay Forrester, Professor Emeritus at MIT’s Sloan School of Management, had a lot to say about unintended consequences. He founded the study of system dynamics: a theoretical framework that helps explain why so many attempts to fix problem actually yield the opposite result. In his books, “Urban Dynamics,” and “Industrial Dynamics,” Forrester shows how complex systems (corporations, cities, nations, etc.) don’t respond to environmental changes (i.e., problem-solving efforts) in obvious ways, because feedback loops exist to counteract the changes.
See, for example, the following from Forrester’s 1991 paper, “System Dynamics and the Lessons of 35 Years”:
To the surprise of those unfamiliar with the devious nature of such dynamic systems, the computer model, based on policies known to people in the company, will usually generate the very difficulties that the company had been experiencing. In short, the policies that were believed to solve the problem are, instead, the cause of the problem. Such a situation creates a serious trap and often a downward spiral. If the policies being followed are believed to alleviate the problem, but, in hidden ways, are causing the problem, then, as the problem gets worse, pressures increase to apply still more strongly the very policies that are causing the problem.
Often, those feedback loops are very powerful, and can more than swamp simple-minded and brute-force efforts to change the system. The books, and the field of study, have existed for many decades, but, unfortunately, few people use them. Certainly not arrogant, ego-centric politicians eager to show the world how tough they are.
In other words, whenever a politician uses the excuse, “no one could have foreseen this unintended adverse outcome,” NEVER, NEVER, NEVER let that politician off the hook with that pathetic excuse. In fact, many adverse outcomes are easily foreseeable, but only if you use a mental model that accurately represents the system to be fixed. In many — indeed, most — cases, adverse outcomes are not the result of unforseeable random chance; they are the result of incompetent leadership failing to appreciate the problem’s feedback loops.
The result is what we see today: rich, powerful countries playing to the terrorists’ strengths. Countries that respond to terrorist attacks by killing orders of magnitude more civilians than they sufferred themselves, merely prove they are more heartless and cruel than the terrorists they condemn. The fact that they do not recognize this in themselves is what produces “unforseeable” consequences.