Author, reviewer and revision dates:

Created by John C. Thomas on 4 September, 2001

Revised, JCT, December 17, 2001

Evocative Image: http://homepage.mac.com/verachuckdave/PhotoAlbum.html

(Not Infinity)

Synonyms

Abstract:
In developing complex systems, it is often expedient to develop feedback loops based on ersatz measures of what we are really interested in assessing and con-trolling. While this may seem expedient in the short term, it often leads to seri-ous problems and distortions, particularly in times of crisis or transition when the correlation between ersatz measures and actuality substantially drifts or even suddenly disconnects. Actions can be based on these measures or models of re-ality rather than on reality and result in negative consequences. The solution is to perform regular “reality checks” to insure that measures or indicators of real-ity continue to reflect that reality. Problem In developing complex systems, it is often expedient to develop feedback loops based on ersatz measures of what we are really interested in assessing and con-trolling. While this may seem expedient in the short term, it often leads to seri-ous problems and distortions, particularly in times of crisis or transition when the correlation between ersatz measures and actuality substantially drifts or even suddenly disconnects. Actions can be based on these measures or models of re-ality rather than on reality. This can result in negative consequences.

Context:
Many problems were partly responsible for the disaster at the Three Mile Island. One crucial problem in particular arose from the design of a feedback loop. A switch was supposed to close a valve. Beside the switch was a light that was supposed to show that the valve was closed. In fact, rather than having the light actually go on as the result of feedback from the valve closure itself, the signal light was merely a collateral circuit to the switch. All it actually showed was that the switch had moved position (Wickens, 1984). Under normal operation; that is, when the valve was operating normally, these two events were perfectly correlated. At a critical point in the meltdown, however, the valve was not oper-ating properly and an operator believed that the valve was closed even though it failed to close. The resulting actions, taken on the basis of the assumption that the valve was closed, exacerbated the problems.

In running an application program recently, I was given a feedback message that a file was posted. In fact, it wasn't. The programmer, rather than checking to see whether the file was actually posted, merely relied on the completion of a loop.

In advertising campaigns, it is difficult to measure the impact on sales. Instead, companies typically measure the "recall" and "recognition" rates of ads. This may often be correlated with sales changes, but in some cases, the ad may be very memorable but give the customer a very negative impression of the com-pany and decrease the chances of actually selling a product.

Historically, monarchs and dictators often surrounded themselves only with peo-ple who gave them good reports and support no matter how their decisions im-pacted the reality of their realm. Eventually, the performance of such people tended to deteriorate over time because their behavior was shaped by this ersatz feedback rather than reality.

During the "oil crisis" in the seventies, oil companies relied on mathematical models of continually increasing demand. Year after year, for seven years, they relied on these models to predict demand despite the fact that, for seven years, demand actually went down. The results are purported to have cost them tens of billions of dollars (Van der Heijden, 1996).

Forces:
Organizations are often hierarchically decomposed and bureaucratic. Therefore, it is often simplest to communicate with those close to us in the hierarchy and to build systems that rely for their model of reality only on things within the imme-diate control span of our small part of the organization.

While more comfortable to limit system design and development to those things within one’s own team or department, it is often precisely the work necessary to capture more reality based measures that will reveal additional challenges and opportunities in business process coherence.

The measure of reality is often more time-consuming, more costly, or more diffi-cult than the measure of something more proximal that is often correlated with reality.

It is likely to be exactly at times of crisis and transition that the correlation be-tween proximal ersatz measures and their referent in reality will be destroyed.

It is likely to be exactly under times of crisis and transition that people will tend to simplify their cognitive models of the world and, among other things, forget that the proximal measure is only ersatz.

Solution:
Therefore:

Whenever feasible, feedback in any business process should be based on re-ality checks, not on ersatz measures. When this is too costly (as opposed to merely inconvenient or uncomfotable), then at least design systems so that the correlation between proximal measures and their referent in reality is double checked periodically.

Examples:
Rather than rely soley on a circle of politically minded advisors, Peter the Great disguised himself and checked out various situations in Russia in person. Resulting Context

Rationale

Related Patterns:
System as a Whole
Convergent Measures
Drawing the Line
Who speaks for Wolf

Known Uses:
Richard Feynman, during the Manhattan project, noticed that the bureaucracy was worried about the possibility of accidentally stockpiling a critical mass of uranium. To prevent this, each section chief was required to insure that their section did not have a critical mass. To insure this, each section chief instructed each sub-section chief to insure that their subsection didn’t have a critical mass and so on, down to the smallest level of the bureaucracy. Upon hearing this plan, Feynman observed that neutrons probably didn’t much care whose subsec-tion they reported to!

In another incident, various bureaucrats were each trying to prove that they had better security than their peers. In order to prove this, they escalated the buying of bigger and thicker safes. The bigger and thicker the safe, the more they felt that they had made their secrets secure. Feynman discovered that more than half of the super-safe safes had been left with the factory installed combinations of 50-50-50 and were trivially easy to break into.

References: Wickens, C. Engineering psychology and human performance. Columbus: Merrill, 1983 (p.1).

Van der Heijden, K. Scenarios: The art of strategic conversation. Chichester: Wiley, 1996.

Hutchings, E., Leighton, R., Feynman, R., and Hibbs, A. Surely, you're joking Mr. Feynman. New York: Norton, 1997.

Underwood, P. The Walking People: A Native American oral history. San Anselmo, CA: Tribe of Two Press, 1993.

Back to Pattern Language

Back to Welcome Page