The answer clearly lies within the Slinky itself. The hands that manipulate it suppress or release some behavior that is latent within the structure of the spring.
That is a central insight of systems theory.
Once we see the relationship between structure and behavior, we can begin to understand how systems work, what makes them produce poor results, and how to shift them into better behavior patterns.
So, what is a system? A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time.
A system* is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose.
Some interconnections in systems are actual physical flows, such as the water in the tree’s trunk or the students progressing through a university. Many interconnections are flows of information—signals that go to decision points or action points within a system. These kinds of interconnections are often harder to see, but the system reveals them to those who look.
The best way to deduce the system’s purpose is to watch for a while to see how the system behaves.
Purposes are deduced from behavior, not from rhetoric or stated goals.
An important function of almost every system is to ensure its own perpetuation.
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems.
A stock takes time to change, because flows take time to flow.
Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.
Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows. That means system thinkers see the world as a collection of “feedback processes.”
A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.
Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself.
You’ll be thinking not in terms of a static world, but a dynamic one. You’ll stop looking for who’s to blame; instead you’ll start asking, “What’s the system?” The concept of feedback opens up the idea that a system can cause its own behavior.
There’s an important general principle here, and also one specific to the thermostat structure. First the general one: The information delivered by a feedback loop can only affect future behavior; it can’t deliver the information, and so can’t have an impact fast enough to correct behavior that drove the current feedback. A person in the system who makes a decision based on the feedback can’t change the behavior of the system that drove the current feedback; the decisions he or she makes will affect only future behavior.
The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.
A flow can’t react instantly to a flow. It can react only to a change in a stock, and only after a slight delay to register the incoming information.
A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.
Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
A delay in a balancing feedback loop makes a system likely to oscillate.
Systems need to be managed not only for productivity or stability, they also need to be managed for resilience—the ability to recover from perturbation, the ability to restore or repair themselves.
This capacity of a system to make its own structure more complex is called self-organization.
When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.
Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
Everything we think we know about the world is a model.
The behavior of a system is its performance over time—its growth, stagnation, decline, oscillation, randomness, or evolution.
The structure of a system is its interlocking stocks, flows, and feedback loops.
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask.
It’s a great art to remember that boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose.
At any given time, the input that is most important to a system is the one that is most limiting.
Any physical entity with multiple inputs and outputs—a population, a production process, an economy—is surrounded by layers of limits.
Ultimately, the choice is not to grow forever but to decide what limits to live within.
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system.
We call the system structures that produce such common patterns of problematic behavior archetypes.
Drift to low performance is a gradual process. If the system state plunged quickly, there would be an agitated corrective process. But if it drifts down slowly enough to erase the memory of (or belief in) how much better things used to be, everyone is lulled into lower and lower expectations, lower effort, lower performance.
THE TRAP: ESCALATION When the state of one stock is determined by trying to surpass the state of another stock—and vice versa—then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse—because exponential growth cannot go on forever. THE WAY OUT
The best way out of this trap is to avoid getting in it. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing loop. Or one can negotiate a new system with balancing loops to control the escalation.
THE TRAP: SUCCESS TO THE SUCCESSFUL If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated. THE WAY OUT Diversification, which allows those who are losing the competition to get out of that game and start another one; strict limitation on the fraction of the pie any one winner may win (antitrust laws); policies that level the playing field, removing some of the advantage of the strongest players or increasing the advantage of the weakest; policies that devise rewards for success that do not bias the next round of competition.
THE TRAP: SHIFTING THE BURDEN TO THE INTERVENOR Shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces (or disguises) the symptoms, but does nothing to solve the underlying problem. Whether it is a substance that dulls one’s perception or a policy that hides the underlying trouble, the drug of choice interferes with the actions that could solve the real problem. If the intervention designed to correct the problem causes the self-maintaining capacity of the original system to atrophy or erode, then a destructive reinforcing feedback loop is set in motion. The system deteriorates; more and more of the solution is then required. The system will become more and more dependent on the intervention and less and less able to maintain its own desired state. THE WAY OUT Again, the best way out of this trap is to avoid getting in. Beware of symptom-relieving or signal-denying policies or practices that don’t really address the problem. Take the focus off short-term relief and put it on long term restructuring.
THE TRAP: RULE BEATING Rules to govern a system can lead to rule beating—perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system.
THE WAY OUT Design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules.
THE TRAP: SEEKING THE WRONG GOAL System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted. THE WAY OUT Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result.
10. Stock-and-Flow Structures—Physical systems and their nodes of intersection The plumbing structure, the stocks and flows and their physical arrangement, can have an enormous effect on how the system operates.
The only way to fix a system that is laid out poorly is to rebuild it, if you can.
9. Delays—The lengths of time relative to the rates of system changes Delays in feedback loops are critical determinants of system behavior. They are common causes of oscillations.
8. Balancing Feedback Loops—The strength of the feedbacks relative to the impacts they are trying to correct
Balancing feedback loops are ubiquitous in systems. Nature evolves them and humans invent them as controls to keep important stocks within safe bounds. A thermostat loop is the classic example.
7. Reinforcing Feedback Loops—The strength of the gain of driving loops A balancing feedback loop is self-correcting; a reinforcing feedback loop is self-reinforcing. The more it works, the more it gains power to work some more, driving system behavior in one direction.
6. Information Flows—The structure of who does and does not have access to information
Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure. The tragedy of the commons that is crashing the world’s commercial fisheries occurs because there is little feedback from the state of the fish population to the decision to invest in fishing vessels.
5. Rules—Incentives, punishments, constraints The rules of the system define its scope, its boundaries, its degrees of freedom. Thou shalt not kill. Everyone has the right of free speech. Contracts are to be honored.
4. Self-Organization—The power to add, change, or evolve system structure
The most stunning thing living systems and some social systems can do is to change themselves utterly by creating whole new structures and behaviors. In biological systems that power is called evolution. In human economies it’s called technical advance or social revolution. In systems lingo it’s called self-organization.
3. Goals—The purpose or function of the system
Right there, the diversity-destroying consequence of the push for control demonstrates why the goal of a system is a leverage point superior to the self-organizing ability of a system.
2. Paradigms—The mind-set out of which the system—its goals, structure, rules, delays, parameters—arises
Paradigms are the sources of systems. From them, from shared social agreements about the nature of reality, come system goals and information flows, feedbacks, stocks, flows, and everything else about systems. No one has ever said that better than Ralph Waldo Emerson: Every nation and every man instantly surround themselves with a material apparatus which exactly corresponds to . . . their state of thought. Observe how every truth and every error, each a thought of some man’s mind, clothes itself with societies, houses, cities, language, ceremonies, newspapers. Observe the ideas of the present day . . . see how timber, brick, lime, and stone have flown into convenient shape, obedient to the master idea reigning in the minds of many persons. . . . It follows, of course, that the least enlargement of ideas . . . would cause the most striking changes of external things.7 Ralph Waldo Emerson,
Systems modelers say that we change paradigms by building a model of the system, which takes us outside the system and forces us to see it whole. I say that because my own paradigms have been changed that way.
1. Transcending Paradigms There is yet one leverage point that is even higher than changing a paradigm. That is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true,” that every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension. It is to “get” at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny. It is to let go into not-knowing, into what the Buddhists call enlightenment.
It is in this space of mastery over paradigms that people throw off addictions, live in constant joy, bring down empires, get locked up or burned at the stake or crucified or shot, and have impacts that last for millennia.
Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionist science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize. We can’t keep track of everything. We can’t find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.
Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history. Ask people who’ve been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system—peoples’ memories are not always reliable when it comes to timing.
You don’t have to put forth your mental model with diagrams and equations, although doing so is a good practice. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Mental flexibility—the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure—is a necessity when you live in a world of flexible systems.
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them to be plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption that may become entangled with your own identity.
If I could, I would add an eleventh commandment to the first ten: Thou shalt not distort, delay, or withhold information.
Use Language with Care and Enrich It with Systems Concepts Our information streams are composed primarily of language. Our mental models are mostly verbal. Honoring information means above all avoiding language pollution—making the cleanest possible use we can of language. Second, it means expanding our language so we can talk about complexity.
Pay Attention to What Is Important, Not Just What Is Quantifiable
Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models. You’ve already seen the system trap that comes from setting goals around what is easily measured, rather than around what is important. So don’t fall into that trap. Human beings have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector.
Make Feedback Policies for Feedback Systems
Go for the Good of the Whole Remember that hierarchies exist to serve the bottom layers, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole.
Listen to the Wisdom of the System Aid and encourage the forces and structures that help the system run itself. Notice how many of those forces and structures are at the bottom of the hierarchy. Don’t be an unthinking intervenor and destroy the system’s own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what’s already there.
Locate Responsibility in the System That’s a guideline both for analysis and design. In analysis, it means looking for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside events can be controlled (as in reducing the pathogens in drinking water to keep down incidences of infectious disease). But sometimes they can’t. And sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system.
Stay Humble— Stay a Learner Systems thinking has taught me to trust my intuition more and my figuring- out rationality less, to lean on both as much as I can, but still to be prepared for surprises. Working with systems, on the computer, in nature, among people, in organizations, constantly reminds me of how incomplete my mental models are, how complex the world is, and how much I don’t know. The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading. That’s hard.
Expand Time Horizons
There is a great deal of historical evidence to suggest that a society which loses its identity with posterity and which loses its positive image of the future loses also its capacity to deal with present problems, and soon falls apart. . . .
Defy the Disciplines
Seeing systems whole requires more than being “interdisciplinary,” if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other.
They will have to go into learning mode. They will have to admit ignorance and be willing to be taught, by each other and by the system.
Expand the Boundary of Caring
As with everything else about systems, most people already know about the interconnections that make moral and practical rules turn out to be the same rules.
They just have to bring themselves to believe that which they know.
Don’t Erode the Goal of Goodness
We know what to do about drift to low performance. Don’t weigh the bad news more heavily than the good. And keep standards absolute.
For more resources, see also www.ThinkingInSystems.org
Systems Thinking and Business Senge, Peter. The Fifth Discipline: The Art and Practice of the Learning Organization. (New York: Doubleday, 1990). Systems thinking in a business environment, and also the broader philosophical tools that arise from and complement systems thinking, such as mental-model flexibility and visioning. Sherwood, Dennis. Seeing the Forest for the Trees: A
Manager’s Guide to Applying Systems Thinking. (London: Nicholas Brealey Publishing, 2002). Sterman, John D. Business Dynamics: Systems Thinking and Modeling for a Complex World. (Boston: Irwin McGraw Hill, 2000).
Macy, Joanna. Mutual Causality in Buddhism and General Systems Theory. (Albany, NY: Stat University of New York Press, 1991). Meadows, Donella H. The Global Citizen. (Washington, DC: Island Press, 1991).