OK I GIVE UP

Thinking in Systems

Book cover image

This short book is the work of an experienced practitioner who wants to introduce the basic concepts and tools of her field to a more general audience, while at the same time avoiding the dumbing down and trivializing that is frequently the case in such works. These tasks are not easy, given that systems science is not well-known among the general population, or at most at a very superficial level. It is to the credit of Meadows that she imparts to the reader the feeling of having understood not only some core concepts and deep truths about systems, but also the role of the systems analyst in a complex world. The book consists of three parts. The first part teaches the very basic vocabulary of systems, describing the most basic kinds of systems in the process. The second part explains how these concepts and principles lead to complex, surprising behavior in the real world, with ample examples of these behaviors. The third and final part advises on how to change systems, and what to expect from our interventions.

Not surprisingly, the first part opens with a relatively straightforward definition of a system: A system is “an interconnected set of elements that is coherently organized in a way that achieves something”. This definition is rather broad, obviously, and is not particularly useful, because the interesting thing about systems is not that they fit a certain definition or taxonomy, but that they exhibit similar behavior due to similar structures of feedback: “systems wih similar feedback structures produce similar dynamic behaviors, even if the outward appearance of these systems is completely dissimilar” (p. 51). This is the fundamental tenet of systems research as a science, as it does not treat the phenomena (e.g. economy, environment, social units) in its entirety, but within an abstraction that sees it as consisting of the units of systems thinking. These units are the concepts explained in the first chapter, most important of which are stock, flow, feedback loop and archetype. With Meadows’ examples, a stock is the amount of heat in a room, all the individuals in a stock of fish in a lake, or the capital a company possesses. In this same order, flows are heating or cooling, fishing or the fish reproducing, and the company spending money or earning money. The central point of this book, and of systems research in general, is that if a stock of fish and a company have the same feedback structure (i.e. the stock affecting the flow into or out of that stock), then similar behavior, such as balancing loops, will appear in both systems, and can be studied using the same tools. Examples of such patterns are given in the “systems zoo”, where prototypical systems (single- or two-stock, with or without delay, renewable or nonrenewable resources) are compared to each other. An especially interesting example is the role of “time” in systems, i.e. how delay in signals can lead to unexpected oscillations, that can either be used for the own good of the system to keep it stable within certain bounds, or can lead to uncontrolled oscillations. Meadows does an excellent job of describing the kinds of behavior that can arise from simple system units, but it has to be said that her selection of examples are sometimes too simplistic (or maybe even irrelevant), such as siblings pushing each other as an example for feedback. Also, the pseudo-poem on systems on page 87 is plain horrible; it shouldn’t have been included, in my opinion.

The second part builds on the description of systems given in the first to help the reader understand the complex, unexpected behavior that is exhibited by real-life systems, and the tools a systems analyst has at her disposal to probe deeper into them. Systems analysts do not stop at describing the events that take place in or around a system, or its behavior in a certain time window; they are more interested in the structures that lead to such behavior. These structures are deduced from the visible actors in the system, past events, and the comparison of behavior to a model. Factors that analysts need to pay attention to in this process are the character of the relationships (linear and nonlinear relationships exhibit completely different characteristics), the nature of the boundaries (whether they are realistic, or only a property of language), and limiting factors (what they are, how they change, and how they affect other factors). Another way systems can surprise the layman is when they settle in a pattern that is undesirable for all parties, what Meadows calls system traps. These negative patterns, examples of which are tragedy of the commons or addiction, happen when the pull of all the actors in the system are in the direction that propogates the pattern, instead of leading out of it.

Meadows gives examples of system patterns and traps from many different domains, but most frequently from socio-political systems. In the case of traps, nearly all her examples are from this domain. Many of the examples, such as the addiction trap, where a substance (drugs for humans, or oil for the economy) creates a dependency that increases with its own use, are revealing and realistic, but some, such as the concept of bounded rationality, or the trap of “success to the successful” (i.e. the rich getting richer) present a difficulty that I think points at the limits of systems thinking. Both of these concepts relate to humans behaving to the detriment of others, without necessarily intending to do so, as the system leads them in that direction. In bounded rationality, for example, people act based on the limited information they have, within the limited time frame in which they can analyze it. The problem with these system-theoretic explanations is that they ignore power relations in society, a factor endemic to this domain, and which is very difficult to understand using the tools of systems thinking. How would one place disinformation campaigns into a systems framework, for example, where certain groups deliberately spread falsehoods to manipulate the society? The only positive outcome is more power to them, which they are pretty conscious of.

The last part of the book is a plea for a more human and “integral” view of systems. Meadows asks the systems researcher to show humility and accept that systems have a character and will of their own that cannot be captured by short-sighted models. In her words, “We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.” (p. 170). This attitude of humility is what makes this book particularly valuable, and a great introduction to systems thinking.