1.1 Complex Systems in a Nutshell
It may be rather unusual to begin a textbook with an outright deﬁnition of a topic, but anyway, here is what we mean by complex systems in this textbook1:
Complex systems are networks made of a number of components that interact with each other, typically in a nonlinear fashion. Complex systems may arise and evolve through self-organization, such that they are neither completely regular nor completely random, permitting the development of emergent behavior at macroscopic scales.
These properties can be found in many real-world systems, e.g., gene regulatory networks within a cell, physiological systems of an organism, brains and other neural systems, food webs, the global climate, stock markets, the Internet, social media, national and international economies, and even human cultures and civilizations. To better understand what complex systems are, it might help to know what they are not. One example of systems that are not complex is a collection of independent components, such as an ideal gas (as discussed in thermodynamics) and random coin tosses (as discussed in probability theory). This class of systems was called “problems of disorganized complexity” by American mathematician and systems scientist Warren Weaver . Conventional statistics works perfectly when handling such independent entities. Another example, which is at the other extreme, is a collection of strongly coupled components, such as rigid bodies (as discussed in classical mechanics) and ﬁxed coin tosses (I’m not sure which discipline studies this). Weaver called this class of systems “problems of simplicity” . In this class, the components of a system are tightly coupled to each other with only a few or no degrees of freedom left within the system, so one can describe the collection as a single entity with a small number of variables. There are very well-developed theories and tools available to handle either case. Unfortunately, however, most real-world systems are somewhere in between.
Complex systems science is a rapidly growing scientiﬁc research area that ﬁlls the huge gap between the two traditional views that consider systems made of either completely independent or completely coupled components. This is the gap where what Weaver called “problems of organized complexity” exist . Complex systems science develops conceptual, mathematical, and computational tools to describe systems made of interdependent components. It studies the structural and dynamical properties of various systems to obtain general, cross-disciplinary implications and applications.
Complex systems science has multiple historical roots and topical clusters of concepts, as illustrated in Fig. 1.1. There are two core concepts that go across almost all subareas of complex systems: emergence and self-organization. The idea of emergence was originally discussed in philosophy more than a century ago. There are many natural phenomena where some property of a system observed at macroscopic scales simply can’t be reduced to microscopic physical rules that drive the system’s behavior. For example, you can easily tell that a dog wagging its tail is alive, but it is extremely difﬁcult to explain what kind of microscopic physical/chemical processes going on in its body are making this organism “alive.” Another typical example is your consciousness. You know you are conscious, but it is hard to describe what kind of neurophysiological processes make you a “conscious” entity. Those macroscopic properties (livingness, consciousness) are called emergent properties of the systems.
Despite its long history of discussion and debate, there are still a number of different deﬁnitions for the concept of emergence in complex systems science. However, the one thing that is common in most of the proposed deﬁnitions is that the emergence is about the system’s properties at different scales. If you observe a property at a macroscopic scale that is fundamentally different from what you would naturally expect from microscopic rules, then you are witnessing emergence. More concisely, emergence is a nontrivial relationship between the system’s properties at different scales. This deﬁnition was proposed by complex systems scientist Yaneer Bar-Yam. I will adopt this deﬁnition since it is simple and consistent with most of the deﬁnitions proposed in the literature.
Emergence is a nontrivial relationship between the properties of a system at microscopic and macroscopic scales. Macroscopic properties are called emergent when it is hard to explain them simply from microscopic properties
Another key idea of complex systems science is self-organization, which is sometimes confused with emergence. Some researchers even use these terms almost interchangeably. One clear distinction, though, is that, while emergence is about scale, self-organization is about time (in addition to scale). Namely, you call something self-organizing when you observe that the system spontaneously organizes itself to produce a nontrivial macroscopic structure and/or behavior (or “order,” if you will) as time progresses. In other words, self-organization is a dynamical process that looks as if it were going against the second law of thermodynamics (which states that entropy of a closed system increases monotonically over time). Many physical, biological, and social systems show self-organizing behavior, which could appear mysterious when people were not aware of the possibility of self-organization. Of course, these systems are not truly going against the law of thermodynamics, because they are open systems that are driven by energy ﬂow coming from and going to the outside of the system. In a sense, the idea of self-organization gives a dynamical explanation for emergent properties of complex systems.
Self-organization is a dynamical process by which a system spontaneously forms nontrivial macroscopic structures and/or behaviors over time
Around these two key ideas, there are several topical clusters, which are illustrated in Fig. 1.1. Let’s quickly review them.
1 In fact, the ﬁrst sentence of this deﬁnition is just a bit wordier version of Herbert Simon’s famous definition in his 1962 paper : “[A] complex system [is a system] made up of a large number of parts that interact in a non-simple way.”