
# 3: Basics of Dynamical Systems


A dynamical system is a system whose state is uniquely speciﬁed by a set of variables and whose behavior is described by predeﬁned rules.

• 3.1: What are Dynamical Systems?
Dynamical systems theory is the very foundation of almost any kind of rule-based models of complex systems. It consider show systems change over time, not just static properties of observations.
• 3.2: Phase Space
A phase space of a dynamical system is a theoretical space where every state of the system is mapped to a unique spatial location. The number of state variables needed to uniquely specify the system’s state is called the degrees of freedom in the system. You can build a phase space of a system by having an axis for each degree of freedom, i.e., by taking each state variable as one of the orthogonal axes.
• 3.3: What Can We Learn?
You can tell from the phase space what will eventually happen to a system’s state in the long run. For a deterministic dynamical system, its future state is uniquely determined by its current state (hence, the name “deterministic”). Trajectories of a deterministic dynamical system will never branch off in its phase space (though they could merge), because if they did, that would mean that multiple future states were possible, which would violate the deterministic nature of the system.