# 3.1: What are Dynamical Systems?

- Page ID
- 7778

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

Dynamical systems theory is the very foundation of almost any kind of rule-based models of complex systems. It consider show systems change over time, not just static properties of observations. A dynamical system can be informally deﬁned as follows^{1}:

Definition: Dynamical System

A *dynamical system* is a system whose state is uniquely speciﬁed by a set of variables and whose behavior is described by predeﬁned rules.

Examples of dynamical systems include population growth, a swinging pendulum, the motions of celestial bodies, and the behavior of “rational” individuals playing a negotiation game, to name a few. The ﬁrst three examples sound legitimate, as those are systems that typically appear in physics textbooks. But what about the last example? Could human behavior be modeled as a deterministic dynamical system? The answer depends on how you formulate the model using relevant assumptions. If you assume that individuals make decisions always perfectly rationally, then the decision making process becomes deterministic, and therefore the interactions among them may be modeled as a deterministic dynamical system. Of course, this doesn’t guarantee whether it is a good model or not; the assumption has to be critically evaluated based on the criteria discussed in the previous chapter.

Anyway, dynamical systems can be described over either discrete time steps or a continuous time line. Their general mathematical formulations are as follows:

Definition: Discrete-time dynamical system

\[x_t = F(x_{t−1},t) \label{3.1}\]

This type of model is called a difference equation, a *recurrence equation*, or an *iterative map* (if there is no \(t\) on the right hand side).

Definition: Continuous-time dynamical system

\[\dfrac{dx}{dt}= F(x,t) \label{3.2}\]

This type of model is called a differential equation.

In either case, \(x_t\) or \(x\) is the state variable of the system at time \(t\), which may take a scalar or vector value. \(F\) is a function that determines the rules by which the system changes its state over time. The formulas given above are ﬁrst-order versions of dynamical systems (i.e., the equations don’t involve \(x_{t−2}\), \(x_{t−3}\), ..., or \(d^2x/dt^2\), \(d^3x/dt^3\), ...). But these ﬁrst-order forms are general enough to cover all sorts of dynamics that are possible in dynamical systems, as we will discuss later.

Exercise \(\PageIndex{1}\)

Have you learned of any models in the natural or social sciences that are formulated as either discrete-time or continuous-time dynamical systems as shown above? If so, what are they? What are the assumptions behind those models?

Exercise \(\PageIndex{2}\)

What are some appropriate choices for state variables in the following systems?

- population growth
- swinging pendulum
- motions of celestial bodies
- behavior of “rational” individuals playing a negotiation game

__ __

^{1}A traditional deﬁnition of dynamical systems considers deterministic systems only, but stochastic (i.e., probabilistic) behaviors can also be modeled in a dynamical system by, for example, representing the probability distribution of the system’s states as a meta-level state.