
# 4.1: Discrete-Time Models with Difference Equations

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$

$$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$

Discrete-time models are easy to understand, develop and simulate. They are easily implementable for stepwise computer simulations, and they are often suitable for modeling experimental data that are almost always already discrete. Moreover, they can represent abrupt changes in the system’s states, and possibly chaotic dynamics, using fewer variables than their continuous-time counterparts (this will be discussed more in Chapter 9). The discrete-time models of dynamical systems are often called Difference Equations, because you can rewrite any ﬁrst-order discrete-time dynamical system with a state variable $$x$$ (Eq. (3.1.1)), i.e.,

$x_t = F(x_{t-1}, t) \label{4.1}$

into a “difference” form

$∆ x = x_t -x_{t-1} = x_t = F(x_{t-1}, t) - x_{t-1} \label{4.2}$

which is mathematically more similar to differential equations. But in this book, we mostly stick to the original form that directly speciﬁes the next value of x, which is more straightforward and easier to understand.

Note that Equation \ref{4.1} can also be written as

$x_{t+1}= F(x_t, t) \label{4.3}$

which is mathematically equivalent to Equation \ref{4.1} and perhaps more commonly used in the literature. But we will use the notation with $$x_t, x_{t−1}, x_{t−2}$$, etc., in this textbook, because this notation makes it easier to see how many previous steps are needed to calculate the next step (e.g., if the right hand side contains $$x_{t−1}$$ and $$x_{t−2}$$, that means you will need to know the system’s state in previous two steps to determine its next state). From a difference equation, you can produce a series of values of the state variable x over time, starting with initial condition $$x_0$$:

${x_0, x_1, x_2, x_3....}\label {(4.4)}$

This is called time series. In this case, it is a prediction made using the difference equation model, but in other contexts, time series also means sequential values obtained by empirical observation of real-world systems as well.

Here is a very simple example of a discrete-time, discrete-state dynamical system. The system is made of two interacting components: A and B. Each component takes one of two possible states: Blue or red. Their behaviors are determined by the following rules:

• A tries to stay the same color as B.
• B tries to be the opposite color of A.

These rules are applied to their states simultaneously in discrete time steps

Exercise $$\PageIndex{1}$$

Write the state transition functions $$F_A(S_A, S_B)$$ and $$F_B(S_A, S_B)$$ for this system, where $$S_A$$ and $$S_B$$ are the states of A and B, respectively.

Exercise $$\PageIndex{2}$$

Produce a time series of $$S_A, S_B$$ starting with an initial condition with both components in blue, using the model you created. What kind of behavior will arise?