Homework 3

Due at 11:59pm on Feb 15th, 2026. Submit to https://canvas.uw.edu/courses/1862386/assignments/11153452.

Problem 1

Given a nonlinear system, \(\dot{x}=f(x)\), one way to study local stability is to linearize the system. In particular, let \(x_e\) be an equilibrium (\(f(x_e)=0\)), and let \(A\) be the Jacobian of \(f\) evaluated at \(x_e\). Since for a small enough perturbation, \(\frac{d}{dt} (x_e+\Delta x)= \dot{\Delta x} \approx f(x_e)+A\Delta x\), we can say that \(x_e\) is a stable equilibrium if the eigenvalues of \(A\) all have negative real parts.

Given the following nonlinear system

\[\dot{x} = f(x)= \begin{bmatrix} x_2 \\ x_1-x_1^3-0.5x_2 \\ -x_3 \end{bmatrix}.\]

a) Find all equilibrium points.

b) Find the stable equilibrium points.

Solution.

a) Set \(f(x_e) = 0\):

\(x_2 = 0\) \(x_1 - x_1^3 - 0.5x_2 = 0\) \(-x_3 = 0\)

From the first and third equations: \(x_2 = 0\), \(x_3 = 0\). Substituting into the second:

\[x_1 - x_1^3 = x_1(1-x_1^2) = 0 \implies x_1 \in \{0, 1, -1\}\]

The three equilibrium points are:

\[x_e^{(1)} = \begin{pmatrix}0\\0\\0\end{pmatrix}, \qquad x_e^{(2)} = \begin{pmatrix}1\\0\\0\end{pmatrix}, \qquad x_e^{(3)} = \begin{pmatrix}-1\\0\\0\end{pmatrix}\]

b) The Jacobian is

\[A(x) = \begin{pmatrix} 0 & 1 & 0 \\ 1-3x_1^2 & -0.5 & 0 \\ 0 & 0 & -1 \end{pmatrix}\]

Looking at the eigenavalues of the Jacobian at the equilibria, we see that \(x_e^{(2)} = (1,0,0)\) and \(x_e^{(3)} = (-1,0,0)\) are stable equilibria/

Problem 2

Consider a swing equation with losses:

\[\begin{align} \dot{\theta} &= \omega \\ \dot{\omega} & = -\omega+P-(g-g\cos(\theta)+b\sin(\theta)), \end{align}\]

and we take \(g=0.5\) and \(b=1\). Find all values of \(P\) where at least one stable equilibrium exists.

Solution.

This is asking for what \(P\) is the equation

\[P=0.5(1-\cos \theta)+\sin \theta\]

solvable. The right-hand side is made up of sinusoids, so it is bounded and the question is essentially finding its maximum and minimum. Taking derivatives and solving, we get that the range is

\[P \in [0.5-\sqrt{1.25},0.5+\sqrt{1.25}].\]

Problem 3

Linearization is fundamentally a local technique since it tells us what happens very close to an equilibrium point. A natural conjecture is that if we understand what happens locally at every point, we can piece them together to understand some global structure. Unfortunately, this is not true when the dimension of the problem is larger than 1.

Consider this system:

\[\dot{x}=\begin{bmatrix} -x_1+x_3 (x_1+x_2 x_3)^2 \\ -x_2-(x_1+x_2x_3)^2 \\ -x_3 \end{bmatrix}\]

a) Compute the eigenvalues of the Jacobian matrix linearized at some \(x\). Note, the eigenvalues are constants, e.g., they don’t depend on the value of \(x\).

b) Directly verify that \(x(t)=(18e^t, -12e^{2t}, e^{-t})\) is a solution.

As a side note, this problem is actually related to a famous problem in algebraic geometry about polynomials, simply called the Jacobian Conjecture.

Solution.

a) Let \(u = x_1 + x_2 x_3\) for shorthand, so \(u^2\) appears in the dynamics. The Jacobian is:

\[A(x) = \begin{pmatrix} \dfrac{\partial f_1}{\partial x_1} & \dfrac{\partial f_1}{\partial x_2} & \dfrac{\partial f_1}{\partial x_3} \\[8pt] \dfrac{\partial f_2}{\partial x_1} & \dfrac{\partial f_2}{\partial x_2} & \dfrac{\partial f_2}{\partial x_3} \\[8pt] \dfrac{\partial f_3}{\partial x_1} & \dfrac{\partial f_3}{\partial x_2} & \dfrac{\partial f_3}{\partial x_3} \end{pmatrix}\]

Buy somewhat tedious computation:

\[A(x) = \begin{pmatrix} -1+2x_3 u & 2x_3^2 u & u^2+2x_2 x_3 u \\ -2u & -1-2x_3 u & -2x_2 u \\ 0 & 0 & -1 \end{pmatrix}\]

The matrix is block upper triangular with a \(2\times 2\) upper-left block and a scalar lower-right block, so one eigenvalue is:

\[\lambda_3 = -1\]

The remaining two come from the \(2\times 2\) block:

\[B = \begin{pmatrix} -1+2x_3 u & 2x_3^2 u \\ -2u & -1-2x_3 u \end{pmatrix}\] \[\text{tr}(B) = -1+2x_3 u - 1 - 2x_3 u = -2\]

\(\det(B) = (-1+2x_3 u)(-1-2x_3 u) - (2x_3^2 u)(-2u)\) \(= 1 - 4x_3^2 u^2 + 4x_3^2 u^2 = 1\)

So the characteristic polynomial of \(B\) is \(\lambda^2 + 2\lambda + 1 = (\lambda+1)^2\), giving:

\[\lambda_1 = \lambda_2 = -1\]

All three eigenvalues are:

\[\boxed{\lambda_1 = \lambda_2 = \lambda_3 = -1}\]

b) Straightfoward computation.

Problem 4

One way to construct discrete time systems is as approximations to continuous time systems. Given \(\dot{x} = Ax\), we can approximate it as \(\frac{x_{t+1}-x_{t}}{\delta} =A x_t\), and rearranging gives

\[x_{t+1}=(I+\delta A) x_t=\hat{A} x_t.\]

Naturally, we expect that discretization shouldn’t change the stability of the system. Show that there exist a sufficiently small \(\delta>0\) such that \(\hat{A}\) is asympotically stable (in the discrete sense) if and only if \(A\) is asymptotically stable (in the continuous sense).

Solution.

Write \(\lambda = \alpha + i\beta\) for an eigenvalue of \(A\). Then:

\[\vert 1+\delta\lambda\vert^2 = \vert 1+\delta\alpha + i\delta\beta\vert^2 = (1+\delta\alpha)^2 + \delta^2\beta^2\] \[= 1 + 2\delta\alpha + \delta^2\alpha^2 + \delta^2\beta^2 = 1 + 2\delta\alpha + \delta^2\vert \lambda\vert^2\]

So the condition \(\vert 1+\delta\lambda\vert^2 < 1\) becomes:

\[2\delta\alpha + \delta^2\vert \lambda\vert^2 < 0 \implies \delta\left(2\alpha + \delta\vert \lambda\vert^2\right) < 0\]

Since \(\delta > 0\), this reduces to:

\[2\alpha + \delta\vert \lambda\vert^2 < 0\]

\((\Rightarrow)\) If \(\text{Re}(\lambda_i) < 0\) for all \(i\), then \(\hat{A}\) is stable for small enough \(\delta\):

Since \(A\) has finitely many eigenvalues, let \(\alpha_{\max} = \max_i \text{Re}(\lambda_i) < 0\). We need:

\[\delta < \frac{-2\alpha_i}{\vert \lambda_i\vert^2} \quad \text{for all } i\]

Since there are finitely many eigenvalues and each bound is strictly positive, choosing:

\[\delta < \min_i \frac{-2\alpha_i}{\vert \lambda_i\vert^2} > 0\]

ensures \(\vert 1+\delta\lambda_i\vert <1\) for all \(i\), so \(\hat{A}\) is asymptotically stable.

\((\Leftarrow)\) If \(\hat{A}\) is stable for some \(\delta > 0\), then \(A\) is stable:

Suppose for contradiction that some eigenvalue \(\lambda = \alpha+i\beta\) has \(\alpha \geq 0\). Then:

\[\vert 1+\delta\lambda\vert^2 = 1 + 2\delta\alpha + \delta^2\vert \lambda\vert^2 \geq 1\]

with equality only if \(\alpha = 0\) and \(\lambda = 0\), but then \(\vert 1+\delta\cdot 0\vert =1\). In all cases \(\vert 1+\delta\lambda\vert \geq 1\), contradicting discrete stability. So all eigenvalues of \(A\) must satisfy \(\text{Re}(\lambda_i)<0\).