Theorem. Let \(V\) be a finite-dim'l real ips, and let \(T\) be a linear operator on \(V\). Then there exists an onb \(\beta\) for \(V\) s.t.
\[[T]_{\beta}=\small\begin{pmatrix} A_1& & & & &\\ & \ddots& & & * &\\ & & A_r & & &\\ & & & c_1 & &\\ & & & & \ddots &\\ & & & & & c_k \end{pmatrix}\]where \(c_j\in \mathbb{R}\ (j=1,\cdots,k),\ A_j\in \mathbb{R}^{2\times 2}\ (j=1,\cdots,r)\) and every eval of \(A_j\) has the form \(a_j+b_j\sqrt{-1}\ (a_j,b_j\in \mathbb{R}, b_j\neq 0)\).
Proof. Induction on \(n=\dim V\). When \(n=0,1\), trivial. Assume that for any \(l-\)dim'l (\(l\le n-1\)) real ips the statement is true.
I. If \(V\) has a \(1-\)dim'l \(T\)-inv subspace \(W\), then \(T\) has an espace \(E_{\lambda}^{T}\ (\lambda\in \mathbb{R})\). Note that \(W:=E_{\lambda}^{(T^*)}\) is an espace of \(T^*\) and hence \(T^*\)-inv. Thus \(W^{\perp}\) is \(T\)-inv and \(\dim W^{\perp}=n-1\).
By ih, there exists an onb \(\gamma\) for \(W^{\perp}\) s.t. \([T_{W^{\perp}}]_{\gamma}\) is of the form in the theorem's statement.
Let \(z\) be a unit vector in \(W\). Then \(\beta=\gamma\cup\{z\}\) is an onb for \(V\) s.t.
\[[T]_{\beta}=\begin{pmatrix}\begin{matrix}[T_{W^{\perp}}]_{\gamma}\\O_{1\times (n-1)}\end{matrix}\ |\ \phi_{\beta}(T(z))\end{pmatrix} \]is of the desired form. Besides, as shown in the refined Schur Theorem, we have \([T]_{\beta}(n,n)=\lambda\). Hence the theorem can be refined: we may require \(c_k\)'s to be evals of \(T\).
II. If \(V\) has no \(T\)-inv subspaces of \(\dim=1\). Then \(T\) has no eval (real). Let \(\alpha\) be an arbitrary onb for \(T\). Denote \(A=[T]_{\alpha}\) and \(L_A\) the left-multiplication transformation on \(\mathbb{R}^n\): \(L_A(y)=Ay,\forall y\in \mathbb{R}^n\). Then \(L_A\) has no eval (real).
It's equivalent to prove that there exists an onb \(\beta'\) for \(\mathbb{R}^n\) s.t. \([L_A]_{\beta'}\) is of the form in the theorem's statement.
Define \(U\) the left-multiplication transformation on \(\mathbb{C}^n:U(y)=Ay,\forall y\in \mathbb{C}^n\). Then \(U\) has an espace \(E_{\lambda}\), where \(\lambda=\lambda_1+i\lambda_2\ (\lambda_1,\lambda_2\in \mathbb{R}, \lambda_2\neq 0)\). By the property of adjoint operators, \(U^*\) has an evec \(x=x_1+ix_2\) corresponding to eval \(\overline{\lambda}=\lambda_1-i\lambda_2\).
Note that \([U]_{\{e_1,\cdots,e_n\}}=A\), we have \([U^*]_{\{e_1,\cdots,e_n\}}=A^*\) and hence \(U^*(y)=A^*y,\forall y\in \mathbb{C}^n\). Then \(A^*x_1=\lambda_1x_1+\lambda_2x_2, A^*x_2=\lambda_1x_2-\lambda_2x_1\). Since \(\lambda_2\neq 0\), by complexification procedure, \(\{x_1,x_2\}\) is linearly independent. Hence \(W:=\text{span}(\{x_1,x_2\})\) is a \(2-\)dim'l \(L_{A^*}-\) and so \(L_A^*\)-inv subspace of \(\mathbb{R}^n\).
Thus \(W^{\perp}\) taken in \(\mathbb{R}^n\) is an \((n-2)-\)dim'l \(L_A\)-inv subspace of \(\mathbb{R}^n\). By ih there exists an onb \(\gamma\) for \(W^{\perp}\) s.t. \([L_A|W^{\perp}]_{\gamma}\) is of the form in the statement of the theorem. Let \(\beta'=\gamma\cup\{x_1',x_2'\}\), where \(\{x_1',x_2'\}\) is an onb for \(W\), then \(\beta'\) is an onb for \(\mathbb{R}^n\) and
\[[L_A]_{\beta'} =\begin{pmatrix} \begin{matrix}[L_A|W^{\perp}]_{\gamma}\\O_{2\times (n-2)}\end{matrix}\ |\ \phi_{\beta'}(Ax_1')\ |\ \phi_{\beta'}(Ax_2')\\ \end{pmatrix} =\begin{pmatrix} [L_A|W^{\perp}]_{\gamma} & * \\ O_{2\times (n-2)} & [L_A|W]_{\{x_1',x_2'\}} \end{pmatrix}\]Note that \(\langle Ax_i' , x_j' \rangle=\langle x_i' , A^*x_j' \rangle=\langle A^*x_j' , x_i' \rangle\), we have \([L_A|W]_{\{x_1',x_2'\}}=[L_A^*|W]_{\{x_2',x_1'\}}\). The latter is similar to $$[L_A^*|W]_{{x_1,x_2}}=\begin{pmatrix}\lambda_1 & -\lambda_2 \
\lambda_2 & \lambda_1\end{pmatrix}$$and hence have evals \(\lambda_1\pm i\lambda_2\).
Note that \(L_A|W^{\perp}\) has no eval (real). Hence by another induction we see that \(\beta'\) can be chosen so that
\[[L_A]_{\beta'}=\small\begin{pmatrix} \begin{matrix} A_1 & \\ & A_2 \end{matrix} & * \\ & \begin{matrix} \ddots & \\ & A_r \end{matrix} \end{pmatrix}\]where \(A_j\in \mathbb{R}^{2\times 2}\ (j=1,\cdots,r)\) and every eval of \(A_j\) has the form \(a_j+b_j\sqrt{-1}\ (a_j,b_j\in \mathbb{R},b_j\neq 0)\).
This fulfills the proof in this case. Besides, the theorem can be refined: we may require the evals of each \(A_j\) as a complex matrix to be evals of \([T]_{\alpha}\) as a matrix in \(\mathbb{C}^{n\times n}\).
Corollary. Every \(n\times n\) real matrix \(A\) is orthogonally similar to an upper triangular block matrix
\[C=\small\begin{pmatrix} A_1& & & & &\\ & \ddots& & & * &\\ & & A_r & & &\\ & & & c_1 & &\\ & & & & \ddots &\\ & & & & & c_k \end{pmatrix}\]where \(c_j\in \mathbb{R}\ (j=1,\cdots,k),\ A_j\in \mathbb{R}^{2\times 2}\ (j=1,\cdots,r)\) and every eval of \(A_j\) has the form \(a_j+b_ji\ (a_j,b_j\in \mathbb{R},b_j\neq 0)\).
Corollary. Let \(V\) be a finite-dim'l real ips, and let \(T\) be an orthogonal operator on \(V\). Then there exists an onb \(\beta\) for \(V\) s.t.
\[[T]_{\beta}=\small\begin{pmatrix} A_1& & & & &\\ & \ddots& & & &\\ & & A_r & & &\\ & & & c_1 & &\\ & & & & \ddots &\\ & & & & & c_k \end{pmatrix}\]where \(c_j\in \mathbb{R}\ (j=1,\cdots,k),\ A_j\in \mathbb{R}^{2\times 2}\ (j=1,\cdots,r)\) and every eval of \(A_j\) is of the form \(\cos\theta_j\pm\sqrt{-1}\sin\theta_j\ (\theta_j\in \mathbb{R})\).
标签:mathbb,begin,matrix,Orthogonal,beta,Quasi,pmatrix,Triangularization,lambda From: https://www.cnblogs.com/chaliceseven/p/16853190.html