Before we are able to demonstrate the above lemma, we first define aPropositionThe set of diagonalizable linear transformations on finite dimensional complex space is dense in the set of all linear operators on that space.

**quotient space**. Let \(V\) be a finite dimensional vector space, and let \(W\subset V\) be a subspace. We define an equivalence relation \(\equiv\) with \(v_1\equiv v_2\) if there is some \(w\in W\) such that \(v_1=v_2+w\). We then define \(V/W\) to be the set of equivalent classes given by this relation, and call it a quotient space.

We define addition and scalar multiplication as \([v_1]+[v_2]=[v_1+v_2],\, a[v]=[av]\), and it can be easily verified that these operations are well defined and will make \(V/W\) a vector space. By taking a basis of \(W\) and \(V/W\) it is also easily verified that \(\dim V/W+\dim W=\dim V\). Note that this means that if extend a basis of \(W\) to be a basis of \(V\) that the equivalence classes of the added on vectors will form a basis of \(V/W\)

Now let \(T\colon V\to V\) be a linear operator on \(V\) such that \(W\) is mapped to itself. We define an operator \(\tilde T\colon V/W\to V/W\) given by \(\tilde T[v]=[Tv]\). It is easily verified that \(\tilde T\) is linear. Furthermore, take a basis of \(V\) to be such that we first list a basis of \(W\) and then vectors not in \(W\). The matrix representation of \(T\) is then $$\begin{pmatrix}A&B\\0&C\end{pmatrix}$$ where \(A,C\) are square matrices of size \(\dim W\times \dim W\) and \(\dim V/W\times \dim V/W\) respectively. It is easily seen then that the matrix representation of \(\tilde T\) will be \(C\) as above.

LemmaLet \(T\colon \mathbb{C}^n\to \mathbb{C}^n\) be a linear operator. Then there exists some basis in which its matrix representation is upper-triangular.

We are now nearly done with this section. For any (upper) triangular matrix, the eigenvalues are given by the diagonal entries. This means that any upper triangular matrix with distinct diagonals are diagonalizable. Given any upper triangular matrix, we may then find a diagonalizable arbitrarily close by equating them on the off-diagonal entries and having the diagonal entries of this new matrix be close enough to the original matrix, yet distinct. We are able to do this as a matrix has finitely many diagonal components, and we have infinitely many choice for entries in our approximating matrix. Thus, given any linear operator on \(\mathbb{C}^n\) we may find an arbitrarily close diagonalizable one (in fact, one with distinct eigenvalues) by representing our operator as an upper triangular one, and finding a nearby diagonalizable matrix as above (which represents some linear transformation). Since the operator norm and matrix norms are equivalent as was shown in part I, our first proposition is proven. \(\blacksquare\)Proof:We proceed by induction on dimension. The base case, where \(n=1\), is trivial. We now proceed to the inductive step.

Inductive step:Assume that any operator on \(n-1\) dimensional complex space may be represented as an upper-triangular matrix. Now, consider any linear operator \(T\colon \mathbb{C}^n\to \mathbb{C}^n\). By the fundamental theorem of algebra its characteristic polynomial must have at least one root, which implies the existence of at least one eigenvector with eigenvalue \(\lambda\). The span of this eigenvector will be a fixed space by the operator. Calling this space \(W\), we may look at our corresponding linear operator \(\tilde T\colon V/W\to V/W\). As we have just shown we may choose an appropriate basis of \(V\) and \(V/W\) such that $$T=\begin{pmatrix}\lambda&B\\0&C\end{pmatrix},\, \tilde T = C.$$ Since by our inductive assumption we may choose \(C\) to be an upper triangular matrix, this completes our proof.\(\blacksquare\)

## No comments:

## Post a Comment