kevinlui's site
Diagonalization Symmetric Matrices and QR factorization
Theorem: If is a symmetric matrix, then eigenvectors associated to distinct eigenvalues are orthogonal.
Definition: A square matrix with orthonormal columns is called an orthogonal matrix.
Theorem: If is a square orthonormal matrix then is invertible and the inverse is the transpose - .
Proof: Let . Then the th entry of is which is 1 if and is if .
Questions:
What is the determinant of an orthogonal matrix? What does it tell you geometrically?
Is every matrix with determinant orthogonal?
How would you invert an orthogonal matrix?
Orthogonally Diagonalizable Matrices
Definition: A square matrix is orthogonally diagonalizable if there exists an orthogonal matrix and a diagonal matrix such that .
(give an example in class.)
Theorem: (Spectral Theorem) A matrix is orthogonally diagonalizable if and only if is symmetric.
It is easy to see that if is orthogonally diagonalizable then is symmetric. The converse is harder and won't be proved in this class.
(Work out example 3 on page 354)
QR Factorization
Theorem: (QR factorization) Let be an matrix with linearly independent columns. Then can be factored as where is a matrix with orthonormal columns and is an matrix with nonnegative diagonal.
See the book for a full proof.
The matrix is obtained by the Gram-Schmidt process. We can then obtain by computing . This will be upper triangular but the entries on the diagonal could be negative. But we can fix it.
This is useful for solving linear systems. Pivot manipulations can cause significant roundoff errors.
If is invertible and and , then so . This is now a triangular system which can be solved with backsubstituion.
(Work out example 4 on page 356)