Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/master/notebooks/Chapter 17 - Symmetric Matrices , Quadratic Form and Cholesky Decomposition.ipynb
Views: 449
Diagonalization of Symmetric Matrices
The first theorem of symmetric matrix:
If is symmetric, i.e. , then any two eigenvectors from different eigenspaces are orthogonal.
Because , so only condition which makes the above equation holds is
With the help of this theorem, we can conclude that if symmetric matrix has different eigenvalues, its corresponding eigenvectors must be mutually orthogonal.
The diagonalization of is
where is an orthonormal matrix with all eigenvectors of .
The second theorem of symmetric matrix:
An matrix is orthogonally diagonalizable if and only if is a symmetric matrix: .
An Example
Create a random matrix.
Make it symmetric.
Perform diagonalization with np.linalg.eig()
.
Check the norm of all eigenvectors.
Check the orthogonality of eigenvectors, see if
The Spectral Theorem
An symmetric matrix has the following properties:
has real eigenvalues, counting multiplicities.
The dimension of the eigenspace for each eigenvalue equals the multiplicity of as a root of the characteristic equation.
The eigenspaces are mutually orthogonal, in the sense that eigenvectors corresponding to different eigenvalues are orthogonal.
is orthogonally diagonalizable.
All these properties are obvious without proof, as the example above shows.However the purpose of the theorem is not reiterating last section, it paves the way for spectral decomposition.
Write diagonalization explicitly, we get the representation of spectral decomposition
are rank symmetric matrices, because all rows of are multiples of .
Following the example above, we demonstrate in SymPy.
Check rank of by np.linalg.matrix_rank()
.
Use spectral theorem to recover :
Quadratic Form
A quadratic form is a function with form , where is an symmetric matrix, which is called the the matrix of the quadratic form.
Consider a matrix of quadratic form
construct the quadratic form .
Fortunately, there is an easier way to calculate quadratic form.
Notice that coefficients of is on the principal diagonal and coefficients of are be split evenly between and -entries in .
Example
Consider another example,
All 's terms are
whose coefficients are from principal diagonal.
All 's terms are
Add up together then quadratic form is
Let's verify in SymPy.
The results is exactly the same as we derived.
Change of Variable in Quadratic Forms
To convert a matrix of quadratic form into diagonal matrix can save us same troubles, that is to say, no cross products terms.
Since is symmetric, there is an orthonormal that
We can show that
where defined a coordinate transformation and .
Consider
Find eigenvalue and eigenvectors.
Test if is normalized.
We can compute
So the is
The transformed quadratic form is
Visualize the Quadratic Form
The codes are exceedingly lengthy, but intuitive.
Som terms to need to be defined, a quadratic form is:
positive definite if for all
negative definite if for all
positive semidefinite if for all
negative semidefinite if for all
indefinite if assumes both positive and negative values.
We have a theorem for quadratic forms and eigenvalues:
Let be an symmetric matrix. Then a quadratic form is:
positive definite if and only if the eigenvalues of are all positive
negative definite if and only if the eigenvalues of are all negative
indefinite if and only if has both positive and negative eigenvalues
With the help of this theorem, we can immediate tell if a quadratic form has a maximum, minimum or saddle point after calculating the eigenvalues.
Positive Definite Matrix
Symmetric matrices are one of the most important matrix form in linear algebra, we will show they are always positive definite.
is a symmetric matrix, premultiplying by
must be positive, since we defined so, then must be larger than .
Try asking the other way around: if all eigenvalues are positive, is positive definite? Yes.
Here is the Principal Axes Theorem which employs the orthogonal change of variable :
If all of 's are positive, is also positive.
Cholesky Decomposition
Cholesky decomposition is modification of decomposition. And it is more efficient than algorithm.
If is positive definite matrix, i.e. or every eigenvalue is strictly positive. A positive definite matrix can be decomposed into a multiplication of lower triangular matrix and its transpose.
We will show this with NumPy.
Check if
Some Facts of Symmetric Matrices
Rank and Positive Definiteness
If a symmetric matrix does not have full rank, which means there must be a non-trivial vector satisfies
which also means the quadratic form equals zero . Thus can not be a positive definite matrix if it does not have full rank.
Contrarily, a matrix to be positive definite must have full rank.