Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/master/notebooks/Chapter 13a - Diagonalization.ipynb
Views: 449
Similarity
If , we say is similar to , decomposing into is also called a similarity transformation, is an invertable matrix.
If matrices and are similar, they have the same eigenvalues.
Here are some reasons that we need similarity transformation:
Invariant Properties: Similar matrices share many important properties, such as eigenvalues, determinant, trace, and rank. This invariance makes similarity transformations useful for simplifying problems without changing their fundamental characteristics.
Changing Basis: Similarity transformations can be interpreted as changing the basis in which the linear transformation represented by the matrix is expressed. This change of basis can make the problem easier to understand or solve.
Applications: Similarity transformations are widely used in various applications, including diagonalization, canonical forms, and simplifying differential equations.
The diagnoalization, which we will explain below, is a special case of similarity transformation.
Diagonalizable Matrix and Special Decompositon
Let be an matrix. If there exists an invertible matrix and a diagonal matrix , such that
then matrix is called a diagonalizable matrix.
And further, the columns of are linearly independent eigenvectors of , and its corresponding eigenvalues are on the principal diagonal of . In other words, is diagonalizable if and only if the dimension of eigenspace basis is .
Let's show why this equation holds. Define and
where is an eigenvector of , is an eigenvalue of .
We know that , i.e.
Since has all independent eigenvectors, then
Strictly speaking, if is symmetric, i.e. , the procedure above called Spectral decomposition, the similar matrix holds all the eigenvalues on its diagonal. And is orthogonal matrix, which means any of of its two columns are perpendicular. Therefore it could be rewritten as
We can show why all eigenvectors are orthogonal to each other: Set up two equations Take the transpose of the first equation and multiply by : Since is symmetric :
Substitute into the equation: Simplify to get:
Since : This implies that , meaning and are orthogonal.
Correlation/covariance matrix is a good example, which is symmetric and square, its all eigenvalues are real.
Spectral Decomposition Visualization
Let's visualize spectral decomposition, and are for rotation because they orthogonal, and are for scaling because it's diagonal.
Diagonalizing a Matrix
Consider a matrix
We seek to diagonalize the matrix .
Following these steps:
Compute the eigenvalues of
Compute the eigenvectors of
Construct .
Construct from the corresponding columns of .
Reminder the return value takes the form [(eigenval, multiplicity, eigenspace), ...]
.
Construct
Construct
We can verify if holds:
Of course we don't need to go through this process seperately. There is diagonalize
method in SymPy.
We obtain the same results as previous separate steps.
Sometimes we just want to test if a matrix is diagonalizable, then use is_diagonalizable
in SymPy.
If is symmetric, all of its eigenvectors are orthogonal.
How Diagonalization Simplifies the Solution Process
Original System: The original system of differential equations is coupled, meaning that each ed involves multiple variables. For example: where is a matrix, and is a vector of variables.
By finding the eigenvalues and eigenvectors of the matrix , we can decomp into , where is a diagonal matrix of eigenvalues and is the matri eigenvectors: We introduce a change of variables using the eigenvector matrix :
Substituting this into the original differential equation gives:
Simplifying, we get:
Since :
Multiplying both sides by : The transformed system is now decoupled because is a diagonal matrix. This means that each differential equation in the system involves only a single variable and can be solved independently:
where are the eigenvalues of . Solving the Decoupled System: The decoupled differential equations are simple first-order linear differential equations, which have straightforward solutions: where are constants determined by the initial conditions. Finally, we transform the solution back to the original variables using the eigenvector matrix :
This gives us the solution to the original system.