Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/master/notebooks/Chapter 4 - LU Factorization.ipynb
Views: 449
LU Factorization
LU factorization arises due its computational efficiency, it mainly facilitates solving the system of linear equations, though the flops (number of floating operations) of LU is still higher than Guassian-Jordon.
Nonetheless LU factorization is particularly handy if you are computing multiple solutions of .
One example is, if you have a set of to substitute in the system, such that we only decompose once, but Guassian-Jordon algorithm will have to re-do operations with every augmented .
LU Algorithm
We aim to decompose a matrix into and , which represent lower and upper triangular matrix respectively. And must have 's on its principal diagonal. For instance, $$ A=\underbrace{\left[\begin{array}{cccc} 1 & 0 & 0 & 0 \
& 1 & 0 & 0 \
& * & 1 & 0 \
& * & * & 1 \end{array}\right]}{L}\underbrace{\left[\right]}{U} $$
The plausibility of decomposition hinges on if can be converted into a upper triangular matrix after a series of row operations, i.e.
Then
where
We will hand calculate an example here:
Or we can compute and directly, because
Construct augmented matrices for and
And we get the inverse of and :
The final result of LU decomposition:
We can take a look at SciPy results.
It is easy to notice the SciPy lu
function give us more than and , but also , which is a permutation matrix. Theoretically, we don't need row switches to convert into , but in some situations we must make row switches beforehand, otherwise decomposition will fail to materialize.
Thus Scipy uses decomposition to make the procedure always possible
Also , why? It's easier to analyze in augmented matrix expression, the inverse of row-switched elementary matrices are themselves.
With these knowledge, what we are decomposing is actucally
Rearrange that we can see
Solving Linear System by Using LU Factorization
Solve the linear system: In matrix form: $$ \underbrace{\left[\right]}_{A} \left[\right]
\underbrace{\left[\right]}_{b} $$ Perform $LU$ decomposition:
Replace by , we get , now solve this pair of equations
Construct augmented matrix
Next we solve , to show this in NumPy:
If the process is correct, this is the solution of the system and we can verify results by invoking np.linalg.solve()
.
The results are the same, decomposition works!
Cholesky Factorization
Cholesky decomposition is analogous to taking the square root of a matrix. A common example is the decomposition of a covariance matrix. In this context, the matrix plays a role similar to that of the correlation matrix.
Suppose we have a vector of uncorrelated random variables . Each element of is a standard normal random variable with zero mean and unit variance. In mathematical terms, , where is the identity matrix.
We can transform non-correlated series into corelated
To understand this, we examine the covariance matrix of the transformed vector ( x ): This shows that has the desired covariance structure.
Let's see a numerical example.