Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/master/notebooks/Chapter 15 - Innear Product and Orthogonality.ipynb
Views: 449
The Dot Product
Consider two vectors
The dot product of and , i.e. is defined as
We can generate two random vectors, then let's compare operations in NumPy.
The Norm of a Vector
The norm is the length of a vector, defined by
The NumPy built-in function np.linalg.norm()
is used for computing norms. By default, it computes the length of vectors from the origin.
Verify the results.
This function can also compute a group of vectors' length, for instance , ,
Distance in
For and in , the distance between and written as dist is the length of the vector That is,
Suppose we have two vectors and , compute the distance and visualize the results.
From the graph, we know that the is .
The same results as the np.linalg.norm(u - v)
.
Orthogonal Vectors
We have two vectors and , and square the distance of and
Suppose and , visualize the vector and distances.
Note that if , and are orthogonal.According to equations above, it must be
This is one of the most important conclusion in linear algebra.
Suppose there is another vector , let's plot over the graph again.
Use SciPy built-in function, construct two matrices for holding head and tail coordinates of the vector.
Verify by NumPy .norm
.
Now Let's test if vector is perpendicular to and .
They are the same length, which means and .
Orthogonal Complements
In general, the set of all vectors that are orthogonal to subspace is called orthogonal complement, or denoted as .
The most common example would be
The nullspace of is perpendicular to the row space of ; the nullspace of is perpendicular to the column space of .
Angles in
Here is the formula of calculating angles in vector space, to derive it we need the law of cosine:
Rearrange, we get
In statistics, is called correlation coefficient.
Geometric Interpretation of Dot Product
You may have noticed that the terms dot product and inner product are sometimes used interchangeably. However, they have distinct meanings.
Functions and polynomials can also have an inner product, but we typically use the term dot product to refer to the inner product within vector spaces.
The dot product has an interesting geometric interpretation. Consider two vectors, and , pointing in different directions, with an angle between them. Suppose is a unit vector. We want to determine how much is aligned with the direction of .
This value can be calculated by projecting onto :
Any vector can be normalized to a unit vector . By performing the calculation above, we can determine how much is aligned with the direction of .
Orthogonal Sets
If a set of vectors in has any arbitrary pair to be orthogonal, i.e. whenever , is called an orthogonal set.
Naturally, orthogonal set is linearly independent, they are also an orthogonal basis for space spanned by . Orthogonal basis has an advantage is that coordinates of the basis can be quickly computed.
For instance any in ,
Because it is an orthogonal sets,
Thus
Orthogonal Projection
For any in , we want to decompose it as
where is perpendicular to , is a scalar.And the subspace spanned by , the projection of onto is denoted as
Becasue , then , replace by :
Now we get the formula for projection onto spanned by .
A Visual Example in
Suppose we have , . Plot onto the subspace spanned by .
Let's use formula to compute and .
With results above, we can plot the orthogonal projection.
The Orthogonal Decomposition Theorem
To generalize the orthogonal projection in the higher dimension , we summarize the idea into the orthogonal decomposition theorem.
Let be a subspace of . Then each in can be written uniquely in the form where is in and is in In fact, if is any orthogonal basis of then and .
In , we project onto subspace which is spanned by , here we generalize the formula for , that is projected onto which is spanned by .
A Visual Example in
A subspace , and a vector is not in , decompose into , and plot them.
where
The projection onto in is
The codes for plotting are quite redundent, however exceedingly intuitive.
Orthonormal Sets
An orthonormal set is obtained from normalizing the orthogonal set, and also called orthonomal basis.
Matrices whose columns form an orthonormal set are important for matrix computation.
Let , where 's are from an orthonormal set, the outer product, is
And is an inner product.
Because of orthonormal sets, we have
which means
Recall that we have a general projection formula
$$\hat{\mathbf{y}}=\frac{\mathbf{y} \cdot \mathbf{u}_{1}}{\mathbf{u}_{1} \cdot \mathbf{u}_{1}} \mathbf{u}_{1}+\cdots+\frac{\mathbf{y} \cdot \mathbf{u}_{p}}{\mathbf{u}_{p} \cdot \mathbf{u}_{p}} \mathbf{u}_{p} =\frac{\mathbf{y} \cdot \mathbf{u}_{1}}{{\mathbf{u}_{1}^T} \mathbf{u}_{1}} \mathbf{u}_{1}+\cdots+\frac{\mathbf{y} \cdot \mathbf{u}_{p}}{{\mathbf{u}_{p}^T} \mathbf{u}_{p}} \mathbf{u}_{p}\\$$If the is an orthonormal set, then
Cross Product
This is the formula of Cross product The output of cross product is the length of vector which is perpendicular to both and .