Views: 11
Visibility: Unlisted (only visible to those who know the link)
Image: ubuntu2004
Kernel: SageMath 9.1

### Projections

from Strang's Introduction to Linear Algebra, 5th ed.

Section 4.2

#### Projection Onto a Line

Projecting a vector $\textbf{b}$ onto a vector $\textbf{a}$ will result in a vector that is in the direction of $\textbf{a}$, but scaled. The scalar can be represented by $\hat{x}$, and hence the projection can be written $\mathbf{p}=\hat{x}\mathbf{a}$. The error vector $\textbf{e}$ is the shortest distance to $\textbf{a}$, and is perpendicular.

Projecting $\textbf{b}$ onto $\textbf{a}$ with error $\mathbf{e} = \mathbf{b} - \hat{x}\mathbf{a}$

$a \cdot (b-\hat{x}a) \text{ or } a \cdot b - \hat{x}a \cdot a = 0$$\hat{x} = \frac{a \cdot b}{a \cdot a} = \frac{a^T b}{a^T a}$

Note, Sage does not distinguish between column and row vectors, and interprets them as needed in operations.

b = vector([1,1,1])
a = vector([1,2,2])
xhat = (a.dot_product(b))/(a.dot_product(a))
p = xhat*a
print("xhat = ", xhat)
print("p = ", p)


$\hat{x}$ is determined by both $\mathbf{b}$ and $\mathbf{a}$. What we need next is the projection matrix $P$ that gives $\mathbf{p} = P \mathbf{b}$. The matrix $P$ is a transform for projecting ANY vector onto $\mathbf{a}$.

$p = \hat{x}a = a \hat{x} = a \frac{a^T b}{a^T a} = \frac{a a^T}{a^T a} b = P b$

Note that in the numerator we have a column times a row, therefore an $n\times n$ matrix.

P = (matrix(QQ, 3, a)*matrix(QQ, 1, a))/(a*a)
P

P * b


#### Projection Onto a Subspace

To find the projection $\mathbf{p}$ of vector $\mathbf{b}$ onto the subspace $A$, we'll proceed as before. First find $\hat{x}$ in $\mathbf{p} = \hat{x}\mathbf{a}$, then find $\mathbf{p}$, then find $P$ in $\mathbf{p} = P \mathbf{b}$.

As before, the error vector $\mathbf{e} = \mathbf{b} - A \hat{x}$ is perpendicular to the subspace.

$\mathbf{e} \cdot A = A \cdot \mathbf{e} = A^T \mathbf{e} = 0$$A^T (\mathbf{b}-A \hat{x}) = 0$

from Strang, v.5

The combination $\mathbf{p} = \hat{x_1}\mathbf{a_1} + \dots \hat{x_n}\mathbf{a_n}$ that is closest to $\mathbf{b}$:

$\textbf{Find } \hat{x} (n\times 1) \text{ where } A^T(\mathbf{b}-A\hat{x}) = \mathbf{0} \text{ or } A^T b = A^T A \hat{x}$

solve for $\hat{x}$ :

$(A^T A)^{-1} A^T \mathbf{b} = (A^T A)^{-1} A^T A \hat{x}$$\hat{x} = (A^T A)^{-1} A^T \mathbf{b}$

$\textbf{Find } \mathbf{p} (m\times 1) \text{ where } \mathbf{p} = A\hat{x} = A(A^T A)^{-1}A^T\mathbf{b}$

$\textbf{Find } P (m \times m) \text{ where } P = A(A^T A)^{-1}A^T \text{ and } \mathbf{p} = P \mathbf{b}$

b = vector([6, 0, 0])
A = matrix(QQ, 3, [1, 0,
1, 1,
1, 2,])

A.transpose() * A

A.transpose() * b

# solve for xhat
xhat = (A.transpose() * A) \ (A.transpose() * b)
xhat

# find projection p
A * xhat

##### from problem set 4.2

problem 5. Find the projection matrix P onto the lines $a_1 = \langle -1, 2, 2\rangle$ and $a_2 = \langle 2, 2, -1\rangle$. Find $P_1 P_2$.

a1 = vector([-1, 2, 2])
P1 = (matrix(QQ, 3, a1)*matrix(QQ, 1, a1))/(a1*a1)
P1

a2 = vector([2, 2, -1])
P2 = (matrix(QQ, 3, a2)*matrix(QQ, 1, a2))/(a2*a2)
P2

P1 * P2


problem 6. Project $b = \langle 1, 0, 0\rangle$ onto $a_1$, $a_2$, and $a_3 = \langle 2, -1, 2\rangle$. Find $p_1 + p_2 + p_3$.

b = vector([1, 0, 0])
p1 = P1 * b
p1

p2 = P2 * b
p2

a3 = vector([2, -1, 2])
P3 = (matrix(QQ, 3, a3)*matrix(QQ, 1, a3))/(a3*a3)
p3 = P3 * b
p3

p1 + p2 + p3


problem 7. Show that $P_1 + P_2 + P_3 = I$.

P1 + P2 + P3