An Interactive Applet powered by Sage.
(By Kim Kyoung Sik - 2012314138)
In the theory of vector spaces, a set of vectors is said to be linearly dependent if one of the vectors in the set can be defined as a linear combination of the others; if no vector in the set can be written in this way, then the vectors are said to be linearly independent. These concepts are central to the definition of dimension.
The vectors in a subset $\displaystyle S\,$ of a vector space $\displaystyle V\,$ are said to be linearly dependent, if there exist a finite number of distinct vectors $\displaystyle S=\{{\vec {v}}_{1},{\vec {v}}_{2},\dots ,{\vec {v}}_{n}\}$ scalars $\displaystyle a_{1},a_{2},\dots ,a_{k}$ not all zero
$\displaystyle a_{1}{\vec {v}}_{1}+a_{2}{\vec {v}}_{2}+\cdots +a_{k}{\vec {v}}_{k}={\vec {0}}$,
When $\displaystyle a_{1},a_{2},\dots ,a_{k} \in K$ , $\displaystyle c_{1}=\cdots =c_{n}=0$, linear independence.
Now consider the linear dependence of the two vectors $\displaystyle v_{1}\ = (1, 1)$, $\displaystyle v_{2}\ = (3, -2)$
$\displaystyle a_{1}{\begin{Bmatrix}1\\1\end{Bmatrix}}+a_{2}{\begin{Bmatrix}-3\\2\end{Bmatrix}}={\begin{Bmatrix}0\\0\end{Bmatrix}},$
or
$\displaystyle {\begin{bmatrix}1&-3\\1&2\end{bmatrix}}{\begin{Bmatrix}a_{1}\\a_{2}\end{Bmatrix}}={\begin{Bmatrix}0\\0\end{Bmatrix}}.$
This shows that $\displaystyle a_{i}=0$ which means that the vectors $\displaystyle v_{1}\ = (1, 1)$ $\displaystyle v_{2}\ = (3, -2)$ are linearly independent.
$\displaystyle A = {\begin{bmatrix}1&-3\\1&2\end{bmatrix}}.$
This depends on Determinant of $\displaystyle A\ $ ,which is.
$\displaystyle \det A=1\cdot 2-1\cdot (-3)=5\neq 0.\,\!$
Since the $\displaystyle det\neq 0 \,\!$, the vectors (1, 1) and (-3, 2) are linearly independent.