Skip to main content

Euclidean vector spaces

Section 4.3 Bases

Now that we have the notions of span and linear independence in place, we simply combine them to define a basis of a vector space. In the spirit of SectionΒ 4.2, a basis of a vector space \(V\) should be understood as a minimal spanning set.
This section includes a number of theoretical results. There are two in particular that are worth highlighting, especially in regard to computational techniques for abstract vector spaces:
  1. If \(B\) is a basis of \(V\) containing exactly \(n\) elements, then any other basis \(B'\) also contains exactly \(n\) elements. (TheoremΒ 4.4.11)
  2. If \(B\) is a basis for \(V\text{,}\) then every element of \(V\) can be written as a linear combination of elements of \(B\) in a unique way. (TheoremΒ 4.3.4)
The first result allows us to define the dimension of a vector space as the number of elements in any given basis. The second result allows us to take any \(n\)-dimensional vector space \(V\) with chosen basis \(B=\{\boldv_1, \boldv_2, \dots, \boldv_n\}\) and effectively identify vectors \(\boldv\in V\) with the sequence \((c_1,c_2,\dots, c_n)\in \R^n\text{,}\) where
\begin{equation*} \boldv=c_1\boldv_1+c_2\boldv_2+\cdots c_n\boldv_n\text{.} \end{equation*}
This observation has the following consequence: given any \(n\)-dimensional vector space \(V\text{,}\) no matter how exotic, once we choose a basis \(B\) of \(V\text{,}\) we can reduce any and all linear algebraic questions or computations about \(V\) to a corresponding question in \(\R^n\text{.}\) We will elaborate this idea further in SectionΒ 5.3.

Subsection 4.3.1 Bases of vector spaces

Remark 4.3.2. Some standard bases.

The examples of standard spanning sets in RemarkΒ 4.2.8 are easily seen to be linearly independent, and hence are in fact bases. We list them again here, using the same notation, and refer to these as standard bases for the given spaces.
  • Zero space.
    Let \(V=\{\boldzero\}\text{.}\) The empty \(B=\emptyset=\{ \}\) is a basis for \(V\text{.}\) Note that \(B=\emptyset\) spans \(V\) by definition (DefinitionΒ 4.2.1), and it satisfies the defining implication of linear independence (DefinitionΒ 4.2.10) trivially.
  • Tuples.
    Let \(V=\R^n\text{.}\) The set \(B=\{\bolde_1, \bolde_2,\dots, \bolde_n\}\) is the standard basis of \(\R^n\text{.}\)
  • Matrices.
    Let \(V=M_{mn}\text{.}\) The set \(B=\{E_{ij}\colon 1\leq i\leq m, 1\leq j\leq n\}\) is the standard basis of \(M_{mn}\text{.}\)
Just as with spanning sets, bases are not in general unique: in fact, for any nonzero vector space there are infinitely many different bases.

Example 4.3.3. Some nonstandard bases.

For each \(V\) and \(B\) below, verify that \(B\) is a basis of \(V\text{.}\)
  1. \(V=\R^2\text{,}\) \(B=\{(1,1), (1,-1)\}\text{.}\)
  2. \(V=M_{22}\text{,}\)
    \begin{equation*} B=\left\{ A_1=\begin{bmatrix}0\amp 1\\ 1\amp 1 \end{bmatrix} , A_2=\begin{bmatrix}1\amp 0\\ 1\amp 1 \end{bmatrix} , A_3=\begin{bmatrix}1\amp 1\\ 0\amp 1 \end{bmatrix} , A_4=\begin{bmatrix}1\amp 1\\ 1\amp 0 \end{bmatrix} \right\}\text{.} \end{equation*}
Solution.
Each verification amounts to showing, using the techniques from SectionΒ 4.2, that the given \(B\) spans the given \(V\) and is linearly independent.
  1. Since neither element of \(B=\{(1,1), (1,-1)\}\) is a scalar multiple of the other, the set is linearly independent. To see that \(B\) spans \(\R^2\) we show that for any \((c,d)\in \R^2\) we have
    \begin{equation*} a(1,1)+b(1,-1)=(c,d) \end{equation*}
    for some \(a,b\in \R\text{.}\) Indeed we may take \(a=\frac{1}{2}(c+d)\) and \(b=\frac{1}{2}(c-d)\text{.}\) (These formulas were obtained by solving the corresponding system of two equations in the unknowns \(a\) and \(b\text{.}\))
  2. First we show that \(B=\{A_1,A_2,A_3,A_4\}\) spans \(M_{22}\text{.}\) Given an arbitrary element
    \begin{equation*} A=\begin{bmatrix}a\amp b\\ c\amp d\end{bmatrix}\text{,} \end{equation*}
    we must show that there exist scalars \(c_1,c_2,c_3,c_4\in \R\) satisfying
    \begin{equation*} c_1A_1+c_2A_2+c_3A_3+c_4A_4=A\text{.} \end{equation*}
    Expanding out the left side of the above equality, we would have
    \begin{equation*} \begin{bmatrix} c_2+c_3+c_4 \amp c_1+c_3+c_4\\ c_1+c_2+c_4\amp c_1+c_2+c_3 \end{bmatrix}= \begin{bmatrix} a\amp b\\ c\amp d \end{bmatrix}\text{.} \end{equation*}
    Thus we have \(A\in \Span B\) if and only if the linear system with augmented matrix
    \begin{equation*} \begin{amatrix}[rrrr|r] 0\amp 1\amp 1\amp 1 \amp a\\ 1\amp 0\amp 1\amp 1 \amp b \\ 1\amp 1\amp 0 \amp 1 \amp c\\ 1\amp 1\amp 1\amp 0\amp d \end{amatrix} \end{equation*}
    is consistent. This augmented matrix row reduces to
    \begin{equation*} \begin{amatrix}[rrrr|r] \boxed{1}\amp 0\amp 1\amp 1 \amp b\\ 0\amp \boxed{1}\amp 1\amp 1 \amp a \\ 0\amp 0\amp \boxed{1} \amp 1/2 \amp \frac{1}{2}(a+b-c)\\ 0\amp 0\amp 0\amp \boxed{1}\amp \frac{1}{3}(a+b+c-2d) \end{amatrix}\text{.} \end{equation*}
    Since there is no leading one in the last column, we see that the corresponding system is consistent, and thus \(A\in \Span B\text{,}\) as desired.
    Turning to linear independence of \(B\text{,}\) we now endeavor to show that the only solution to
    \begin{equation*} c_1A_1+c_2A_2+c_3A_3+c_4A_4=\begin{bmatrix} 0\amp 0\\ 0\amp 0 \end{bmatrix} \end{equation*}
    is the trivial one \(c_1=c_2=c_3=c_4=0\text{.}\) Just as above, such a solution corresponds to a solution to the linear system with augmented matrix
    \begin{equation*} \begin{amatrix}[rrrr|r] 0\amp 1\amp 1\amp 1 \amp 0\\ 1\amp 0\amp 1\amp 1 \amp 0 \\ 1\amp 1\amp 0 \amp 1 \amp 0\\ 1\amp 1\amp 1\amp 0\amp 0 \end{amatrix}\text{,} \end{equation*}
    which row reduces to
    \begin{equation*} \begin{amatrix}[rrrr|r] \boxed{1}\amp 0\amp 1\amp 1 \amp 0\\ 0\amp \boxed{1}\amp 1\amp 1 \amp 0 \\ 0\amp 0\amp \boxed{1} \amp 1/2 \amp 0\\ 0\amp 0\amp 0\amp \boxed{1}\amp 0 \end{amatrix}\text{.} \end{equation*}
    Since the first four columns of this matrix contain leading ones, none of the unknowns \(c_i\) is free, which means that \((c_1,c_2,c_3,c_4)=(0,0,0,0)\) is the unique solution to the system. This proves that
    \begin{equation*} c_1A_1+c_2A_2+c_3A_3+c_4A_4=\boldzero\implies c_1=c_2=c_3=c_4=0\text{,} \end{equation*}
    as desired.
Proceeding directly from the definition, to show a set \(B\) is a basis of \(V\) we have to do two steps: (i) show \(\Span B= V\text{;}\) (ii) show that \(B\) is linearly independent. The following theorem gives rise to a convenient one-step technique for proving a finite set \(B\) is a basis: show that every element of \(V\) can be written as a linear combination of elements of \(B\) in a unique way.

Proof.

Implication: \((1)\implies (2)\).
Suppose \(B\) is a basis. By definition, \(B\) spans \(V\text{,}\) and so every element of \(V\) can be written as a linear combination of elements of \(B\text{.}\) It remains to show that this linear combination is unique in the sense described. This follows from the fact that \(B\) is linearly independent. Indeed, if
\begin{equation*} c_1\boldv_1+c_2\boldv_2+\cdots +c_n\boldv_r=d_1\boldv_1+d_2\boldv_2+\cdots +d_r\boldv_n\text{,} \end{equation*}
then after some algebra we have
\begin{equation*} (c_1-d_1)\boldv_1+(c_2-d_2)\boldv_2+\cdots +(c_n-d_n)\boldv_n=\boldzero\text{.} \end{equation*}
Since \(B\) is linearly independent and since the \(\boldv_i\) are distinct, we must have \(c_i-d_j=0\text{,}\) and hence \(c_i=d_i\) for all \(1\leq i\leq n\text{.}\)
Implication: \((2)\implies (1)\).
If \(B\) satisfies (2), then clearly it spans \(V\text{.}\) The uniqueness of linear combinations of elements of \(B\) now easily implies \(B\) is linearly independent:
\begin{align*} c_1\boldv_1+c_2\boldv_2+\cdots c_n\boldv_n=\boldzero \amp \implies c_1\boldv_1+c_2\boldv_2+\cdots c_r\boldv_n=0\boldv_1+0\boldv_2+\cdots 0\boldv_n\\ \amp \implies c_1=0, c_2=0, \dots, c_n=0\text{,} \end{align*}
where the last step uses the fact that \(\boldzero=0\boldv_1+\cdots +0\boldv_n\) is the unique expression of \(\boldzero\) as a linear combination of the elements of \(B\text{.}\)
TheoremΒ 4.3.4 yields the following one-step technique for proving a set is a basis.

Example 4.3.6. One-step technique for \(\R^3\).

Use the one step technique to decide whether the set
\begin{equation*} S=\{\boldv_1=(1,1,-3), \boldv_2=(1,0,-1), \boldv_3=(-1,1,-1), \boldv_4=(1,2,1)\} \end{equation*}
is a basis of \(\R^3\text{.}\)
Solution.
We ask whether for all elements \(\boldy=(a,b,c)\in \R^3\) we can write
\begin{equation} \boldy=c_1\boldv_1+c_2\boldv_2+c_3\boldv_3+c_4\boldv_4\tag{4.10} \end{equation}
for a unique choice of \(c_1,c_2,c_3, c_4\text{.}\) This is equivalent to asking whether the matrix equation
\begin{equation*} \underset{A}{\begin{amatrix}[rrrr] 1\amp 1\amp -1\amp 1\\ 1\amp 0\amp 1\amp 2\\ -3\amp -1\amp -1 \amp 1 \end{amatrix}}\, \underset{\boldx}{\colvec{c_1\\ c_2\\ c_3\\ c_4}}=\underset{\boldy}{\colvec{a\\ b\\ c}}\text{.} \end{equation*}
has a unique solution \(\boldx=(c_1,c_2,c_3,c_4)\) for any choice of \(\boldy=(a,b,c)\text{.}\) Performing Gaussian elimination on the corresponding augmented matrix yields
\begin{equation*} \begin{amatrix}[rrrr|r] 1\amp 1\amp -1\amp 1\amp a \\ 1\amp 0\amp 1\amp 2\amp b\\ -3\amp -1\amp -1 \amp 1 \amp c \end{amatrix} \xrightarrow{\phantom{row}}U= \begin{amatrix}[rrrr|r] \boxed{1}\amp 1\amp -1\amp 1\amp a \\ 0\amp \boxed{1}\amp -2\amp -1\amp a-b\\ 0\amp 0\amp 0\amp \boxed{1} \amp (a+2b+c)/6 \end{amatrix}\text{.} \end{equation*}
Since the third column of \(U\) does not have a leading one, we conclude that the corresponding system has a free variable, and hence that for any given \((a,b,c)\in \R^3\) the equation (4.10) has either no solutions (inconsistent) or infinitely many solutions. In particular, it is not true that there is always a unique solution. Thus \(S\) is not a basis according to the one-step technique.
In fact, our Gaussian elimination analysis tells us exactly how \(S\) fails to be a basis. Since the last column of \(U\) does not have a leading one, the corresponding system is always consistent: i.e., there is always at least one solution \(\boldx=(c_1,c_2,c_3,c_4)\) to (4.10) for each \((a,b,c)\in \R^3\text{.}\) This tells us that \(S\) is a spanning set of \(\R^3\text{.}\) On the other hand, the existence of the free variable tells us that for \((a,b,c)=(0,0,0)=\boldzero\text{,}\) we will have infinitely many choices \(c_1,c_2,c_3,c_4\) satisfying
\begin{equation*} c_1\boldv_1+c_2\boldv_2+c_3\boldv_3+c_4\boldv_4=\boldzero\text{.} \end{equation*}
This shows that \(S\) is not linearly independent.

Example 4.3.7. Video example: deciding if a set is a basis (\(\R^n\)).

Figure 4.3.8. Video: deciding if a set is a basis (\(\R^n\))

Example 4.3.9. Video example: deciding if a set is a basis (\(M_{nn}\)).

Figure 4.3.10. Video: deciding if a set is a basis
Besides deciding whether a given set is a basis, we will often want to come up with a basis of a given space on our own. The following β€œby inspection” technique often comes in handy in cases where the elements of the vector space can be described in a simple parametric manner.

Example 4.3.12. By-inspection basis technique.

Use ProcedureΒ 4.3.11 to compute a basis of the subspace \(W\subseteq \R^5\) defined as
\begin{equation*} W=\{(x_1,x_2,x_3,x_4,x_5)\in \R^5\colon x_1+x_2=x_4-x_5=0\}\text{.} \end{equation*}
Solution.
The two equations
\begin{align*} x_1+x_2\amp =0 \amp x_4-x_5\amp =0 \end{align*}
give two independent conditions on \(x_1,x_2\) and \(x_4,x_5\text{,}\) and no condition on \(x_3\text{.}\) We see that the general element of \(W\) can be described as
\begin{equation} (r,-r,s,t,t)=r\underset{\boldx_1}{\underbrace{(1,-1,0,0,0)}}+s\underset{\boldx_2}{\underbrace{(0,0,1,0,0)}}+t\underset{\boldx_3}{\underbrace{(0,0,0,1,1)}}\tag{4.11} \end{equation}
for arbitrary scalars \(r,s,t\text{.}\) It follows immediately that \(B=\{\boldx_1,\boldx_2,\boldx_3\}\) spans \(W\text{.}\) Furthermore, using (4.11), we have
\begin{align*} c_1\boldx_1+c_2\boldx_2+c_3\boldx_3=\boldzero \amp\implies (c_1,-c_1,c_2,c_3,c_3)=(0,0,0,0,0,0) \\ \amp \implies c_1=c_2=c_3=0 \end{align*}
for any scalars \(c_1,c_2,c_3\in \R\text{.}\) Thus \(B\) is linearly independent. We conclude \(B\) is a basis.

Example 4.3.13. By-inspection basis technique.

Let \(W\subseteq M_{22}\) be the subspace of all symmetric matrices. Use ProcedureΒ 4.3.11 to compute a basis of \(W\text{.}\)
Solution.
We follow the three steps of ProcedureΒ 4.3.11.
  1. A general \(2\times 2\) symmetric matrix can be described parametrically as
    \begin{equation*} A=\begin{bmatrix} a\amp b\\ b\amp c \end{bmatrix}\text{.} \end{equation*}
  2. We have
    \begin{equation} \begin{bmatrix} a\amp b\\ b\amp c \end{bmatrix}= a \begin{bmatrix} 1\amp 0\\ 0\amp 0 \end{bmatrix} + b \begin{bmatrix} 0\amp 1\\ 1\amp 0 \end{bmatrix} + c \begin{bmatrix} 0\amp 0\\ 0\amp 1 \end{bmatrix}\text{.}\tag{4.12} \end{equation}
    It follows immediately that the set \(B=\{A_1, A_2, A_3\}\) is a spanning set, where
    \begin{equation*} A_1= \begin{bmatrix} 1\amp 0\\ 0\amp 0 \end{bmatrix}, A_2=\begin{bmatrix} 0\amp 1\\ 1\amp 0 \end{bmatrix} , A_3=\begin{bmatrix} 0\amp 0\\ 0\amp 1 \end{bmatrix}\text{.} \end{equation*}
  3. The expression (4.12) tells us that
    \begin{align*} aA_1+bA_2+cA_3 = \boldzero\amp \iff \begin{bmatrix} a\amp b\\ b\amp c \end{bmatrix}=\begin{bmatrix} 0\amp 0\\ 0\amp 0 \end{bmatrix} \\ \amp\iff a=b=c=0 \text{.} \end{align*}
    This proves \(B\) is linearly independent.
Since \(B\) is a linearly independent spanning set of \(W\text{,}\) it is a basis of \(W\text{.}\)

Exercises 4.3.2 Exercises

WeBWork Exercises

1.
Find a basis for the vector space \(\lbrace A\in {\mathbb R}^{2\times 2} \mid \text{tr}(A)=0\rbrace\) of \(2\times 2\) matrices with trace 0.
\(B = \lbrace\) (2Β Γ—Β 2 array), (2Β Γ—Β 2 array), (2Β Γ—Β 2 array) \(\rbrace\text{.}\)
2.
A square matrix is half-magic if the sum of the numbers in each row and column is the same. Find a basis \(B\) for the vector space of \(2\times 2\) half-magic squares.
\(B = \lbrace\) (2Β Γ—Β 2 array), (2Β Γ—Β 2 array) \(\rbrace\text{.}\)

One-step basis technique.

For each vector space \(V\) and subset \(S\text{,}\) use the one-step technique (ProcedureΒ 4.3.5) to decide whether \(S\) is a basis for \(V\text{.}\)
4.
\(V=M_{22}\)
  1. \(S=\{A_1, A_2, A_3, A_4 \}\text{,}\) where
    \begin{equation*} A_1=\begin{bmatrix}1\amp 1\\ 1\amp 1 \end{bmatrix}, \ A_2=\begin{bmatrix}1\amp -1\\ 0\amp 0 \end{bmatrix}, \ A_3=\begin{bmatrix}0\amp -1\\ 1\amp 0 \end{bmatrix}, \ A_4=\begin{bmatrix}1\amp 0\\ 0\amp 0 \end{bmatrix} \end{equation*}
  2. \(S=\{A_1, A_2, A_3, A_4 \}\text{,}\) where
    \begin{equation*} A_1=\begin{bmatrix}1\amp 1\\ 1\amp 1 \end{bmatrix}, \ A_2=\begin{bmatrix}1\amp -1\\ -1\amp 0 \end{bmatrix}, \ A_3=\begin{bmatrix}0\amp 1\\ 1\amp 0 \end{bmatrix}, \ A_4=\begin{bmatrix}1\amp 0\\ 0\amp 0 \end{bmatrix} \end{equation*}

By-inspection basis technique.

For each given \(V\) and subspace \(W\subseteq V\text{,}\) provide a basis for \(W\) using the β€œby inspection” technique ProcedureΒ 4.3.11. In more detail:
  • Give a simple parametric description of the elements of \(W\text{.}\)
  • If your parametric description is simple enough, you should be able to find an obvious spanning set \(S\) of \(W\text{.}\)
  • Argue that your spanning set is linearly independent.
8.
\(V=M_{23}\text{,}\) \(W\) is set of all matrices whose rows and columns all sum to zero

9.

Suppose \(B=\{\boldv_1,\boldv_2,\boldv_3\}\) be a basis for the vector space\(V\text{.}\) Let \(B'=\{\boldu_1,\boldu_2,\boldu_3\}\text{,}\) where
\begin{equation*} \boldu_1 = \boldv_1, \boldu_2 = \boldv_1 + \boldv_2, \boldu_3 = \boldv_1 +\boldv_2 + \boldv_3\text{.} \end{equation*}
Prove that \(B'\) is a basis.

10.

Let \(S=\{\boldv_1, \boldv_2, \dots, \boldv_k\}\) be a set of \(k\) distinct elements of \(\R^n\text{,}\) let \(A\) be an invertible \(n\times n\) matrix, and let \(S'=\{A\boldv_1, A\boldv_2,\dots, A\boldv_k\}\text{.}\) Prove that \(S\) is a basis of \(\R^n\) if and only if \(S'\) is a basis of \(\R^n\) as follows.
  1. Prove that \(A\boldv_i\ne A\boldv_j\) for all \(i\ne j\text{:}\) i.e., \(S'\) contains \(k\) distinct elements.
  2. Prove that if \(\{\boldv_1, \boldv_2, \dots, \boldv_k\}\) is a basis of \(\R^n\text{,}\) then \(\{A\boldv_1, A\boldv_2,\dots, A\boldv_k\}\) is also a basis for any invertible \(n\times n\) matrix \(A\text{.}\)
  3. Now prove that if \(\{A\boldv_1, A\boldv_2,\dots, A\boldv_k\}\) is a basis of \(\R^n\) for the invertible matrix \(A\text{,}\) then \(\{\boldv_1, \boldv_2, \dots, \boldv_k\}\) is a basis of \(\R^n\text{.}\)
    Hint: in (b) you showed that if you start with any basis and apply any invertible matrix to the elements of this basis, then you end up with another basis; think of a useful choice of matrix for the present situation.

11. Bases for important matrix subspaces.

Let \(V=M_{nn}\text{.}\) For each of the following subspaces \(W\subset M_{nn}\text{,}\) give a basis \(B\) of \(W\text{.}\) You must explicitly describe the elements of your basis as linear combinations of the elements \(E_{ij}\) of the standard basis for \(M_{nn}\text{.}\) No justification needed, as long as your proposed basis is simple enough.
  1. Upper triangular matrices.
    \(\displaystyle W=\{A\in M_{nn}\colon A \text{ is upper triangular}\}\)
  2. Symmetric matrices.
    \(\displaystyle W=\{A\in M_{nn}\colon A^T=A\}\)
  3. Skew-symmetric matrices.
    \(\displaystyle W=\{A\in M_{nn}\colon A^T=-A\}\)
Hint.
It might help to look at the \(n=2\) and \(n=3\) cases to get an idea of what these bases should be.