Skip to main content
Logo image

Section 3.7 Dimension

Intuitively, we think of \(\R^2\) as a two-dimensional space, and \(\R^3\) as three-dimensional one. Why? Loosely speaking this notion of dimension has something to do with the number of degrees of freedom involved in specifying a particular element of the given space: to specify an element of \(\boldv\in \R^2\) we need to give its two coordinates; to specify an element of \(\R^3\) we need to give its three coordinates. Clearly, this conception is too imprecise to serve as a mathematical definition. What exactly does “degrees of freedom” mean? And how do you quantify the number of “degrees of freedom” needed for a given space? For example, we also think of a plane \(W\subseteq \R^3\) passing through the origin as a two-dimensional object; it is not immediately clear how to square this intuition with our vague “degrees of freedom” formulation. In this section we introduce the definition of the dimension of a vector space, which will be a rigorous articulation of these notions. Our definition, which relies on the concept of bases, seems simple enough: the dimension of a vector space \(V\) is defined as the number of elements contained in any basis \(B\) of \(V\text{.}\) However, as we will see there is considerable work involved (a) in proving that this definition is well-defined, and (b) in showing that it captures the intuition of dimension described above.

Subsection 3.7.1 Dimension of a vector space

Definition 3.7.1. Cardinality of a set.

Let \(X\) be a set. The cardinality of \(X\), denoted \(\val{X}\) is defined as follows:
  • If \(X\) is finite, then its cardinality is the number \(n\) of distinct elements it contains, written \(\val{X}=n\text{.}\)
  • If \(X\) is infinite, then we say it has infinite cardinality, and write \(\val{X}=\infty\text{.}\)
Let \(S=\{\boldv_1, \boldv_2, \dots, \boldv_n\}\text{,}\) and let \(S'=\{\boldw_1, \boldw_2, \dots, \boldw_r\}\text{.}\) Since \(S\) spans \(V\text{,}\) each element of \(S'\) is a linear combination of elements of \(S\text{:}\) i.e., we have
\begin{equation} \boldw_j=a_{1j}\boldv_1+a_{2j}\boldv_2+\cdots a_{nj}\boldv_n=\sum_{i=1}^n\boldv_{ij}\tag{3.7.1} \end{equation}
for all \(1\leq j\leq r\text{.}\) Now consider the following chain of equivalences:
\begin{align*} c_1\boldw_1+c_2\boldw_2+\cdots c_r\boldw_r=\boldzero \amp \iff c_1(\sum_{i=1}^n\boldv_{i1})+c_2(\sum_{i=1}^n\boldv_{i2})+\cdots +c_r\sum_{i=1}^n\boldv_{in}=\boldzero\amp (\knowl{./knowl/eq_basis_bound.html}{\text{(3.7.1)}}) \\ \amp \iff \sum_{j=1}^rc_j\sum_{i=1}^na_{ij}\boldv_i=\boldzero\\ \amp\iff \sum_{i=1}^n(\sum_{j=1}^ra_{ij}c_j)\boldv_i=\boldzero \\ \amp \iff (\sum_{j=1}^ra_{1j}c_j)\boldv_1+(\sum_{j=1}^ra_{2j}c_j)\boldv_2+\cdots (\sum_{j=1}^ra_{nj}c_j)\boldv_n=\boldzero \text{.} \end{align*}
From the last vector equation, we see that if we can find a nonzero sequence \((c_1,c_2,\dots, c_r)\) satisfying
\begin{equation*} \sum_{j=1}^ra_{ij}c_j=0 \end{equation*}
for all \(1\leq i\leq n\text{,}\) then there is a nontrivial combination of the \(\boldw_i\) equal to the zero vector, and hence that \(S'\) is linearly dependent. Such a sequence \((c_1,c_2,\dots, c_n)\) corresponds to a solution to the homogeneous linear with augmented matrix
\begin{equation*} \begin{amatrix}[r|r] A\amp \boldzero \end{amatrix}\text{,} \end{equation*}
where \(A=[a_{ij}]_{n\times r}\text{.}\) Since this is a homogeneous system of \(n\) equations in \(r\) unknowns, and since \(r>n\text{,}\) there are in fact infinitely many solutions. (The system has at most \(n\) leading ones, and so there must be a free variable since one of the \(r\) columns in the equivalent row echelon matrix must fail to contain a leading one.) In particular there is a nonzero solution \((c_1,c_2,\dots, c_n)\ne \boldzero\text{,}\) and we conclude that \(S'\) is linearly dependent.
The next theorem not only ensures that our definition of dimension (3.7.4) is well-defined, it also characterizes dimension as the minimal cardinality of a spanning set, and the maximal cardinality of a linearly independent set.
  1. Suppose by contradiction that \(\Span S=V\) and \(\val{S} < n\text{.}\) Then Theorem 3.7.2 would imply \(B\) is linearly dependent. Since this is a contradiction, we conclude that \(\val{S}\geq n\text{.}\)
  2. This also follows from Theorem 3.7.2: since \(B\) is a spanning set of \(V\) any set containing more than \(n=\val{B}\) elements must be linearly dependent.
  3. This follows from (1) and (2): if \(B'\) is a basis, then since \(B'\) spans, we have \(\val{B'}\geq n\) (1); and since \(B\) is linearly independent we have \(\val{B'}\leq n\) (2). We conclude that \(\val{B'}=n\text{.}\)

Definition 3.7.4. Dimension of a vector space.

Let \(V\) be a vector space. The dimension of \(V\), denoted \(\dim V\text{,}\) is defined as follows:
  • If \(V\) has a finite basis \(B\text{,}\) then the dimension of \(V\) is the number of elements of \(B\text{:}\) i.e., \(\dim V=\val{B}\text{.}\)
  • If \(V\) does not have a finite basis, then we say \(V\) is infinite dimensional, and write \(\dim V=\infty\text{.}\)

Remark 3.7.5.

Wouldn’t it have been more natural to simply say \(V\) is infinite dimensional if it has an infinite basis? As it turns out this is indeed equivalent to not having a finite basis, but to prove this we need to establish the general fact that we can always find a basis for a given vector space \(V\text{.}\) As intuitive as that claim may sound (i.e., that bases always exist), its proof requires some set theory methods that are not covered in this text. As such we will not include it in our treatment of dimension, and so will have to make do with our slightly awkward definition of infinite-dimensional vector spaces.
By definition, to show a vector space \(V\) has dimension \(n\text{,}\) we must exhibit a basis \(B\) with \(\val{B}=n\text{.}\) Similarly to show \(V\) is infinite dimensional, we must show that it does not have a finite basis: equivalently, if you believe Remark 3.7.5, we must exhibit an infinite basis of \(V\text{.}\)
It is especially easy to carry out Procedure 3.7.6 for a vector space when we have a basis \(B\) at the ready. This is the case for our list of familiar vector spaces, each of which is equipped with a standard basis.

Example 3.7.7. Familiar vector spaces.

For each \(V\) below we provide its standard basis \(B\) and compute \(\dim V=\val{B}\text{.}\)
  • Zero space.
    \(V=\{\boldzero\}\text{,}\) \(B=\emptyset=\{ \}\text{,}\) \(\dim V=0\)
  • Tuples.
    \(V=\R^n\text{,}\) \(B=\{\bolde_1, \bolde_2,\dots, \bolde_n\}\text{,}\) \(\dim V=\val{B}=n\)
  • Matrices.
    \(V=M_{mn}\text{,}\) \(B=\{E_{ij}\colon 1\leq i\leq m, 1\leq j\leq n\}\text{,}\) \(\dim V=\val{B}=m\, n\)
  • Polynomials of bounded degree.
    \(V=P_n\text{,}\) \(B=\{x^n, x^{n-1}, \dots, x, 1\}\text{,}\) \(\dim V=\val{B}=n+1\)
  • Polynomials.
    \(V=P\text{,}\) \(B=\{1,x, x^2, \dots\}\text{,}\) \(\dim V=\val{B}=\infty\)
Even when we do not have a standard basis at our disposal, there are many situations when we can produce a basis of a space by inspection, allowing us to then easily compute the dimension.

Example 3.7.8. Subspace of \(P_3\).

Let \(W\subseteq P_3\) be the subspace of polynomials whose leading coefficient is equal to their constant term: i.e.,
\begin{equation*} W=\{a_3x^3+a_2x^2+a_1x+a_0\colon a_3=a_0\}\text{.} \end{equation*}
Compute \(\dim W\) using Procedure 3.7.6. You may take for granted that \(W\) is a subspace.
Solution.
First we use Procedure 3.6.13 to produce a basis of \(W\text{.}\) A general element of \(W\) has parametric description
\begin{equation} p(x)=ax^3+bx^2+cx+a=a(x^3+1)+bx^2+cx\text{,}\tag{3.7.2} \end{equation}
from which we see immediately that \(B=\{x^3+1, x^2, x\}\) is a spanning set of \(W\text{.}\) From (3.7.2) we also see that
\begin{align*} a(x^3+1)+bx^2+cx=\boldzero \amp \iff ax^3+bx^2+cx+a=0x^3+0x^2+0x+0 \\ \amp \iff a=b=c=0\text{.} \end{align*}
Thus \(B\) is a basis of \(W\text{.}\) Since \(B\) contains three distinct elements, we conclude that \(\dim W=\abs{B}=3\text{.}\)

Example 3.7.9. Dimension of symmetric matrices.

Let \(W\subseteq M_{22}\) be the space of symmetric \(2\times 2\) matrices. Compute \(\dim W\) using Procedure 3.7.6. You may take for granted that \(W\) is a subspace.
Solution.
We saw in Example 3.6.14 that \(B=\{A_1, A_2, A_3\}\) is a basis of \(W\text{,}\) where
\begin{equation*} A_1= \begin{bmatrix} 1\amp 0\\ 0\amp 0 \end{bmatrix}, A_2=\begin{bmatrix} 0\amp 1\\ 1\amp 0 \end{bmatrix} , A_3=\begin{bmatrix} 0\amp 0\\ 0\amp 1 \end{bmatrix}\text{.} \end{equation*}
We conclude that \(\dim W=\abs{B}=3\text{.}\)

Video example: computing dimension.

Figure 3.7.10. Video: computing dimension
The “contracting and expanding” theorem below is very useful theoretical consequence of Theorem 3.7.3. It allows us to construct a customized basis from a given set \(S\text{.}\) This method is used prominently in the proof of the rank-nullity theorem.
Let \(S=\{\boldv_1, \boldv_2, \dots , \boldv_r\}\text{.}\)
  1. Assume \(\Span S=V\text{.}\) Let \(S'\) be a subset of \(S\) of minimal cardinality such that \(\Span S'\) is still equal to \(V\text{.}\) Such a set is guaranteed to exist: since \(S\) is finite, it has a finite number of subsets; of these subsets some will span, others will not; of the subsets that do span, there will be one (or more) that has the least number of elements.
    We claim that such a spanning subset \(S'\) of minimal cardinality is linearly independent, and hence is a basis for \(V\text{,}\) as desired. We give a proof by contradiction. Suppose \(S'\) is linearly dependent. Then some element of \(S'\text{,}\) call it \(\boldv_0\text{,}\) can be expressed as a linear combination of the other elements (Remark 3.5.10). Then in terms of span, the element \(\boldv_0\) is “redundant”. In other words, if we let \(S''=S'-\{\boldv_0\}\text{,}\) the set obtained by “throwing out” \(\boldv_0\text{,}\) then we have \(\Span S''=\Span S'=V\text{.}\) Since \(\val{S''} < \val{S}\text{,}\) this contradicts our choice of \(S'\) as a spanning set of minimal cardinality. We conclude that \(S'\) is linearly independent, completing the proof.
  2. Assume \(S\) is linearly independent. The procedure below constructs a finite chain of sets
    \begin{equation*} S=S_0\subseteq S_1\dots \subseteq S_k \end{equation*}
    that ends with a basis \(B=S_k\text{.}\)
    • Initialization.
      Let \(S=S_0\)
    • Expansion loop.
      If \(\Span S_i=V\text{,}\) return \(B=S_i\text{.}\) Otherwise set \(S_{i+1}=S_i\cup \{\boldv\}\text{,}\) where \(\boldv\) is any element of \(V\) that is not contained in \(\Span S_i\) and repeat.
    Some observations:
    1. Each \(S_i\) is linearly independent. This can be shown by induction, the crucial point being that if \(S_i\) is linearly independent, and if \(\boldv\notin \Span S_i\text{,}\) then \(S_{i+1}=S_i\cup \{\boldv_0\}\) is linearly independent. The proof is left as an exercise.
    2. The procedure must halt. Why? Since \(\dim V=n\text{,}\) and since each \(S_i\) is linearly independent, we must have \(\val{S_i}\leq n\) for all \(i\) by Theorem 3.7.3. Since \(\val{S_i}< \val{S_{i+1}}\) and \(\val{S_{i}}\leq n\text{,}\) the procedure must halt in at most \(n\) steps.
    From (ii) we may assume the procedure halts at \(S_k\) for some \(k\geq 0\text{.}\) From (i) we know that \(S_k\) is linearly independent. Furthermore, since the procedure halts at \(S_k\text{,}\) we know that \(\Span S_k=V\text{.}\) It follows that \(B=S_k\) is a basis containing \(S\text{,}\) as desired.
The following corollary follows from Theorem 3.7.3 and Theorem 3.7.11. We call it a “street smarts” result as the first two statements give us a quick and dirty way of dashing a set’s hopes of being a basis. The third statement asserts that when a finite set’s cardinality matches the dimension of a space, then to prove it is a basis it suffices to prove either one of the two defining properties of Definition 3.6.1.
Statements (1) and (2) follow directly from Theorem 3.7.3. Statement (3) is an easy consequence of Theorem 3.7.11. For example, if \(S\) spans \(V\text{,}\) then there is a subset \(S'\) of \(S\) that is a basis of \(V\text{;}\) since all bases of \(V\) have \(n\) elements, and since \(\val{S}=n\text{,}\) we must have \(S'=S\text{;}\) thus \(S\) is a basis. The proof for a linear independent set \(S\) is similar, and is left to the reader.
We are often in a situation where we wish to show a given subspace of a vector space is in fact the entire space. For example, when deciding whether a set \(S\) spans a vector space \(V\text{,}\) we are asking whether \(W=\Span S\) is all of \(V\text{.}\) As another example, given a linear transformation \(T\colon V\rightarrow W\text{,}\) one of the first things we wish to determine is whether the subspace \(\im T\) is in fact all of \(W\text{.}\) As the next result illustrates, dimension is a very useful tool in this regard.
  1. Firstly, if \(\dim V=\infty\text{,}\) then clearly \(\dim W\leq \dim V\) for any subspace \(W\subseteq V\text{.}\)
    Now assume \(\dim V=n\text{.}\) Apply the “extending to a basis” procedure described in the proof of Theorem 3.7.11 to the emptyset \(S=\{\}\text{,}\) which is lienarly independent, considered as a subset of \(W\text{:}\) i.e., at each stage, if the current set \(S_i\) is not a basis of \(W\text{,}\) add any element \(\boldb\notin \Span S_i\text{.}\) Since \(W\subseteq V\text{,}\) and since \(\dim V\leq n\text{,}\) the linearly independent sets \(S_i\) cannot have more than \(n\) elements; thus the procedure must halt with a basis \(B\) of \(W\) satisfying \(\val{B}\leq n\text{.}\) We conclude that \(\dim W\leq \dim V\text{.}\)
  2. Clearly, if \(W=V\text{,}\) then \(\dim W=\dim V\text{.}\) For the other direction, assume \(\dim W=\dim V=n\text{.}\) Let \(B\) be a basis for \(W\text{.}\) Since \(B\) is lienarly independent, it can be extended to a basis \(B'\) of \(V\) (3.7.11). Since \(\val{B}=\dim W=\dim V\text{,}\) and since all bases of \(V\) have cardinality \(n\text{,}\) we must have \(B=B'\text{.}\) It follows that \(B\) is also a basis of \(V\text{,}\) and hence that
    \begin{equation*} V=\Span B=W\text{.} \end{equation*}

Example 3.7.14. Dimension of subspace.

Let \(W=\Span\{A_1, A_2, A_3\}\subseteq M_{22}\text{,}\) where
\begin{equation*} A_1= \begin{amatrix} 1\amp 1\\ 1\amp 0 \end{amatrix}, A_2= \begin{amatrix} 0\amp 1\\ 1\amp 0 \end{amatrix}, A_3= \begin{bmatrix} 0\amp 1\\ 1\amp 1 \end{bmatrix}\text{.} \end{equation*}
Let \(W'\) be the space of symmetric \(2\times 2\) matrices. Use Corollary 3.7.13 and the fact that \(\dim W'=3\) to prove that \(W=W'\text{.}\)
Solution.
  • Since each \(A_i\) is symmetric, and since the set of symmetric matrices \(W'\) is a subspace, we have \(W=\Span\{A_1, A_2,A_3\}\subseteq W'\) by statement (2) of Theorem 3.5.5.
  • The set \(B=\{A_1, A_2, A_3\}\) is linearly independent:
    \begin{align*} aA_1+bA_2+cA_3=\boldzero \amp \iff \begin{bmatrix} a\amp a+b+c\\ a+b+c\amp c \end{bmatrix}= \begin{bmatrix} 0\amp 0\\ 0\amp 0 \end{bmatrix}\\ \amp \iff a=c=0 \text{ and } a+b+c=0 \\ \amp \iff a=b=c=0\text{.} \end{align*}
    Thus \(B\) is a basis of \(W=\Span B\text{,}\) and we conclude that \(\dim W=3\text{.}\)
  • We saw in Example 3.7.9 that \(\dim W'=3\text{:}\) i.e., the space of symmetric \(2\times 2\) is \(3\)-dimensional. Since \(W\) is a subspace of \(W'\) and \(\dim W=\dim W'=3\text{,}\) we conclude that \(W=W'\) by Corollary 3.7.13.
Corollary 3.7.13 is also of use in situations where we wish to characterize all subspaces of a given \(n\)-dimensional vector space \(V\text{.}\) By the corollary, any such subspace \(W\) must have dimension \(k\) for some \(0\leq k\leq n\text{;}\) and this in turn means that \(W\) must have a basis of cardinality \(k\text{.}\) We thus have the beginnings of a systematic description of all subspaces \(W\subseteq V\text{,}\) organized by dimension \(k=\dim W\text{,}\) for \(0\leq k\leq n\text{.}\)

Example 3.7.15. Subspaces of \(\R^3\).

We describe all subspaces \(W\) of \(\R^3\) ordered by dimension.
  • \(\dim W=0\).
    If \(\dim W=0\text{,}\) then \(W\) must have the empty set \(\{\}\) as a basis (by definition of dimension), and hence \(W=\Span\{ \}=\{\boldzero\}\text{,}\) the zero subspace.
  • \(\dim W=1\).
    If \(\dim W=1\text{,}\) then \(W\) has a basis \(B=\{\boldv\}\) consisting of a single nonzero vector \(\boldv\in \R^3\text{.}\) In this case we have \(W=\Span\{\boldv\}=\{t\boldv\colon t\in \R\}\text{,}\) which is a line passing through the origin with direction vector \(\boldv\text{.}\) Thus \(1\)-dimensional subspaces of \(\R^3\) are lines passing through the origin.
  • \(\dim W=2\).
    If \(\dim W=2\text{,}\) then \(W\) has a basis \(B=\{\boldv_1, \boldv_2\}\) containing two non-parallel vectors in \(\R^3\text{.}\) You learn in multivariable calculus that
    \begin{equation*} W=\Span\{\boldv_1, \boldv_2\}=\{s\boldv_1+t\boldv_2\colon s,t\in \R\} \end{equation*}
    is the plane in \(\R^3\) passing through the origin with “direction vectors” \(\boldv_1, \boldv_2\text{.}\) In fact, you can show that the cross product \(\boldn=\boldv_1\times \boldv_2\) is a normal vector for the plane \(W\) in this case. Thus the \(2\)-dimensional subspaces of \(\R^3\) are planes passing through the origin.
  • \(\dim W=3\).
    Assume \(\dim W=3\text{.}\) Since \(W\) is a subspace of \(\R^3\text{,}\) and since \(\dim W=\dim \R^3=3\text{,}\) we must have \(W=\R^3\) according to Corollary 3.7.13. It is as simple as that! Thus \(\R^3\) is the only \(3\)-dimensional subspace of \(\R^3\text{.}\)
That was quite a dose of theory! For your convenience, we collect the various results on dimension, together with their nicknames, in one omnibus theorem.

Exercises 3.7.2 Street smarts!

WeBWork Exercises

1.
Are the following statements true or false?
  1. \(R^n\) has exactly one subspace of dimension \(m\) for each of \(m = 0,1,2,\ldots, n\) .
  2. The nullity of a matrix A is the same as the dimension of the subspace spanned be the columns of A.
  3. If {\(u_1, u_2, u_3\)} is a basis for \(R^3\text{,}\) then span{\(u_1, u_2\)} is a plane.
  4. The set {0} forms a basis for the zero subspace.
  5. Let \(m>n\text{.}\) Then \(U =\) {\(u_1, u_2,\ldots, u_m\)} in \(R^n\) can form a basis for \(R^n\) if the correct \(m-n\) vectors are removed from \(U\text{.}\)
2.
Suppose that \(S_1\) and \(S_2\) are nonzero subspaces, with \(S_1\) contained inside \(S_2\text{,}\) and suppose that \(dim(S_2) = 3\text{.}\)
(a) What are the possible dimensions of \(S_1\text{?}\)
  • 1
  • 2
  • 3
  • 1 or 2
  • 1,2, or 3
(b) If \(S_1\neq S_2\) then what are the possible dimensions of \(S_1\text{?}\)
  • 1
  • 2
  • 3
  • 1 or 2
  • 1,2, or 3
Answer 1.
\(\text{1,2, or 3}\)
Answer 2.
\(\text{1 or 2}\)
Solution.
(a) The dimension of \(S_1\) cannot exceed the dimension of \(S_2\) since \(S_1\) is contained in \(S_2\text{.}\) \(S_1\) is non-zero, and thus its dimension cannot be 0. Hence 1, 2, or 3 are the possible dimensions of \(S_1\text{.}\) (b) If \(S_1 \ne S_2\text{,}\) then \(S_1\) is properly contained in \(S_2\text{,}\) and the dimension of \(S_1\) is strictly less than the dimension of \(S_2\text{.}\) Thus only 1 or 2 are possible dimensions of \(S_1\) .
3.
Suppose that \(S_1\) and \(S_2\) are nonzero subspaces, with \(S_1\) contained inside \(S_2\text{,}\) and suppose that \(dim(S_2) = 4\)
  1. If \(\ S_1 \ne S_2\text{,}\) then what are the possible dimensions of \(S_1\) ?
  2. What are the possible dimensions of \(S_1\text{?}\)
Solution.
\(S_1\) cannot exceed the dimension of \(S_2\text{,}\) since \(S_1\) is contained in \(S_2\text{.}\) \(S_1\) us non-zero, and thus the its dimension cannot be zero. Hence the possible dimensions of \(S_1\) are 1, 2, 3, and 4. b) If \(S_1 \ne S_2\text{,}\) then \(S_1\) is properly contained in \(S_2\text{,}\) and the dimension of \(S_1\) is strictly less than the dimension of \(S_2\text{.}\) So the possible dimensions of \(S_1\) are 1, 2, and 3.
4.
Find the dimensions of the following vector spaces.
(a) The vector space of \(7 \times 7\) matrices with trace 0
(b) The vector space \({\mathbb R}^{\,5}\)
(c) The vector space of all upper triangular \(6 \times 6\) matrices
(d) The vector space \(P_4 [x]\) of polynomials with degree less than \(4\)
(e) The vector space \({\mathbb R}^{3 \times 2}\)
(f) The vector space of all diagonal \(7 \times 7\) matrices
Answer 1.
\(48\)
Answer 2.
\(5\)
Answer 3.
\(21\)
Answer 4.
\(4\)
Answer 5.
\(6\)
Answer 6.
\(7\)
5.
By deleting linearly dependent vectors, find a basis of each subspace and give the dimension of the subspace.
A. The dimension of \(\text{span} \left \lbrace \left[\begin{array}{c}-2\\-2\\\end{array}\right], \left[\begin{array}{c}3\\3\\\end{array}\right] \right \rbrace\) is .
B. The dimension of \(\text{span} \left \lbrace \left[\begin{array}{c}7\\-11\\\end{array}\right], \left[\begin{array}{c}-22\\33\\\end{array}\right] \right \rbrace\) is .
C. The dimension of \(\text{span} \left \lbrace \left[\begin{array}{c}3\\3\\3\\\end{array}\right], \left[\begin{array}{c}-9\\-9\\-9\\\end{array}\right], \left[\begin{array}{c}6\\6\\6\\\end{array}\right] \right \rbrace\) is .
D. The dimension of \(\text{span} \left \lbrace \left[\begin{array}{c}1\\-1\\1\\\end{array}\right], \left[\begin{array}{c}-5\\5\\-5\\\end{array}\right], \left[\begin{array}{c}2\\7\\8\\\end{array}\right] \right \rbrace\) is .
E. The dimension of \(\text{span} \left \lbrace \left[\begin{array}{c}0\\0\\0\\\end{array}\right], \left[\begin{array}{c}3\\0\\0\\\end{array}\right], \left[\begin{array}{c}3\\-3\\0\\\end{array}\right], \left[\begin{array}{c}1\\3\\3\\\end{array}\right] \right \rbrace\) is .
Answer 1.
\(1\)
Answer 2.
\(2\)
Answer 3.
\(1\)
Answer 4.
\(2\)
Answer 5.
\(3\)

Exercise Group.

For each vector space \(V\) and subset \(S\) use an appropriate statement from Corollary 3.7.12 to help decide whether \(S\) is a basis for \(V\text{.}\) Justify your answer.
6.
\(V=P_3\text{,}\) \(S=\{x^3-x^2+x-1, x^3+\pi\, x-\sqrt{2}, x^3+1\}\)
7.
\(V=\R^4\text{,}\) \(S=\{(1,1,1,1), (0,1,1,1), (0,0,1,1), (0,0,0,1) \}\)
8.
\(V=\R^3\text{,}\) \(S=\{(-217, 3e^2, 3), (1,1,1), (-\log 81, 15, 17), (2,-\sqrt{17}, \sqrt{19})\}\)

9.

Let \(W\subseteq \R^3\) be the set of solutions to the following homogeneous system:
\begin{align*} x+y+z\amp =\amp 0\\ 3x+2y-2z\amp =\amp 0\\ 4x+3y-z \amp =\amp 0\\ 6x+5y+z \amp =\amp 0\text{.} \end{align*}
  1. Compute a basis of \(W\text{.}\) Justify your answer.
  2. Compute \(\dim W\text{.}\)

10.

Compute bases and dimensions for the following subspaces of \(\R^4\text{.}\)
  1. \(\displaystyle W=\{(a,b,c,d)\in\R^4\colon d=a+b, c=a-b\}\)
  2. \(\displaystyle W=\{(a,b,c,d)\in \R^4\colon a+b=c+d\}\)

11.

Suppoe \(B=\{\boldv_1,\boldv_2,\boldv_3\}\) be a basis for the vector space\(V\text{.}\) Let \(B'=\{\boldu_1,\boldu_2,\boldu_3\}\text{,}\) where
\begin{equation*} \boldu_1 = \boldv_1, \boldu_2 = \boldv_1 + \boldv_2, \boldu_3 = \boldv_1 +\boldv_2 + \boldv_3\text{.} \end{equation*}
Prove that \(B'\) is a basis.
Hint.
First explain why it is enough to show that \(B'\) is linearly independent.

12.

In multivariable calculus, a plane in \(\R^3\) is defined as the set of solutions to an equation of the form \(ax+by+cz=d\text{,}\) where at least one of \(a, b, c\) is nonzero. In particular, a plane passing through the origin \((0,0,0)\) is the set of solutions to an equation of the form \(ax+by+cz=0\text{,}\) where at least one of \(a, b, c\) is nonzero.
Prove that the 2-dimensional subspaces of \(\R^3\) are precisely the planes of \(\R^3\) that pass through the origin. In other words, show (a) that any plane \(\mathcal{P}\subseteq\R^3\) passing through the origin is a 2-dimensional subspace, and conversely, (b) that any 2-dimensional subspace \(W\subseteq \R^3\) is a plane passing through the origin.
Hint.
For (b), begin with a basis of \(B=\{(a,b,c), (d,e,f)\}\) of \(W\text{,}\) and use the cross product to find a normal vector \(\boldn\) that defines \(W\) as a plane.

13.

Let \(V=M_{33}\text{,}\) \(W=\{A\in M_{33}\colon A^T=-A\}\text{,}\) and \(W'=\Span\{ A_1, A_2, A_3\}\text{,}\) where
\begin{equation*} A_1=\begin{bmatrix}0\amp 1\amp 2\\ -1\amp 0\amp 1\\ -2\amp -1\amp 0 \end{bmatrix} , \ A_2=\begin{bmatrix}0\amp 1\amp -1\\ -1\amp 0\amp 1\\ 1\amp -1\amp 0 \end{bmatrix} ,\ A_3=\begin{bmatrix}0\amp 1\amp 0\\ -1\amp 0\amp -1\\ 0\amp 1\amp 0 \end{bmatrix} \end{equation*}
Show that \(W'=W\) as follows:
  1. Show that \(W'\subseteq W\text{.}\)
  2. Compute the dimensions of \(W'\) and \(W\) and use Corollary 3.7.13.

14.

Let \(V=M_{23}\) and define
\begin{equation*} W=\{A\in M_{23}\colon \text{ all rows and columns of \(A\) sum to 0 } \}\text{.} \end{equation*}
Find a basis for \(W\) by inspection and compute its dimension.

15.

Let \(C(\R)\) and let \(W=\Span S\text{,}\) where
\begin{equation*} S=\{f_1(x)=\cos(2x), f_2(x)=\sin(2x), f_3(x)=\sin(x)\cos(x), f_4(x)=\cos^2(x), f_5(x)=\sin^2(x) \}\text{.} \end{equation*}
Provide a basis for \(W\) and compute \(\dim W\text{.}\)
Hint.
You can contract \(S\) to a basis by throwing out some redundant elements.

16. Dimensions of important matrix subspaces.

Let \(V=M_{nn}\text{.}\) Compute \(\dim W\) for each subspace \(W\subset M_{nn}\text{.}\)
  1. Upper triangular matrices.
    \(\displaystyle W=\{A\in M_{nn}\colon A \text{ is upper triangular}\}\)
  2. Symmetric matrices.
    \(\displaystyle W=\{A\in M_{nn}\colon A^T=A\}\)
  3. Skew-symmetric matrices.
    \(\displaystyle W=\{A\in M_{nn}\colon A^T=-A\}\)
Hint.
Use your results from Exercise 3.6.3.14. The identity
\begin{equation*} \sum_{i=1}^n i=\frac{(n+1)n}{2} \end{equation*}
will be helpful.

17.

Let \(A\in M_{nn}\text{.}\) Show that there is a nonzero polynomial \(p(x)=a_{n^2}x^{n^2}+a_{n^2-1}x^{n^2-1}+\cdots +a_1x+a_0\) such that
\begin{equation*} p(A)=a_{n^2}A^{n^2}+a_{n^2-1}A^{n^2-1}+\cdots +a_1A+a_0I_n=\underset{n\times n}{\boldzero} \end{equation*}
.
Hint.
Consider the set \(S=\{A^{n^2}, A^{n^2-1}, \dots, A, I\}\subseteq M_{nn}\) and use a relevant statement from Theorem 3.7.16. Treat two cases separately: (a) the powers of \(A\) are all distinct; (b) \(A^i=A^j\) for some \(0\leq i < j\leq n^2\text{.}\)