In this section we introduce another operation on \(\R^n\) called the dot product. As we will see, the dot product is an additional layer added to the vector space structure of \(\R^n\) that gives rise to a number of useful analytic tools. More generally, the dot product turns out to be just one example of what is called an inner product on a vector space. Inner products imbue vector spaces with valuable geometric content, a few of which are highlighted below.
Distance and angle.
A notion of distance and angle between two vectors can be defined relative to a given inner product. These provide a numeric measurement of how βcloseβ (distance) or βclosely orientedβ (angle) two vectors in our space are.
Two vectors of an inner product space are orthogonal if their inner product is equal to zero. Orthogonality leads to a general notion of orthogonal projection, and is part of the definition of an orthogonal basis.
We will get into these geometric notions in detail later. For now we content ourselves simply with defining what an inner product is. We will also show that the dot product is just one example of infinitely many inner products that can be defined on \(\R^n\text{.}\) In particular, we will define a family of natural inner products on \(\R^n\) called weighted dot products, which play an important role in statistics and data science.
As mentioned above, the dot product is just one example of a more general notion called an inner product, which is an additional operation defined on a vector space.
Let \(V\) be a vector space. An inner product on \(V\) is an operation that takes as input a pair of vectors \(\boldv, \boldw\in V\) and outputs a scalar \(\langle \boldv, \boldw \rangle \in \R\text{.}\) Using function notation:
An inner product space is a pair \((V, \langle , \rangle )\text{,}\) where \(V\) is a vector space, and \(\langle , \rangle \) is a choice of inner product on \(V\text{.}\)
Although we will almost exclusively work with the dot product in this treatment of linear algebra, it is worth considering a natural family of inner products on \(\R^n\) that the dot product fits nicely into: namely, weighted dot products. These examples of inner products are especially important in data science.
Let \(k_1, k_2, \dots , k_n\) be a list of positive real numbers: i.e., \(k_i> 0\) for all \(1\leq i\leq n\text{.}\) The weighted dot product with weights \((k_1,k_2,\dots, k_n)\) is the operation
Let \(k_1, k_2, \dots , k_n\) be a list of positive real numbers. The weighted dot product on \(\R^n\) with weights \((k_1,k_2,\dots, k_n)\) is an inner product. In particular, the dot product is an inner product.
We verify each axiom in turn. Throughout we assume \(\boldx=(x_1,x_2,\dots, x_n)\text{,}\)\(\boldy=(y_1,y_2,\dots, y_n)\text{,}\) and \(\boldw=(w_1,w_2,\dots, w_n)\) are arbitrary elements of \(\R^n\text{.}\)
Since \(k_i> 0\) and \(x_i^2\geq 0\) for all \(i\) (squares of real numbers are nonnegative), we have \(\langle \boldx, \boldx \rangle\geq 0\) for any \(\boldx\text{.}\)
We now show that \(\angvec{\boldx,\boldx}=0\) if and only if \(\boldx=\boldzero=(0,0,\dots, 0)\text{.}\) The reverse implication is clear: if \(x_i=0\) for all \(i\text{,}\) then \(\sum_{i=1}^nk_ix_i^2=\sum_{i=1}^n 0=0\text{.}\) We prove the forward implication by showing that its contrapositive is true: i.e., if \(\boldx\ne \boldzero\text{,}\) then \(\angvec{\boldx,\boldx} \ne 0\text{.}\) If \(\boldx\ne \boldzero\text{,}\) then we have \(x_{i_0}\ne 0\) for some \(1\leq i_0\leq n\text{,}\) in which case \(k_{i_0}x_{i_0}^2 > 0\text{.}\) But then
where \(\boldx=(x_1,x_2), \boldy=(y_1,y_2)\text{.}\) This operation satisfies axioms i-ii of DefinitionΒ 1.2.3. (See proof of TheoremΒ 1.2.6.) However, it fails both the positivity and definiteness properties of axiom (iii): e.g.,
Are there other examples of inner products besides weighted dot products that can be defined on \(\R^n\text{?}\) Indeed there are! For example, it is possible to show that the operation
defines an inner product on \(\R^2\text{.}\) In fact, getting ahead of ourselves once again, inner products on \(\R^n\) are in one-to-one correspondence with symmetric positive-definite \(n\times n\) matrices. For example, the inner product (1.4) corresponds to the matrix
For our purposes, however, the family of weighted dot products will serve as a sufficiently rich source of examples of inner products. Furthermore, as we will see later, any inner product on \(\R^n\) can be viewed as being equal to the dot product βafter a change of variablesβ. For example, letting \(\boldx=(x_1,x_2)\) and \(\boldy=(y_1,y_2)\text{,}\) the inner product (1.4) satisfies