The page you are reading is part of a draft (v2.0) of the "No bullshit guide to math and physics."

The text has since gone through many edits and is now available in print and electronic format. The current edition of the book is v4.0, which is a substantial improvement in terms of content and language (I hired a professional editor) from the draft version.

I'm leaving the old wiki content up for the time being, but I highly engourage you to check out the finished book. You can check out an extended preview here (PDF, 106 pages, 5MB).


Abstract vector spaces

The math we learned for dealing with vectors can be applied more generally to vector-like things. We will see that several mathematical objects like matrices and polynomials behave similarly to vectors. For example, the addition of two polynomials $P$ and $Q$ is done by adding the coefficients for each power of $x$ component-wise, the same way the addition of vectors happens component-wise.

In this section, we'll learn how to use the terminology and concepts associated with regular vector spaces to study other mathematical objects. In particular we'll see that notions such as linear independence, basis, and dimension can be applied to pretty much all mathematical objects that have components.

Definitions

To specify an abstract vector space $(V,F,+,\cdot)$, we must specify four things:

  1. A set of vector-like objects $V=\{\mathbf{u},\mathbf{v},\ldots \}$.
  2. A field $F$ of scalar numbers, usually $F=\mathbb{R}$ or $F=\mathbb{C}$.

In this section $F=\mathbb{R}$.

  1. An addition operation “$+$” for the elements of $V$ that dictates

how to add vectors $\mathbf{u} + \mathbf{v}$.

  1. A scalar multiplication operation “$\cdot$” for scaling a vector

by an element of the field. Scalar multiplication is usually denoted

  implicitly $\alpha \mathbf{u}$ (without the dot).

NOINDENT A vector space satisfies the following eight axioms, for all scalars $\alpha, \beta \in F$ and all $\mathbf{u}, \mathbf{v}, \mathbf{w} \in V$:

  1. $\mathbf{u} + (\mathbf{v}+ \mathbf{w}) = (\mathbf{u}+ \mathbf{v}) + \mathbf{w}$. (associativity of addition)
  2. $\mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u}$. (commutativity of addition)
  3. There exists zero vector $\mathbf{0} \in V$,

such that $\mathbf{u} + \mathbf{0} = \mathbf{0} +\mathbf{u} = \mathbf{u}$ for all $\mathbf{u} \in V$.

  1. For every $\mathbf{u} \in V$, there exists an inverse element

$-\mathbf{u}$ such that $\mathbf{u} + (-\mathbf{u}) = \mathbf{u} -\mathbf{u} = \mathbf{0}$.

  1. $\alpha (\mathbf{u} + \mathbf{v}) = \alpha \mathbf{u} + \alpha \mathbf{v}$. (distributivity I)
  2. $(\alpha + \beta)\mathbf{u}= \alpha\mathbf{u} + \beta\mathbf{u}$. (distributivity II)
  3. $\alpha (\beta \mathbf{u})= (\alpha\beta) \mathbf{u}$

(associativity of scalar multiplication).

  1. There exists a unit scalar $1$ such that $1 \mathbf{u}= \mathbf{u}$.

If you know anything about vectors, then the above properties should be familiar to you. Indeed, these are standard properties for the vector space $\mathbb{R}^n$ (and its subsets), where the field $F$ is $\mathbb{R}$ and we use the standard vector addition and scalar multiplication operations.

In this section, we'll see that the many of the things we learned about $\mathbb{R}^n$ vectors apply to other mathematical objects which are vector like.

Examples

Matrices

Consider the vector space of $m\times n$ matrices over the real numbers $\mathbb{R}^{m \times n}$. The addition operation for two matrices $A,B \in \mathbb{R}^{m \times n}$ is the usual rule matrix addition: $(A+B)_{ij} = a_{ij}+b_{ij}$.

This vector space is $mn$-dimensional. This can be seen by explicitly constructing a basis for this space. The standard basis for this space consists of matrices with zero entries everywhere except for a single one in the $i$th row and the $j$th column. This set is a basis because any matrix $A \in \mathbb{R}^{m \times n}$ can be written as a linear combination of the standard basis and since each of them is manifestly independent from the others.

Symmetric 2x2 matrices

Consider now the set of $2\times2$ symmetric matrices: \[ \mathbb{S}(2,2) \equiv \{ A \in \mathbb{R}^{2 \times 2} \ | \ A = A^T \}, \] in combination with the usual laws for matrix addition an scalar multiplication.

An explicit basis for this space is obtained as follows: \[ \mathbf{v}_1 = \begin{bmatrix} 1 & 0 \nl 0 & 0 \end{bmatrix}, \ \ \mathbf{v}_1 = \begin{bmatrix} 0 & 1 \nl 1 & 0 \end{bmatrix}, \ \ \mathbf{v}_3 = \begin{bmatrix} 0 & 0 \nl 0 & 1 \end{bmatrix}. \]

Observe how any symmetric matrix $\mathbf{s} \in \mathbb{S}(2,2)$ can be written as a linear combination: \[ \mathbf{s} = \begin{bmatrix} a & b \nl b & c \end{bmatrix} = a \begin{bmatrix} 1 & 0 \nl 0 & 0 \end{bmatrix} + b \begin{bmatrix} 0 & 1 \nl 1 & 0 \end{bmatrix} + c \begin{bmatrix} 0 & 0 \nl 0 & 1 \end{bmatrix}. \]

Since there are three vectors in the basis, the vector space of symmetric matrices $\mathbb{S}(2,2)$ is three-dimensional.

Polynomials of degree n

Define the vector space $P_n(t)$ of polynomials with real coefficients and degree less than or equal to $n$. The “vectors” in this space are polynomials of the form: \[ \mathbf{p} = a_0 + a_1x + a_2x^2 + \cdots + a_n x^n, \] where $a_0,a_1,\ldots,a_n$ are the coefficients of the polynomial $\mathbf{p}$.

The addition of vectors $\mathbf{p}, \mathbf{q} \in P_n(t)$ is performed component-wise: \[ \begin{align*} \mathbf{p} + \mathbf{q} & = (a_0+a_1x+\cdots+a_nx^n)+(b_0+b_1x+\cdots+b_nx^n) \nl & =(a_0+b_0)+(a_1+b_1)x+\cdots +(a_n+b_n)x^n. \end{align*} \] Similarly, scalar multiplication acts as you would expect: \[ \alpha \mathbf{p} = \alpha\cdot (a_0+a_1x+\ldots a_nx^n)=(\alpha a_0)+(\alpha a_1)x+\ldots (\alpha a_n)x^n. \]

The space $P_n(x)$ is $n+1$-dimensional since each “vector” in that space has $n+1$ coefficients.

Functions

Another interesting vector space is the set of all functions $f:\mathbb{R} \to \mathbb{R}$ in combination with the point-wise addition and scaler multiplication operations: \[ \mathbf{f}+\mathbf{g}=(f+g)(x) = f(x) + g(x), \qquad \alpha\mathbf{f} = (\alpha f)(x) = \alpha f(x). \]

The space of functions is infinite-dimensional.

Discussion

In this section we saw that we can talk about linear independence and bases for more abstract vector spaces. Indeed, these notions are well defined for any vector-like object.

In the next section we will generalize the concept of orthogonality for abstract vector spaces. In order to do this, we have to define an abstract inner product operation.

Links

 
home about buy book