Processing math: 100%

The page you are reading is part of a draft (v2.0) of the "No bullshit guide to math and physics."

The text has since gone through many edits and is now available in print and electronic format. The current edition of the book is v4.0, which is a substantial improvement in terms of content and language (I hired a professional editor) from the draft version.

I'm leaving the old wiki content up for the time being, but I highly engourage you to check out the finished book. You can check out an extended preview here (PDF, 106 pages, 5MB).


Abstract vector spaces

The math we learned for dealing with vectors can be applied more generally to vector-like things. We will see that several mathematical objects like matrices and polynomials behave similarly to vectors. For example, the addition of two polynomials P and Q is done by adding the coefficients for each power of x component-wise, the same way the addition of vectors happens component-wise.

In this section, we'll learn how to use the terminology and concepts associated with regular vector spaces to study other mathematical objects. In particular we'll see that notions such as linear independence, basis, and dimension can be applied to pretty much all mathematical objects that have components.

Definitions

To specify an abstract vector space (V,F,+,), we must specify four things:

  1. A set of vector-like objects V={u,v,}.
  2. A field F of scalar numbers, usually F=R or F=C.

In this section F=R.

  1. An addition operation “+” for the elements of V that dictates

how to add vectors u+v.

  1. A scalar multiplication operation “” for scaling a vector

by an element of the field. Scalar multiplication is usually denoted

  implicitly $\alpha \mathbf{u}$ (without the dot).

NOINDENT A vector space satisfies the following eight axioms, for all scalars α,βF and all u,v,wV:

  1. u+(v+w)=(u+v)+w. (associativity of addition)
  2. u+v=v+u. (commutativity of addition)
  3. There exists zero vector 0V,

such that u+0=0+u=u for all uV.

  1. For every uV, there exists an inverse element

u such that u+(u)=uu=0.

  1. α(u+v)=αu+αv. (distributivity I)
  2. (α+β)u=αu+βu. (distributivity II)
  3. α(βu)=(αβ)u

(associativity of scalar multiplication).

  1. There exists a unit scalar 1 such that 1u=u.

If you know anything about vectors, then the above properties should be familiar to you. Indeed, these are standard properties for the vector space Rn (and its subsets), where the field F is R and we use the standard vector addition and scalar multiplication operations.

In this section, we'll see that the many of the things we learned about Rn vectors apply to other mathematical objects which are vector like.

Examples

Matrices

Consider the vector space of m×n matrices over the real numbers Rm×n. The addition operation for two matrices A,BRm×n is the usual rule matrix addition: (A+B)ij=aij+bij.

This vector space is mn-dimensional. This can be seen by explicitly constructing a basis for this space. The standard basis for this space consists of matrices with zero entries everywhere except for a single one in the ith row and the jth column. This set is a basis because any matrix ARm×n can be written as a linear combination of the standard basis and since each of them is manifestly independent from the others.

Symmetric 2x2 matrices

Consider now the set of 2×2 symmetric matrices: S(2,2){AR2×2 | A=AT}, in combination with the usual laws for matrix addition an scalar multiplication.

An explicit basis for this space is obtained as follows: v1=[1000],  v1=[0110],  v3=[0001].

Observe how any symmetric matrix sS(2,2) can be written as a linear combination: s=[abbc]=a[1000]+b[0110]+c[0001].

Since there are three vectors in the basis, the vector space of symmetric matrices S(2,2) is three-dimensional.

Polynomials of degree n

Define the vector space Pn(t) of polynomials with real coefficients and degree less than or equal to n. The “vectors” in this space are polynomials of the form: p=a0+a1x+a2x2++anxn, where a0,a1,,an are the coefficients of the polynomial p.

The addition of vectors p,qPn(t) is performed component-wise: p+q=(a0+a1x++anxn)+(b0+b1x++bnxn)=(a0+b0)+(a1+b1)x++(an+bn)xn. Similarly, scalar multiplication acts as you would expect: αp=α(a0+a1x+anxn)=(αa0)+(αa1)x+(αan)xn.

The space Pn(x) is n+1-dimensional since each “vector” in that space has n+1 coefficients.

Functions

Another interesting vector space is the set of all functions f:RR in combination with the point-wise addition and scaler multiplication operations: f+g=(f+g)(x)=f(x)+g(x),αf=(αf)(x)=αf(x).

The space of functions is infinite-dimensional.

Discussion

In this section we saw that we can talk about linear independence and bases for more abstract vector spaces. Indeed, these notions are well defined for any vector-like object.

In the next section we will generalize the concept of orthogonality for abstract vector spaces. In order to do this, we have to define an abstract inner product operation.

Links

 
home about buy book