Essential Linear Algebra

In this section we summarize some of the theorems and facts of linear algebra that are most important for numerical methods.

Basics

An m by n matrix A has m rows and n columns. If A is m by n and B is n by p, then the product AB is defined as the m by p matrix with entries

(A B)_{i,j} = \sum_{k=1}^p A_{i,k} B_{k,j}

The matrix product is not commutative in general (AB \neq BA), but it is associative (A(BC) = (AB) C). The definition is motivated mostly by the fact that it represents composition of linear functions.

Fundamental subspaces and rank

We very often think of a m by n matrix A as a linear function

F: \mathbb{C}^n \rightarrow \mathbb{C}^m

with F(x) = Ax. In this context we have the four fundamental subspaces. These are the range or column space of A:

R(A) = \{ y \in \mathbb{C}^m | y = Ax \}

the kernel, or nullspace, of A:

N(A) = ker(A) = \{ x \in \mathbb{C}^n | Ax = 0 \}

and their complements under transposes, the row space or range of A^T, and the kernel of A^T (also called the left kernel of A).

The dimension of R(A) and R(A^T) are equal, and this dimension is called the rank of the matrix A, often denoted r. The dimension of N(A) is n - r, and the dimension of N(A^T) is m - r.

Trace and determinant

The trace of a square (n by n) matrix A is the sum of the diagonal elements:

tr(A) = \sum_{i}^n A_{i,i} = A_{1,1} + A_{2,2} + \ldots + A_{n,n}.

The determinant can be computed (or defined) recursively

\det(A) = \sum_{j=1}^n (-1)^{(i+j)} M_{i,j} = (-1)^{(i+1)} M_{i,1} + (-1)^{(i+2)} M_{i,2} + \ldots + (-1)^{(i+n)} M_{i,n}

where M_{i,j} is the determinant of the submatrix of A which is obtaining by deleting row i and column j.

Determinants distribute multiplicatively, i.e:

\det(A B) = \det(A) \det(B).

They are also invariant under the transpose:

\det(A^T) = \det(A)

Angles and orthogonality

On a vector space V over a field K, an inner product is a map V \times V \rightarrow K, often denoted by <x,y> or <x|y>, which has the most important properties of the standard Euclidean inner product: it is positive-definite, linear in one argument, and complex-conjugate symmetric (<x,y> = \bar{<x,y>}.

For numerical purposes, the field K is almost always either the real or complex numbers, and the inner product is usually the standard Euclidean or Hermitian inner product <x,y> = y^* x (sometimes x^* y is used instead).

In a real inner product space we can define the angle between two vectors by

\cos(\theta) = \frac{<x,y>}{|x| |y|}

where the lengths are also defined through the inner product: |x| = \sqrt{<x,x>}.

Symmetry and Normality

A matrix A is symmetric if A_{i,j} = A_{j,i}, and anti-symmetric if A_{i,j} = -A_{j,i}. Symmetric and anti-symmetric matrices are subsets of the normal matrices, which have the property that A^* A = A A^*, where A^* is the complex-conjugate transpose of A. Normal matrices can be orthogonally diagonalized, so they possess a complete set of orthogonal eigenvectors. In addition, symmetric matrices have real eigenvalues and anti-symmetric matrices have purely imaginary eigenvalues (including 0).