Essential Linear Algebra
In this section we summarize some of the theorems and facts of linear algebra that are most important for numerical methods.
Basics
An m by n matrix A has m rows and n columns. If A is m by n and B is n by p, then the product AB is defined as the m by p matrix with entries
The matrix product is not commutative in general (AB \neq BA), but it is associative (A(BC) = (AB) C). The definition is motivated mostly by the fact that it represents composition of linear functions.
Fundamental subspaces and rank
We very often think of a m by n matrix A as a linear function
with F(x) = Ax. In this context we have the four fundamental subspaces. These are the range or column space of A:
the kernel, or nullspace, of A:
and their complements under transposes, the row space or range of A^T, and the kernel of A^T (also called the left kernel of A).
The dimension of R(A) and R(A^T) are equal, and this dimension is called the rank of the matrix A, often denoted r. The dimension of N(A) is n - r, and the dimension of N(A^T) is m - r.
Trace and determinant
The trace of a square (n by n) matrix A is the sum of the diagonal elements:
The determinant can be computed (or defined) recursively
where M_{i,j} is the determinant of the submatrix of A which is obtaining by deleting row i and column j.
Determinants distribute multiplicatively, i.e:
They are also invariant under the transpose:
Angles and orthogonality
On a vector space V over a field K, an inner product is a map V \times V \rightarrow K, often denoted by <x,y> or <x|y>, which has the most important properties of the standard Euclidean inner product: it is positive-definite, linear in one argument, and complex-conjugate symmetric (<x,y> = \bar{<x,y>}.
For numerical purposes, the field K is almost always either the real or complex numbers, and the inner product is usually the standard Euclidean or Hermitian inner product <x,y> = y^* x (sometimes x^* y is used instead).
In a real inner product space we can define the angle between two vectors by
where the lengths are also defined through the inner product: |x| = \sqrt{<x,x>}.
Symmetry and Normality
A matrix A is symmetric if A_{i,j} = A_{j,i}, and anti-symmetric if A_{i,j} = -A_{j,i}. Symmetric and anti-symmetric matrices are subsets of the normal matrices, which have the property that A^* A = A A^*, where A^* is the complex-conjugate transpose of A. Normal matrices can be orthogonally diagonalized, so they possess a complete set of orthogonal eigenvectors. In addition, symmetric matrices have real eigenvalues and anti-symmetric matrices have purely imaginary eigenvalues (including 0).