Description:

  • We write
    • m rows and n columns
  • A collection of vectors in rows or columns
    • or
  • Always think of matrix as a transformation for vectors

Matrix Algebra:

  • Transpose:
    • Exchange row to colume
      • every element at becomes at
  • Matrix Addition/Subtraction:
    • Must have the same size
    • Match element by element
    • Resulting to a new matrix of same size
  • Multiplication:
    • Multiply with a scalar:
    • Matrix-Vector product:
      • Between a matrix and a -vector
      • Returns a vector, like a linear map
        • A vector transformed by the matrix
      • Denote by , the -vector with -th component as
      • If the columns of matrix is given by vectors by columns, then can be intepreted as Linear Combination of the columns,
        • Then we have
        • Think of vectors within matrix as the transformation for the the input (vector)
    • Vector-Matrix product:
      • We first need to transpose vector to have
    • Matrix-Matrix product:
      • Defines for and
      • Combine matrix A and matrix B in 1 matrix
      • returns the matrix with element given by
        • column of A linear combine with row of B
      • Column-wise interpretation:
        • which is matrix-vector product
      • Row-wise interpretation:
        • which is vector-matrix product
      • Generally,
  • Division:
    • By Inverse

As Linear System of Equations

Some classes of matrices:

Matrix as linear map:

  • Vector function
  • A map can be represented by a matrix , mapping input vectors to output a vector
  • where is a matrix
  • Affine maps are linear functions with a constant vector:
    • for some

Matrix Range and Rank:

  • For matrix which are vectors, 1 vector each
    • each is a column space
    • row space also exist for
  • Matrix transformation will result to a new space where the solution exist
    • range of a matrix is the space that the solution can exist in
  • The range of ,
    • ex:
    • now and forms a Linear Combination of vector
    • so is a Subspace, span with basis
  • ie,
  • , rank of is dimension of
    • rank represents the number of linear independent columns of
      • so the matrix transformation output will have nb of dimension =
    • Full rank means all vectors columns are linear independent
      • as, for example:
        • span column space of
        • span row space of , column space of
        • The dimension of the range is capped by the dimension number of vectors

Nullspace of matrix:

  • Nullspace of matrix is a subspace
  • The set of vectors in the input space that are mapped to zero, denoted by
    • As transformed by matrix, some vectors output to origin
    • example: map a 2d plan in 3d to a point, the 2d space is the null space
  • and are mutually orthogonal subspace
    • as they dot product is 0

Trace:

  • The trace of a space matrix is the sum of its diagonal elements
  • The trace is a linear function of the entries of the matrix
  • for any square matrix
  • for any matrices

Matrix determinant:

  • Think of matrix as tranformation, image 2 vectors forms a plan, the determinant denotes the ratio of size after / size before
    • if the determinant is 0 for a plane, then the matrix transformation squish all the plane to a line, or a point
    • generally, if the matrix is 0, it squishing everything to lower dimension
    • if the determinant is negative, meaning the plane flipped around
  • Of a Square Matrix
    • inductive formula (Laplace’s determinant expansion)
    • where is any row, choosen at will
    • denotes a submatrix of obtained by eliminating row and column from
  • For any square matrix and a scalar :

Orthonormal matrix:

  • We have
  • Hence

Matrix inverse:

  • If is non-singular, , then we have as the unique such that
    • If are both square nonsingular, then
  • If is square and nonsingular then:
  • For a generic matrix , pseudoinverse is:
    • if , then is a left inverse of , if
    • if , then is a right inverse of , if
    • In general, a matrix is a pseudoinverse of , if

Matrix Norm

Similar matrix:

  • Two matrices are said to be similar if there exist a nonsingular matrix such that
    • any with with 3 linear independent column vector works
  • Similar matrices are related to different representation of the same linear map, under a change of basis in the underlying space.
  • They have same set of Eigenvalues
  • Matrix represents the linear map , in the new basis defined by the columns of

Eigenvector

Complex Matrix

Matrix Decomposition:

Matrix Completion