Overview

Columbia University, Fall 2022M. Miller EismeierSyllabusCourse notes

Course description

The Honors Math sequence covers linear algebra and multivariable calculus with a rigorous, proofbased approach. In addition to the course content itself, the Honors sequence hopes to develop your skills as proofwriters who can clearly communicate the idea of why things are true, in addition to the usual computational content of introductory classes in calculus and linear algebra.

Honors Math A is a course in proof-based linear algebra. We will first discuss mathematical logic, then the idea of sets and functions, before moving on to linear algebra from both abstract and concrete perspectives. No calculus is used except as a source of examples.


Main definitions and theorems

Vector spaces and subspaces

  • (Lemma 38) Basis extension lemma. If is any linearly independent set for a finite-dimensional vector space , there is a larger set which is a basis for (with if ).
  • (Midterm, Problem 6) If two subspaces are complementary (i.e., intersect trivially so that , then .
    • (Sample Final 1, Problem 2) For all , there exist unique and so that .

Linear maps

  • (Theorem 45) Linear maps are determined by their values on a basis. Given a basis for a vector space and a list of vectors , there exists a unique linear map for which for all .
  • (Theorem 49) Rank-nullity. If is a linear map between finite-dimensional vector spaces, we have:
  • (Corollary 50) Following the rank-nullity theorem, we have:
  • (Proposition 53) Given a linear map :
    • If is right invertible, then is surjective
    • If is left invertible, then is injective.
    • If is invertible, then is bijective.
  • (Corollary 57) Invertible map theorem. If is a linear map between two vector spaces of the same dimension, then

Gauss-Jordan algorithm

  • (Proposition 70) If is an matrix and is a vector (i.e., an matrix), and and are obtained by performing the same elementary row operation, then
  • (HW6, Problem 4) A matrix is invertible , the identity matrix.
  • (HW6, Problem 5) Algorithm to compute the inverse of a matrix. Let be an matrix. Perform the Gauss-Jordan algorithm on the pair until is reached for some matrix . Then is invertible if and only if , in which case the inverse .
    • Following Proposition 70, If a pair has solution , then has the same solution .
    • If invertible, GJA transforms the system into , so is some unique vector with .
    • The vector is thus the ‘th column of .

Matrices with respect to a basis

  • (Proposition 74) The transition maps satisfy the following properties:
    • For any basis of , we have is the identity map .
    • For any two bases of , we have .
    • For any three bases , we have .
  • (Proposition 75) Let be a linear map, and choose bases for and for . Then the matrix representation of with respect to two bases is given by

Determinants

  • (Theorem 80) The determinant is well-behaved under matrix multiplication. If and are matrices, then .
    • If is invertible, and .
  • (Theorem 85) Determinant changes with elementary row or column operations.
    • If is obtained by swapping two rows or two columns of , then .
    • If is obtained by scaling a row or column of by , then .
    • If is obtained by adding a multiple of one row or column of to another, then .

Eigen(things)

  • (Lemma 94) If are the eigenvalues of , then the subspaces are independent, so we have for all .
  • (Lemma 95) If are independent subspaces, then . Furthermore, the separate bases for can be combined to form a basis for .
  • (Theorem 97) A linear map is diagonalizable if and only if its characteristic polynomial can be split into linear factors, and the algebraic multiplicity of for every eigenvalue .
  • (Corollary 98) If is a linear map with distinct eigenvalues , then is diagonalizable.
  • (Corollary 100) If is a linear map for some vector space over the algebraically closed field , then has an eigenvector; that is, there exists some and so that .

Inner product spaces

  • (Definition 54) If is an inner product space, the norm, or magnitude, is a function defined by .
  • (Lemma 102) The norm on an inner product space satisfies the following properties:
    • (N1) , and .
    • (N2)
    • (N3)
      • In particular, has no real part.

Orthogonality and orthonormal bases

  • (Definition 57) An orthonormal list of vectors in a finite-dimensional inner product space is a list of vectors for which and for all .
  • (Lemma 106) If is an orthonormal list, it is linearly independent.
  • (Proposition 107) Orthonormal basis extension. If is a finite-dimensional inner product space, every list of orthonormal vectors in can be extended to an orthonormal basis for .

Transpose and dot product

  • (Proposition 113) If is a linear map, we have and .

Maps between inner product spaces

  • (Definition 59) If is an inner product space, an isometry is an invertible linear map with for all .
    • If is a real inner product space, is called an orthogonal transformation and has a corresponding orthogonal matrix. The set of orthogonal matrices is denoted .
    • If is complex, is called an unitary transformation and has a corresponding unitary matrix. The set of unitary matrices is denoted .
  • (Proposition 115) If is a linear map of a finite-dimensional inner product space, then the following are equivalent:
    • is an isometry.
    • preserves norms; for all , we have .
    • satisfies the equation .
    • If and corresponds with an matrix, then the columns of form an orthonormal basis for .
  • (Lemma 118) Inner product and symmetric/Hermitian matrices.
    • A real matrix is symmetric and a complex matrix is Hermitian if and only if for all , we have .
    • A real matrix is skew-symmetric and a complex matrix is skew-Hermitian if and only if for all , we have .
  • (Proposition 119) The eigenvalues of a symmetric or Hermitian matrix are all real, so that its characteristic polynomial splits into linear factors over . The eigenvalues of a skew- matrix are all imaginary.

28 items with this tag.