The 14 lectures will cover the material as broken down below:
1-3: Linear Systems, Matrix Algebra
3-4: Inverses and Transposes
4-5: Vector Spaces and Subspaces
8: Dimension and Subspaces
9-10: Linear Maps. Rank-Nullity Theorem
11-12: Matrices representing Linear Maps
13-14: Inner Product Spaces
Lecture Notes for Linear Algebra (2021)
Publication may 2016, table of contents, preface to the notes, textbooks, websites, and video lectures, sample sections : 1.3 and 3.3 and 3.5 and 7.1.
Linear Algebra @ OCW (video lectures from MIT : Math 18.06) Linear Algebra and Learning from Data @ OCW (video lectures : Math 18.065) Gilbert Strang @ OpenCourseWare Gilbert Strang's MIT Home Page
Other books by Gilbert Strang
Linear algebra for everyone (new textbook, september 2020), linear algebra and learning from data (2019), introduction to linear algebra, 5th edition (2016), differential equations and linear algebra, computational science and engineering, ordering gilbert strang's books, detailed table of contents, front matter (contents and preface), part 1 : basic ideas of linear algebra, 1.1 linear combinations of vectors, 1.2 dot products v · w and lengths || v || and angles θ, 1.3 matrices multiplying vectors : a times x, 1.4 column space and row space of a, 1.5 dependent and independent columns, 1.6 matrix-matrix multiplication ab, 1.7 factoring a into cr : column rank = r = row rank, 1.8 rank one matrices a =(1 column) times (1 row), part 2 : solving linear equations ax = b : a is n by n, 2.1 inverse matrices a -1 and solutions x = a -1 b, 2.2 triangular matrix and back substitution for ux = c, 2.3 elimination : square a to triangular u : ax = b to ux = c, 2.4 row exchanges for nonzero pivots : permutation p, 2.5 elimination with no row exchanges : why is a = lu , 2.6 transposes / symmetric matrices / dot products, 2.7 changes in a -1 from changes in a (more advanced), part 3 : vector spaces and subspaces, basis and dimension, 3.1 vector spaces and four fundamental subspaces, 3.2 basis and dimension of a vector space s, 3.3 independent columns and rows : bases by elimination, 3.4 ax= 0 and ax=b : x nullspace and x particular, 3.5 four fundamental subspaces c( a ), c( a t ), n( a ), n( a t ), 3.6 rank = dimension of column space and row space, 3.7 graphs, incidence matrices, and kirchhoff's laws, 3.8 every matrix a has a pseudoinverse a +, part 4 : orthogonal matrices q t = q -1 and least squares for ax = b, 4.1 orthogonality of the four subspaces (two pairs), 4.2 projections onto lines and subspaces, 4.3 least squares approximations (regression) : a t ax ̂ = a t b, 4.4 independent a 's to orthonormal q 's by gram-schmidt, 4.5 the minimum norm solution to ax = b ( n > m ) is x row space, 4.6 vector norms and matrix norms, part 5 : determinant of a square matrix, 5.1 3 by 3 and n by n determinants, 5.2 cofactors and the formula for a -1, 5.3 det ab = (det a ) (det b ) and cramer's rule, 5.4 volume of box = | determinant of edge matrix e |, part 6 : eigenvalues and eigenvectors : ax = λ x and a n x = λ n x, 6.1 eigenvalues λ and eigenvectors x : ax = λ x, 6.2 diagonalizing a matrix : x -1 ax = Λ = eigenvalues, 6.3 symmetric positive definite matrices : five tests, 6.4 solve linear differential equations du dt = au, 6.5 matrices in engineering : derivatives to differences, 6.6 rayleigh quotients and sx = λ mx (two matrices), 6.7 derivatives of the inverse matrix and the eigenvalues, 6.8 interlacing eigenvalues and low rank changes in s, part 7 : singular values and vectors : av = σ u and a = u Σ v t, 7.1 singular vectors in u and v —singular values in Σ, 7.2 reduced svd / full svd / construct u Σ v t from a t a, 7.3 the geometry of a=u Σ v t : rotate — stretch — rotate, 7.4 a k is closest to a : principal component analysis pca, 7.5 computing eigenvalues of s and singular values of a, 7.6 computing homework and professor townsend's advice, 7.7 compressing images by the svd, 7.8 the victory of orthogonality : nine reasons, part 8 : linear transformations and their matrices, 8.1 examples of linear transformations, 8.2 derivative matrix d and integral matrix d +, 8.3 basis for v and basis for y ⇒ matrix for t : v → y, part 9 : complex numbers and the fourier matrix, 9.1 complex numbers x+iy=re iθ : unit circle r = 1, 9.2 complex matrices : hermitian s = s t and unitary q -1 = q t, 9.3 fourier matrix f and the discrete fourier transform, 9.4 cyclic convolution and the convolution rule, 9.5 fft : the fast fourier transform, 9.6 cyclic permutation p and circulants c, 9.7 the kronecker product a ⊗ b, part 10 : learning from data (deep learning with neural nets), 10.1 learning function f(x, v 0 ) : data v 0 and weights x, 10.2 playground.tensorflow.org : circle dataset, 10.3 playground.tensorflow.org : spiral dataset, 10.4 creating the architecture of deep learning, 10.5 convolutional neural nets : cnn in 1d and 2d, 10.6 counting flat pieces in the graph of f, 10.7 three-way tensors t ijk, part 11 : computing weights by gradient descent, 11.1 minimizing f(x) / solving f(x) =0, 11.2 minimizing a quadratic gives linear equations, 11.3 calculus for a function f(x, y), 11.4 minimizing the loss : stochastic gradient descent, 11.5 slow convergence with zigzag : add momentum, 11.6 direction of the step x k+1 − x k : step length c, 11.7 chain rule for ∇ f and ∇ l, part 12 : basic statistics : mean, variance, covariance, 12.1 mean and variance : actual and expected, 12.2 probability distributions : binomial, poisson, normal, 12.3 covariance matrices and joint probabilities, 12.4 three basic inequalities of statistics, 12.5 markov matrices and markov chains, 12.6 the mean and variance of z = x + y, part 13 : graphs, flows, and linear programming, 13.1 graph incidence matrix a and laplacian matrix a t a, 13.2 ohm's law combines with kirchhoff's law : a t cax = f, 13.3 max flow-min cut problem in linear programming, 13.4 linear programming and duality : max = min, 13.5 finding well-connected clusters in graphs, 13.6 completing rank one matrices.
Browse Course Material
- Prof. Gilbert Strang
As Taught In
- Linear Algebra
Learning Resource Types
Lecture Notes on Linear Algebra
If you're seeing this message, it means we're having trouble loading external resources on our website.
If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.
Unit 1: vectors and spaces, unit 2: matrix transformations, unit 3: alternate coordinate systems (bases).