Matrix Methods in Data Analysis, Signal Processing, and Machine Learning by Prof. Gilbert Strang via MIT
Matrix Methods in Data Analysis, Signal Processing, and Machine Learning free videos and free material uploaded by Massachusetts Institute of Technology Staff .
The Column Space of A Contains All Vectors Ax
Multiplying and Factoring Matrices
Orthonormal Columns in Q Give Q′Q=I
Eigenvalues and Eigenvectors
Positive Definite and Semidefinite Matrices
Singular Value Decomposition (SVD)
Eckart-Young: The Closest Rank k Matrix
to A
Norms of Vectors and Matrices
Four Ways to Solve Least Squares Problems
Survey of Difficulties with Ax=b
Minimizing ‖x‖ Subject
to Ax=b
Computing Eigenvalues and Singular Values
Randomized Matrix Multiplication
Low Rank Changes in A and
Its Inverse
Matrices A(t) Depending
on t, Derivative = dA/dt
Derivatives of Inverse and Singular Values
Rapidly Decreasing Singular Values
Counting Parameters in SVD, LU, QR, Saddle Points
Saddle Points Continued, Maxmin Principle
Definitions and Inequalities
Minimizing a Function Step by Step
Gradient Descent: Downhill to a Minimum
Accelerating Gradient Descent (Use Momentum)
Linear Programming and Two-Person Games
Stochastic Gradient Descent
Structure of Neural Nets for Deep Learning
Backpropagation: Find Partial Derivatives
Computing in Class [No video available]
Computing in Class (cont.) [No video available]
Completing a Rank-One Matrix, Circulants!
Eigenvectors of Circulant Matrices: Fourier Matrix
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule
Neural Nets and the Learning Function
Distance Matrices, Procrustes Problem
Finding Clusters in Graphs
Alan Edelman and Julia Language
Linear algebra concepts are key for understanding and creating machine learning algorithms, especially as applied to deep learning and neural networks. This course reviews linear algebra with applications to probability and statistics and optimization–and above all a full explanation of deep learning.
Write a public review