Mathematics for Machine Learning

In this course, we’ll study the theory of matrices behind the basic
concepts of maching learning. We’ll present the matrix algebra
and derivations of and over matrices. These definitions are crucial
to understand the optimization methods in machine learning.

A detailed description of the course content, divided into weekly
lectures.
Week 1 Introduction.
Weeks 2-5 Trace, norm, distance, angle, orthogonality. Kronecker product,
vec operator, Haddamard product. Linear systems and
generalized inverses. Moore-Penrose inverse. Determinants.
Linear, bilinear and quadratic forms. Eigenvalues and eigenvectors
of matrices.
Week 6 -11 Matrix differentiation. Polar decomposition. Hessian.
Week 12 Minimization of a second-degree n-variable polynomial subject
to linear constraints.
Weeks 13-14 Applications to machine learning.