Machine Learning: Foundations and Applications
Professor: Paul J. Atzberger
MATH 260J Fall 2017, Meeting in Arts 1356
TR 2:00pm - 3:15pm

Main Menu


Course Flyer


Class Annoucements

Supplemental Class Notes

Software and Web Resources

Atzberger Homepage


Welcome to the class website for Machine Learning: Foundations and Applications . This class aims to serve as an accessible introduction to machine learning. The course aims to cover foundations and the practical use of methods in applications arising in recent data-driven fields, the natural sciences, and engineering. The class will also cover select advanced topics on deep learning and dimension reduction. Please check back for updates on the course website periodically.

In the course we plan to use the books

  • The Elements of Statistical Learning: Data Mining, Inference, and Prediction, by Hastie, Tibshirani, Friedman.
  • Foundations of Machine Learning, by Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar.

The course will also be based on recent papers from the literature and special lecture materials prepared by the instructor. Topics may be adjusted based on the backgrounds and interests of the class.

Syllabus [PDF]
Course Flyer [PDF]


  • Introduction
    • Historic developments and recent motivations
    • Statistical Learning Theory, PAC-Learnability
    • Concentration Inequalities and Sample Complexity Bounds
    • Rademacher Complexity, Vapnik–Chervonenkis Dimension
    • No-Free-Lunch Theorems
    • Motivating applications
  • Supervised learning
    • Linear methods for regression and classification
    • Parametric vs non-parametric regression
    • Kernel methods, Mercer Theorem, Reproducing Kernel Hilbert Spaces
    • Model selection and bias-variance trade-offs
    • Support vector machines
    • Graphical models
    • Neural networks
  • Unsupervised learning
    • Clustering methods
    • Principle component analysis and related methods
    • Diffusion maps
    • Manifold learning
  • Computational methods for machine learning
    • Stochastic gradient descent
    • First-order non-linear optimization methods
    • Markov-chain monte-carlo sampling for posterior distributions
    • Sampling with ito stochastic processes
    • Variational inference
    • Iterative methods and preconditioning
    • Dimensionality reduction
    • Sparse matrix methods
    • Stochastic averaging and multiscale methods
    • Example applications


Calculus, linear algebra, and ideally some experience programming.


The grade for this special topics class will be based mostly on participation and through specific homework assignments and a final project.

Supplemental Materials:

Edit: Main | Menu | Description | Info | Image | Log-out