Machine Learning: Foundations and Applications
Professor: Paul J. Atzberger
MATH 94WK Spring 2021
T 10:00am - 11:00am




Main Menu

Homepage

Syllabus

Class Annoucements

Supplemental Class Notes

Software and Web Resources

Atzberger Homepage


Description

Welcome to the class website for Machine Learning: Foundations and Applications . This seminar course aims to serve as an accessible introduction to machine learning. The course will cover foundations and the practical use of methods in applications arising in recent data-driven fields, the natural sciences, and engineering. The class will also cover select advanced topics on deep learning and dimension reduction. Please check back for updates on the course website periodically.

In the course we plan to use the books

  • The Elements of Statistical Learning: Data Mining, Inference, and Prediction, by Hastie, Tibshirani, Friedman.
  • Foundations of Machine Learning, by Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar.

The course will also be based on recent papers from the literature and special lecture materials prepared by the instructor. Topics may be adjusted based on the backgrounds and interests of the class.

Syllabus [PDF]

Topics

Topic Areas

  • Foundations of Machine Learning / Data Science
    • Historic developments and recent motivations.
    • Concentration Inequalities and Sample Complexity Bounds.
    • Statistical Learning Theory, PAC-Learnability, related theorems.
    • Rademacher Complexity, Vapnik–Chervonenkis Dimension.
    • No-Free-Lunch Theorems.
    • High Dimensional Probability and Statistics.
    • Optimization theory and practice.
  • Supervised learning
    • Linear methods for regression and classification.
    • Model selection and bias-variance trade-offs.
    • Support vector machines.
    • Kernel methods.
    • Parametric vs non-parametric regression.
    • Neural network methods: deep learning.
    • Convolutional Neural Networks (CNNs).
    • Recurrent Neural Networks (RNNs).
  • Unsupervised learning
    • Clustering methods
    • Kernel principal component analysis, and related methods
    • Manifold learning
    • Neural network methods.
    • Autoencoders (AEs)
    • Generative Adversarial Networks (GANs)
  • Additional topics
    • Stochastic approximation and optimization.
    • Variational inference.
    • Generative Methods: GANs, AEs.
    • Graphical models.
    • Randomized numerical linear algebra approximations.
    • Dimensionality reduction.

Prerequisites:

Calculus, linear algebra, and ideally some experience programming.

Grading:

The grade for this special topics class will be based mostly on participation and through specific homework assignments and a final project.

Supplemental Materials:

Additional Information


Edit: Main | Menu | Description | Info | Image | Log-out