Navigation:
UCSB Seal

Paul J. Atzberger

  • Department
    • Applied Mathematics
    • Employment/Positions
    • Event Calendar
    • People
      • Staff
      • Faculty
      • Visitors
    • Directions
    • UCSB Map
  • Research
    • Course Notes and Resources
    • Math Research Groups
      • Applied Math
      • Analysis
      • Partial Differential Equations
      • Geometry
      • Algebra
      • Topology
      • Number Theory
    • UCSB Research Groups
      • Kavli Institute (KITP Physics)
      • Materials Research (MRL)
      • California Nanosystems Institute (CNSI)
      • Computer Science and Engineering (CSE)
      • Center for Financial Mathematics and Statistics (CRFMS)
    • UCSB Math Preprint Server
    • Science Direct
    • Math Sci Net
    • LALN arXiv
    • CiteSeer IST
    • ISI Web of Knowledge
  • Graduate
    • Prospective Students
  • Undergraduate
    • Prospective Students

Description

Welcome to the class website for Special Topics in Machine Learning . This seminar will cover select topics in machine learning concerning mathematical foundations and practical computational aspects. The seminar will also discuss applications arising in recent data-driven fields, the natural sciences, and engineering. The class will also cover select advanced topics on deep learning and dimension reduction. Please check back for updates as the course website will be revised periodically.

We plan to use the books

  • The Elements of Statistical Learning: Data Mining, Inference, and Prediction, by Hastie, Tibshirani, Friedman.
  • Foundations of Machine Learning, by Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar.

The course will also be based on recent papers from the literature and special lecture materials prepared by the instructor. Topics may be adjusted based on the backgrounds and interests of the class.

Syllabus [PDF]

Topics

  • Introduction
    • Historic developments and recent motivations
    • Statistical Learning Theory, PAC-Learnability
    • Concentration Inequalities and Sample Complexity Bounds
    • Rademacher Complexity, Vapnik–Chervonenkis Dimension
    • No-Free-Lunch Theorems
    • Motivating applications
  • Supervised learning
    • Linear methods for regression and classification
    • Parametric vs non-parametric regression
    • Kernel methods, Mercer Theorem, Reproducing Kernel Hilbert Spaces
    • Model selection and bias-variance trade-offs
    • Support vector machines
    • Graphical models
    • Neural networks
  • Unsupervised learning
    • Clustering methods
    • Principle component analysis and related methods
    • Diffusion maps
    • Manifold learning
  • Advanced Topics
    • Non-linear optimization methods for machine learning
    • Theory for design of deep architectures
    • Regularization and stochastic gradient descent
    • Variational inference
    • Sparse matrix methods
    • Dimensionality reduction
    • Markov-chain Monte-Carlo sampling for posterior distributions
    • Computational architectures for machine learning
    • Example applications

Prerequisites:

Calculus, linear algebra, and ideally some experience programming.

Grading:

The grade for this special topics class will be based mostly on participation and through specific homework assignments and a final project.

Supplemental Materials:

  • Python tutorial at Codecademy.
  • Python 2.7 documentation.
  • Enthought Canopy integrated analysis environment.
  • Numpy Tutorial
  • Anaconda Tutorial

Additional Materials

  • You can find additional materials on my related data science course websites
    • [Course MATH CS 120: Fall 2018 ]
    • [Course MATH 260J: Fall 2018 ]
    • [Course MATH 260: Fall 2017]


Edit: Main | Menu | Description | Info | Image | Log-out

Edit | History | Print | Recent Changes | Edit Sidebar

Page last modified on January 28, 2019, at 04:58 pm


Contact Us