Winter 2018 - Seminar in Applied Mathematics

The UCSB student chapter of the Society for Industrial and Applied Mathematicians (SIAM) will host a weekly, interdisciplinary seminar to provide a forum for graduate students to discuss their research in an informal setting. This is a great outlet to polish your presentation skills in front of a general (technical) audience.

If you are interested in giving a talk during this seminar, please send an email to siam@math.ucsb.edu with a potential title, abstract, and when you would like to speak.


All are welcome to attend, cookies and coffee provided.


When: Mondays 2-3

Where: South Hall 6635

Date

Winter 2018 Schedule

January 17th

Organizational Meeting

January 22nd

Speaker: Chris Gorman (Mathematics)

Title: Fast Algorithms for MSN Interpolation

Fast algorithms in numerical linear algebra frequently result by taking advantage of matrix structure. Obvious examples include multiplication of the Discrete Fourier Matrix (via the Fast Fourier Transform) and factoring a tridiagonal matrix. This talk will focus on studying linear systems that arise from interpolation using the Minimum Sobolev norm (MSN). This reduces to computing the LQ factorization of a Vandermonde system. When interpolating on Chebyshev nodes, there is much structure that can be exploited, allowing for fast algorithms. We present examples in 1 and 2 dimensions interpolating function and derivative information.

January 29th

Speaker: Christian Beuno (Mathematics)

Title: Topological Vector Space Safari

Hilbert spaces and Banach spaces are frequently encountered environments that dot the mathematical landscape, both pure and applied. Occasionally one ventures out beyond these familiar locales to explore more exotic areas such as the weak-* topologies, Fr├ęchet spaces, or the land of tempered distributions. Unfortunately many are lost along the way. Though the definitions are varied, all these are unified under the banner of topological vector spaces, and in fact the above creatures are all part of the same family - the locally convex spaces. On this safari we'll learn what a locally convex (or not) topological vector space is, how to spot one, and how to tame your own! It is recommended to bring your point set topology, linear algebra and bug spray. Strong analysis background would be great, but certainly not necessary for survival.

February 5th

Speaker: Jay Roberts (Mathematics)

Title: A Primer on Deep Learning

Neural networks have become a staple of machine learning techniques in recent years. Deep Convolutional Networks in particular have revolutionized image recognition and processing. In this talk I will introduce the basics of neural networks and the challenges that arise in deep learning. Some results from the mathematical theory of Deep Convolutional Networks will be covered, as well recent work using ODEs to design new network architectures.

February 12th

Speaker: Christian Bueno (Mathematics)

Title: Neural Networks and the Universal Approximation Theorem

Why are neural networks able to describe such a wide variety of phenomenon? The Universal Approximation Theorem (UAT) provides a partial answer by addressing what kinds of functions these networks can represent. In this talk we will carefully introduce neural networks, try and gain an intuition as to why they are so flexible, and ultimately work through Cybenko's classic proof of the UAT.

February 19th

No Seminar

February 26th

Speaker: TBA (TBA)

Title: TBA

TBA

March 5th

Speaker: TBA (TBA)

Title: TBA

TBA

March 12th

Speaker: TBA (TBA)

Title: TBA

TBA