# All Seminars

Title: A limit theorem in Optimal transportation theory
Seminar: Analysis and Differential Geometry
Speaker: Professor Gershon Wolansky of Israel Institute of Technology - Technion
Contact: Vladimir Oliker, oliker@emory.edu
Date: 2020-02-11 at 4:00PM
Venue: MSC W301
Abstract:
I’ll review the fundamental theory of optimal transportation and define the notion of conditional Wasserstein metric on the set of probability measures. If time permits I’ll discuss various applications to control, regularity of flows, dynamics, and design of optimal networks.
Title: Moduli spaces in computer vision
Colloquium: Algebra and Number Theory
Speaker: Max Lieblich of University of Washington
Contact: David Zureick-Brown, david.zureick-brown@emory.edu
Date: 2020-02-10 at 2:30PM
Venue: Mathematics and Science Center: MSC E208
Abstract:
Moduli theory is one of the cornerstones of algebraic geometry. The underlying idea of the theory is that, given a class of mathematical objects, one can often find a universal space parametrizing those objects, and the geometry of this space gives us insight into the objects being parametrized. After introducing moduli theory with some basic classical examples, I will discuss recent applications to computer vision. As it turns out, the roots of computer vision are tightly intertwined with classical projective geometry. I will present the early history and basic geometric problems of computer vision, and then I will talk about how modern methods give us deeper insight into these problems, including new understandings of core algorithms that are used billions of times a day all over the planet.
Title: The foundation of a matroid
Colloquium: Algebra and Number Theory
Speaker: Matt Baker of Georgia Institute of Technology
Contact: David Zureick-Brown, david.zureick.brown@gmail.com
Date: 2020-02-05 at 4:00PM
Venue: MSC W303
Abstract:
Originally introduced independently by Hassler Whitney and Takeo Nakasawa, matroids are a combinatorial way of axiomatizing the notion of linear independence in vector spaces. If $K$ is a field and $n$ is a positive integer, any linear subspace of $K^n$ gives rise to a matroid; such matroid are called \textbf{representable} over $K$. Given a matroid $M$, one can ask over which fields $M$ is representable. More generally, one can ask about representability over partial fields in the sense of Semple and Whittle. Pendavingh and van Zwam introduced the \textbf{universal partial field} of a matroid $M$, which governs the representations of $M$ over all partial fields. Unfortunately, most matroids (asymptotically 100\%, in fact) are not representable over any partial field, and in this case, the universal partial field gives no information.

Oliver Lorscheid and I have introduced a generalization of the universal partial field which we call the \textbf{foundation} of a matroid. The foundation of $M$ is a type of algebraic object which we call a \textbf{pasture}; pastures include both hyperfields and partial fields. Pastures form a natural class of field-like objects within Lorscheid's theory of ordered blueprints, and they have desirable categorical properties (e.g., existence of products and coproducts) that make them a natural context in which to study algebraic invariants of matroids. The foundation of a matroid $M$ represents the functor taking a pasture $F$ to the set of rescaling equivalence classes of $F$-representations of $M$; in particular, $M$ is representable over a pasture $F$ if and only if there is a homomorphism from the foundation of $M$ to $F$. (In layman's terms, what we're trying to do is recast as much as possible of the theory of matroids and their representations in functorial Grothendieck-style'' algebraic geometry, with the goal of gaining new conceptual insights into various phenomena which were previously understood only through lengthy case-by-case analyses and ad hoc computations.)

As a particular application of this point of view, I will explain the classification which Lorscheid and I have recently obtained of all possible foundations for ternary matroids (matroids representable over the field of three elements). The proof of this classification theorem relies crucially on Tutte's celebrated Homotopy Theorem.
Title: An introduction to counting curves arithmetically
Colloquium: Algebra and Number Theory
Speaker: Jesse Kass of University of South Carolina
Contact: David Zureick-Brown, david.zureick-brown@emory.edu
Date: 2020-02-03 at 2:30PM
Venue: Mathematics and Science Center: MSC E208
Abstract:
A long-standing program in algebraic geometry focuses on counting the number of curves in special configuration such as the lines on a cubic surface (27) or the number of conic curves tangent to 5 given conics (3264). While many important counting results have been proven purely in the language of algebraic geometry, a major modern discovery is that curve counts can often be interpreted in terms of algebraic topology and this topological perspective reveals unexpected properties.

One problem in modern curve counting is that classical algebraic topology is only available when working over the real or complex numbers. A successful solution to this problem should produce curve counts over fields like the rational numbers in such a way as to record interesting arithmetic information. My talk will explain how to derive such counts using ideas from A1-homotopy theory. The talk will focus on joint work with Marc Levine, Jake Solomon, and Kirsten Wickelgren including a new result about lines on the cubic surface.
Title: Deep Learning meets Modeling: Taking the Best out of Both Worlds
Seminar: Computational Mathematics
Speaker: Gitta Kutyniok, Einstein Chair for Mathematic of TU Berlin
Contact: James Nagy, jngay@emory.edu
Date: 2020-02-03 at 4:00PM
Venue: MSC N306
Abstract:
Pure model-based approaches are today often insufficient for solving complex inverse problems in imaging. At the same time, we witness the tremendous success of data-based methodologies, in particular, deep neural networks for such problems. However, pure deep learning approaches often neglect known and valuable information from the modeling world.

In this talk, we will provide an introduction to this problem complex and then focus on the inverse problem of (limited-angle) computed tomography. We will develop a conceptual approach by combining the model-based method of sparse regularization by shearlets with the data-driven method of deep learning. Our solvers are guided by a microlocal analysis viewpoint to pay particular attention to the singularity structures of the data. Finally, we will show that our algorithm significantly outperforms previous methodologies, including methods entirely based on deep learning.
Title: Polynomials vanishing on points in projective space
Colloquium: Algebra and Number Theory
Speaker: Brooke Ullery of Harvard University
Contact: David Zureick-Brown, david.zureick-brown@emory.edu
Date: 2020-01-30 at 4:00PM
Venue: MSC W301
Abstract:
We all know that any two points in the plane lie on a unique line. However, three points will lie on a line only if those points are in a very special position: collinear. More generally if Z is a set of k points in n-space, we can ask what the set of polynomials of degree d in n variables that vanish on all the points of Z looks like. The answer depends not only on the values of k, d, and n but also (as we see in the case of three collinear points) on the geometry of Z. This question, in some form, dates back to at least the 4th century. We will talk about several attempts to answer it throughout history and some surprising connections to modern algebraic geometry.
Title: PDE-Principled Trustworthy Deep Learning Meets Computational Biology
Colloquium: Computational Mathematics
Speaker: Bao Wang of University of California, Los Angeles
Contact: James Nagy, jnagy@emory.edu
Date: 2020-01-23 at 3:00PM
Venue: MSC W303
Abstract:
Deep learning achieves tremendous success in image and speech recognition and machine translation. However, deep learning is not trustworthy. 1. How to improve the robustness of deep neural networks? Deep neural networks are well known to be vulnerable to adversarial attacks. For instance, malicious attacks can fool the Tesla self-driving system by making a tiny change on the scene acquired by the intelligence system. 2. How to compress high-capacity deep neural networks efficiently without loss of accuracy? It is notorious that the computational cost of inference by deep neural networks is one of the major bottlenecks for applying them to mobile devices. 3. How to protect the private information that is used to train a deep neural network? Deep learning-based artificial intelligence systems may leak the private training data. Fredrikson et al. recently showed that a simple model-inversion attack can recover the portraits of the victims whose face images are used to train the face recognition system.

In this talk, I will present some recent work on developing PDE-principled robust neural architecture and optimization algorithms for robust, accurate, private, and efficient deep learning. I will also present some potential applications of the data-driven approach for bio-molecule simulation and computer-aided drug design.
Title: Gradient Flows: From PDE to Data Analysis
Colloquium: Computational Mathematics
Speaker: Franca Hoffmann of California Institute of Technology
Contact: James Nagy, jnagy@emory.edu
Date: 2020-01-21 at 3:00PM
Venue: MSC W303
Abstract:
Certain diffusive PDEs can be viewed as infinite-dimensional gradient flows. This fact has led to the development of new tools in various areas of mathematics ranging from PDE theory to data science. In this talk, we focus on two different directions: model-driven approaches and data-driven approaches.

In the first part of the talk we use gradient flows for analyzing non-linear and non-local aggregation-diffusion equations when the corresponding energy functionals are not necessarily convex. Moreover, the gradient flow structure enables us to make connections to well-known functional inequalities, revealing possible links between the optimizers of these inequalities and the equilibria of certain aggregation-diffusion PDEs.

In the second part, we use and develop gradient flow theory to design novel tools for data analysis. We draw a connection between gradient flows and Ensemble Kalman methods for parameter estimation. We introduce the Ensemble Kalman Sampler - a derivative-free methodology for model calibration and uncertainty quantification in expensive black-box models. The interacting particle dynamics underlying our algorithm can be approximated by a novel gradient flow structure in a modified Wasserstein metric which reflects particle correlations. The geometry of this modified Wasserstein metric is of independent theoretical interest.
Title: Comparison principles for stochastic heat equations
Seminar: PDE Seminar
Speaker: Le Chen of Emory University
Date: 2020-01-17 at 1:00PM
Venue: MSC E408
Abstract:
The stochastic heat equation is a canonical model that is related to many models in mathematical physics, mathematical biology, particle systems, etc. It usually takes the following form: $\left(\frac{\partial }{\partial t} -\frac{1}{2}\Delta \right) u(t,x) = \rho(u(t,x)) \:\dot{M}(t,x), \qquad u(0,\cdot) =\mu, \qquad t>0, \: x\in R^d,$ where $\mu$ is the initial data, $\dot{M}$ is a spatially homogeneous Gaussian noise that is white in time and $\rho$ is a Lipschitz continuous function. In this talk, we will study a particular set of properties of this equation --- the comparison principles, which include both {\it sample-path comparison} and {\it stochastic comparison principles}. These results are obtained for general initial data and under the weakest possible requirement on the correlation function --- Dalang's condition, namely, $\int_{ R^d}(1+|\xi|^2)^{-1}\hat{f}(d \xi)<\infty$, where $\hat{f}$ is the spectral measure of the noise. For the sample-path comparison, one can compare solutions pathwisely with respect to different initial conditions $\mu$, while for the stochastic comparison, one can compare certain functionals of the solutions either with respect to different diffusion coefficients $\rho$ or different correlation functions of the noise $f$. This talk is based on some joint works with Jingyu Huang and Kunwoo Kim.
Title: Geometry + Optimization: towards computational anatomy
Colloquium: Computational Mathematics
Speaker: Shahar Kovalsky of Duke University
Contact: James Nagy, jnagy@emory.edu
Date: 2020-01-16 at 3:00PM
Venue: MSC W303
Abstract:
Geometric shape-processing lies at the heart of various branches of science: from finite element simulation in engineering, through animation of virtual avatars, to applications such as the analysis of anatomical variations, or detection of structural anomalies in medicine and biology. The demand for such computational approaches in geometry is constantly growing, as 3-dimensional data becomes readily available and is integrated into various everyday uses.

I will begin my talk with a brief overview of optimization-based approaches for geometric problems, such as identifying pointwise correspondences between exemplars in a collection of shapes, or deforming shapes to satisfy prescribed constraints in a least-distorting manner. After discussing some of the theoretical and computational challenges arising in these optimization problems, I will focus on large-scale geometric problems and efficient first- and second-order algorithms for their optimization. Then, motivated by anatomical shape analysis, I will show applications of these computational approaches for shape characterization and comparison in evolutionary anthropology.

I will finish with briefly presenting two related but tangential works: a theoretical work on the characterization of planar harmonic maps into non-convex domains, and a clinical work on the prediction of thyroid cancer from biopsy images.