All Seminars
Title: Wave decay for star-shaped waveguides |
---|
Seminar: Analysis and PDE |
Speaker: Kiril Datchev of Purdue |
Contact: Yiran Wang, yiran.wang@emory.edu |
Date: 2020-02-20 at 2:30PM |
Venue: MSC N306 |
Download Flyer |
Abstract: Let $\Omega \subset \mathbb R^d$ be an unbounded open set. We wish to understand how decay of solutions to the wave equation on $\Omega$ is related to the geometry of $\Omega$.\\ \\ When $\mathbb R^d \setminus \Omega$ is bounded, this is the celebrated obstacle scattering problem. Then a particularly favorable geometric assumption, going back to the original work of Morawetz, is that the obstacle is star shaped. We adapt this assumption to the study of waveguides, which are domains bounded in some directions and unbounded in others, such as tubes or wires. We prove sharp wave decay rates for various waveguides, including the example of a disk removed from a straight planar waveguide, that is to say $\Omega = ((-1,1) \times \mathbb R) \setminus D$, where $D$ is a closed disk contained in $(-1,1) \times \mathbb R$. Our results are based on establishing estimates and pole-free regions for the resolvent of the Laplacian near the continuous spectrum.\\ \\ This talk is based on joint work with Tanya Christiansen. |
Title: A limit theorem in Optimal transportation theory |
---|
Seminar: Analysis and Differential Geometry |
Speaker: Professor Gershon Wolansky of Israel Institute of Technology - Technion |
Contact: Vladimir Oliker, oliker@emory.edu |
Date: 2020-02-11 at 4:00PM |
Venue: MSC W301 |
Download Flyer |
Abstract: I’ll review the fundamental theory of optimal transportation and define the notion of conditional Wasserstein metric on the set of probability measures. If time permits I’ll discuss various applications to control, regularity of flows, dynamics, and design of optimal networks. |
Title: Moduli spaces in computer vision |
---|
Colloquium: Algebra and Number Theory |
Speaker: Max Lieblich of University of Washington |
Contact: David Zureick-Brown, david.zureick-brown@emory.edu |
Date: 2020-02-10 at 2:30PM |
Venue: Mathematics and Science Center: MSC E208 |
Download Flyer |
Abstract: Moduli theory is one of the cornerstones of algebraic geometry. The underlying idea of the theory is that, given a class of mathematical objects, one can often find a universal space parametrizing those objects, and the geometry of this space gives us insight into the objects being parametrized. After introducing moduli theory with some basic classical examples, I will discuss recent applications to computer vision. As it turns out, the roots of computer vision are tightly intertwined with classical projective geometry. I will present the early history and basic geometric problems of computer vision, and then I will talk about how modern methods give us deeper insight into these problems, including new understandings of core algorithms that are used billions of times a day all over the planet. |
Title: The foundation of a matroid |
---|
Colloquium: Algebra and Number Theory |
Speaker: Matt Baker of Georgia Institute of Technology |
Contact: David Zureick-Brown, david.zureick.brown@gmail.com |
Date: 2020-02-05 at 4:00PM |
Venue: MSC W303 |
Download Flyer |
Abstract: Originally introduced independently by Hassler Whitney and Takeo Nakasawa, matroids are a combinatorial way of axiomatizing the notion of linear independence in vector spaces. If $K$ is a field and $n$ is a positive integer, any linear subspace of $K^n$ gives rise to a matroid; such matroid are called \textbf{representable} over $K$. Given a matroid $M$, one can ask over which fields $M$ is representable. More generally, one can ask about representability over partial fields in the sense of Semple and Whittle. Pendavingh and van Zwam introduced the \textbf{universal partial field} of a matroid $M$, which governs the representations of $M$ over all partial fields. Unfortunately, most matroids (asymptotically 100\%, in fact) are not representable over any partial field, and in this case, the universal partial field gives no information. Oliver Lorscheid and I have introduced a generalization of the universal partial field which we call the \textbf{foundation} of a matroid. The foundation of $M$ is a type of algebraic object which we call a \textbf{pasture}; pastures include both hyperfields and partial fields. Pastures form a natural class of field-like objects within Lorscheid's theory of ordered blueprints, and they have desirable categorical properties (e.g., existence of products and coproducts) that make them a natural context in which to study algebraic invariants of matroids. The foundation of a matroid $M$ represents the functor taking a pasture $F$ to the set of rescaling equivalence classes of $F$-representations of $M$; in particular, $M$ is representable over a pasture $F$ if and only if there is a homomorphism from the foundation of $M$ to $F$. (In layman's terms, what we're trying to do is recast as much as possible of the theory of matroids and their representations in functorial ``Grothendieck-style'' algebraic geometry, with the goal of gaining new conceptual insights into various phenomena which were previously understood only through lengthy case-by-case analyses and ad hoc computations.) As a particular application of this point of view, I will explain the classification which Lorscheid and I have recently obtained of all possible foundations for ternary matroids (matroids representable over the field of three elements). The proof of this classification theorem relies crucially on Tutte's celebrated Homotopy Theorem. |
Title: An introduction to counting curves arithmetically |
---|
Colloquium: Algebra and Number Theory |
Speaker: Jesse Kass of University of South Carolina |
Contact: David Zureick-Brown, david.zureick-brown@emory.edu |
Date: 2020-02-03 at 2:30PM |
Venue: Mathematics and Science Center: MSC E208 |
Download Flyer |
Abstract: A long-standing program in algebraic geometry focuses on counting the number of curves in special configuration such as the lines on a cubic surface (27) or the number of conic curves tangent to 5 given conics (3264). While many important counting results have been proven purely in the language of algebraic geometry, a major modern discovery is that curve counts can often be interpreted in terms of algebraic topology and this topological perspective reveals unexpected properties. One problem in modern curve counting is that classical algebraic topology is only available when working over the real or complex numbers. A successful solution to this problem should produce curve counts over fields like the rational numbers in such a way as to record interesting arithmetic information. My talk will explain how to derive such counts using ideas from A1-homotopy theory. The talk will focus on joint work with Marc Levine, Jake Solomon, and Kirsten Wickelgren including a new result about lines on the cubic surface. |
Title: Deep Learning meets Modeling: Taking the Best out of Both Worlds |
---|
Seminar: Computational Mathematics |
Speaker: Gitta Kutyniok, Einstein Chair for Mathematic of TU Berlin |
Contact: James Nagy, jngay@emory.edu |
Date: 2020-02-03 at 4:00PM |
Venue: MSC N306 |
Download Flyer |
Abstract: Pure model-based approaches are today often insufficient for solving complex inverse problems in imaging. At the same time, we witness the tremendous success of data-based methodologies, in particular, deep neural networks for such problems. However, pure deep learning approaches often neglect known and valuable information from the modeling world. In this talk, we will provide an introduction to this problem complex and then focus on the inverse problem of (limited-angle) computed tomography. We will develop a conceptual approach by combining the model-based method of sparse regularization by shearlets with the data-driven method of deep learning. Our solvers are guided by a microlocal analysis viewpoint to pay particular attention to the singularity structures of the data. Finally, we will show that our algorithm significantly outperforms previous methodologies, including methods entirely based on deep learning. |
Title: Polynomials vanishing on points in projective space |
---|
Colloquium: Algebra and Number Theory |
Speaker: Brooke Ullery of Harvard University |
Contact: David Zureick-Brown, david.zureick-brown@emory.edu |
Date: 2020-01-30 at 4:00PM |
Venue: MSC W301 |
Download Flyer |
Abstract: We all know that any two points in the plane lie on a unique line. However, three points will lie on a line only if those points are in a very special position: collinear. More generally if Z is a set of k points in n-space, we can ask what the set of polynomials of degree d in n variables that vanish on all the points of Z looks like. The answer depends not only on the values of k, d, and n but also (as we see in the case of three collinear points) on the geometry of Z. This question, in some form, dates back to at least the 4th century. We will talk about several attempts to answer it throughout history and some surprising connections to modern algebraic geometry. |
Title: PDE-Principled Trustworthy Deep Learning Meets Computational Biology |
---|
Colloquium: Computational Mathematics |
Speaker: Bao Wang of University of California, Los Angeles |
Contact: James Nagy, jnagy@emory.edu |
Date: 2020-01-23 at 3:00PM |
Venue: MSC W303 |
Download Flyer |
Abstract: Deep learning achieves tremendous success in image and speech recognition and machine translation. However, deep learning is not trustworthy. 1. How to improve the robustness of deep neural networks? Deep neural networks are well known to be vulnerable to adversarial attacks. For instance, malicious attacks can fool the Tesla self-driving system by making a tiny change on the scene acquired by the intelligence system. 2. How to compress high-capacity deep neural networks efficiently without loss of accuracy? It is notorious that the computational cost of inference by deep neural networks is one of the major bottlenecks for applying them to mobile devices. 3. How to protect the private information that is used to train a deep neural network? Deep learning-based artificial intelligence systems may leak the private training data. Fredrikson et al. recently showed that a simple model-inversion attack can recover the portraits of the victims whose face images are used to train the face recognition system. In this talk, I will present some recent work on developing PDE-principled robust neural architecture and optimization algorithms for robust, accurate, private, and efficient deep learning. I will also present some potential applications of the data-driven approach for bio-molecule simulation and computer-aided drug design. |
Title: Gradient Flows: From PDE to Data Analysis |
---|
Colloquium: Computational Mathematics |
Speaker: Franca Hoffmann of California Institute of Technology |
Contact: James Nagy, jnagy@emory.edu |
Date: 2020-01-21 at 3:00PM |
Venue: MSC W303 |
Download Flyer |
Abstract: Certain diffusive PDEs can be viewed as infinite-dimensional gradient flows. This fact has led to the development of new tools in various areas of mathematics ranging from PDE theory to data science. In this talk, we focus on two different directions: model-driven approaches and data-driven approaches. In the first part of the talk we use gradient flows for analyzing non-linear and non-local aggregation-diffusion equations when the corresponding energy functionals are not necessarily convex. Moreover, the gradient flow structure enables us to make connections to well-known functional inequalities, revealing possible links between the optimizers of these inequalities and the equilibria of certain aggregation-diffusion PDEs. In the second part, we use and develop gradient flow theory to design novel tools for data analysis. We draw a connection between gradient flows and Ensemble Kalman methods for parameter estimation. We introduce the Ensemble Kalman Sampler - a derivative-free methodology for model calibration and uncertainty quantification in expensive black-box models. The interacting particle dynamics underlying our algorithm can be approximated by a novel gradient flow structure in a modified Wasserstein metric which reflects particle correlations. The geometry of this modified Wasserstein metric is of independent theoretical interest. |
Title: Comparison principles for stochastic heat equations |
---|
Seminar: PDE Seminar |
Speaker: Le Chen of Emory University |
Contact: Maja Taskovic, maja.taskovic@emory.edu |
Date: 2020-01-17 at 1:00PM |
Venue: MSC E408 |
Download Flyer |
Abstract: The stochastic heat equation is a canonical model that is related to many models in mathematical physics, mathematical biology, particle systems, etc. It usually takes the following form: \[ \left(\frac{\partial }{\partial t} -\frac{1}{2}\Delta \right) u(t,x) = \rho(u(t,x)) \:\dot{M}(t,x), \qquad u(0,\cdot) =\mu, \qquad t>0, \: x\in R^d, \] where $\mu$ is the initial data, $\dot{M}$ is a spatially homogeneous Gaussian noise that is white in time and $\rho$ is a Lipschitz continuous function. In this talk, we will study a particular set of properties of this equation --- the comparison principles, which include both {\it sample-path comparison} and {\it stochastic comparison principles}. These results are obtained for general initial data and under the weakest possible requirement on the correlation function --- Dalang's condition, namely, $\int_{ R^d}(1+|\xi|^2)^{-1}\hat{f}(d \xi)<\infty$, where $\hat{f}$ is the spectral measure of the noise. For the sample-path comparison, one can compare solutions pathwisely with respect to different initial conditions $\mu$, while for the stochastic comparison, one can compare certain functionals of the solutions either with respect to different diffusion coefficients $\rho$ or different correlation functions of the noise $f$. This talk is based on some joint works with Jingyu Huang and Kunwoo Kim. |