# All Seminars

Title: Polynomials vanishing on points in projective space |
---|

Colloquium: Algebra and Number Theory |

Speaker: Brooke Ullery of Harvard University |

Contact: David Zureick-Brown, david.zureick-brown@emory.edu |

Date: 2020-01-30 at 4:00PM |

Venue: MSC W301 |

Download Flyer |

Abstract:We all know that any two points in the plane lie on a unique line. However, three points will lie on a line only if those points are in a very special position: collinear. More generally if Z is a set of k points in n-space, we can ask what the set of polynomials of degree d in n variables that vanish on all the points of Z looks like. The answer depends not only on the values of k, d, and n but also (as we see in the case of three collinear points) on the geometry of Z. This question, in some form, dates back to at least the 4th century. We will talk about several attempts to answer it throughout history and some surprising connections to modern algebraic geometry. |

Title: PDE-Principled Trustworthy Deep Learning Meets Computational Biology |
---|

Colloquium: Computational Mathematics |

Speaker: Bao Wang of University of California, Los Angeles |

Contact: James Nagy, jnagy@emory.edu |

Date: 2020-01-23 at 3:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:Deep learning achieves tremendous success in image and speech recognition and machine translation. However, deep learning is not trustworthy. 1. How to improve the robustness of deep neural networks? Deep neural networks are well known to be vulnerable to adversarial attacks. For instance, malicious attacks can fool the Tesla self-driving system by making a tiny change on the scene acquired by the intelligence system. 2. How to compress high-capacity deep neural networks efficiently without loss of accuracy? It is notorious that the computational cost of inference by deep neural networks is one of the major bottlenecks for applying them to mobile devices. 3. How to protect the private information that is used to train a deep neural network? Deep learning-based artificial intelligence systems may leak the private training data. Fredrikson et al. recently showed that a simple model-inversion attack can recover the portraits of the victims whose face images are used to train the face recognition system. In this talk, I will present some recent work on developing PDE-principled robust neural architecture and optimization algorithms for robust, accurate, private, and efficient deep learning. I will also present some potential applications of the data-driven approach for bio-molecule simulation and computer-aided drug design. |

Title: Gradient Flows: From PDE to Data Analysis |
---|

Colloquium: Computational Mathematics |

Speaker: Franca Hoffmann of California Institute of Technology |

Contact: James Nagy, jnagy@emory.edu |

Date: 2020-01-21 at 3:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:Certain diffusive PDEs can be viewed as infinite-dimensional gradient flows. This fact has led to the development of new tools in various areas of mathematics ranging from PDE theory to data science. In this talk, we focus on two different directions: model-driven approaches and data-driven approaches. In the first part of the talk we use gradient flows for analyzing non-linear and non-local aggregation-diffusion equations when the corresponding energy functionals are not necessarily convex. Moreover, the gradient flow structure enables us to make connections to well-known functional inequalities, revealing possible links between the optimizers of these inequalities and the equilibria of certain aggregation-diffusion PDEs. In the second part, we use and develop gradient flow theory to design novel tools for data analysis. We draw a connection between gradient flows and Ensemble Kalman methods for parameter estimation. We introduce the Ensemble Kalman Sampler - a derivative-free methodology for model calibration and uncertainty quantification in expensive black-box models. The interacting particle dynamics underlying our algorithm can be approximated by a novel gradient flow structure in a modified Wasserstein metric which reflects particle correlations. The geometry of this modified Wasserstein metric is of independent theoretical interest. |

Title: Comparison principles for stochastic heat equations |
---|

Seminar: PDE Seminar |

Speaker: Le Chen of Emory University |

Contact: Maja Taskovic, maja.taskovic@emory.edu |

Date: 2020-01-17 at 1:00PM |

Venue: MSC E408 |

Download Flyer |

Abstract:The stochastic heat equation is a canonical model that is related to many models in mathematical physics, mathematical biology, particle systems, etc. It usually takes the following form: \[ \left(\frac{\partial }{\partial t} -\frac{1}{2}\Delta \right) u(t,x) = \rho(u(t,x)) \:\dot{M}(t,x), \qquad u(0,\cdot) =\mu, \qquad t>0, \: x\in R^d, \] where $\mu$ is the initial data, $\dot{M}$ is a spatially homogeneous Gaussian noise that is white in time and $\rho$ is a Lipschitz continuous function. In this talk, we will study a particular set of properties of this equation --- the comparison principles, which include both {\it sample-path comparison} and {\it stochastic comparison principles}. These results are obtained for general initial data and under the weakest possible requirement on the correlation function --- Dalang's condition, namely, $\int_{ R^d}(1+|\xi|^2)^{-1}\hat{f}(d \xi)<\infty$, where $\hat{f}$ is the spectral measure of the noise. For the sample-path comparison, one can compare solutions pathwisely with respect to different initial conditions $\mu$, while for the stochastic comparison, one can compare certain functionals of the solutions either with respect to different diffusion coefficients $\rho$ or different correlation functions of the noise $f$. This talk is based on some joint works with Jingyu Huang and Kunwoo Kim. |

Title: Geometry + Optimization: towards computational anatomy |
---|

Colloquium: Computational Mathematics |

Speaker: Shahar Kovalsky of Duke University |

Contact: James Nagy, jnagy@emory.edu |

Date: 2020-01-16 at 3:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:Geometric shape-processing lies at the heart of various branches of science: from finite element simulation in engineering, through animation of virtual avatars, to applications such as the analysis of anatomical variations, or detection of structural anomalies in medicine and biology. The demand for such computational approaches in geometry is constantly growing, as 3-dimensional data becomes readily available and is integrated into various everyday uses. I will begin my talk with a brief overview of optimization-based approaches for geometric problems, such as identifying pointwise correspondences between exemplars in a collection of shapes, or deforming shapes to satisfy prescribed constraints in a least-distorting manner. After discussing some of the theoretical and computational challenges arising in these optimization problems, I will focus on large-scale geometric problems and efficient first- and second-order algorithms for their optimization. Then, motivated by anatomical shape analysis, I will show applications of these computational approaches for shape characterization and comparison in evolutionary anthropology. I will finish with briefly presenting two related but tangential works: a theoretical work on the characterization of planar harmonic maps into non-convex domains, and a clinical work on the prediction of thyroid cancer from biopsy images. |

Title: Rethinking regularization in modern machine learning and computational imaging |
---|

Colloquium: Computational Mathematics |

Speaker: Gregory Ongie of |

Contact: James Nagy, jnagy@emory.edu |

Date: 2020-01-13 at 4:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:Optimization is central to both supervised machine learning and inverse problems in computational imaging. These problems are often ill-posed and some form of regularization is necessary to obtain a useful solution. However, new paradigms in machine learning and computational imaging necessitate rethinking the role of regularization, as I will illustrate with two examples. First, in the context of supervised learning with shallow neural networks, I will show how a commonly used form of regularization has a surprising reinterpretation as a convex regularizer in function space. This yields novel insights into the role of overparameterization and depth in learning with neural networks having ReLU activations. Second, I will discuss a novel network architecture for solving linear inverse problems in computational imaging called a Neumann network. Rather than using a pre-specified regularizer, Neumann networks effectively learn a regularizer from training data, outperforming classical techniques. Beyond these two examples, I will show how many open problems in the mathematical foundations of deep learning and computational imaging relate to understanding regularization in its many forms. |

Title: Data Assimilation for State and Parameter Estimation in Hurricane Storm Surge Modeling |
---|

Colloquium: Computational Mathematics |

Speaker: Talea L. Mayo of University of Central Florida |

Contact: James Nagy, jnagy@emory.edu |

Date: 2020-01-09 at 1:30PM |

Venue: MSC W301 |

Download Flyer |

Abstract:Numerical hydrodynamic models are frequently used within the coastal science and engineering communities to simulate tides, waves, and hurricane storm surges. The applications of these simulations are vast, and include hindcasts of historical events, forecasts of impending hurricanes, and long-term flood risk assessment. However, like most numerical models, they are subject to epistemic and aleatoric uncertainties, due to factors including the approximation of relevant physical processes by mathematical models, the subsequent numerical discretization, uncertain boundary and initial conditions, and unknown model parameters. Quantifying and reducing these uncertainties is essential for developing reliable and robust hydrodynamic models. Data assimilation methods can be used to estimate uncertain model states (e.g. water levels) by informing model output with observations. I have developed these methods for hurricane storm surge modeling applications to reduce uncertainties resulting from coarse spatial resolution (i.e. limited computational resources) and uncertain meteorological conditions. While state estimation is beneficial for accurately simulating the storm surge resulting from a single, observed hurricane, broader contributions can be made by estimating uncertain model parameters. To this end, I have also developed these methods for parameter estimation. In this talk, I will discuss applications of data assimilation methods for both state and parameter estimation in hurricane storm surge modeling. |

Title: Witt vectors and perfectoid rings |
---|

Seminar: Algebra |

Speaker: Christopher Davis of UC Irvine |

Contact: David Zureick-Brown, dzb@mathcs.emory.edu |

Date: 2019-12-10 at 4:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:This talk will introduce (p-typical) Witt vectors and the de Rham-Witt complex. Classically these are evaluated on rings of characteristic p, but our focus will be on rings which are p-torsion-free. In particular, we hope to discuss special features of Witt vectors and the de Rham-Witt complex when they are evaluated on p-torsion-free perfectoid rings. This is joint work with Irakli Patchkoria, and is based on earlier work of Hesselholt and Hesselholt-Madsen. |

Title: Recent advances in the fast simulation of the Steady and Unsteady Incompressible Navier-Stokes Equations. |
---|

Seminar: Computational Math |

Speaker: Alessandro Veneziani of Emory University |

Contact: Yuanzhe Xi, yuanzhe.xi@EMORY.EDU |

Date: 2019-12-06 at 2:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:The efficient numerical solution of the Steady Incompressible Navier-Stokes equations is receiving more attention recently, driven by some applications where steadiness is solved as a surrogate of time average (see, e.g., [Tang, Chun Xiang, et al., JACC: Cardiov Imag (2019)]). The efficient numerical solution is challenged by the absence of the time derivative that makes the algebraic structure of the problem more problematic. In this talk, we cover some recent advances considering smart algebraic factorizations to mimic splitting strategies popular in the unsteady case [A. Viguerie, A. Veneziani, CMAME 330 (2018)], new stabilization techniques inspired by turbulence modeling [A. Viguerie, A. Veneziani, JCP 391 (2019)] and the treatment of nonstandard boundary conditions emerging in computational hemodynamics [A. Veneziani, A. Viguerie, in preparation (2019)], that inspired this research. In the latter case, the focus will be on the so-called backflow and inflow instabilities [H. Xu et al., to appear in JCP 2020] occurring in defective problems (i.e., problems where the data available are incomplete to make the mathematical formulation well-posed). Dedicated to the memory of Dr. G. Zanetti (1959-2019). The NSF Project DMS-1620406 supported this research. |

Title: One trick with two applications |
---|

Seminar: Combinatorics |

Speaker: Mathias Schacht of The University of Hamburg and Yale University |

Contact: Dwight Duffus, dwightduffus@emory.edu |

Date: 2019-12-06 at 4:00PM |

Venue: MSC W303 |

Download Flyer |

Abstract:We discuss a recent key lemma of Alweiss, Lovett, Wu and Zhang which led to big improvement for the Erdos-Rado sunflower problem. Essentially the same lemma was also crucial in the recent work of Frankston, Kahn, Narayanan, and Park showing that thresholds of increasing properties of binomial random discrete structures are at most a log-factor away from the so-called (fractional) expectation threshold. This fairly general result gives a new proof of the Johansson-Kahn-Vu theorem for perfect matchings in random hypergraphs. |