Upcoming Seminars
Title: The Hassett-Keel program in genus 4 |
---|
Seminar: Algebra |
Speaker: Kristin DeVleming of University of Massachusetts Amherst |
Contact: Andrew Kobin, ajkobin@emory.edu |
Date: 2024-03-19 at 4:00PM |
Venue: MSC W303 |
Download Flyer |
Abstract: Studying the minimal model program with scaling on the moduli space of genus g curves and interpreting the steps in a modular way is known as the Hassett-Keel program. The first few steps are well-understood yet the program remains incomplete in general. We complete the Hassett-Keel program in genus 4 using wall-crossing in K-moduli and modular interpretations. This is joint work with Kenneth Ascher, Yuchen Liu, and Xiaowei Wang. |
Title: Structure-conforming Operator Learning via Transformers |
---|
Seminar: Numerical Analysis and Scientific Computing |
Speaker: Shuhao Cao of University of Missouri-Kansas City |
Contact: Yuanzhe Xi, yuanzhe.xi@emory.edu |
Date: 2024-03-21 at 10:00AM |
Venue: MSC W201 |
Download Flyer |
Abstract: GPT, Stable Diffusion, AlphaFold 2, etc., all these state-of-the-art deep learning models use a neural architecture called "Transformer". Since the emergence of "Attention Is All You Need" paper by Google, Transformer is now the ubiquitous architecture in deep learning. At Transformer's heart and soul is the "attention mechanism". In this talk, we shall dissect the "attention mechanism" through the lens of traditional numerical methods, such as Galerkin methods, and hierarchical matrix decomposition. We will report some numerical results on designing attention-based neural networks according to the structure of a problem in traditional scientific computing, such as inverse problems for Neumann-to-Dirichlet operator (EIT) or multiscale elliptic problems. Progresses within different communities will be briefed to answer some open problems on the mathematical properties of the attention mechanism in Transformers. |
Title: Local heights on hyperelliptic curves for quadratic Chabauty |
---|
Seminar: Algebra |
Speaker: Juanita Duque-Rosero of Boston University |
Contact: Andrew Kobin, ajkobin@emory.edu |
Date: 2024-03-26 at 4:00PM |
Venue: MSC W303 |
Download Flyer |
Abstract: The method of quadratic Chabauty is a powerful tool to determine the set of rational points on curves. A key input for this method is the values of local height functions. In this talk, we will discuss an algorithm to compute these local heights at odd primes v not equal to p for hyperelliptic curves. Through applications, we will see how this work extends the reach of quadratic Chabauty to curves previously deemed inaccessible. This is joint work with Alexander Betts, Sachi Hashimoto, and Pim Spelier. |
Title: Degeneracy of eigenvalues and singular values of parameter dependent matrices |
---|
Seminar: Numerical Analysis and Scientific Computing |
Speaker: Alessandro Pugliese of Georgia Tech/University of Bary |
Contact: Manuela Manetta, manuela.manetta@emory.edu |
Date: 2024-03-28 at 10:00AM |
Venue: MSC W201 |
Download Flyer |
Abstract: Hermitian matrices have real eigenvalues and an orthonormal set of eigenvectors. Do smooth Hermitian matrix valued functions have smooth eigenvalues and eigenvectors? Starting from such question, we will first review known results on the smooth eigenvalue and singular values decompositions of matrices that depend on one or several parameters, and then focus on our contribution, which has been that of devising topological tools to detect and approximate parameters' values where eigenvalues or singular values of a matrix valued function are degenerate (i.e. repeated or zero). The talk will be based on joint work with Luca Dieci (Georgia Tech) and Alessandra Papini (Univ. of Florence). |
Title: Multifidelity linear regression for scientific machine learning from scarce data |
---|
Seminar: Numerical Analysis and Scientific Computing |
Speaker: Elizabeth Qian of Georgia Tech |
Contact: Elizabeth Newman, elizabeth.newman@emory.edu |
Date: 2024-04-04 at 10:00AM |
Venue: MSC W201 |
Download Flyer |
Abstract: Machine learning (ML) methods have garnered significant interest as potential methods for learning surrogate models for complex engineering systems for which traditional simulation is expensive. However, in many scientific and engineering settings, training data are scarce due to the cost of generating data from traditional high-fidelity simulations. ML models trained on scarce data have high variance and are sensitive to vagaries of the training data set. We propose a new multifidelity training approach for scientific machine learning that exploits the scientific context where data of varying fidelities and costs are available; for example high-fidelity data may be generated by an expensive fully resolved physics simulation whereas lower-fidelity data may arise from a cheaper model based on simplifying assumptions. We use the multifidelity data to define new multifidelity Monte Carlo estimators for the unknown parameters of linear regression models, and provide theoretical analyses that guarantee accuracy and improved robustness to small training budgets. Numerical results show that multifidelity learned models achieve order-of-magnitude lower expected error than standard training approaches when high-fidelity data are scarce. |