MATH Seminar
Title: What in the structure of data make them learnable? |
---|
Seminar: Algebra |
Speaker: Matthieu Wyart of EPFL |
Contact: Matthias Chung, matthias.chung@emory.edu |
Date: 2023-12-04 at 11:30AM |
Venue: N215 |
Download Flyer |
Abstract: Deep learning algorithms have achieved remarkable successes, yet why they work is unclear. Notably, they can learn many high-dimensional tasks, a feat generically infeasible due to the so-called curse of dimensionality. What is the structure of data that makes them learnable, and how this structure is exploited by deep neural networks, is a central question of the field. In the absence of an answer, relevant quantities such as the number of training data needed to learn a given task -the sample complexity- cannot be determined. I will show how deep neural networks trained with gradient descent can beat the curse of dimensionality when the task is hierarchically compositional, by building a good representation of the data that effectively lowers the dimension of the problem. This analysis also reveals how the sample complexity is affected by the hierarchical nature of the task. If time permits, I will also discuss how the fact that regions in the data containing information on the task can be sparse affects sample complexity. |
See All Seminars