Manifolds, and new families of multiscale functions that are easy to learn by Neural Networks

Seminar: 
Applied Mathematics
Event time: 
Wednesday, November 8, 2023 - 3:00pm
Location: 
LOM 214
Speaker: 
Shira Faigenbaum-Golovin
Speaker affiliation: 
Duke University
Event description: 

We consider several problems pertaining to low and high-dimensional data, and their relation to the approximation power of Neural Networks. Given a noisy point cloud in a high-dimensional space, we will address the question of denoising and reconstructing a low-dimensional manifold in a high-dimensional space. To address this challenge, we introduce a framework named “Manifold Locally Optimal Projection (MLOP)” and provide its accompanying theoretical analysis.

In the second part of my talk, we will delve into the theoretical aspects of Neural Networks through the lens of approximation theory. Refinable functions, which are the solutions of refinement equations, are the building stones in many constructions; including subdivision schemes used in computer graphics, wavelets, B-splines, as well as several fractals. Even though our earlier work proved that all refinable functions can be implemented, up to arbitrary high precision, by ReLu-based Neural Networks, it was far from clear how such functions could be learned from data. We propose a different type of refinement that involves not only translation and rescaling but also mirroring; functions satisfying the resulting reflecto-refinement equations still generate multiresolution hierarchies that provide an excellent approximation for many functional spaces of interest, yet are also adapted to ReLu networks. We will illustrate the proposed methodology to create new function families.

The talk will be based on joint works with David Levin (TAU) and Ingrid Daubechies (Duke)