Efficient, Robust and Agnostic Generative Modeling with Group Symmetry and Regularized Divergences

Seminar: 
Applied Mathematics
Event time: 
Wednesday, November 6, 2024 - 2:30pm
Location: 
LOM 214
Speaker: 
Ziyu Chen
Speaker affiliation: 
UMass Amherst
Event description: 

In this talk, I will discuss our recent theoretical advancements in generative modeling. The first part of the presentation will focus on learning distributions with symmetry. I will introduce results on the sample complexity of empirical estimations of probability divergences for group-invariant distributions, and present performance guarantees for GANs and score-based generative models that incorporate symmetry. Notably, I will offer the first quantitative comparison between data augmentation and directly embedding symmetry into models, highlighting the latter as a more fundamental approach for efficient learning. These findings underscore how incorporating symmetry into generative models can significantly enhance learning efficiency, particularly in data-limited scenarios. The second part will cover $\alpha$-divergences with Wasserstein-1 regularization. These divergences can be interpreted as $\alpha$-divergences constrained to Lipschitz test functions in their variational form. I will demonstrate how generative learning can be made agnostic to assumptions about target distributions, including those with heavy tails or low-dimensional and fractal supports, through the use of these divergences as objective functionals. I will outline the conditions for the finiteness of these divergences under minimal assumptions on the target distribution along with the gradient flow formulation associated with them. This framework provides guarantees for various machine learning algorithms that optimize over this class of divergences.