Overparameterized and Adversarially Robust Sparse Models

Seminar: 
Applied Mathematics
Event time: 
Monday, April 5, 2021 - 1:00pm
Location: 
Zoom Meeting ID: 97670014308
Speaker: 
Jeremias Sulam
Speaker affiliation: 
Johns Hopkins University
Event description: 

Abstract: Sparsity has been a driving force in signal & image processing and machine learning for decades. In this talk we’ll explore sparse representations based on dictionary learning techniques from two perspectives: over-parameterization and adversarial robustness. First, we will characterize the surprising phenomenon that dictionary recovery can be facilitated by searching over the space of larger (over-realized/parameterized) models. This observation is general and independent of the specific dictionary learning algorithm used. We will demonstrate this observation in practice and provide a theoretical analysis of it by tying recovery measures to generalization bounds. We will further show that an efficient and provably correct distillation mechanism can be employed to recover the correct atoms from the over-realized model, consistently providing better recovery of the ground-truth model.

We will then switch gears towards the analysis of adversarial examples, focusing on the hypothesis class obtained by combining a sparsity-promoting encoder coupled with a linear classifier, and show an interesting interplay between the flexibility and stability of the (supervised) representation map and a notion of margin in the feature space. Leveraging a mild encoder gap assumption in the learned representations, we will provide a bound on the generalization error of the robust risk to L2-bounded adversarial perturbations and a robustness certificate for end-to-end classification. We will demonstrate the applicability of our analysis by computing certified accuracy on real data, and comparing with other alternatives for certified robustness. This analysis will shed light on to how to characterize this interplay for more general models.

email tatianna.curtis@yale.edu for info.