Wednesday, April 3, 2024
Time  Items 

All day 

3pm 
04/03/2024  3:00pm Empirical Bayes provides a powerful approach to learning and adapting to latent structure in data. Theory and algorithms for empirical Bayes have a rich literature for sequence models, but are less understood in settings where latent variables and data interact through more complex designs. In this work, we study empirical Bayes estimation of an i.i.d. prior in Bayesian linear models, via the nonparametric maximum likelihood estimator (NPMLE). We introduce and study a system of gradient flow equations for optimizing the marginal loglikelihood, jointly over the prior and posterior measures in its Gibbs variational representation using a smoothed reparametrization of the regression coefficients. A diffusionbased implementation yields a Langevin dynamics MCEM algorithm, where the prior law evolves continuously over time to optimize a sequencemodel loglikelihood defined by the coordinates of the current Langevin iterate. We show consistency of the NPMLE as n,p→∞ under mild conditions, including settings of random subGaussian designs when n≍p. In high noise, we prove a uniform logSobolev inequality for the mixing of Langevin dynamics, for possibly misspecified priors and nonlogconcave posteriors. We then establish polynomialtime convergence of the joint gradient flow to a nearNPMLE if the marginal negative loglikelihood is convex in a sublevel set of the initialization. Location:
LOM 214

4pm 
