Classification Logit Two-sample Testing by Neural Networks

Seminar: 
Applied Mathematics
Event time: 
Tuesday, December 3, 2019 - 4:00pm
Location: 
LOM 215
Speaker: 
Xiuyuan Cheng
Speaker affiliation: 
Duke
Event description: 

The recent success of generative adversarial networks and variational learning suggests that training a classifier network may work well in addressing the classical two-sample testing problem. Neural network approaches, compared to kernel approaches, can have the computational advantage that the algorithm better scales to large samples once the model is trained. In this talk, we introduce the network-logit test, which uses the logit of a trained neural network classifier evaluated on the two finite samples as the test statistic. Theoretically, the testing power to differentiate two smooth densities is proved given that the network is sufficiently parametrized, and when the two densities lie on or near to low-dimensional manifolds embedded in possibly high-dimensional space, the needed network complexity is reduced to only depending on the intrinsic manifold geometry. In experiments, the method demonstrates better performance than previous neural network tests which use the classification accuracy as the test statistic, and can compare favorably to certain kernel maximum mean discrepancy (MMD) tests on synthetic and hand-written digits datasets. We will also discuss limitations of the method and open questions. Joint work with Alexander Cloninger.