Abstracts

Week of May 5, 2024

May 7, 2024
Applied Mathematics Graph Cut-based Optimization for Semi-Supervised Learning 2:00pm -
LOM 214

I will discuss some recent work on cut-based methods for graph-based semi-supervised learning. Classic methods such as Laplace learning are known to be degenerate in low label rate regimes and are dependent on carefully chosen heuristics to map their continuous-valued solutions to discrete labels. By considering a spectral relaxation of a graph cut problem, we formulate graph-based semi-supervised learning as the minimization of a quadratic over a Stiefel manifold. We develop sequential subspace methods to recover critical points, at which the associated multiplier matrix meets a certain upper bound condition. These critical points enjoy global optimality in certain special cases, and can be searched for more efficiently by SSM compared to alternative methods. Next, to address the difficulty of mapping between continuous-valued solutions and discrete labels, I will introduce an “exact” non-convex relaxation of the cardinality constrained minimum cut problem with supervision and our algorithm based on ADMM to solve it. This method significantly outperforms the state of the art across various label-rate and imbalanced class regimes. Our work builds on earlier results by Hager and Krylyuk (SIAM Discrete Math 1999) on graph partitioning by continuous optimization and Hager (Siam Optimization 2001) on sequential subspace algorithms for quadratic minimization over the sphere and Calder, Cook, Thorpe, Slepcev (ICML, 2020) on graph-based semi-supervised learning. Applications to classification of kNN, citation, and large product graphs at low label rates and imbalanced class and label regimes will be discussed.