Spectral Sparsification and Regret Minimization Beyond Matrix Multiplicative Updates


In this paper, we provide a novel construction of the linear-sized spectral sparsifiers of Batson, Spielman and Srivastava. While previous constructions required $Ω(n^4)$ running time, our sparsification routine can be implemented in almost-quadratic running time $O(n^{2+ε}).$ The fundamental conceptual novelty of our work is the leveraging of a strong connection between sparsification and a regret minimization problem over density matrices. This connection was known to provide an interpretation of the randomized sparsifiers of Spielman and Srivastava via the application of matrix multiplicative weight updates (MWU). In this paper, we explain how matrix MWU naturally arises as an instance of the Follow-the-Regularized-Leader framework and generalize this approach to yield a larger class of updates. This new class allows us to accelerate the construction of linear-sized spectral sparsifiers, and give novel insights on the motivation behind Batson, Spielman and Srivastava.

Proceedings of the forty-seventh annual ACM symposium on Theory of Computing