License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ICALP.2016.11
URN: urn:nbn:de:0030-drops-62788
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2016/6278/
Go to the corresponding LIPIcs Volume Portal


Cohen, Michael B. ; Nelson, Jelani ; Woodruff, David P.

Optimal Approximate Matrix Product in Terms of Stable Rank

pdf-format:
LIPIcs-ICALP-2016-11.pdf (0.5 MB)


Abstract

We prove, using the subspace embedding guarantee in a black box way, that one can achieve the spectral norm guarantee for approximate matrix multiplication with a dimensionality-reducing map having m = O(˜r/epsilon^2) rows. Here r˜ is the maximum stable rank, i.e., the squared ratio of Frobenius and operator norms, of the two matrices being multiplied. This is a quantitative improvement over previous work of [Magen and Zouzias, SODA, 2011] and [Kyrillidis et al., arXiv, 2014] and is also optimal for any oblivious dimensionality-reducing map. Furthermore, due to the black box reliance on the subspace embedding property in our proofs, our theorem can be applied to a much more general class of sketching matrices than what was known before, in addition to achieving better bounds. For example, one can apply our theorem to efficient subspace embeddings such as the Subsampled Randomized Hadamard Transform or sparse subspace embeddings, or even with subspace embedding constructions that may be developed in the future.

Our main theorem, via connections with spectral error matrix multiplication proven in previous work, implies quantitative improvements for approximate least squares regression and low rank approximation, and implies faster low rank approximation for popular kernels in machine learning such as the gaussian and Sobolev kernels. Our main result has also already been applied to improve dimensionality reduction guarantees for k-means clustering, and also implies new results for nonparametric regression.

Lastly, we point out that the proof of the "BSS" deterministic row-sampling result of [Batson et al., SICOMP, 2012] can be modified to obtain deterministic row-sampling for approximate matrix product in terms of the stable rank of the matrices. The original "BSS" proof was in terms of the rank rather than the stable rank.

BibTeX - Entry

@InProceedings{cohen_et_al:LIPIcs:2016:6278,
  author =	{Michael B. Cohen and Jelani Nelson and David P. Woodruff},
  title =	{{Optimal Approximate Matrix Product in Terms of Stable Rank}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{11:1--11:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Ioannis Chatzigiannakis and Michael Mitzenmacher and Yuval Rabani and Davide Sangiorgi},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2016/6278},
  URN =		{urn:nbn:de:0030-drops-62788},
  doi =		{10.4230/LIPIcs.ICALP.2016.11},
  annote =	{Keywords: subspace embeddings, approximate matrix multiplication, stable rank, regression, low rank approximation}
}

Keywords: subspace embeddings, approximate matrix multiplication, stable rank, regression, low rank approximation
Collection: 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)
Issue Date: 2016
Date of publication: 23.08.2016


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI