License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.APPROX-RANDOM.2016.7
URN: urn:nbn:de:0030-drops-66304
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2016/6630/
Go to the corresponding LIPIcs Volume Portal


Cohen, Michael B. ; Musco, Cameron ; Pachocki, Jakub

Online Row Sampling

pdf-format:
LIPIcs-APPROX-RANDOM-2016-7.pdf (0.5 MB)


Abstract

Finding a small spectral approximation for a tall n x d matrix A is a fundamental numerical primitive. For a number of reasons, one often seeks an approximation whose rows are sampled from those of A. Row sampling improves interpretability, saves space when A is sparse, and preserves row structure, which is especially important, for example, when A represents a graph.

However, correctly sampling rows from A can be costly when the matrix is large and cannot be stored and processed in memory. Hence, a number of recent publications focus on row sampling in the streaming setting, using little more space than what is required to store the outputted approximation [Kelner Levin 2013] [Kapralov et al. 2014].

Inspired by a growing body of work on online algorithms for machine learning and data analysis, we extend this work to a more restrictive online setting: we read rows of A one by one and immediately decide whether each row should be kept in the spectral approximation or discarded, without ever retracting these decisions. We present an extremely simple algorithm that approximates A up to multiplicative error epsilon and additive error delta using O(d log d log (epsilon ||A||_2^2/delta) / epsilon^2) online samples, with memory overhead proportional to the cost of storing the spectral approximation. We also present an algorithm that uses O(d^2) memory but only requires O(d log (epsilon ||A||_2^2/delta) / epsilon^2) samples, which we show is optimal.

Our methods are clean and intuitive, allow for lower memory usage than prior work, and expose new theoretical properties of leverage score based matrix approximation.

BibTeX - Entry

@InProceedings{cohen_et_al:LIPIcs:2016:6630,
  author =	{Michael B. Cohen and Cameron Musco and Jakub Pachocki},
  title =	{{Online Row Sampling}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016)},
  pages =	{7:1--7:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-018-7},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{60},
  editor =	{Klaus Jansen and Claire Mathieu and Jos{\'e} D. P. Rolim and Chris Umans},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2016/6630},
  URN =		{urn:nbn:de:0030-drops-66304},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2016.7},
  annote =	{Keywords: spectral sparsification, leverage score sampling, online sparsification}
}

Keywords: spectral sparsification, leverage score sampling, online sparsification
Collection: Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016)
Issue Date: 2016
Date of publication: 06.09.2016


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI