License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2018.28
URN: urn:nbn:de:0030-drops-83374
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2018/8337/
Go to the corresponding LIPIcs Volume Portal


Moshkovitz, Dana ; Moshkovitz, Michal

Entropy Samplers and Strong Generic Lower Bounds For Space Bounded Learning

pdf-format:
LIPIcs-ITCS-2018-28.pdf (0.6 MB)


Abstract

With any hypothesis class one can associate a bipartite graph whose vertices are the hypotheses H on one side and all possible labeled examples X on the other side, and an hypothesis is connected to all the labeled examples that are consistent with it. We call this graph the hypotheses graph. We prove that any hypothesis class whose hypotheses graph is mixing cannot be learned using less than Omega(log^2 |H|) memory bits unless the learner uses at least a large number |H|^Omega(1) labeled examples. Our work builds on a combinatorial framework that we suggested in a previous work for proving lower bounds on space bounded learning. The strong lower bound is obtained by defining a new notion of pseudorandomness, the entropy sampler. Raz obtained a similar result using different ideas.

BibTeX - Entry

@InProceedings{moshkovitz_et_al:LIPIcs:2018:8337,
  author =	{Dana Moshkovitz and Michal Moshkovitz},
  title =	{{Entropy Samplers and Strong Generic Lower Bounds For Space Bounded Learning}},
  booktitle =	{9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
  pages =	{28:1--28:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-060-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{94},
  editor =	{Anna R. Karlin},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2018/8337},
  URN =		{urn:nbn:de:0030-drops-83374},
  doi =		{10.4230/LIPIcs.ITCS.2018.28},
  annote =	{Keywords: learning, space bound, mixing, certainty, entropy sampler}
}

Keywords: learning, space bound, mixing, certainty, entropy sampler
Collection: 9th Innovations in Theoretical Computer Science Conference (ITCS 2018)
Issue Date: 2018
Date of publication: 12.01.2018


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI