License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2019.12
URN: urn:nbn:de:0030-drops-101053
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2018/10105/
Go to the corresponding LIPIcs Volume Portal


Ben-Sasson, Eli ; Saig, Eden

The Complexity of User Retention

pdf-format:
LIPIcs-ITCS-2019-12.pdf (0.5 MB)


Abstract

This paper studies families of distributions T that are amenable to retentive learning, meaning that an expert can retain users that seek to predict their future, assuming user attributes are sampled from T and exposed gradually over time. Limited attention span is the main problem experts face in our model. We make two contributions.
First, we formally define the notions of retentively learnable distributions and properties. Along the way, we define a retention complexity measure of distributions and a natural class of retentive scoring rules that model the way users evaluate experts they interact with. These rules are shown to be tightly connected to truth-eliciting "proper scoring rules" studied in Decision Theory since the 1950's [McCarthy, PNAS 1956].
Second, we take a first step towards relating retention complexity to other measures of significance in computational complexity. In particular, we show that linear properties (over the binary field) are retentively learnable, whereas random Low Density Parity Check (LDPC) codes have, with high probability, maximal retention complexity. Intriguingly, these results resemble known results from the field of property testing and suggest that deeper connections between retentive distributions and locally testable properties may exist.

BibTeX - Entry

@InProceedings{bensasson_et_al:LIPIcs:2018:10105,
  author =	{Eli Ben-Sasson and Eden Saig},
  title =	{{The Complexity of User Retention}},
  booktitle =	{10th Innovations in Theoretical Computer Science  Conference (ITCS 2019)},
  pages =	{12:1--12:30},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{124},
  editor =	{Avrim Blum},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2018/10105},
  URN =		{urn:nbn:de:0030-drops-101053},
  doi =		{10.4230/LIPIcs.ITCS.2019.12},
  annote =	{Keywords: retentive learning, retention complexity, information elicitation, proper scoring rules}
}

Keywords: retentive learning, retention complexity, information elicitation, proper scoring rules
Collection: 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)
Issue Date: 2018
Date of publication: 08.01.2019


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI