License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ICALP.2016.87
URN: urn:nbn:de:0030-drops-62203
Go to the corresponding LIPIcs Volume Portal

Braverman, Mark ; Schneider, Jon

Information Complexity Is Computable

LIPIcs-ICALP-2016-87.pdf (0.5 MB)


The information complexity of a function f is the minimum amount of information Alice and Bob need to exchange to compute the function f. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function f to within any additive error epsilon > 0, thus resolving an open question as to whether information complexity is computable.

In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of f when restricted to b-bit protocols to the (unrestricted) information complexity of f.

BibTeX - Entry

  author =	{Mark Braverman and Jon Schneider},
  title =	{{Information Complexity Is Computable}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{87:1--87:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Ioannis Chatzigiannakis and Michael Mitzenmacher and Yuval Rabani and Davide Sangiorgi},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{},
  URN =		{urn:nbn:de:0030-drops-62203},
  doi =		{10.4230/LIPIcs.ICALP.2016.87},
  annote =	{Keywords: Communication complexity, convergence rate, information complexity}

Keywords: Communication complexity, convergence rate, information complexity
Collection: 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)
Issue Date: 2016
Date of publication: 23.08.2016

DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI