License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.CCC.2016.6
URN: urn:nbn:de:0030-drops-58450
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2016/5845/
Go to the corresponding LIPIcs Volume Portal


Guruswami, Venkatesan ; Radhakrishnan, Jaikumar

Tight Bounds for Communication-Assisted Agreement Distillation

pdf-format:
24.pdf (0.5 MB)


Abstract

Suppose Alice holds a uniformly random string X in {0,1}^N and Bob holds a noisy version Y of X where each bit of X is flipped independently with probability epsilon in [0,1/2]. Alice and Bob would like to extract a common random string of min-entropy at least k. In this work, we establish the communication versus success probability trade-off for this problem by giving a protocol and a matching lower bound (under the restriction that the string to be agreed upon is determined by Alice's input X). Specifically, we prove that in order for Alice and Bob to agree on a common string with probability 2^{-gamma k} (gamma k >= 1), the optimal communication (up to o(k) terms, and achievable for large N) is precisely (C *(1-gamma) - 2 * sqrt{ C * (1-C) gamma}) * k, where C := 4 * epsilon * (1-epsilon). In particular, the optimal communication to achieve Omega(1) agreement probability approaches 4 * epsilon * (1-epsilon) * k.

We also consider the case when Y is the output of the binary erasure channel on X, where each bit of Y equals the corresponding bit of X with probability 1-epsilon and is otherwise erased (that is, replaced by a "?"). In this case, the communication required becomes (epsilon * (1-gamma) - 2 * sqrt{ epsilon * (1-epsilon) * gamma}) * k. In particular, the optimal communication to achieve Omega(1) agreement probability approaches epsilon * k, and with no communication the optimal agreement probability approaches 2^{- (1-sqrt{1-epsilon})/(1+sqrt{1-epsilon}) * k}.

Our protocols are based on covering codes and extend the approach of (Bogdanov and Mossel, 2011) for the zero-communication case. Our lower bounds rely on hypercontractive inequalities. For the model of bit-flips, our argument extends the approach of (Bogdanov and Mossel, 2011) by allowing communication; for the erasure model, to the best of our knowledge the needed hypercontractivity statement was not studied before, and it was established (given our application) by (Nair and Wang 2015). We also obtain information complexity lower bounds for these tasks, and together with our protocol, they shed light on the recently popular "most informative Boolean function" conjecture of Courtade and Kumar.

BibTeX - Entry

@InProceedings{guruswami_et_al:LIPIcs:2016:5845,
  author =	{Venkatesan Guruswami and Jaikumar Radhakrishnan},
  title =	{{Tight Bounds for Communication-Assisted Agreement Distillation}},
  booktitle =	{31st Conference on Computational Complexity (CCC 2016)},
  pages =	{6:1--6:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-008-8},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{50},
  editor =	{Ran Raz},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2016/5845},
  URN =		{urn:nbn:de:0030-drops-58450},
  doi =		{10.4230/LIPIcs.CCC.2016.6},
  annote =	{Keywords: communication complexity, covering codes, hypercontractivity, information theory, lower bounds, pseudorandomness}
}

Keywords: communication complexity, covering codes, hypercontractivity, information theory, lower bounds, pseudorandomness
Collection: 31st Conference on Computational Complexity (CCC 2016)
Issue Date: 2016
Date of publication: 19.05.2016


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI