License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.APPROX-RANDOM.2015.544
URN: urn:nbn:de:0030-drops-53234
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2015/5323/
Go to the corresponding LIPIcs Volume Portal


Bottesch, Ralph Christian ; Gavinsky, Dmitry ; Klauck, Hartmut

Correlation in Hard Distributions in Communication Complexity

pdf-format:
33.pdf (0.6 MB)


Abstract

We study the effect that the amount of correlation in a bipartite distribution has on the communication complexity of a problem under that distribution. We introduce a new family of complexity measures that interpolates between the two previously studied extreme cases: the (standard) randomised communication complexity and the case of distributional complexity under product distributions.

- We give a tight characterisation of the randomised complexity of Disjointness under distributions with mutual information k, showing that it is Theta(sqrt(n(k+1))) for all 0 <= k <= n. This smoothly interpolates between the lower bounds of Babai, Frankl and Simon for the product distribution case (k=0), and the bound of Razborov for the randomised case. The upper bounds improve and generalise what was known for product distributions, and imply that any tight bound for Disjointness needs Omega(n) bits of mutual information in the corresponding distribution.

- We study the same question in the distributional quantum setting, and show a lower bound of Omega((n(k+1))^{1/4}), and an upper bound (via constructing communication protocols), matching up to a logarithmic factor.

- We show that there are total Boolean functions f_d that have distributional communication complexity O(log(n)) under all distributions of information up to o(n), while the (interactive) distributional complexity maximised over all distributions is Theta(log(d)) for n <= d <= 2^{n/100}. This shows, in particular, that the correlation needed to show that a problem is hard can be much larger than the communication complexity of the problem.

- We show that in the setting of one-way communication under product distributions, the dependence of communication cost on the allowed error epsilon is multiplicative in log(1/epsilon) - the previous upper bounds had the dependence of more than 1/epsilon. This result, for the first time, explains how one-way communication complexity under product distributions is stronger than PAC-learning: both tasks are characterised by the VC-dimension, but have very different error dependence (learning from examples, it costs more to reduce the error).

BibTeX - Entry

@InProceedings{bottesch_et_al:LIPIcs:2015:5323,
  author =	{Ralph Christian Bottesch and Dmitry Gavinsky and Hartmut Klauck},
  title =	{{Correlation in Hard Distributions in Communication Complexity}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)},
  pages =	{544--572},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-89-7},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{40},
  editor =	{Naveen Garg and Klaus Jansen and Anup Rao and Jos{\'e} D. P. Rolim},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2015/5323},
  URN =		{urn:nbn:de:0030-drops-53234},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2015.544},
  annote =	{Keywords: communication complexity; information theory}
}

Keywords: communication complexity; information theory
Collection: Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)
Issue Date: 2015
Date of publication: 13.08.2015


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI