License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2022.95
URN: urn:nbn:de:0030-drops-156912
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2022/15691/
Go to the corresponding LIPIcs Volume Portal


Kong, Yuqing

More Dominantly Truthful Multi-Task Peer Prediction with a Finite Number of Tasks

pdf-format:
LIPIcs-ITCS-2022-95.pdf (2 MB)


Abstract

In the setting where we ask participants multiple similar possibly subjective multi-choice questions (e.g. Do you like Bulbasaur? Y/N; do you like Squirtle? Y/N), peer prediction aims to design mechanisms that encourage honest feedback without verification. A series of works have successfully designed multi-task peer prediction mechanisms where reporting truthfully is better than any other strategy (dominantly truthful), while they require an infinite number of tasks. A recent work proposes the first multi-task peer prediction mechanism, Determinant Mutual Information (DMI)-Mechanism, where not only is dominantly truthful but also works for a finite number of tasks (practical).
However, the existence of other practical dominantly-truthful multi-task peer prediction mechanisms remains to be an open question. This work answers the above question by providing
- a new family of information-monotone information measures: volume mutual information (VMI), where DMI is a special case;
- a new family of practical dominantly-truthful multi-task peer prediction mechanisms, VMI-Mechanisms.
To illustrate the importance of VMI-Mechanisms, we also provide a tractable effort incentive optimization goal. We show that DMI-Mechanism may not be not optimal but we can construct a sequence of VMI-Mechanisms that are approximately optimal.
The main technical highlight in this paper is a novel geometric information measure, Volume Mutual Information, that is based on a simple idea: we can measure an object A’s information amount by the number of objects that is less informative than A. Different densities over the object lead to different information measures. This also gives Determinant Mutual Information a simple geometric interpretation.

BibTeX - Entry

@InProceedings{kong:LIPIcs.ITCS.2022.95,
  author =	{Kong, Yuqing},
  title =	{{More Dominantly Truthful Multi-Task Peer Prediction with a Finite Number of Tasks}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{95:1--95:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2022/15691},
  URN =		{urn:nbn:de:0030-drops-156912},
  doi =		{10.4230/LIPIcs.ITCS.2022.95},
  annote =	{Keywords: Information elicitation, information theory}
}

Keywords: Information elicitation, information theory
Collection: 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)
Issue Date: 2022
Date of publication: 25.01.2022


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI