License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.CCC.2023.12
URN: urn:nbn:de:0030-drops-182825
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2023/18282/
Go to the corresponding LIPIcs Volume Portal


Goldberg, Halley ; Kabanets, Valentine

Improved Learning from Kolmogorov Complexity

pdf-format:
LIPIcs-CCC-2023-12.pdf (0.9 MB)


Abstract

Carmosino, Impagliazzo, Kabanets, and Kolokolova (CCC, 2016) showed that the existence of natural properties in the sense of Razborov and Rudich (JCSS, 1997) implies PAC learning algorithms in the sense of Valiant (Comm. ACM, 1984), for boolean functions in P/poly, under the uniform distribution and with membership queries. It is still an open problem to get from natural properties learning algorithms that do not rely on membership queries but rather use randomly drawn labeled examples.
Natural properties may be understood as an average-case version of MCSP, the problem of deciding the minimum size of a circuit computing a given truth-table. Problems related to MCSP include those concerning time-bounded Kolmogorov complexity. MKTP, for example, asks for the KT-complexity of a given string. KT-complexity is a relaxation of circuit size, as it does away with the requirement that a short description of a string be interpreted as a boolean circuit. In this work, under assumptions of MKTP and the related problem MK^tP being easy on average, we get learning algorithms for boolean functions in P/poly that
- work over any distribution D samplable by a family of polynomial-size circuits (given explicitly in the case of MKTP),
- only use randomly drawn labeled examples from D, and
- are agnostic (do not require the target function to belong to the hypothesis class). Our results build upon the recent work of Hirahara and Nanashima (FOCS, 2021) who showed similar learning consequences but under a stronger assumption that NP is easy on average.

BibTeX - Entry

@InProceedings{goldberg_et_al:LIPIcs.CCC.2023.12,
  author =	{Goldberg, Halley and Kabanets, Valentine},
  title =	{{Improved Learning from Kolmogorov Complexity}},
  booktitle =	{38th Computational Complexity Conference (CCC 2023)},
  pages =	{12:1--12:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-282-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{264},
  editor =	{Ta-Shma, Amnon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2023/18282},
  URN =		{urn:nbn:de:0030-drops-182825},
  doi =		{10.4230/LIPIcs.CCC.2023.12},
  annote =	{Keywords: learning, Kolmogorov complexity, meta-complexity, average-case complexity}
}

Keywords: learning, Kolmogorov complexity, meta-complexity, average-case complexity
Collection: 38th Computational Complexity Conference (CCC 2023)
Issue Date: 2023
Date of publication: 10.07.2023


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI