Abstract
Carmosino, Impagliazzo, Kabanets, and Kolokolova (CCC, 2016) showed that the existence of natural properties in the sense of Razborov and Rudich (JCSS, 1997) implies PAC learning algorithms in the sense of Valiant (Comm. ACM, 1984), for boolean functions in P/poly, under the uniform distribution and with membership queries. It is still an open problem to get from natural properties learning algorithms that do not rely on membership queries but rather use randomly drawn labeled examples.
Natural properties may be understood as an averagecase version of MCSP, the problem of deciding the minimum size of a circuit computing a given truthtable. Problems related to MCSP include those concerning timebounded Kolmogorov complexity. MKTP, for example, asks for the KTcomplexity of a given string. KTcomplexity is a relaxation of circuit size, as it does away with the requirement that a short description of a string be interpreted as a boolean circuit. In this work, under assumptions of MKTP and the related problem MK^tP being easy on average, we get learning algorithms for boolean functions in P/poly that
 work over any distribution D samplable by a family of polynomialsize circuits (given explicitly in the case of MKTP),
 only use randomly drawn labeled examples from D, and
 are agnostic (do not require the target function to belong to the hypothesis class). Our results build upon the recent work of Hirahara and Nanashima (FOCS, 2021) who showed similar learning consequences but under a stronger assumption that NP is easy on average.
BibTeX  Entry
@InProceedings{goldberg_et_al:LIPIcs.CCC.2023.12,
author = {Goldberg, Halley and Kabanets, Valentine},
title = {{Improved Learning from Kolmogorov Complexity}},
booktitle = {38th Computational Complexity Conference (CCC 2023)},
pages = {12:112:29},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {9783959772822},
ISSN = {18688969},
year = {2023},
volume = {264},
editor = {TaShma, Amnon},
publisher = {Schloss Dagstuhl  LeibnizZentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2023/18282},
URN = {urn:nbn:de:0030drops182825},
doi = {10.4230/LIPIcs.CCC.2023.12},
annote = {Keywords: learning, Kolmogorov complexity, metacomplexity, averagecase complexity}
}
Keywords: 

learning, Kolmogorov complexity, metacomplexity, averagecase complexity 
Collection: 

38th Computational Complexity Conference (CCC 2023) 
Issue Date: 

2023 
Date of publication: 

10.07.2023 