License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.STACS.2016.54
URN: urn:nbn:de:0030-drops-57550
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2016/5755/
Go to the corresponding LIPIcs Volume Portal


Milovanov, Alexey

Algorithmic Statistics, Prediction and Machine Learning

pdf-format:
55.pdf (0.6 MB)


Abstract

Algorithmic statistics considers the following problem: given a binary string x (e.g., some experimental data), find a "good" explanation of this data. It uses algorithmic information theory to define formally what is a good explanation. In this paper we extend this framework in two directions.

First, the explanations are not only interesting in themselves but also used for prediction: we want to know what kind of data we may reasonably expect in similar situations (repeating the same experiment). We show that some kind of hierarchy can be constructed both in terms of algorithmic statistics and using the notion of a priori probability, and these two approaches turn out to be equivalent (Theorem 5).

Second, a more realistic approach that goes back to machine learning theory, assumes that we have not a single data string x but some set of "positive examples" x_1,...,x_l that all belong to some unknown set A, a property that we want to learn. We want this set A to contain all positive examples and to be as small and simple as possible. We show how algorithmic statistic can be extended to cover this situation (Theorem 11).

BibTeX - Entry

@InProceedings{milovanov:LIPIcs:2016:5755,
  author =	{Alexey Milovanov},
  title =	{{Algorithmic Statistics, Prediction and Machine Learning}},
  booktitle =	{33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016)},
  pages =	{54:1--54:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-001-9},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{47},
  editor =	{Nicolas Ollinger and Heribert Vollmer},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2016/5755},
  URN =		{urn:nbn:de:0030-drops-57550},
  doi =		{10.4230/LIPIcs.STACS.2016.54},
  annote =	{Keywords: algorithmic information theory, minimal description length, prediction, kolmogorov complexity, learning}
}

Keywords: algorithmic information theory, minimal description length, prediction, kolmogorov complexity, learning
Collection: 33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016)
Issue Date: 2016
Date of publication: 16.02.2016


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI