License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/DagRep.8.3.60
URN: urn:nbn:de:0030-drops-92977
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2018/9297/
Go back to Dagstuhl Reports


Loh, Po-Ling ; Mazumdar, Arya ; Papailiopoulos, Dimitris ; Urbanke, RĂ¼diger
Weitere Beteiligte (Hrsg. etc.): Po-Ling Loh and Arya Mazumdar and Dimitris Papailiopoulos and RĂ¼diger Urbanke

Coding Theory for Inference, Learning and Optimization (Dagstuhl Seminar 18112)

pdf-format:
dagrep_v008_i003_p060_18112.pdf (9 MB)


Abstract

This report documents the program and the outcomes of Dagstuhl Seminar 18112, "Coding Theory for Inference, Learning and Optimization."

Coding theory has recently found new applications in areas such as distributed machine learning, dimension reduction, and variety of statistical problems involving estimation and inference. In machine learning applications that use large-scale data, it is desirable to communicate the results of distributed computations in an efficient and robust manner. In dimension reduction applications, the pseudorandom properties of algebraic codes may be used to construct projection matrices that are deterministic and facilitate algorithmic efficiency. Finally, relationships that have been forged between coding theory and problems in theoretical computer science, such as k-SAT or the planted clique problem, lead to a new interpretation of the sharp thresholds encountered in these settings in terms of thresholds in channel coding theory.

The aim of this Dagstuhl Seminar was to draw together researchers from industry and academia that are working in coding theory, particularly in these different (and somewhat disparate) application areas of machine learning and inference. The discussions and collaborations facilitated by this seminar were intended to spark new ideas about how coding theory may be used to improve and inform modern techniques for data analytics.

BibTeX - Entry

@Article{loh_et_al:DR:2018:9297,
  author =	{Po-Ling Loh and Arya Mazumdar and Dimitris Papailiopoulos and R{\"u}diger Urbanke},
  title =	{{Coding Theory for Inference, Learning and Optimization (Dagstuhl Seminar 18112)}},
  pages =	{60--73},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2018},
  volume =	{8},
  number =	{3},
  editor =	{Po-Ling Loh and Arya Mazumdar and Dimitris Papailiopoulos and R{\"u}diger Urbanke},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2018/9297},
  URN =		{urn:nbn:de:0030-drops-92977},
  doi =		{10.4230/DagRep.8.3.60},
  annote =	{Keywords: Coding theory, Distributed optimization, Machine learning, Threshold phenomena}
}

Keywords: Coding theory, Distributed optimization, Machine learning, Threshold phenomena
Collection: Dagstuhl Reports, Volume 8, Issue 3
Issue Date: 2018
Date of publication: 25.07.2018


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI