License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.FORC.2023.10
URN: urn:nbn:de:0030-drops-179311
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2023/17931/
Go to the corresponding LIPIcs Volume Portal


Awasthi, Pranjal ; Jung, Christopher ; Morgenstern, Jamie

Distributionally Robust Data Join

pdf-format:
LIPIcs-FORC-2023-10.pdf (0.7 MB)


Abstract

Suppose we are given two datasets: a labeled dataset and unlabeled dataset which also has additional auxiliary features not present in the first dataset. What is the most principled way to use these datasets together to construct a predictor?
The answer should depend upon whether these datasets are generated by the same or different distributions over their mutual feature sets, and how similar the test distribution will be to either of those distributions. In many applications, the two datasets will likely follow different distributions, but both may be close to the test distribution. We introduce the problem of building a predictor which minimizes the maximum loss over all probability distributions over the original features, auxiliary features, and binary labels, whose Wasserstein distance is r₁ away from the empirical distribution over the labeled dataset and r₂ away from that of the unlabeled dataset. This can be thought of as a generalization of distributionally robust optimization (DRO), which allows for two data sources, one of which is unlabeled and may contain auxiliary features.

BibTeX - Entry

@InProceedings{awasthi_et_al:LIPIcs.FORC.2023.10,
  author =	{Awasthi, Pranjal and Jung, Christopher and Morgenstern, Jamie},
  title =	{{Distributionally Robust Data Join}},
  booktitle =	{4th Symposium on Foundations of Responsible Computing (FORC 2023)},
  pages =	{10:1--10:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-272-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{256},
  editor =	{Talwar, Kunal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2023/17931},
  URN =		{urn:nbn:de:0030-drops-179311},
  doi =		{10.4230/LIPIcs.FORC.2023.10},
  annote =	{Keywords: Distributionally Robust Optimization, Semi-Supervised Learning, Learning Theory}
}

Keywords: Distributionally Robust Optimization, Semi-Supervised Learning, Learning Theory
Collection: 4th Symposium on Foundations of Responsible Computing (FORC 2023)
Issue Date: 2023
Date of publication: 04.06.2023


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI