License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.MFCS.2018.41
URN: urn:nbn:de:0030-drops-96239
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2018/9623/
Go to the corresponding LIPIcs Volume Portal


Dan, Chen ; Hansen, Kristoffer Arnsfelt ; Jiang, He ; Wang, Liwei ; Zhou, Yuchen

Low Rank Approximation of Binary Matrices: Column Subset Selection and Generalizations

pdf-format:
LIPIcs-MFCS-2018-41.pdf (0.6 MB)


Abstract

Low rank approximation of matrices is an important tool in machine learning. Given a data matrix, low rank approximation helps to find factors, patterns, and provides concise representations for the data. Research on low rank approximation usually focuses on real matrices. However, in many applications data are binary (categorical) rather than continuous. This leads to the problem of low rank approximation of binary matrices. Here we are given a d x n binary matrix A and a small integer k < d. The goal is to find two binary matrices U and V of sizes d x k and k x n respectively, so that the Frobenius norm of A - U V is minimized. There are two models of this problem, depending on the definition of the dot product of binary vectors: The GF(2) model and the Boolean semiring model. Unlike low rank approximation of a real matrix which can be efficiently solved by Singular Value Decomposition, we show that approximation of a binary matrix is NP-hard, even for k=1.
In this paper, our main concern is the problem of Column Subset Selection (CSS), in which the low rank matrix U must be formed by k columns of the data matrix, and we are interested in the approximation ratio achievable by CSS for binary matrices. For the GF(2) model, we show that CSS has approximation ratio bounded by k/2+1+k/(2(2^k-1)) and this is asymptotically tight. For the Boolean model, it turns out that CSS is no longer sufficient to obtain a bound. We then develop a Generalized CSS (GCSS) procedure in which the columns of U are generated from Boolean formulas operating bitwise on selected columns of the data matrix. We show that the approximation ratio achieved by GCSS is bounded by 2^(k-1)+1, and argue that an exponential dependency on k is seems inherent.

BibTeX - Entry

@InProceedings{dan_et_al:LIPIcs:2018:9623,
  author =	{Chen Dan and Kristoffer Arnsfelt Hansen and He Jiang and Liwei Wang and Yuchen Zhou},
  title =	{{Low Rank Approximation of Binary Matrices: Column Subset Selection and Generalizations}},
  booktitle =	{43rd International Symposium on Mathematical Foundations  of Computer Science (MFCS 2018)},
  pages =	{41:1--41:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-086-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{117},
  editor =	{Igor Potapov and Paul Spirakis and James Worrell},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2018/9623},
  URN =		{urn:nbn:de:0030-drops-96239},
  doi =		{10.4230/LIPIcs.MFCS.2018.41},
  annote =	{Keywords: Approximation Algorithms, Low Rank Approximation, Binary Matrices}
}

Keywords: Approximation Algorithms, Low Rank Approximation, Binary Matrices
Collection: 43rd International Symposium on Mathematical Foundations of Computer Science (MFCS 2018)
Issue Date: 2018
Date of publication: 27.08.2018


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI