License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ICALP.2019.33
URN: urn:nbn:de:0030-drops-106092
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2019/10609/
Go to the corresponding LIPIcs Volume Portal


Chakraborty, Shantanav ; Gilyén, András ; Jeffery, Stacey

The Power of Block-Encoded Matrix Powers: Improved Regression Techniques via Faster Hamiltonian Simulation

pdf-format:
LIPIcs-ICALP-2019-33.pdf (0.5 MB)


Abstract

We apply the framework of block-encodings, introduced by Low and Chuang (under the name standard-form), to the study of quantum machine learning algorithms and derive general results that are applicable to a variety of input models, including sparse matrix oracles and matrices stored in a data structure. We develop several tools within the block-encoding framework, such as singular value estimation of a block-encoded matrix, and quantum linear system solvers using block-encodings. The presented results give new techniques for Hamiltonian simulation of non-sparse matrices, which could be relevant for certain quantum chemistry applications, and which in turn imply an exponential improvement in the dependence on precision in quantum linear systems solvers for non-sparse matrices.
In addition, we develop a technique of variable-time amplitude estimation, based on Ambainis' variable-time amplitude amplification technique, which we are also able to apply within the framework.
As applications, we design the following algorithms: (1) a quantum algorithm for the quantum weighted least squares problem, exhibiting a 6-th power improvement in the dependence on the condition number and an exponential improvement in the dependence on the precision over the previous best algorithm of Kerenidis and Prakash; (2) the first quantum algorithm for the quantum generalized least squares problem; and (3) quantum algorithms for estimating electrical-network quantities, including effective resistance and dissipated power, improving upon previous work.

BibTeX - Entry

@InProceedings{chakraborty_et_al:LIPIcs:2019:10609,
  author =	{Shantanav Chakraborty and Andr{\'a}s Gily{\'e}n and Stacey Jeffery},
  title =	{{The Power of Block-Encoded Matrix Powers: Improved Regression Techniques via Faster Hamiltonian Simulation}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{33:1--33:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Christel Baier and Ioannis Chatzigiannakis and Paola Flocchini and Stefano Leonardi},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2019/10609},
  URN =		{urn:nbn:de:0030-drops-106092},
  doi =		{10.4230/LIPIcs.ICALP.2019.33},
  annote =	{Keywords: Quantum algorithms, Hamiltonian simulation, Quantum machine learning}
}

Keywords: Quantum algorithms, Hamiltonian simulation, Quantum machine learning
Collection: 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)
Issue Date: 2019
Date of publication: 04.07.2019


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI