License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ISAAC.2016.26
URN: urn:nbn:de:0030-drops-67960
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2016/6796/
Chen, Lijie
Adaptivity vs. Postselection, and Hardness Amplification for Polynomial Approximation
Abstract
We study the following problem: with the power of postselection (classically or quantumly), what is your ability to answer adaptive queries to certain languages? More specifically, for what kind of computational classes C, we have P^C belongs to PostBPP or PostBQP? While a complete answer to the above question seems impossible given the development of present computational complexity theory. We study the analogous question in query complexity, which sheds light on the limitation of relativized methods (the relativization barrier) to the above question.
Informally, we show that, for a partial function f, if there is no efficient small bounded-error algorithm for f classically or quantumly, then there is no efficient postselection bounded-error algorithm to answer adaptive queries to f classically or quantumly. Our results imply a new proof for the classical oracle separation P^{NP^O} notsubset PP^O, which is arguably more elegant. They also lead to a new oracle separation P^{SZK^O} notsubset PP^O, which is close to an oracle separation between SZK and PP - an open problem in the field of oracle separations.
Our result also implies a hardness amplification construction for polynomial approximation: given a function f on n bits, we construct an adaptive-version of f, denoted by F, on O(m·n) bits, such that if f requires large degree to approximate to error 2/3 in a certain one-sided sense, then F requires large degree to approximate even to error 1/2 - 2^{-m}. Our construction achieves the same amplification in the work of Thaler (ICALP, 2016), by composing a function with O(log n) deterministic query complexity, which is in sharp contrast to all the previous results where the composing amplifiers are all hard functions in a certain sense.
BibTeX - Entry
@InProceedings{chen:LIPIcs:2016:6796,
author = {Lijie Chen},
title = {{Adaptivity vs. Postselection, and Hardness Amplification for Polynomial Approximation}},
booktitle = {27th International Symposium on Algorithms and Computation (ISAAC 2016)},
pages = {26:1--26:12},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {978-3-95977-026-2},
ISSN = {1868-8969},
year = {2016},
volume = {64},
editor = {Seok-Hee Hong},
publisher = {Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
address = {Dagstuhl, Germany},
URL = {http://drops.dagstuhl.de/opus/volltexte/2016/6796},
URN = {urn:nbn:de:0030-drops-67960},
doi = {10.4230/LIPIcs.ISAAC.2016.26},
annote = {Keywords: approximate degree, postselection, hardness amplification, adaptivity}
}
Keywords: |
|
approximate degree, postselection, hardness amplification, adaptivity |
Collection: |
|
27th International Symposium on Algorithms and Computation (ISAAC 2016) |
Issue Date: |
|
2016 |
Date of publication: |
|
07.12.2016 |