License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2023.70
URN: urn:nbn:de:0030-drops-175736
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2023/17573/
Go to the corresponding LIPIcs Volume Portal


Hirahara, Shuichi ; Nanashima, Mikito

Learning Versus Pseudorandom Generators in Constant Parallel Time

pdf-format:
LIPIcs-ITCS-2023-70.pdf (0.7 MB)


Abstract

A polynomial-stretch pseudorandom generator (PPRG) in NC⁰ (i.e., constant parallel time) is one of the most important cryptographic primitives, especially for constructing highly efficient cryptography and indistinguishability obfuscation. The celebrated work (Applebaum, Ishai, and Kushilevitz, SIAM Journal on Computing, 2006) on randomized encodings yields the characterization of sublinear-stretch pseudorandom generators in NC⁰ by the existence of logspace-computable one-way functions, but characterizing PPRGs in NC⁰ seems out of reach at present. Therefore, it is natural to ask which sort of hardness notion is essential for constructing PPRGs in NC⁰. Particularly, to the best of our knowledge, all the previously known candidates for PPRGs in NC⁰ follow only one framework based on Goldreich’s one-way function.
In this paper, we present a new learning-theoretic characterization for PPRGs in NC⁰ and related classes. Specifically, we consider the average-case hardness of learning for well-studied classes in parameterized settings, where the number of samples is restricted to fixed-parameter tractable (FPT), and show that the following are equivalent:
- The existence of (a collection of) PPRGs in NC⁰.
- The average-case hardness of learning sparse ?₂-polynomials on a sparse example distribution and an NC⁰-samplable target distribution (i.e., a distribution on target functions).
- The average-case hardness of learning Fourier-sparse functions on a sparse example distribution and an NC⁰-samplable target distribution.
- The average-case hardness of learning constant-depth parity decision trees on a sparse example distribution and an NC⁰-samplable target distribution. Furthermore, we characterize a (single) PPRG in parity-NC⁰ by the average-case hardness of learning constant-degree ?₂-polynomials on a uniform example distribution with FPT samples. Based on our results, we propose new candidates for PPRGs in NC⁰ and related classes under a hardness assumption on a natural learning problem. An important property of PPRGs in NC⁰ constructed in our framework is that the output bits are computed by various predicates; thus, it seems to resist an attack that depends on a specific property of one fixed predicate.
Conceptually, the main contribution of this study is to formalize a theory of FPT dualization of concept classes, which yields a meta-theorem for the first result. For the second result on PPRGs in parity-NC⁰, we use a different technique of pseudorandom ?₂-polynomials.

BibTeX - Entry

@InProceedings{hirahara_et_al:LIPIcs.ITCS.2023.70,
  author =	{Hirahara, Shuichi and Nanashima, Mikito},
  title =	{{Learning Versus Pseudorandom Generators in Constant Parallel Time}},
  booktitle =	{14th Innovations in Theoretical Computer Science Conference (ITCS 2023)},
  pages =	{70:1--70:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-263-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{251},
  editor =	{Tauman Kalai, Yael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2023/17573},
  URN =		{urn:nbn:de:0030-drops-175736},
  doi =		{10.4230/LIPIcs.ITCS.2023.70},
  annote =	{Keywords: Parallel cryptography, polynomial-stretch pseudorandom generators in NC⁰, PAC learning, average-case complexity, fixed-parameter tractability}
}

Keywords: Parallel cryptography, polynomial-stretch pseudorandom generators in NC⁰, PAC learning, average-case complexity, fixed-parameter tractability
Collection: 14th Innovations in Theoretical Computer Science Conference (ITCS 2023)
Issue Date: 2023
Date of publication: 01.02.2023


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI