Publication
FOCS 2021
Conference paper

Quantum learning algorithms imply circuit lower bounds

View publication

Abstract

We establish the first general connection between the design of quantum algorithms and circuit lower bounds. Specifically, let C be a class of polynomial-size concepts, and suppose that C can be PAC-learned with membership queries under the uniform distribution with error 1/2-γ by a time T quantum algorithm. We prove that if γ 2cdot T\ll 2n/n, then BQEnot ⊂\mathfrak{C, where BQE} = BQTIME}[2O(n)] is an exponential-time analogue of BQP. This result is optimal in both γ and T, since it is not hard to learn any class C of functions in (classical) time T=2n (with no error), or in quantum time T= poly(n) with error at most 1/2-Ω(2-n/2) via Fourier sampling. In other words, even a marginal quantum speedup over these generic learning algorithms would lead to major consequences in complexity lower bounds. As a consequence, our result shows that the study of quantum learning speedups is intimately connected to fundamental open problems about algorithms, quantum computing, and complexity theory. Our proof builds on several works in learning theory, pseudorandomness, and computational complexity, and on a connection between non-trivial classical learning algorithms and circuit lower bounds established by Oliveira and Santhanam (CCC 2017). Extending their approach to quantum learning algorithms turns out to create significant challenges, since extracting computational hardness from a quantum computation is inherently more complicated. To achieve that, we show among other results how pseudorandom generators imply learning-to-lower-bound connections in a generic fashion, construct the first conditional pseudorandom generator secure against uniform quantum computations, and extend the local list-decoding algorithm of Impagliazzo, Jaiswal, Kabanets and Wigderson (SICOMP 2010) to quantum circuits via a delicate analysis. We believe that these contributions are of independent interest and might find other applications.

Date

Publication

FOCS 2021

Share