Memory-Sample Lower Bounds for Learning with Classical-Quantum Hybrid Memory

PROCEEDINGS OF THE 55TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING, STOC 2023(2023)

引用 5|浏览19
暂无评分
摘要
In a work by Raz (J. ACM and FOCS 16), it was proved that any algorithm for parity learning on n bits requires either Omega(n(2)) bits of classical memory or an exponential number (in n) of random samples. A line of recent works continued that research direction and showed that for a large collection of classical learning tasks, either super-linear classical memory size or super-polynomially many samples are needed. All these works consider learning algorithms as classical branching programs, which perform classical computation within bounded memory. However, these results do not capture all physical computational models, remarkably, quantum computers and the use of quantum memory. It leaves the possibility that a small piece of quantum memory could significantly reduce the need for classical memory or samples and thus completely change the nature of the classical learning task. Despite the recent research on the necessity of quantum memory for intrinsic quantum learning problems like shadow tomography and purity testing, the role of quantum memory in classical learning tasks remains obscure. In this work, we study classical learning tasks in the presence of quantum memory. We prove that any quantum algorithm with both, classical memory and quantum memory, for parity learning on n bits, requires either Omega(n(2)) bits of classical memory or Omega(n) bits of quantum memory or an exponential number of samples. In other words, the memory-sample lower bound for parity learning remains qualitatively the same, even if the learning algorithm can use, in addition to the classical memory, a quantum memory of size cn (for some constant c > 0). Our result is more general and applies to many other classical learning tasks. Following previous works, we represent by the matrix M : A x X -> {-1, 1} the following learning task. An unknown x is sampled uniformly at random from a concept class X, and a learning algorithm tries to uncover x by seeing streaming of random samples (a(i), b(i) = M(a(i), x)) where for every i, a(i) is an element of A is chosen uniformly at random. Assume that k, l, r are integers such that any submatrix of M of at least 2(-k) center dot vertical bar A vertical bar rows and at least 2(-l) center dot vertical bar X vertical bar columns, has a bias of at most 2(-r). We prove that any algorithm with classical and quantum hybrid memory for the learning problem corresponding to M needs either (1) Omega(k center dot l) bits of classical memory, or (2) Omega(r) qubits of quantum memory, or (3) 2(Omega(r)) random samples, to achieve a success probability at least 2(-O(r)). Our results refute the possibility that a small amount of quantum memory significantly reduces the size of classical memory needed for efficient learning on these problems. Our results also imply improved security of several existing cryptographical protocols in the bounded-storage model (protocols that are based on parity learning on n bits), proving that security holds even in the presence of a quantum adversary with at most cn(2) bits of classical memory and cn bits of quantum memory (for some constant c > 0).
更多
查看译文
关键词
Learning parity,Quantum lower bounds,Time-space lower bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要