An Improved Algorithm For Learning Sparse Parities In The Presence Of Noise

THEORETICAL COMPUTER SCIENCE(2021)

引用 0|浏览10
暂无评分
摘要
We revisit the Learning Sparse Parities with Noise (LSPN) problem on k out of n variables for k << n, and present the following findings.1. For true parity size k = n(u) for any 0 < u < 1, and noise rate eta < 1/2, the first algorithm solves the (n,k,eta)-LSPN problem with constant probability and time/sample complexity n((1-u+o(1))k)/(1/2-eta)(2).2. For any 1/2 < c(1) < 1, k = o(eta n/logn), and eta <= n (-c1)/4, our second algorithm solves the (n,k,eta)-LSPN problem with constant probability and time/sample complexity n(2(1-c1+o(1))k).3. We show a "win-win" result about reducing the number of samples. If there is an algorithm that solves (n, k, eta)-LSPN problem with probability Omega(1), time/sample complexity n(O(k)) for k = o(n(1-c)), any noise rate eta = n(1-2C)/3 and 1/2 <= c < 1. Then, either there exists an algorithm that solves the (n, k, mu)-LSPN problem under lower noise rate mu = n(-c)/3 using only 2n samples, or there exists an algorithm that solves the (n, k ', mu)-LSPN problem for a much larger k ' = n(1-c) with probability n(-O(k))/poly(n), and time complexity poly(n) center dot n(O(k)), using only n samples.Our algorithms are simple in concept by combining a few basic techniques such as majority voting, reduction from the LSPN problem to its decisional variant, Goldreich-Levin list decoding, and computational sample amplification. (C) 2021 Elsevier B.V. All rights reserved.
更多
查看译文
关键词
Learning theory, Algorithm analysis, Learning parity with noise, Learning sparse parity with noise, Sample amplification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要