Max-Information, Differential Privacy, and Post-selection Hypothesis Testing

2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)(2016)

引用 87|浏览113
暂无评分
摘要
In this paper, we initiate a principled study of how the generalization properties of approximate differential privacy can be used to perform adaptive hypothesis testing, while giving statistically valid p-value corrections. We do this by observing that the guarantees of algorithms with bounded approximate max-information are sufficient to correct the p-values of adaptively chosen hypotheses, and then by proving that algorithms that satisfy (∈,δ)-differential privacy have bounded approximate max information when their inputs are drawn from a product distribution. This substantially extends the known connection between differential privacy and max-information, which previously was only known to hold for (pure) (∈,0)-differential privacy. It also extends our understanding of max-information as a partially unifying measure controlling the generalization properties of adaptive data analyses. We also show a lower bound, proving that (despite the strong composition properties of max-information), when data is drawn from a product distribution, (∈,δ)-differentially private algorithms can come first in a composition with other algorithms satisfying max-information bounds, but not necessarily second if the composition is required to itself satisfy a nontrivial max-information bound. This, in particular, implies that the connection between (∈,δ)-differential privacy and max-information holds only for inputs drawn from product distributions, unlike the connection between (∈,0)-differential privacy and max-information.
更多
查看译文
关键词
adaptive data analysis,differential privacy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要