Some Constructions of Private, Efficient, and Optimal K-Norm and Elliptic Gaussian Noise

Matthew Joseph, Alexander Yu

arxiv(2023)

引用 0|浏览1
暂无评分
摘要
Differentially private computation often begins with a bound on some d-dimensional statistic's ℓ_p sensitivity. For pure differential privacy, the K-norm mechanism can improve on this approach using a norm tailored to the statistic's sensitivity space. Writing down a closed-form description of this optimal norm is often straightforward. However, running the K-norm mechanism reduces to uniformly sampling the norm's unit ball; this ball is a d-dimensional convex body, so general sampling algorithms can be slow. Turning to concentrated differential privacy, elliptic Gaussian noise offers similar improvement over spherical Gaussian noise. Once the shape of this ellipse is determined, sampling is easy; however, identifying the best such shape may be hard. This paper solves both problems for the simple statistics of sum, count, and vote. For each statistic, we provide a sampler for the optimal K-norm mechanism that runs in time Õ(d^2) and derive a closed-form expression for the optimal shape of elliptic Gaussian noise. The resulting algorithms all yield meaningful accuracy improvements while remaining fast and simple enough to be practical. More broadly, we suggest that problem-specific sensitivity space analysis may be an overlooked tool for private additive noise.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要