Contraction of $E_\gamma$-Divergence and Its Applications to Privacy

arxiv(2023)

引用 0|浏览35
暂无评分
摘要
We investigate the contraction coefficients derived from strong data processing inequalities for the $E_\gamma$-divergence. By generalizing the celebrated Dobrushin's coefficient from total variation distance to $E_\gamma$-divergence, we derive a closed-form expression for the contraction of $E_\gamma$-divergence. This result has fundamental consequences in two privacy settings. First, it implies that local differential privacy can be equivalently expressed in terms of the contraction of $E_\gamma$-divergence. This equivalent formula can be used to precisely quantify the impact of local privacy in (Bayesian and minimax) estimation and hypothesis testing problems in terms of the reduction of effective sample size. Second, it leads to a new information-theoretic technique for analyzing privacy guarantees of online algorithms. In this technique, we view such algorithms as a composition of amplitude-constrained Gaussian channels and then relate their contraction coefficients under $E_\gamma$-divergence to the overall differential privacy guarantees. As an example, we apply our technique to derive the differential privacy parameters of gradient descent. Moreover, we also show that this framework can be tailored to batch learning algorithms that can be implemented with one pass over the training dataset.
更多
查看译文
关键词
privacy,$e_\gamma$-divergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要