Optimal Differentially Private PCA and Estimation for Spiked Covariance Matrices
CoRR(2024)
摘要
Estimating a covariance matrix and its associated principal components is a
fundamental problem in contemporary statistics. While optimal estimation
procedures have been developed with well-understood properties, the increasing
demand for privacy preservation introduces new complexities to this classical
problem. In this paper, we study optimal differentially private Principal
Component Analysis (PCA) and covariance estimation within the spiked covariance
model.
We precisely characterize the sensitivity of eigenvalues and eigenvectors
under this model and establish the minimax rates of convergence for estimating
both the principal components and covariance matrix. These rates hold up to
logarithmic factors and encompass general Schatten norms, including spectral
norm, Frobenius norm, and nuclear norm as special cases.
We introduce computationally efficient differentially private estimators and
prove their minimax optimality, up to logarithmic factors. Additionally,
matching minimax lower bounds are established. Notably, in comparison with
existing literature, our results accommodate a diverging rank, necessitate no
eigengap condition between distinct principal components, and remain valid even
if the sample size is much smaller than the dimension.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要