谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Filtering time-dependent covariance matrices using time-independent eigenvalues

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT(2023)

引用 0|浏览12
暂无评分
摘要
We propose a data-driven, model-free, way to reduce the noise of covariance matrices of time-varying systems. If the true covariance matrix is time-invariant, non-linear shrinkage of the eigenvalues is known to yield the optimal estimator for large matrices. Such a method outputs eigenvalues that are highly dependent on the inputs, as common sense suggests. When the covariance matrix is time-dependent, we show that it is generally better to use the set of eigenvalues that encode the average influence of the future on present eigenvalues resulting in a set of time-independent average eigenvalues. This situation is widespread in nature, one example being financial markets, where non-linear shrinkage remains the gold-standard filtering method. Our approach outperforms non-linear shrinkage both for the Frobenius norm distance, which is the typical loss function used for covariance filtering and for financial portfolio variance minimization, which makes our method generically relevant to many problems of multivariate inference. Further analysis of financial data suggests that the expected overlap between past eigenvectors and future ones is systematically overestimated by methods designed for constant covariances matrices. Our method takes a simple empirical average of the eigenvector overlap matrix, which is enough to outperform non-linear shrinkage.
更多
查看译文
关键词
covariance matrix filtering,random matrix theory,non-linear shrinkage,average oracle
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要