Uniform inference in high-dimensional Gaussian graphical models

arXiv: Methodology(2023)

Cited 1|Views8
No score
Abstract
Graphical models have become a popular tool for representing dependencies within large sets of variables and are crucial for representing causal structures. We provide results for uniform inference on high-dimensional graphical models, in which the number of target parameters d is potentially much larger than the sample size, under approximate sparsity. Our results highlight how graphical models can be estimated and recovered using modern machine learning methods in high-dimensional complex settings. To construct simultaneous confidence regions on many target parameters, it is crucial to have sufficiently fast estimation rates of the nuisance functions. In this context, we establish uniform estimation rates and sparsity guarantees for the square-root lasso estimator in a random design under approximate sparsity conditions. These might be of independent interest for related problems in high dimensions. We also demonstrate in a comprehensive simulation study that our procedure has good small sample properties in comparison to existing methods, and we present two empirical applications.
More
Translated text
Key words
Conditional independence,Double/debiased machine learning,Gaussian graphical model,High-dimensional setting,Post-selection inference,Square-root lasso
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined