Optimal Non-Asymptotic Bounds for the Sparse Model

Xiaowei Yang,Lu Pan, Kun Cheng,Chao Liu

MATHEMATICS(2023)

引用 0|浏览4
暂无评分
摘要
This paper investigates the sparse beta model with l(1) penalty in the field of network data models, which is a hot topic in both statistical and social network research. We present a refined algorithm designed for parameter estimation in the proposed model. Its effectiveness is highlighted through its alignment with the proximal gradient descent method, stemming from the convexity of the loss function. We study the estimation consistency and establish an optimal bound for the proposed estimator. Empirical validations facilitated through meticulously designed simulation studies corroborate the efficacy of our methodology. These assessments highlight the prospective contributions of our methodology to the advanced field of network data analysis.
更多
查看译文
关键词
sparse beta model,l(1) penalty,proximal gradient decent,consistency analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要