论文标题

树木项目的梯度下降,用于估计图上的梯度-SPARSE参数

Tree-Projected Gradient Descent for Estimating Gradient-Sparse Parameters on Graphs

论文作者

Xu, Sheng, Fan, Zhou, Negahban, Sahand

论文摘要

我们研究梯度 - sparse参数矢量$ \boldsymbolθ^*\ in \ mathbb {r}^p $,具有强梯度 - sparsity $ s^*:= \ | \ | \ | \ nabla_g \ boldsymbolth \boldsymbolθ^*\ | _0 $ | _0 $ g $ g $。给定观察结果$ z_1,\ ldots,z_n $和一个平稳的,凸的损失函数$ \ mathcal {l} $,其中$ \boldsymbolθ^*$最小化人口风险$ \ mathbb {e} [e} [\ mathcal {\ nathcal {l} $ \boldsymbolθ^*$通过迭代且大约将梯度步骤投射到具有较小梯度 - 比值较小的向量的空间上的梯度步骤上的梯度下降算法的$ \boldsymbolθ^*$。我们表明,在适当受限制的强凸度和损失的平滑度假设下,所得的估计器实现了平方 - 误差风险$ \ frac {s^*} {n} {n} {n} \ log(1+ \ frac {p} {p} {s^*})$,最大为乘以$ g $。相比之下,以前的多项式时间算法仅显示出在更专业的设置中实现此保证,或者在$ g $和/或$ \ nabla_g \boldsymbolθ^*$的稀疏模式的其他假设下实现此保证。作为一般框架的应用,我们将结果应用于线性模型的示例和具有随机设计的广义线性模型。

We study estimation of a gradient-sparse parameter vector $\boldsymbolθ^* \in \mathbb{R}^p$, having strong gradient-sparsity $s^*:=\|\nabla_G \boldsymbolθ^*\|_0$ on an underlying graph $G$. Given observations $Z_1,\ldots,Z_n$ and a smooth, convex loss function $\mathcal{L}$ for which $\boldsymbolθ^*$ minimizes the population risk $\mathbb{E}[\mathcal{L}(\boldsymbolθ;Z_1,\ldots,Z_n)]$, we propose to estimate $\boldsymbolθ^*$ by a projected gradient descent algorithm that iteratively and approximately projects gradient steps onto spaces of vectors having small gradient-sparsity over low-degree spanning trees of $G$. We show that, under suitable restricted strong convexity and smoothness assumptions for the loss, the resulting estimator achieves the squared-error risk $\frac{s^*}{n} \log (1+\frac{p}{s^*})$ up to a multiplicative constant that is independent of $G$. In contrast, previous polynomial-time algorithms have only been shown to achieve this guarantee in more specialized settings, or under additional assumptions for $G$ and/or the sparsity pattern of $\nabla_G \boldsymbolθ^*$. As applications of our general framework, we apply our results to the examples of linear models and generalized linear models with random design.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源