论文标题

在统一收敛和低声称插值学习上

On Uniform Convergence and Low-Norm Interpolation Learning

论文作者

Zhou, Lijia, Sutherland, Danica J., Srebro, Nathan

论文摘要

我们考虑了一个不确定的噪声线性回归模型,其中最小值插值预测变量是一致的,并且要问:可以在标准球中或至少(至少在Nagarajan和Kolter之后)在典型输入集中选择算法选择的Norm Ball的子集(遵循Norgarajan and Kolter)的子集,解释这一点?我们表明,统一地界定经验和人口错误之间的差异不能显示norm ball中的任何学习,也不能显示任何集合的一致性,即使是根据确切的算法和分布的不同。但是我们认为,我们可以解释最小值插值器的一致性,但略弱(但标准)的概念略有弱:零元预测变量均匀收敛。我们使用它来绑定低(但不是最小)标准插值预测因子的概括误差。

We consider an underdetermined noisy linear regression model where the minimum-norm interpolating predictor is known to be consistent, and ask: can uniform convergence in a norm ball, or at least (following Nagarajan and Kolter) the subset of a norm ball that the algorithm selects on a typical input set, explain this success? We show that uniformly bounding the difference between empirical and population errors cannot show any learning in the norm ball, and cannot show consistency for any set, even one depending on the exact algorithm and distribution. But we argue we can explain the consistency of the minimal-norm interpolator with a slightly weaker, yet standard, notion: uniform convergence of zero-error predictors in a norm ball. We use this to bound the generalization error of low- (but not minimal-) norm interpolating predictors.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源