论文标题

强大的内核高参数调整的统计成本

The Statistical Cost of Robust Kernel Hyperparameter Tuning

论文作者

Meyer, Raphael A., Musco, Christopher

论文摘要

本文研究了在对抗噪声下的主动回归情况下内核高参数调整的统计复杂性。我们考虑的问题是,从一类具有未知超参数的核中找到最佳的插值,只是假设噪声是可以正方形的。我们为问题提供有限样本的保证,表征了增加内核类别的复杂性如何增加学习核超参数的复杂性。对于常见的内核类(例如,长度未知的平方 - 指数核),我们的结果表明,与预先知道最佳参数的设置相比,高参数优化仅通过对数因子增加了样品复杂性。我们的结果是基于在多个设计矩阵下的线性回归的子采样保证,并结合了离散核参数化的ε-net参数。

This paper studies the statistical complexity of kernel hyperparameter tuning in the setting of active regression under adversarial noise. We consider the problem of finding the best interpolant from a class of kernels with unknown hyperparameters, assuming only that the noise is square-integrable. We provide finite-sample guarantees for the problem, characterizing how increasing the complexity of the kernel class increases the complexity of learning kernel hyperparameters. For common kernel classes (e.g. squared-exponential kernels with unknown lengthscale), our results show that hyperparameter optimization increases sample complexity by just a logarithmic factor, in comparison to the setting where optimal parameters are known in advance. Our result is based on a subsampling guarantee for linear regression under multiple design matrices, combined with an ε-net argument for discretizing kernel parameterizations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源