论文标题

鲁棒性和不确定性定量的超参数合奏

Hyperparameter Ensembles for Robustness and Uncertainty Quantification

论文作者

Wenzel, Florian, Snoek, Jasper, Tran, Dustin, Jenatton, Rodolphe

论文摘要

从不同随机初始化(称为深层合奏)训练的神经网络权重的合奏,达到了最新的准确性和校准。最近引入的批处理集合提供了更有效的参数替换。在本文中,我们不仅设计了合奏,而且还设计了超级参数,以改善两种情况下的最新技术。为了最佳性能与预算无关,我们提出了超深的合奏,这是一个简单的过程,涉及在不同的超参数上进行随机搜索,并在多个随机初始化中进行了分层。它的强大性能突出了将模型与重量和超级参数多样性相结合的好处。我们进一步提出了一个参数有效版本,超批量合奏,该版本建立在批处理合奏和自调整网络的层结构上。我们方法的计算和记忆成本明显低于典型合奏。在图像分类任务上,使用MLP,LENET,RESNET 20和宽Resnet 28-10体系结构,我们可以在深层和批处理集合方面进行改进。

Ensembles over neural network weights trained from different random initialization, known as deep ensembles, achieve state-of-the-art accuracy and calibration. The recently introduced batch ensembles provide a drop-in replacement that is more parameter efficient. In this paper, we design ensembles not only over weights, but over hyperparameters to improve the state of the art in both settings. For best performance independent of budget, we propose hyper-deep ensembles, a simple procedure that involves a random search over different hyperparameters, themselves stratified across multiple random initializations. Its strong performance highlights the benefit of combining models with both weight and hyperparameter diversity. We further propose a parameter efficient version, hyper-batch ensembles, which builds on the layer structure of batch ensembles and self-tuning networks. The computational and memory costs of our method are notably lower than typical ensembles. On image classification tasks, with MLP, LeNet, ResNet 20 and Wide ResNet 28-10 architectures, we improve upon both deep and batch ensembles.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源