论文标题

通过协变量改编改善对共同腐败的鲁棒性

Improving robustness against common corruptions by covariate shift adaptation

论文作者

Schneider, Steffen, Rusak, Evgenia, Eck, Luisa, Bringmann, Oliver, Brendel, Wieland, Bethge, Matthias

论文摘要

当今的最先进的机器视觉模型容易受到图像损坏(例如模糊或压缩工件)的损害,从而限制了它们在许多现实世界应用中的性能。我们在这里认为,在许多(但不是全部)应用程序场景中,衡量针对常见腐败(例如Imagenet-c)的模型鲁棒性(如Imagenet-C)的稳健性的稳健性。关键见解是,在许多情况下,可用多个未标记的损坏示例,可用于无监督的在线改编。用损坏图像的统计数据代替通过批准设置估算的激活统计数据始终提高25种不同流行的计算机视觉模型的鲁棒性。使用校正后的统计数据,Resnet-50在Imagenet-C上达到62.2%MCE,而没有适应性为76.7%。借助更强大的Deepaughment+Augmix模型,我们将Resnet50模型最新的最新MCE提高到了53.6%的MCE到45.4%的MCE。即使适应单个样品,也可以改善Resnet-50和Augmix型号的鲁棒性,而32个样品足以改善Resnet-50体系结构的当前最新技术状态。我们认为,只要在腐败基准和其他分布概括设置中报告得分时,都应包括具有适应性统计的结果。

Today's state-of-the-art machine vision models are vulnerable to image corruptions like blurring or compression artefacts, limiting their performance in many real-world applications. We here argue that popular benchmarks to measure model robustness against common corruptions (like ImageNet-C) underestimate model robustness in many (but not all) application scenarios. The key insight is that in many scenarios, multiple unlabeled examples of the corruptions are available and can be used for unsupervised online adaptation. Replacing the activation statistics estimated by batch normalization on the training set with the statistics of the corrupted images consistently improves the robustness across 25 different popular computer vision models. Using the corrected statistics, ResNet-50 reaches 62.2% mCE on ImageNet-C compared to 76.7% without adaptation. With the more robust DeepAugment+AugMix model, we improve the state of the art achieved by a ResNet50 model up to date from 53.6% mCE to 45.4% mCE. Even adapting to a single sample improves robustness for the ResNet-50 and AugMix models, and 32 samples are sufficient to improve the current state of the art for a ResNet-50 architecture. We argue that results with adapted statistics should be included whenever reporting scores in corruption benchmarks and other out-of-distribution generalization settings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源