论文标题

再次回到:重新访问反向传播显着性方法

There and Back Again: Revisiting Backpropagation Saliency Methods

论文作者

Rebuffi, Sylvestre-Alvise, Fong, Ruth, Ji, Xu, Vedaldi, Andrea

论文摘要

显着性方法试图通过在每个输入样本中产生重要性图来解释模型的预测。一种流行的此类方法类是基于回传信号并分析所得梯度。尽管对此类方法进行了大量研究,但几乎没有做过相对较少的工作来阐明这些技术以及这些技术的逃避之间的差异。因此,需要严格了解不同方法及其故障模式之间的关系。在这项工作中,我们对基于反向传播的显着性方法进行了彻底的分析,并提出了一个单一的框架,可以统一几种此类方法。由于我们的研究,我们做出了三个额外的贡献。首先,我们使用框架提出Normgrad,这是一种基于卷积权重梯度的空间贡献的新型显着性方法。其次,我们在不同层上结合了显着性图,以测试显着性方法在不同网络水平上提取互补信息的能力(例如,〜交易空间分辨率和独特性),我们解释了为什么某些方法在特定层(例如,除Grad-CAM之外,除了最后一个卷积层以外的任何地方)。第三,我们引入了一个类别敏感性度量和一种适用于任何显着性方法的元学习启发范式,以提高对所解释的输出类别的敏感性。

Saliency methods seek to explain the predictions of a model by producing an importance map across each input sample. A popular class of such methods is based on backpropagating a signal and analyzing the resulting gradient. Despite much research on such methods, relatively little work has been done to clarify the differences between such methods as well as the desiderata of these techniques. Thus, there is a need for rigorously understanding the relationships between different methods as well as their failure modes. In this work, we conduct a thorough analysis of backpropagation-based saliency methods and propose a single framework under which several such methods can be unified. As a result of our study, we make three additional contributions. First, we use our framework to propose NormGrad, a novel saliency method based on the spatial contribution of gradients of convolutional weights. Second, we combine saliency maps at different layers to test the ability of saliency methods to extract complementary information at different network levels (e.g.~trading off spatial resolution and distinctiveness) and we explain why some methods fail at specific layers (e.g., Grad-CAM anywhere besides the last convolutional layer). Third, we introduce a class-sensitivity metric and a meta-learning inspired paradigm applicable to any saliency method for improving sensitivity to the output class being explained.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源