论文标题

卷积神经网络的有效版本降低

Effective Version Space Reduction for Convolutional Neural Networks

论文作者

Liu, Jiayu, Chiotellis, Ioannis, Triebel, Rudolph, Cremers, Daniel

论文摘要

在积极学习中,抽样偏差可能会构成严重的不一致问题,并阻碍算法找到最佳假设。但是,许多神经网络的方法都是假设空间不可知的,并且无法解决此问题。我们通过降低版本空间的原则镜头来检查卷积神经网络的积极学习。我们确定两种方法之间的联系---先前的质量减少和直径降低---并提出了一种新的基于直径的查询方法---最小吉布斯 - 投票分歧。通过估计版本的直径和偏差,我们说明了神经网络的版本空间如何发展并检查可实现的假设。借助MNIST,时尚 - 纳斯特,SVHN和STL-10数据集的实验,我们证明了直径降低方法可以更有效地减少版本空间,并且比以前的质量减少和其他基线更好,并且Gibbs投票的分歧与最佳查询方法相当。

In active learning, sampling bias could pose a serious inconsistency problem and hinder the algorithm from finding the optimal hypothesis. However, many methods for neural networks are hypothesis space agnostic and do not address this problem. We examine active learning with convolutional neural networks through the principled lens of version space reduction. We identify the connection between two approaches---prior mass reduction and diameter reduction---and propose a new diameter-based querying method---the minimum Gibbs-vote disagreement. By estimating version space diameter and bias, we illustrate how version space of neural networks evolves and examine the realizability assumption. With experiments on MNIST, Fashion-MNIST, SVHN and STL-10 datasets, we demonstrate that diameter reduction methods reduce the version space more effectively and perform better than prior mass reduction and other baselines, and that the Gibbs vote disagreement is on par with the best query method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源