论文标题

在线知识蒸馏的多视图对比度学习

Multi-view Contrastive Learning for Online Knowledge Distillation

论文作者

Yang, Chuanguang, An, Zhulin, Xu, Yongjun

论文摘要

以前的在线知识蒸馏(OKD)经常进行相互交换的概率分布,但忽略了有用的代表性知识。因此,我们提出了多视图对比度学习(MCL),以使OKD隐式捕获由多个PEER网络编码的功能嵌入的相关性,这些嵌入了多个PEER网络,这些网络提供了各种视图,以了解输入数据实例。从MCL中受益,我们可以比以前的OKD方法学习一个更具歧视性的表示空间。图像分类的实验结果表明,我们的MCL-OKD通过大幅度优于其他最先进的OKD方法,而不会牺牲额外的推理成本。代码可在https://github.com/winycg/mcl-okd上找到。

Previous Online Knowledge Distillation (OKD) often carries out mutually exchanging probability distributions, but neglects the useful representational knowledge. We therefore propose Multi-view Contrastive Learning (MCL) for OKD to implicitly capture correlations of feature embeddings encoded by multiple peer networks, which provide various views for understanding the input data instances. Benefiting from MCL, we can learn a more discriminative representation space for classification than previous OKD methods. Experimental results on image classification demonstrate that our MCL-OKD outperforms other state-of-the-art OKD methods by large margins without sacrificing additional inference cost. Codes are available at https://github.com/winycg/MCL-OKD.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源