论文标题

具有部分汇总神经网络的可区分Pac-Bayes目标

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

论文作者

Biggs, Felix, Guedj, Benjamin

论文摘要

我们做出了三个相关的贡献,这些贡献是由训练随机神经网络挑战的挑战,尤其是在Pac-bayesian的环境中:(1)我们展示了如何在随机神经网络的集合上平均启用新的\ emph {partield-atielder-emph {partield-atielder-aggregegated}估计量; (2)我们表明,这些导致了非差异性签名输出网络的较低变异梯度估计; (3)我们将Pac-Bayesian与这些网络的界限进行了重新制定,以得出直接优化,可区分的目标和概括保证,而无需使用替代损失或放松界限。这种绑定的紧密程度是Letarte等人的两倍。 (2019)在类似的网络类型上。我们从经验上表明,这些创新使培训变得更加容易并导致竞争性保证。

We make three related contributions motivated by the challenge of training stochastic neural networks, particularly in a PAC-Bayesian setting: (1) we show how averaging over an ensemble of stochastic neural networks enables a new class of \emph{partially-aggregated} estimators; (2) we show that these lead to provably lower-variance gradient estimates for non-differentiable signed-output networks; (3) we reformulate a PAC-Bayesian bound for these networks to derive a directly optimisable, differentiable objective and a generalisation guarantee, without using a surrogate loss or loosening the bound. This bound is twice as tight as that of Letarte et al. (2019) on a similar network type. We show empirically that these innovations make training easier and lead to competitive guarantees.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源