论文标题

alpha-Divergence差异推断符合重要性加权自动编码器:方法和渐近学

Alpha-divergence Variational Inference Meets Importance Weighted Auto-Encoders: Methodology and Asymptotics

论文作者

Daudel, Kamélia, Benton, Joe, Shi, Yuyang, Doucet, Arnaud

论文摘要

已经提出了几种涉及变异rényi(VR)结合的算法,以最大程度地减少目标后验分布与变异分布之间的α差异。尽管有希望的经验结果,但这些算法诉诸于偏见的随机梯度下降程序,因此缺乏理论保证。在本文中,我们正式化并研究了VR-IWAE结合,这是重要性加权自动编码器(IWAE)结合的概括。我们表明,VR-iwae结合具有几种理想的特性,并且显着导致与重新聚集情况下的VR结合的随机梯度下降程序,但这一次是依靠无偏的梯度估计器。然后,我们提供了VR-IWAE结合的两个互补理论分析,因此提供了标准IWAE结合的互补分析。这些分析阐明了这些界限的好处或缺乏益处。最后,我们说明了我们对玩具和真实数据示例的理论主张。

Several algorithms involving the Variational Rényi (VR) bound have been proposed to minimize an alpha-divergence between a target posterior distribution and a variational distribution. Despite promising empirical results, those algorithms resort to biased stochastic gradient descent procedures and thus lack theoretical guarantees. In this paper, we formalize and study the VR-IWAE bound, a generalization of the Importance Weighted Auto-Encoder (IWAE) bound. We show that the VR-IWAE bound enjoys several desirable properties and notably leads to the same stochastic gradient descent procedure as the VR bound in the reparameterized case, but this time by relying on unbiased gradient estimators. We then provide two complementary theoretical analyses of the VR-IWAE bound and thus of the standard IWAE bound. Those analyses shed light on the benefits or lack thereof of these bounds. Lastly, we illustrate our theoretical claims over toy and real-data examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源