论文标题
对可区分子集的应用的supsodular功能的神经估计
Neural Estimation of Submodular Functions with Applications to Differentiable Subset Selection
论文作者
论文摘要
通过表征多样性和覆盖范围的能力,突出的功能和变体已成为数据选择和摘要的关键工具。许多学习次生次功能的方法的表现力有限。在这项工作中,我们提出了Flexsubnet,这是一个用于单调和非单调性函数的柔性神经模型家族。为了拟合(set,value)观测值的潜在次管函数,flexsubnet以递归方式在模块化函数上应用凹形函数。我们不会从受限制的家庭中汲取凹功能,而是使用实现可区分正交程序的高表达性神经网络从数据中学习。这种表达性凹功能的神经模型可能具有独立的兴趣。接下来,我们扩展了此设置,以提供单调α-屈球功能的新颖表征,这是最近引入的近似下二个函数的概念。然后,我们使用这种表征来设计一种新型的神经模型以实现此类功能。最后,我们考虑以(周长集,高价值 - 订阅)对的形式学习在遥远监督下的下二集合功能。这产生了一种基于订单不变但贪婪的采样器围绕上述神经集函数构建的新型子集选择方法。我们对合成和真实数据的实验表明,Flexsubnet的表现优于几个基线。
Submodular functions and variants, through their ability to characterize diversity and coverage, have emerged as a key tool for data selection and summarization. Many recent approaches to learn submodular functions suffer from limited expressiveness. In this work, we propose FLEXSUBNET, a family of flexible neural models for both monotone and non-monotone submodular functions. To fit a latent submodular function from (set, value) observations, FLEXSUBNET applies a concave function on modular functions in a recursive manner. We do not draw the concave function from a restricted family, but rather learn from data using a highly expressive neural network that implements a differentiable quadrature procedure. Such an expressive neural model for concave functions may be of independent interest. Next, we extend this setup to provide a novel characterization of monotone α-submodular functions, a recently introduced notion of approximate submodular functions. We then use this characterization to design a novel neural model for such functions. Finally, we consider learning submodular set functions under distant supervision in the form of (perimeter-set, high-value-subset) pairs. This yields a novel subset selection method based on an order-invariant, yet greedy sampler built around the above neural set functions. Our experiments on synthetic and real data show that FLEXSUBNET outperforms several baselines.