论文标题
学习具有各种随机特征的学习内核
Learning to Learn Kernels with Variational Random Features
论文作者
论文摘要
在这项工作中,我们在元学习框架中介绍了具有随机傅立叶功能的内核,以利用其强大的少量学习能力。我们提出元变异随机特征(METAVRF)来学习基础学习者的自适应核,该特征是通过将随机特征基础视为潜在变量来以潜在变量模型开发的。我们通过在元学习框架下得出一个下限的证据来提出对MetaVRF的优化为变异推理问题。为了纳入相关任务的共同知识,我们提出了由LSTM架构建立的后验的上下文推断。基于LSTM的推理网络可以有效地将先前任务的上下文信息与特定于任务的信息集成在一起,从而生成信息和适应性功能。学识渊博的MetaVRF可以以相对较低的光谱采样率产生高代表力的内核,还可以快速适应新任务。各种少数回归和分类任务的实验结果表明,与现有的元学习替代方案相比,METAVRF提供更好或至少具有竞争性的性能。
In this work, we introduce kernels with random Fourier features in the meta-learning framework to leverage their strong few-shot learning ability. We propose meta variational random features (MetaVRF) to learn adaptive kernels for the base-learner, which is developed in a latent variable model by treating the random feature basis as the latent variable. We formulate the optimization of MetaVRF as a variational inference problem by deriving an evidence lower bound under the meta-learning framework. To incorporate shared knowledge from related tasks, we propose a context inference of the posterior, which is established by an LSTM architecture. The LSTM-based inference network can effectively integrate the context information of previous tasks with task-specific information, generating informative and adaptive features. The learned MetaVRF can produce kernels of high representational power with a relatively low spectral sampling rate and also enables fast adaptation to new tasks. Experimental results on a variety of few-shot regression and classification tasks demonstrate that MetaVRF delivers much better, or at least competitive, performance compared to existing meta-learning alternatives.