论文标题

多任务通过共享特征学习:算法和硬度

Multitask Learning via Shared Features: Algorithms and Hardness

论文作者

Bairaktari, Konstantina, Blanc, Guy, Tan, Li-Yang, Ullman, Jonathan, Zakynthinou, Lydia

论文摘要

我们研究了$ d $维超立方体的布尔函数多任务学习的计算效率,这些功能通过大小$ k \ ll d $在所有任务中共享的功能表示。我们提供了一个多项式多任务多任务学习算法,该学习算法具有边缘$γ$的半空间的概念类别,该算法基于同时增强技术,仅需要$ \ textrm {poly}(k/γ)$ permals-permples-per-per-per-per-samples-per-samples-和$ \ \ \ \ \ \ \ \ \ textrm {poly}(k \ fold log log(d d d) 此外,我们证明了一个计算分离,表明假设存在一个概念类,无法在属性效率模型中学习,我们可以构建另一个概念类,以便可以在属性效率高效的模型中学习,但是不能有效地学习多任务 - 多任务学习此概念类需要超级多项式的时间复杂性或更大的总数。

We investigate the computational efficiency of multitask learning of Boolean functions over the $d$-dimensional hypercube, that are related by means of a feature representation of size $k \ll d$ shared across all tasks. We present a polynomial time multitask learning algorithm for the concept class of halfspaces with margin $γ$, which is based on a simultaneous boosting technique and requires only $\textrm{poly}(k/γ)$ samples-per-task and $\textrm{poly}(k\log(d)/γ)$ samples in total. In addition, we prove a computational separation, showing that assuming there exists a concept class that cannot be learned in the attribute-efficient model, we can construct another concept class such that can be learned in the attribute-efficient model, but cannot be multitask learned efficiently -- multitask learning this concept class either requires super-polynomial time complexity or a much larger total number of samples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源