论文标题
拓扑深度学习:超越图形数据
Topological Deep Learning: Going Beyond Graph Data
论文作者
论文摘要
拓扑深度学习是一个快速增长的领域,与在拓扑结构域(例如简单复合物,细胞复合物和超图)上支持的数据的发展有关,该模型概括了科学计算中遇到的许多领域。在本文中,我们提出了一个统一的深度学习框架,建立在更丰富的数据结构上,其中包括广泛采用的拓扑结构域。 具体而言,我们首先引入组合综合体,这是一种新型的拓扑结构域。组合复合物可以看作是维持某些理想特性的图的概括。与超球相似,组合复合物对一组关系没有任何约束。此外,组合络合物允许构建分层高阶关系,类似于在简单和细胞复合物中发现的高阶关系。因此,组合复合物概括并结合了超图和细胞复合物的有用特征,这些特征已成为两个有前途的抽象,可促进图形神经网络对拓扑空间的概括。 其次,在组合络合物及其丰富的组合和代数结构的基础上,我们开发了一类消息 - 通信组合复合神经网络(CCNN),主要集中在基于注意力的CCNN上。我们表征了CCNNS的排列和定向等效性,并详细讨论CCNN中的汇集和未解决操作。 第三,我们评估CCNN在与网格形状分析和图形学习有关的任务上的性能。我们的实验表明,与专门针对相同任务量身定制的最新深度学习模型相比,CCNN具有竞争性能。我们的发现证明了将高阶关系纳入不同应用中的深度学习模型的优势。
Topological deep learning is a rapidly growing field that pertains to the development of deep learning models for data supported on topological domains such as simplicial complexes, cell complexes, and hypergraphs, which generalize many domains encountered in scientific computations. In this paper, we present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains. Specifically, we first introduce combinatorial complexes, a novel type of topological domain. Combinatorial complexes can be seen as generalizations of graphs that maintain certain desirable properties. Similar to hypergraphs, combinatorial complexes impose no constraints on the set of relations. In addition, combinatorial complexes permit the construction of hierarchical higher-order relations, analogous to those found in simplicial and cell complexes. Thus, combinatorial complexes generalize and combine useful traits of both hypergraphs and cell complexes, which have emerged as two promising abstractions that facilitate the generalization of graph neural networks to topological spaces. Second, building upon combinatorial complexes and their rich combinatorial and algebraic structure, we develop a general class of message-passing combinatorial complex neural networks (CCNNs), focusing primarily on attention-based CCNNs. We characterize permutation and orientation equivariances of CCNNs, and discuss pooling and unpooling operations within CCNNs in detail. Third, we evaluate the performance of CCNNs on tasks related to mesh shape analysis and graph learning. Our experiments demonstrate that CCNNs have competitive performance as compared to state-of-the-art deep learning models specifically tailored to the same tasks. Our findings demonstrate the advantages of incorporating higher-order relations into deep learning models in different applications.