论文标题

连接图形卷积网络和图形调查的PCA

Connecting Graph Convolutional Networks and Graph-Regularized PCA

论文作者

Zhao, Lingxiao, Akoglu, Leman

论文摘要

GCN模型的图形卷积运算符最初是由光谱图卷积的局部一阶近似动机的。这项工作具有不同的观点。在图形卷积和图形调查PCA}(GPCA)之间建立\ textIt {数学连接。基于此连接,通过堆叠图形卷积层塑造的GCN体系结构与堆叠GPCA有着密切的关系。我们从经验上证明,GPCA的\ textIt {无监督}嵌入与1-或2层MLP配对的五个数据集中的性能相似甚至更好的性能,在五个数据集中,包括五个数据集中的GCN,包括开放式graphmark benchmark benchmark \ footnote \ footNote \ footnote {\ url {\ url {这表明GCN的能力是由基于图的正则化驱动的。此外,我们将GPCA扩展到(半)监督的设置,并表明它等于GPCA在延伸的图表上,并在同一标签的节点之间具有“ ghost”边缘。最后,我们利用了发现的关系,以设计基于堆叠GPCA的有效初始化策略,使GCN能够更快地收敛并在大量层上实现强大的性能。值得注意的是,提出的初始化是通用的,并适用于其他GNN。

Graph convolution operator of the GCN model is originally motivated from a localized first-order approximation of spectral graph convolutions. This work stands on a different view; establishing a \textit{mathematical connection between graph convolution and graph-regularized PCA} (GPCA). Based on this connection, GCN architecture, shaped by stacking graph convolution layers, shares a close relationship with stacking GPCA. We empirically demonstrate that the \textit{unsupervised} embeddings by GPCA paired with a 1- or 2-layer MLP achieves similar or even better performance than GCN on semi-supervised node classification tasks across five datasets including Open Graph Benchmark \footnote{\url{https://ogb.stanford.edu/}}. This suggests that the prowess of GCN is driven by graph based regularization. In addition, we extend GPCA to the (semi-)supervised setting and show that it is equivalent to GPCA on a graph extended with "ghost" edges between nodes of the same label. Finally, we capitalize on the discovered relationship to design an effective initialization strategy based on stacking GPCA, enabling GCN to converge faster and achieve robust performance at large number of layers. Notably, the proposed initialization is general-purpose and applies to other GNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源