11月17日论文推荐
论文名:
Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning
作者:Qimai Li, Zhichao Han, Xiao-Ming Wu
推荐理由:
这篇文章从 GCN 的 convolution 的式子开始谈起,经过一些简单的变换说明了 graph convolution 是一种特殊形式的拉普拉斯平滑,并且证明了连续足够多次的平滑会使得连通图中所有结点的 feature 收敛到相同的值,由此推测深层的 GCN 会使得结点特征越来越不明显。并更展开讨论了 GCN 其他的一些问题,如浅层的 GCN 受限于局部性而需要更多的 training label、GCN 训练需要一定大小的 validation 集用来 early stopping 等,也提出了一些可能的改进。
Abstract
Many interesting problems in machine learning are being revisited with new deep learning tools. For graph-based semisupervised learning, a recent important development is graph convolutional networks (GCNs), which nicely integrate local vertex features and graph topology in the convolutional layers. Although the GCN model compares favorably with other state-of-the-art methods, its mechanisms are not clear and it still requires considerable amount of labeled data for validation and model selection.
In this paper, we develop deeper insights into the GCN model and address its fundamental limits. First, we show that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of oversmoothing with many convolutional layers. Second, to overcome the limits of the GCN model with shallow architectures, we propose both co-training and self-training approaches to train GCNs. Our approaches significantly improve GCNs in learning with very few labels, and exempt them from requiring additional labels for validation. Extensive experiments on benchmarks have verified our theory and proposals.
分享干货
解读|阿里、腾讯和百度发表于KDD2018上的论文(含附录)
解读| 带你走进Curriculum Learning和Self-paced Learning的世界
第52届超级计算机TOP500排行榜发布,中国拥有229台,占比45.8%
CNCC2018技术论坛|6场报告引爆“认知图谱与推理”现场
CNCC2018|图灵奖获得者Robert E.Kahn谈“数字对象与互联网发展”
学术头条
发掘科技创新的原动力