Literature DB >> 35476575

Self-Supervised Learning of Graph Neural Networks: A Unified Review.

Yaochen Xie, Zhao Xu, Jingtun Zhang, Zhengyang Wang, Shuiwang Ji.   

Abstract

Deep models trained in supervised mode have achieved remarkable success on a variety of tasks. When labeled samples are limited, self-supervised learning (SSL) is emerging as a new paradigm for making use of large amounts of unlabeled samples. SSL has achieved promising performance on natural language and image learning tasks. Recently, there is a trend to extend such success to graph data using graph neural networks (GNNs). In this survey, we provide a unified review of different ways of training GNNs using SSL. Specifically, we categorize SSL methods into contrastive and predictive models. In either category, we provide a unified framework for methods as well as how these methods differ in each component under the framework. Our unified treatment of SSL methods for GNNs sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms. We also summarize different SSL settings and the corresponding datasets used in each setting. To facilitate methodological development and empirical comparison, we develop a standardized testbed for SSL in GNNs, including implementations of common baseline methods, datasets, and evaluation metrics.

Entities:  

Year:  2022        PMID: 35476575     DOI: 10.1109/TPAMI.2022.3170559

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  1 in total

1.  Bringing Your Own View: Graph Contrastive Learning without Prefabricated Data Augmentations.

Authors:  Yuning You; Tianlong Chen; Zhangyang Wang; Yang Shen
Journal:  Proc Int Conf Web Search Data Min       Date:  2022-02-15
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.