These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: SP-GNN: Learning structure and position information from graphs. Author: Chen Y, You J, He J, Lin Y, Peng Y, Wu C, Zhu Y. Journal: Neural Netw; 2023 Apr; 161():505-514. PubMed ID: 36805265. Abstract: Graph neural network (GNN) is a powerful model for learning from graph data. However, existing GNNs may have limited expressive power, especially in terms of capturing adequate structural and positional information of input graphs. Structure properties and node position information are unique to graph-structured data, but few GNNs are capable of capturing them. This paper proposes Structure- and Position-aware Graph Neural Networks (SP-GNN), a new class of GNNs offering generic and expressive power of graph data. SP-GNN enhances the expressive power of GNN architectures by incorporating a near-isometric proximity-aware position encoder and a scalable structure encoder. Further, given a GNN learning task, SP-GNN can be used to analyze positional and structural awareness of GNN tasks using the corresponding embeddings computed by the encoders. The awareness scores can guide fusion strategies of the extracted positional and structural information with raw features for better performance of GNNs on downstream tasks. We conduct extensive experiments using SP-GNN on various graph datasets and observe significant improvement in classification over existing GNN models.[Abstract] [Full Text] [Related] [New Search]