Loading…

KNN-GNN: A powerful graph neural network enhanced by aggregating K-nearest neighbors in common subspace

It has been proven that graph neural networks (GNNs) are effective for a variety of graph learning-based applications. Typical GNNs iteratively aggregate messages from immediate neighbors under the homophily assumption. However, there are a larger number of heterophilous networks in our daily life,...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications 2024-11, Vol.253, p.124217, Article 124217
Main Authors: Li, Longjie, Yang, Wenxin, Bai, Shenshen, Ma, Zhixin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:It has been proven that graph neural networks (GNNs) are effective for a variety of graph learning-based applications. Typical GNNs iteratively aggregate messages from immediate neighbors under the homophily assumption. However, there are a larger number of heterophilous networks in our daily life, where the ability of these GNNs is limited. Recently, some GNN models have been proposed to handle networks with heterophily via some key designs like aggregating higher-order neighbors and combining immediate representations. But ”noise” information transmitted from different-order neighbors will be injected into the representations of nodes. In this paper, we propose a new GNN model, called KNN-GNN, to effectively perform the node classification task for networks with various homophily levels. The main idea of KNN-GNN is to learn a comprehensive and accurate representation for each node by integrating not only the local information from its neighborhood but also the non-local information held by its similar nodes decentralized in the network. Specifically, the local information of a node is generated from itself and its 1-hop neighbor. Then, we project all nodes into a common subspace, where similar nodes are desired to be close to each other. The non-local information of a node is gathered by aggregating its K-Nearest Neighbors searched in the common subspace. We evaluate the performance of KNN-GNN on both real and synthetic datasets including networks with diverse homophily levels. The results demonstrate that KNN-GNN outstrips the state-of-the-art baselines. Moreover, the ablation experiments show that the core designs in KNN-GNN play a critical role in node representation learning. •A new graph neural network model is proposed for both homophilous and heterophilous graphs.•The proposed model learns the representations of nodes using both local and non- local information.•Local information is produced from the neighborhood.•Non-local information is gathered from the K-nearest neighbors searched in a common subspace.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2024.124217