Loading…

Purity Skeleton Dynamic Hypergraph Neural Network

Recently, in the field of Hypergraph Neural Networks (HGNNs), the effectiveness of dynamic hypergraph construction has been validated, which aims to reduce structural noise within the hypergraph through embeddings. However, the existing dynamic construction methods fail to notice the reduction of in...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2024-12, Vol.610, p.128539, Article 128539
Main Authors: Wang, Yuge, Yang, Xibei, Sun, Qiguo, Qian, Yuhua, Guo, Qihang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently, in the field of Hypergraph Neural Networks (HGNNs), the effectiveness of dynamic hypergraph construction has been validated, which aims to reduce structural noise within the hypergraph through embeddings. However, the existing dynamic construction methods fail to notice the reduction of information contained in the hypergraphs during dynamic updates. This limitation undermines the quality of hypergraphs. Moreover, dynamic hypergraphs are constructed from graphs. Several key nodes play a crucial role in graph, but they are overlooked in hypergraphs. In this paper, we propose a Purity Skeleton Dynamic Hypergraph Neural Network (PS-DHGNN) to address the above issues. Firstly, we leverage purity skeleton method to dynamically construct hypergraphs via the fusion embeddings of features and topology simultaneously. This method effectively reduces structural noise and prevents the loss of information. Secondly, we employ an incremental training strategy, which implements a batch training strategy based on the importance of nodes. The key nodes, as the skeleton of hypergraph, are still highly valued. In addition, we utilize a novel loss function for learning structure information between hypergraph and graph. We conduct extensive experiments on node classification and clustering tasks, which demonstrate that our PS-DHGNN outperforms state-of-the-art methods. Note on real-world traffic flow datasets, PS-DHGNN demonstrates excellent performance, which is highly meaningful in practice. [Display omitted] •Propose a novel dynamic hypergraph construction method to reduce structural noise.•Introduce incremental learning to achieve batch training from key to ordinary.•Using contrastive loss and object loss to guide hypergraph and embedding generation.•Achieving excellent performance on traffic flow datasets has practical significance.
ISSN:0925-2312
DOI:10.1016/j.neucom.2024.128539