Loading…

Fast Online Node Labeling for Very Large Graphs

This paper studies the online node classification problem under a transductive learning setting. Current methods either invert a graph kernel matrix with \(\mathcal{O}(n^3)\) runtime and \(\mathcal{O}(n^2)\) space complexity or sample a large volume of random spanning trees, thus are difficult to sc...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2023-05
Main Authors: Zhou, Baojian, Sun, Yifan, Babanezhad, Reza
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper studies the online node classification problem under a transductive learning setting. Current methods either invert a graph kernel matrix with \(\mathcal{O}(n^3)\) runtime and \(\mathcal{O}(n^2)\) space complexity or sample a large volume of random spanning trees, thus are difficult to scale to large graphs. In this work, we propose an improvement based on the \textit{online relaxation} technique introduced by a series of works (Rakhlin et al.,2012; Rakhlin and Sridharan, 2015; 2017). We first prove an effective regret \(\mathcal{O}(\sqrt{n^{1+\gamma}})\) when suitable parameterized graph kernels are chosen, then propose an approximate algorithm FastONL enjoying \(\mathcal{O}(k\sqrt{n^{1+\gamma}})\) regret based on this relaxation. The key of FastONL is a \textit{generalized local push} method that effectively approximates inverse matrix columns and applies to a series of popular kernels. Furthermore, the per-prediction cost is \(\mathcal{O}(\text{vol}({\mathcal{S}})\log 1/\epsilon)\) locally dependent on the graph with linear memory cost. Experiments show that our scalable method enjoys a better tradeoff between local and global consistency.
ISSN:2331-8422