Open-sourced implementation for TNNLS 2023.
-
We propose a graph gradual pruning framework, namely CGP, to reduce the training and inference computing costs of GNN models while preserving their accuracy.
-
We comprehensively sparsify the elements of GNNs, including graph structures, the node feature dimension, and model parameters, to significantly improve the efficiency of GNN models.
-
Experimental results on various GNN models and datasets consistently validate the effectiveness and efficiency of our proposed CGP.
Our proposed Gapformer is implemented in Python 3.7 and major libraries include:
More dependencies are provided in requirements.txt.
Once the requirements are fulfilled, use this command to run:
sh xx.sh
All datasets used in this paper can be downloaded from PyG.