Most Graph Neural Networks (GNNs) proposed in literature tend to add complexity (and non-linearity) to the model. In this paper, we follow the opposite direction by proposing a simple linear multi-resolution architecture that implements a multi-head gating mechanism. We assessed the performances of the proposed architecture on node classification tasks. To perform a fair comparison and present significant results, we re-implemented the competing methods from the literature and ran the experimental evaluation considering two different experimental settings with different model selection procedures. The proposed convolution, dubbed Simple Multi-resolution Gated GNN, exhibits state-of-the-art predictive performance on the considered benchmark datasets in terms of accuracy. In addition, it is way more efficient to compute than GAT, a well-known multihead GNN proposed in literature.
Simple Multi-resolution Gated GNN
Pasa L.;Navarin N.;Sperduti A.
2021
Abstract
Most Graph Neural Networks (GNNs) proposed in literature tend to add complexity (and non-linearity) to the model. In this paper, we follow the opposite direction by proposing a simple linear multi-resolution architecture that implements a multi-head gating mechanism. We assessed the performances of the proposed architecture on node classification tasks. To perform a fair comparison and present significant results, we re-implemented the competing methods from the literature and ran the experimental evaluation considering two different experimental settings with different model selection procedures. The proposed convolution, dubbed Simple Multi-resolution Gated GNN, exhibits state-of-the-art predictive performance on the considered benchmark datasets in terms of accuracy. In addition, it is way more efficient to compute than GAT, a well-known multihead GNN proposed in literature.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.