Graph Neural Networks (GNN) show good results in classification and regression on graphs, notwithstanding most GNN models use a limited depth. In fact, they are composed of only a few stacked graph convolutional layers. One reason for this is the number of parameters growing with the number of GNN layers. In this paper, we show how using a recurrent graph convolution layer can help in building deeper GNNs, without increasing the complexity of the training phase, while improving on the predictive performances. We also analyze how the depth of the model influences the final result.

Deep recurrent graph neural networks

Pasa L.;Navarin N.;Sperduti A.
2020

Abstract

Graph Neural Networks (GNN) show good results in classification and regression on graphs, notwithstanding most GNN models use a limited depth. In fact, they are composed of only a few stacked graph convolutional layers. One reason for this is the number of parameters growing with the number of GNN layers. In this paper, we show how using a recurrent graph convolution layer can help in building deeper GNNs, without increasing the complexity of the training phase, while improving on the predictive performances. We also analyze how the depth of the model influences the final result.
2020
ESANN 2020 - Proceedings, 28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
File in questo prodotto:
File Dimensione Formato  
ES2020-107.pdf

accesso aperto

Descrizione: articolo principale
Tipologia: Published (publisher's version)
Licenza: Accesso libero
Dimensione 1.63 MB
Formato Adobe PDF
1.63 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3366866
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact