Blog posts

2023

Oversmoothing in GNNs: why does it happen so fast? (and do popular solutions such as residual connections or normalization really work?)

18 minute read

Published:

Oversmoothing is a well-known problem in message-passing GNNs, but how come it happens when a GNN only has 2-4 layers? In this post, I am going to discuss our ICLR’23 paper “A Non-Asymptotic Analysis of Oversmoothing in Graph Neural Networks”, which provides the first quantitative finite-depth theory of oversmoothing in GNNs and explains why oversmoothing would occur at a shallow depth.