Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Oversmoothing in GNNs: why does it happen so fast? (and do popular solutions such as residual connections or normalization really work?)

18 minute read

Published:

Oversmoothing is a well-known problem in message-passing GNNs, but how come it happens when a GNN only has 2-4 layers? In this post, I am going to discuss our ICLR’23 paper “A Non-Asymptotic Analysis of Oversmoothing in Graph Neural Networks”, which provides the first quantitative finite-depth theory of oversmoothing in GNNs and explains why oversmoothing would occur at a shallow depth.

portfolio

publications

talks

teaching