Join the Learning on Graphs and Geometry Reading Group: https://hannes-stark.com/logag-reading-group
Paper “Graph Neural Networks as Gradient Flows": https://arxiv.org/abs/2206.10991
Abstract: Dynamical systems minimizing an energy are ubiquitous in geometry and physics. We propose a novel framework for GNNs where we parametrize (and {\em learn}) an energy functional and then take the GNN equations to be the gradient flow of such energy. This approach allows to analyse the GNN evolution from a multi-particle perspective as learning attractive and repulsive forces in feature space via the positive and negative eigenvalues of a symmetric `channel-mixing' matrix. We conduct spectral analysis of the solutions and provide a better understanding of the role of the channel-mixing in (residual) graph convolutional models and of its ability to steer the diffusion away from over-smoothing. We perform thorough ablation studies corroborating our theory and show competitive performance of simple models on homophilic and heterophilic datasets.
Authors: Francesco Di Giovanni, James Rowbottom, Benjamin P. Chamberlain, Thomas Markovich, Michael M. Bronstein
Twitter Hannes: https://twitter.com/HannesStaerk
Twitter Dominique: https://twitter.com/dom_beaini
Twitter Valence Discovery: https://twitter.com/valence_ai
Reading Group Slack: https://join.slack.com/t/logag/shared_invite/zt-u0mbo1ec-zElmvd1oSCXGjXvxLSokvg
~
Chapters
00:00 - Intro
00:42 - Outline of Contributions and a Dual Message
08:14 - A Rough Picture: Low-pass vs. High-pass filtering
11:29 - LFD Dynamics: Numerical Example
18:09 - Are Graph Convolutional Models Doomed?
24:08 - Gradient Flows
38:52 - Comparison with Some Continuous GNN Models
41:52 - GNN’s as Gradient Flows: Ablation Studies and Experiments
54:29 - Limitations and Future Directions
56:25 - Q&A