Valence Labs is a research engine within Recursion committed to advancing the frontier of AI in drug discovery. Learn more about our open roles: https://www.valencelabs.com/careers
Join the Learning on Graphs and Geometry Reading Group on Slack: https://join.slack.com/t/logag/shared_invite/zt-1ifx2ocpf-5kTNi9VYb8c5ghrTiQ0tBA
Abstract: We introduce Clifford Group Equivariant Neural Networks: a novel approach for constructing E(n)-equivariant networks. We identify and study the Clifford group, a subgroup inside the Clifford algebra, whose definition we slightly adjust to achieve several favorable properties. Primarily, the group’s action forms an orthogonal automorphism that extends beyond the typical vector space to the entire Clifford algebra while respecting the multivector grading. This leads to several non-equivalent subrepresentations corresponding to the multivector decomposition. Furthermore, we prove that the action respects not just the vector space structure of the Clifford algebra but also its multiplicative structure, i.e., the geometric product. These findings imply that every polynomial in multivectors, including their grade projections, constitutes an equivariant map with respect to the Clifford group, allowing us to parameterize equivariant neural network layers. Notable advantages are that these layers operate directly on a vector basis and elegantly generalize to any dimension. We demonstrate, notably from a single core implementation, state-of-the-art performance on several distinct tasks, including a three-dimensional n-body experiment, a four-dimensional Lorentz-equivariant high-energy physics experiment, and a five-dimensional convex hull experiment.
Speaker: David Ruhe - https://davidruhe.github.io/
Twitter Hannes: https://twitter.com/HannesStaerk
Twitter Dominique: https://twitter.com/dom_beaini
Twitter datamol.io: https://twitter.com/datamol_io
~
Chapters
00:00 - Intro
07:14 - The Clifford Algebra
21:55 - Clifford Group Equivariant Networks: Theoretical Results
29:29 - Methodology: Linear Layers
30:44 - Methodology: Geometric Product Layers
35:39 - Experiments
46:18 - Final Remarks
47:38 - Q+A