Equivariant Models | Open Catalyst Intro Series | Ep. 6

Описание к видео Equivariant Models | Open Catalyst Intro Series | Ep. 6

Episode 6: In this episode, we explore ML models that have equivariant representations. These model representations are quite fascinating, since they change predictably given changes in the input. For instance, if the input atoms are rotated, the model’s internal representation will also “rotate”. We’ll discuss how a special set of basis functions called spherical harmonics are used in equivariant models to represent atom neighborhoods and what makes them so mathematically interesting.

This video series is aimed at machine learning and AI researchers interested in gaining a better understanding of how to explore machine learning problems in chemistry and material science.

#opencatalyst #ai4science #climatechange

Additional materials:
A Gentle Introduction to Graph Neural Networks: https://distill.pub/2021/gnn-intro/
A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems: https://arxiv.org/pdf/2312.07511.pdf

Videos on Fourier Transforms:
But what is the Fourier Transform? A visual introduction:    • But what is the Fourier Transform?  A...  
Fourier Analysis, Steve Brunton:    • Fourier Analysis [Data-Driven Science...  

Some equivariant model papers:
3D steerable CNNs: Learning rotationally equivariant features in volumetric data: https://arxiv.org/abs/1807.02547
Tensor Field Networks: Rotation-and translation-equivariant neural networks for 3d point clouds: https://arxiv.org/abs/1802.08219
Geometric and physical quantities improve E(3) equivariant message passing: https://arxiv.org/abs/2110.02905
Equivariant message passing for the prediction of tensorial properties and molecular spectra: https://arxiv.org/abs/2102.03150
E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials:
MACE: higher order equivariant message passing neural networks for fast and accurate force fields: https://arxiv.org/abs/2101.03164
Reducing SO(3) convolutions to SO(2) for efficient equivariant GNNs: https://arxiv.org/abs/2302.03655
EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations: https://arxiv.org/abs/2306.12059

Комментарии

Информация по комментариям в разработке