AI Dynamics

Global AI News Aggregator

Convolution Equivariance vs Self-Attention Permutation Properties

Convolution is equivariant to translations.
Self-attention is equivariant to permutations.
They both have a role to play.
Conv is efficient for signals with strong local correlations and motifs that can appear anywhere.
SelfAtt is good for "object-based" representations where

→ View original post on X — @ylecun,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *