AI Dynamics

Global AI News Aggregator

Transformers Remain Fundamentally Unchanged Despite Recent Extensions

There have been some incredible works around extending transformers to other tasks(ex: ViT) and efficiency(etc.., ex: flash attention.) but deep down, it's the same transformer, with some minor changes here and there.

→ View original post on X — @jeande_d,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *