AI Dynamics

Global AI News Aggregator

Five Years Between Transformer Attention and FlashAttention Innovation

"5 years between Self-Attention Is All You Need and FlashAttention"
quite incredible stat, gives a pause

→ View original post on X — @karpathy,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *