AI Dynamics

Global AI News Aggregator

Human Text Generation Differs from GPT Sequential Token Processing

When humans generate text (articles, posts, papers, etc) they spend very different amount of time per token, create intermediate work, make edits, etc. Very different from GPTs that just go chunk chunk chunk. But there seem to be enough puzzle pieces out and about to remedy.

→ View original post on X — @karpathy,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *