When humans generate text (articles, posts, papers, etc) they spend very different amount of time per token, create intermediate work, make edits, etc. Very different from GPTs that just go chunk chunk chunk. But there seem to be enough puzzle pieces out and about to remedy.
Human Text Generation Differs from GPT Sequential Token Processing
By
–
Leave a Reply