AI Dynamics

Global AI News Aggregator

LLMs as Cognitive Engines Orchestrating Compute Infrastructure via Text

Good post. A lot of interest atm in wiring up LLMs to a wider compute infrastructure via text I/O (e.g. calculator, python interpreter, google search, scratchpads, databases, …). The LLM becomes the "cognitive engine" orchestrating resources, its thought stack trace in raw text

→ View original post on X — @karpathy,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *