AI Dynamics

Global AI News Aggregator

Marcus vs LeCun: Key Disagreements on LLMs and AGI

Lost track of what @garymarcus and @ylecun agree & disagree about? Agree:
LLMs are an off-ramp to AGI
Human extinction is not likely Common sense is critical
World models are critical Disagree:
GFM: Near-term risk of LLMs is serious GFM: Explicit symbolic reasoning will be

→ View original post on X — @garymarcus,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *