Lost track of what @garymarcus and @ylecun agree & disagree about? Agree:
LLMs are an off-ramp to AGI
Human extinction is not likely Common sense is critical
World models are critical Disagree:
GFM: Near-term risk of LLMs is serious GFM: Explicit symbolic reasoning will be
Marcus vs LeCun: Key Disagreements on LLMs and AGI
By
–
Leave a Reply