AI Dynamics

Global AI News Aggregator

Claude Identifies Georgetown House: Real Understanding or Hallucination?

Example of why working with AI both so impressive and so challenging: I show Claude a picture of a house in Georgetown that shouldn't be in its training set. It nails it (GPT-4 does too) I ask it why Georgetown? Its answers seem great, but could all be hallucinated justification

→ View original post on X — @emollick,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *