AI Dynamics

Global AI News Aggregator

LLM Hallucinations in Medical Assessments: A Critical Concern

In a recent study, 4 out of 5 LLMs hallucinated a significant proportion of sources for their answers to medical questions. Should we be using them for medical assessments?

→ View original post on X — @stanfordhai,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *