In a recent study, 4 out of 5 LLMs hallucinated a significant proportion of sources for their answers to medical questions. Should we be using them for medical assessments?
LLM Hallucinations in Medical Assessments: A Critical Concern
By
–
Global AI News Aggregator
By
–
In a recent study, 4 out of 5 LLMs hallucinated a significant proportion of sources for their answers to medical questions. Should we be using them for medical assessments?
Leave a Reply