Or do you mean maximum truthseeking in the people thinking about and building AGI? If so, I'd ask you whether you very carefully and neutrally evaluated exactly how much expected outcome shift could be produced that way – which is what truthseeking looks like, in humans.
Truthseeking in AGI Development: Evaluating Expected Outcome Shifts
By
–
Leave a Reply