I think a very useful thing for AI critics to do would be to "red team" their own beliefs by spending some serious time and effort (maybe with help) to see if they can get AI to do the things they say it can't. If they are right, that is very valuable benchmarking information.
Red Team Your Own AI Beliefs: Empirical Testing for Critics
By
–
Leave a Reply