Is it the number of examples that matters or the number of presentations to the model during training? E.g. humans used spaced repetition to memorize facts but there are no equivalents of similar techniques in LLMs where the typical training regime is uniform random.
Examples vs. Presentations: Spaced Repetition in LLM Training
By
–
Leave a Reply