The basic motivation is that vocabularies are unbounded, but embedding tables can only be a fixed size. If you know your training data matches up to your test data well, this isn't such a big deal for most applications. If it's rare at training time, it'll be rare at test time.
Unbounded Vocabularies and Fixed-Size Embedding Tables Explained
By
–
Leave a Reply