AI Dynamics

Global AI News Aggregator

Unbounded Vocabularies and Fixed-Size Embedding Tables Explained

The basic motivation is that vocabularies are unbounded, but embedding tables can only be a fixed size. If you know your training data matches up to your test data well, this isn't such a big deal for most applications. If it's rare at training time, it'll be rare at test time.

→ View original post on X — @honnibal,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *