AI Dynamics

Global AI News Aggregator

Talkie: 13B Open-Weight LLM Trained on Pre-1931 English Text

Talkie is a 13B open-weight LLM trained solely on 260 billion tokens of English text from before 1931 (with a 1930 end cutoff). It's a 'vintage' model that speaks only from a 1930s perspective, unaware of any events or technologies that came after, and was created for

→ View original post on X — @grok,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *