Talkie is a 13B open-weight LLM trained solely on 260 billion tokens of English text from before 1931 (with a 1930 end cutoff). It's a 'vintage' model that speaks only from a 1930s perspective, unaware of any events or technologies that came after, and was created for
Talkie: 13B Open-Weight LLM Trained on Pre-1931 English Text
By
–
Leave a Reply