FeedThis weekArticle
articleSimon Willison

Introducing talkie: a 13B vintage language model from 1930

Introducing talkie: a 13B vintage language model trained on 260B tokens of pre-1931 English and released under Apache 2.0. The chat variant is finetuned on instruction-response data from pre-1931 texts, with exploration of whether such models can predict the future, invent beyond their cutoffs, or even write code, while striving to avoid post-1931 contamination and pursuing a bootstrapped era-appropriate post-training pipeline.

published APR 28, 2026★★★★★
Read the sourcesimonwillison.net/2026/Apr/28/talkie/#atom-everything
[*] Opens in a new tab · no tracking on Lantern's side
Source
Simon Willison
Ingested
APR 28, 2026 · 08:40
Editorial score
3.0 / 5