A couple of weeks ago, I wrote about GPT-2
[http://aiweirdness.com/post/182824715257/gpt-2-it-learned-on-the-internet], a
text-generating algorithm whose huge size and long-term analysis abilities mean
that it can generate text with an impressive degree of coherence. So impressive,
in fact, that its programmers at OpenAI have only released a
![It takes a bot to know one?](/content/images/size/w795/image/fetch/w_1200-c_limit-f_jpg-q_auto:good-fl_progressive:steep/https-3A-2F-2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com-2Fpublic-2Fimages-2Fa2664a84-6f90-421e-bbcc-f24594d84951_1354x916.png)