Now that so many of us are spending so much time in our own homes, the thought
of being stuck in a room is very much on our minds. If you’ve ever done an
escape room, you know that you can pay to be stuck in a room -
I’ve done several experiments with a text-generating neural network called
GPT-2. Trained at great expense by OpenAI (to the tune of tens of thousands of
dollars worth of computing power), GPT-2 learned to imitate all kinds of text
from the internet. I’ve interacted with the basic model, discovering
Neural networks can be good at naming things, I’ve discovered. Recently I’ve
been experimenting with a neural network called GPT-2
[https://aiweirdness.com/post/185085792997/gpt-2-it-cant-resist-a-list], which
OpenAI trained on a huge chunk of the internet. Thanks to a colab notebook
implementation by Max Woolf [https://github.com/