I remember when my hometown got one of these giant wooden playgrounds. It must
have been in the early 90s and a kid could get lost for hours in there.
Or injured or full of splinters or chromated copper arsenate I guess, which is
why you don't see
This month I'm beginning 2022 as the first Futurist in Residence at the
Smithsonian Arts and Industries Building.
It's weird to think of myself as a futurist. I write a lot about the algorithms
we're calling artificial intelligence (AI), but rather than deal with
When you think about it, Christmas can get pretty weird.
There's the classic Christmas story of the Bible, and then there are all these
extra entities that aren't in the book but which become somehow part of
Christmas. And some of them are quite unsettling. There&
Few could have predicted that the must-have toy of 1998 would be an owl-like
bilingual hamster doll with infrared sensors, or that in 1975 kids would be
begging their parents for a toy that is literally a single rock in a cardboard
box.
But could AI have predicted it? Could
When I was a kid I looked forward to opening advent calendar doors in December,
although the pictures behind the doors were pretty forgettable. A bell. A
snowflake. If you were lucky, a squirrel.
So I thought I'd see if I can generate something a bit more interesting,
One strange thing I've noticed about otherwise reasonably competent
text-generating neural nets is how their lists tend to go off the rails.
I noticed this first with GPT-2
[https://aiweirdness.com/post/185085792997/gpt-2-it-cant-resist-a-list]. But it
turns out GPT-3 is no exception.
Here's the largest model,
One of the fun things about working with a giant text-generating model with
general internet training is that when it finishes responding to one of your
prompts, it'll sometimes continue with a prompt of its own. (This can also be
one of the NOT fun things, if its
I'd been vaguely aware of pigeons until I read my friend Rosemary Mosco's book
A
Pocket Guide to Pigeon Watching
[https://www.workman.com/products/a-pocket-guide-to-pigeon-watching].
Now I'm in love.
It started with the back cover, where there's a pigeon trying so
"What would it take to teach a machine to behave ethically?" A recent paper
[https://arxiv.org/abs/2110.07574] approached this question by collecting a
dataset that they called "Commonsense Norm Bank", from sources like advice
columns and internet forums, and then training a machine
I've amused myself before
[https://www.aiweirdness.com/this-is-the-openai-api-it-makes-spookily-20-06-11/]
by getting GPT-3 to change one kind of text into another. With just a few
style-change examples, I got it to change the Winnie the Pooh theme song
[https://kids.niehs.nih.gov/games/songs/movies/winnie-the-pooh-theme/index.htm]