Transformers have been such an explosive advance because they had all of the internet to train on.

All of that training data was pent up, ready to catch fire when the right spark came, giving a massive explosive boost.

We extrapolate from that boost to a runaway effect, but that might not be the case!

Now we've run out of fresh original materials, and it looks like the benefits from "simply scale the models more" is starting to peter out too.

We're faced with a possible plateau where it will take lots of steady, patient progress to get considerably farther.

That means there's a lot more time for society and the industry to catch up and figure out how to use this stuff!

More on this topic

From other episodes