LLMs compress nearly all of humanity's background context into a teensy weeny little hyperobject package.

· Bits and Bobs 12/2/24

They have effectively infinite background context–a marvel of lossy compression.

The answers to innumerable questions are encoded in that little hyperobject.

They just need the right question to get the answer out.

The most important thing now is: "what is the right question"?

The power of using LLMs is asking the right question.

More on this topic

From other episodes