LLMs are gap-fillers and will fill all gaps implicitly with the most average input.

· Bits and Bobs 1/12/26
  • LLMs are gap-fillers and will fill all gaps implicitly with the most average input.
    • So it's your job to give them non-average gaps to fill, to inject the entropy.
    • If you ask it for a joke you will get one of ten hyper bland ones.
    • If you ask it for a joke about the pope, an orange, and Richard Feynman, you'll get something novel.
      • You'll get the most average answer to that very novel request, which will in turn be novel.