LLMs add a token that is most coherent with what's in the context.

· Bits and Bobs 1/6/26
  • LLMs add a token that is most coherent with what's in the context.
    • So if it makes an error it will tend to make it again.
    • Because that error is put into the context.
    • The most coherent belief to amend silently assumes that the error is right.
    • The more errors it makes the more deeply ingrained the errors get.