When an LLM goes off the rails they get into a self-poisoning spiral.

· Bits and Bobs 9/22/25
  • When an LLM goes off the rails they get into a self-poisoning spiral.
    • They poison their own context and get increasingly deluded.
      • It puts things into its context that make future iterations even more confused.
    • An auto-catalytic process.
    • A human needs to be in the loop to keep them centered, grounded.