Why is LLM training convergent?

· Bits and Bobs 8/25/25
  • Why is LLM training convergent?
    • It feels like it should be divergent; diffusing through an unfathomably vast hyper-dimensional space.
    • But our intuitions for hyper-dimensional spaces are often wrong.
    • Hyperdimensional spaces are interconnected in surprising and weird ways.
    • Wormholes that teleport from one region to the other.