With variation, some systems converge and some diverge.
- With variation, some systems converge and some diverge.
- If they diverge, you get a rat's nest.
- The variation compounds and becomes mutually inscrutable.
- If they converge, you get a boring, over-saturated middle.
- LLMs converge their output.
- So now the middle of the distribution of output will be over-saturated.
- That will push out the differentiation to the tails.
- Hyper-niche.
- Hyper-scale.
- This was already happening before LLMs, due to the zero cost of content distribution.
- But LLMs turbocharge it.