Language isn't just a large mush of words, it's a high-dimensional fractally structured crystal.
It's a tool applied to the real world, so the structure it absorbs into itself is a reflection of actual structure in the real world, because if it weren't then the swarm wouldn't bother to absorb it.
It's only if lots of members of the swarm find it useful that the information becomes a new wrinkle in the hyperdimensional structure of language that is then propagated forward in time and space.
If no one uses a new wrinkle, it fades away over time, entropy eroding it.
Humans absorb knowledge into language. And then LLMs can come along and slurp up that knowledge.
LLMs and neural nets are extremely rudimentary processes, just in an extremely large scale system soaking in an incredibly rich corpus of a galactically detailed thing.