Consensus always pulls towards mush.
The centroid, the average.
Notably, that centroid might not itself be a viable point.
LLMs are inherently a kind of planetary-scale consensus mechanism.
For example, if you ask LLMs for "chicken paillard" recipes, they will do a good job.
If you ask it to give you a "chili recipe" it is much more likely to give you a disgusting slop, asymptotically approaching vomit in appearance and taste.
Recipes that are published in cookbooks or even shared on TikTok had a real human in the loop asserting, "I tried this and it was good."
The LLM can't try the recipe itself, so it can serve you up something gross without realizing it.