LLMs are losing the ability to simulate real people.
- LLMs are losing the ability to simulate real people.
- LLMs are largely a warped mirror of all of the human input.
- It used to be possible to use LLMs as a kind of proxy for swarms of humans to see how people would respond to given things like surveys.
- But as we've trained them more for reasoning, they've lost the ability to handle nuanced situated human perspectives.
- The distributions are getting sharper and sharper.