LLMs can't make convincing recommendations on their own.
- LLMs can't make convincing recommendations on their own.
- They lack the lived experience of the real phenomena.
- Good recommendations must be based on real, situated human decisions and opinions.
- Review aggregator sites do this, by averaging hundreds of reviews from real people.
- These averages can be resonant, because they are distilled out of authentic human observations.
- LLMs can also do this for popular things, by absorbing the observations contained in the writing they were trained on.
- But novel recommendations are just pattern matching guesses.
- Hollow.