LLMs just give generic advice.
- LLMs just give generic advice.
- The question is what do real humans in that situation think?
- If somehow you could have an emergent, real process of those decisions across the anonymous swarm of real users, you'd get very useful suggestions.
- If the LLM doesn't use your context for personalization, it feels like you're in its world.