It's creepy when a system that knows you better than you know yourself shows you ads.
- It's creepy when a system that knows you better than you know yourself shows you ads.
- The ads are a form of manipulation, if it knows what makes you tick it can be extremely persuasive.
- Pre-LLM systems could do this, but only by distilling crowd sourced revealed preferences.
- The system didn't understand you, it just knew how to show you ads that worked for people like you.
- Google never knew you better than you know yourself.
- It could mechanistically remember everything you told it, but figuring out your implied innermost desires was not something it could do.
- But LLMs allow the system to understand you and decide a plan to best get you to align with the system's goals.
- A conflict of interest is inescapable.