We maintain mental models of the systems that we interact with that can be wrong, and sometimes confuse us.
Matt Webb told me he had heard somewhere that humans tend to categorize objects into four basic categories:
Rocks - No movement or agency; just obey the laws of physics.
Plants - Alive, but fully reactive / slow; you can garden them.
Animals - Alive and active, with some degree of agency but strictly dumber than us. We can trick them without having to be particularly clever.
People - Alive and with a level of intelligence and agency roughly equivalent to ours; we need to have a full-fledged theory of mind to deal with them.
Waymo cars feel like animals.
Raw AI foundation models feel something like plants.
ChatGPT feels like a person.