If you come up with a definition of a task LLMs can't do...
If you come up with a definition of a task LLMs can't do... can humans?
When you understand a machine or animal better, it causes you to reflect on your own human skills. What makes us human?
"LLMs can't reason!" / "... OK, but for that task, can humans do it?"
LLMs don't do novel reasoning, but they do such a good and wide-ranging facsimile that you won't notice it in basically any possible situation. So does it matter?
Everything everywhere is basically just vibes, and that happens to work!