This generated some skepticism in the comments.
Let me try a slightly different derivation.
Learning means developing a better predictive model of the world.
Without error there is nothing in the model to correct.
Correcting weights to produce a better model is literally what learning means.
An error is a kind of failure.
Failure often hurts.
Sometimes there are situations where error doesn't feel like failure (for example, in a situation of play) or where failure doesn't hurt (for example in a psychologically safe environment) but those are special environments that have to be cultivated.