Last week I asserted that there's no easy way to learn.

· Bits and Bobs 1/21/25
  • Last week I asserted that there's no easy way to learn.
    • This generated some skepticism in the comments.
    • Let me try a slightly different derivation.
    • Learning means developing a better predictive model of the world.
    • Without error there is nothing in the model to correct.
    • Correcting weights to produce a better model is literally what learning means.
      • Especially for neural networks, but also for humans.
    • An error is a kind of failure.
    • Failure often hurts.
    • Sometimes there are situations where error doesn't feel like failure (for example, in a situation of play) or where failure doesn't hurt (for example in a psychologically safe environment) but those are special environments that have to be cultivated.

More on this topic

From other episodes