This week someone framed alignment to me as "minimizing agency cost, as defined in the principal agent problem."

· Bits and Bobs 8/4/25
  • This week someone framed alignment to me as "minimizing agency cost, as defined in the principal agent problem."
    • When you interact with a superior intelligence, the principal agent problem becomes more important to deal with, because the power differential gets larger.
    • Swarms, like organizations, are a form of emergent collective intelligence, and can be "smarter" than any individual member.

More on this topic

From other episodes