I liked these fundamental reflections on alignment and collectives from SoftMax's Reimagining Alignment.
- I liked these fundamental reflections on alignment and collectives from SoftMax's Reimagining Alignment.
- "The result of this process is not just a big colony of cells, but an organism which is a new individual in itself. Something more than just the sum of its parts. The "we" of the cells becomes an "I", with goals that cannot be understood as some simple sum of the goals of the parts. Animals do the same thing, forming colonies and packs and so on. Even trees form these organically aligned collectives through mycelial networks. It happens at every scale, big and small."
- "Hierarchical alignment works fine, right up until the rules or person on top are wrong. The smarter the subordinate, the more likely this is. Hierarchical alignment is therefore a deceptive trap: it works best when the AI is weak and you need it least, and worse and worse when it's strong and you need it most. Organic alignment is by contrast a constant adaptive learning process, where the smarter the agent the more capable it becomes of aligning itself."