Bits and Bobs 2/10/25
1Next week's Bits and Bobs will come out on Tuesday due to the US holiday.
- Next week's Bits and Bobs will come out on Tuesday due to the US holiday.
2With the new class of reasoning models, we have computerized both System 1 and System 2.
- With the new class of reasoning models, we have computerized both System 1 and System 2.
- A refresher on the System 1 / System 2[aaa] conceptual model of human cognition:
- System 1 - Automatic, cheap, parallel retrieval of past experiences.
- System 2 - Effortful, expensive, serial creation of novel reasoning.
- LLMs before the reasoning models were 100% System 1.
- It looked like they were able to reason, but that was due less to them being capable of reasoning and more due to their planetary scale.
- If you thought of them as just like a normal single human, you'd miss the fact that they are a totally different category: a planetary-scale hive mind of vibes and memories.
- LLMs can quickly retrieve past vibes and tweak them to fit a novel pattern, but only a little bit.
- But with planetary scale, it doesn't really matter–there is such huge coverage of scenarios that there's likely a pretty good one to draft off of.
- As a result, we tricked ourselves into thinking they could do novel reasoning when really what they were doing is cache retrieval with a bit of augmentation.
- But the reasoning extensions from O1, R1 and the others are like a proper System 2.[aac]
- The System 1 has enough coverage that if you give it the time and space to think step by step, it can do proper reasoning even for truly novel scenarios not captured directly in the System 1.
- It has humans-style common knowledge it can use to reason its way through novel scenarios.
- These models also give a reinforcement learning style curve of quality for that reasoning ability to then climb up.
- Turns out that computers need time to reflect and think to get better answers, just like humans do!
- Both of the computer System 1 and System 2 are impressive on their own.
- The computer System 1 is orders of magnitude beyond any human's System 1 that has lived or could ever live.
- The computer System 2[aad] is currently high school graduate level.
- One bonus it has: it's infinitely patient, unlike real high school students.
3The Washington Post covered an example where OpenAI's Operator did something unexpected.
- The Washington Post covered an example where OpenAI's Operator did something unexpected.
- The user had asked Operator to find cheap eggs around them, then left it alone for a few minutes.
- Operator ended up ordering expensive eggs to their house–even though the user had never asked it to actually buy any eggs.
- They had shared their home address so it could narrow its scope, and had given it access to Instacart so it could see prices… but hadn't expected it to actually buy something.
- From New York Times's review:
- "In all, I found that using Operator was usually more trouble than it was worth. Most of what it did for me I could have done faster myself, with fewer headaches. Even when it worked, it asked for so many confirmations and reassurances before acting that I felt less like I had a virtual assistant and more like I was supervising the world's most insecure intern."
- This is a small example of the challenge of the agent frame, of giving something autonomous agency to operate on your behalf.
- If you aren't perfectly aligned, watch out!
- Agents are like the monkey's paw of wish granting.
- Be careful what you wish for if there's any ambiguity in what you asked it to do![aak]
4DeepSeek is a banana peel moment for OpenAI.
- DeepSeek is a banana peel moment for OpenAI.
- A thing that looked untouchable and strong is revealed to actually be very precarious, in an embarrassing way in front of everyone.
5I want magic that is cozy.
- I want magic that is cozy.
- Magic in the large is terrifying and inhuman.
- What if it decides you're in its way?[aal]
- Magic in the small can be empowering and human.
6I want an enchanted loom to help me weave together software that accomplishes my wishes.
- I want an enchanted loom to help me weave together software that accomplishes my wishes.
- A tool that I direct, that is enchanted with the insights from across the realm.
- The collective, cozy magic of all of the users, together, working for each of us as individuals.
7I want cooperative software.
8I want to garden my own software.
- I want to garden my own software.
- Plant seeds of intention.
- Spread fertilizer of my data for the software to operate on.
- The system provides the trellis for the software to grow on.
- I pull weeds and prune back parts that are growing outside of what I want.
- Today's software is made via factory farming.
- Optimized for efficiency for the market as a whole, not my individual nutrition.
- Let's create organic software in community gardens.
- Human-scale, nutritious software.
9I want truly personal computing.
- I want truly personal computing.
- The word "PC" means "Personal Computer".
- It was in contrast to the centralized mainframes of the time.
- But it also meant a level of personal computation.
- Nobody could tell you no on your own computer.
- In the early days of the PC, it was messy and weird.
- You could install whatever software you wanted, and combine data from applications in novel ways via the filesystem.
- Then we moved it to the cloud for convenience, which also allowed new types of collaboration and social networking not possible locally.
- But now it was on someone else's server: someone else's turf.
- If you didn't like how a piece of software worked, you couldn't tweak it or configure it to connect other tools on the data in the filesystem.
- You can use it or not use it, that's the only two options.
- This leads to a world where software tries not to be great for individuals but good enough for the largest market possible.
- A thing that people don't actively hate that's minimally better than alternatives.
- Even if you have the motivation and drive to improve or build, you can't.
- It's shrink wrapped, one-size-fits-none software.
- Why did we give up on the idea of personal computing in the era of the cloud?
- We lost the "computer" part of the PC for the cloud, and we also threw out the "personal" baby with the computer bathwater.
- Why not both? In a disruptive new era of AI, it's more important than ever before.
10I don't want humane tech, I want tech that makes us more human.
11Apps are more than just their code.
- Apps are more than just their code.
- If you had only the Instagram app binary, you'd have a teensy portion of what "Instagram" means.
- Instagram is a massive network of users, connections, and data.
- The app part is just a client to access that network.
- Most apps are actually a special kind of hyper-specific browser for a specific, proprietary network.
12What would happen if we shattered apps into liquid software?
13In today's architecture, the person who wrote the software has to be the one to extend it with new functionality.
- In today's architecture, the person who wrote the software has to be the one to extend it with new functionality.
- You can't run untrusted 3P code in a trusted context unless it's actively sandboxed.
- There are some ways to do sandboxing in the context of another experience (e.g. iframes) but they create noticeable seams between the host and the inner content. It's hard to make seamless experiences.
- In the architecture of today, this means the person who writes the software has to write turing complete code to extend their app that works on your data.
- The software runs on the software creator's servers, out of our reach.
- But things that you care a lot about will not necessarily be a thing they care about, if few other people have that use case... or it's against their business incentives.
14SMTP is a weathered communication technology.
15Email clients in today's software paradigm have to be one-size-fit- all.
- Email clients in today's software paradigm have to be one-size-fit- all.
- From one extreme of people who only get spam and only a few emails a day, to people who are getting thousands of emails a day.
- But what if you could have an email client optimized entirely for you?[aax]
16Don't play in a category.
- Don't play in a category. Define one.
- Don't just build a product, define a category.
17Some new categories open up a whole new universe of categories within them.
- Some new categories open up a whole new universe of categories within them.
- The web was such a meta-category definer.
- There was a new category of thing: the web browser.
- But within the web, there was an explosion of new categories that previously were impossible (online shopping, social networks, search engines, etc).
18The browser of tomorrow will be distributed via the browser of today.
19Aggregators are like monkey traps.
- Aggregators are like monkey traps.
- Monkey traps have a delicious prize in a pot.
- The monkey sticks their hand in, grabs the prize… but then the monkey's fist with the prize inside is too large to fit out of the pot's mouth.
- The monkey is trapped; they don't want to give up the prize, but they can't leave with it.
20If you use a linear process to react to a compounding curve, you will never catch up.
- If you use a linear process to react to a compounding curve, you will never catch up.
- Compounding curves are things that grow on their own, like a tree.
- But it takes time for the tree to start growing and become big and strong.
- If you need something big and strong right now, planting a tree will never be the thing you do.
- But if you had planted a tree before you needed it, you'd have your own self-growing thing.
- If you don't have a tree and are trying to tackle something that grows on its own, at each time step you will get more and more behind.
- It will never solve your tactical problem to plant the tree, so you never will.
- If you know you're up against a compounding force, you need to have the compounding complement.
- Plant it early.
21Search engines should coevolve with their ecosystem.
- Search engines in a late stage ecosystem are massive, complex beasts.
- But at the very beginning of the ecosystem, they can be simple and small.
- Then, as the ecosystem grows, you iteratively increase the complexity of the search engine too.
- Search engines are proprietary, opinionated guides that users seek out to help them navigate the wild, verdant, but overwhelming jungle of an ecosystem.
- The ecosystem will grow at a compounding rate, so the search engine must, too.
- If you build a search engine as a linear process, it will never catch up with the compounding momentum of the ecosystem.
- Linear process here means "An employee has to invest effort to get it to handle the increased scope of the ecosystem."
- A directory like Yahoo was would be an example of a linear process.
- By default we build things with linear processes, but this will get you stuck.
- My first year out of college I worked in the Search team on a precursor to the Knowledge Graph.
- In the product we were building we needed to know which attributes were actually synonyms (e.g. "date of birth" and "birthdate").
- There were only a few dozen equivalence classes that covered almost all use cases, so I wrote up a CSV of configuration and we tried to check it into version control.
- That meant that as the ecosystem grew[abi]–as more users used it for more attribute names we'd never thought of, or in different languages, even without human intervention the system would be able to adapt and heal, automatically.
- This iron rule was one reason that the search engine could coevolve with the underlying, open-ended ecosystem.
- A tiny bit of additional work now made the system antifragile and auto-catalyzing, a compounding process, not a linear one[abj].
22A fundamental, inescapable battle that shapes a lot of the world around us: the warring curves.
- A fundamental, inescapable battle that shapes a lot of the world around us: the warring curves.
- The warring curves are the logarithmic and the exponential.
- There are many scenarios where you get exponential cost for logarithmic returns.
- At the beginning they work great.
- But they are fundamentally capped at some frustratingly low ceiling.
- You can always get more return for more effort, but past a certain point it's effectively meaningless, all effort no return.[abk]
- This is a vicious spiral.
- Close-ended, top-down contexts have this shape.
- This curve shows up because to capture more real-world value, you need a ton of rules and structure.
- Think of the portion of real world value captured by the system as the volume, and the set of rules and structure necessary to capture that value as the surface area.
- The real world is fundamentally, inescapably wrinkly and fractal; it does not behave according to simple linear rules that make sense to humans.
- As you get into smaller details, they have more surface area for a smaller amount of net new volume.
- This is where the shape of the curve comes from.
- The volume grows logarithmically and the surface area grows exponentially.
- This scenario cannot be tackled by simply putting more effort into it; the curves race away from each other.
- This problem only shows up if you try to capture the real world with top down rules.
- At the very early stages, you get a high amount of value for a small amount of effort, and everything seems to be going great.
- But unbeknownst to you, you are locked into a path of tragedy and heartache.
- As you get further, you get closer and closer to the inflection point where the curves cross.
- When the curves cross, you start getting extreme cost for very little benefit.
- As you get closer to the inflection point, you can tell something is wrong, but you think you're just losing your touch, and you push harder.
- Once you get past the inflection point you are lost. You are locked into this approach that used to work but now cannot work.
- Starting from scratch seems impossible, and so you toil there, stuck in a dead end, increasingly exhausted and angry, until you give up.
- You can get to 80% of the value with 20% of the effort, but you'll need infinite effort to get anywhere meaningfully beyond 80%.
- This phenomena shows up in many situations.
- This showed up for Alexa and Google Assistant: grammar based trees of behavior that got increasingly expensive to author for real world scenarios.
- This shows up any time you try to capture a real world phenomena in a formal ontology.[abl]
- This shows up in a government trying to set precise laws.
- This is also part of why you get the tyranny of the rocket equation for organizations.
- That is, where adding an incremental employee is the best way to get more value, but each incremental employee adds less and less value until it becomes infinitesimal.[abm]
- It's also why organizations and organisms[abn] get to a certain size and can't get bigger whereas emergent things like an ecosystem or economy or city can grow to an arbitrary scale.
- The top down control works best at small scales but it flips for larger scales where it's not possible to coordinate
- But all hope is not lost. There are other scenarios where you get exponential return for logarithmic effort.[abo]
- These combine the two warring curves in their maximal expression.
- This is the virtuous cycle.
- Open-ended, emergent contexts have this shape.
- This shape is wildly different from its cousin.
- It starts off with more effort than the alternative.
- But past the inflection point you get nearly infinite value created for a small amount of effort.
- It uses an emergent phenomena to capture an emergent phenomena.
- You just need to plant the seed and water it; it will grow into a majestic oak tree on its own.
- These kinds of situations are rare and hard to engineer, but they can be searched for and grown.
- Think truffle hunting or gardening.
- To solve modernity's problems, we need to focus more of our attention on this kind of virtuous cycle.[abp]
- The warring curves problem is impossible to fix with coherent effort.
- It can only be tackled with swarm energy that is emergent and imprecise.
23All else equal, power goes to the scarce thing.
- All else equal, power goes to the scarce thing.
- The scarce thing tends to be expensive.
- Software used to be scarce.
- But now software–at least, shitty software in the small–can be plentiful.
- That implies the creators of software will have less power.
24Centralized things become bland.
- Centralized things become bland.
- Centralized things try to get more marginal users to scale.
- The best way is to sand down the rough edges, the complexity, to appeal to those marginal users.
- Sanding down and aiming for the lowest common denominator is by definition bland.
25Extremely simple, massively used protocols exist.
- Extremely simple, massively used protocols exist.
- But most protocols fail by being either too simple to coordinate in an interesting way or so bloated that they never get off the ground.
- There's an existence proof that it's possible to balance on that knife's edge, to find explosive growth with simplicity[abq], but that doesn't mean it's easy to do.
- It's an emergent process with selection pressure, very hard to do by design.
26Why do open protocols seem to only work when there's some dirt simple coordination format?
- Why do open protocols seem to only work when there's some dirt simple coordination format?
- Because to coordinate around a protocol people have to decide to use it, and the value of using it goes up with the number of other people who already use it.
- A classic network effect.
- The more complex it is--the more characters of normative spec text–the more there is to disagree with.[abr]
- Because the coordination point is an emergent property of multiple actors, a linear increase in length of things to disagree with leads to a super-linear decrease in likelihood to be a viable coordination point.
- This is the exact same inescapable coordination tragedy[abs] covered in the slime mold deck.
27Ecosystems get to critical mass when they start building themselves.
- Ecosystems get to critical mass when they start building themselves.
- That is, the fire catches and becomes a self-sustaining flame.
- If the ecosystem isn't fundamentally better than more established alternatives, then they never catch up.
- Even with billions of dollars of investment, it's sometimes not possible to get to the critical mass point.
- But if the ecosystem is fundamentally better in some novel way, if it has no alternatives, then sometimes it can get to critical mass at a surprisingly low bar.
- You need:
- 1) a core that is valuable to a subset of the market on its own
- 2) a boundary gradient that pulls in more and more people over time.
- The boundary gradient can be a network effect, or it can be as simple as a thing that is values aligned, that people want to want.
28Find a context that's smarter than you and listen to it.
- Find a context that's smarter than you and listen to it.
- Emergent processes have a compounding growth curve that an individual process can never hope to beat.
- This is inspired by Christopher Alexander's notion of unfolding.
29Business-oriented people are monetization first, product second.
- Business-oriented people are monetization first, product second.
- Product-oriented people are product first, monetization second.
- A PM might build a product that provides value to society even if they couldn't monetize it.
- A business person might build a product that detracts value from society as long as they could monetize it.
30I love this frame of strong-link vs weak-link systems.
- I love this frame of strong-link vs weak-link systems.
- Most problems are weak-link problems.
- If a single component fails the whole thing doesn't work.
- This means all of the focus is on improving the weakest link.
- Some special problems are strong-link problems.
- Only the best link matters; everything else can be ignored.
- This is one of the characteristics of antifragile systems.
- Science, evolution, capitalism, etc.
- Strong-link systems are ones that have massive distributed computation, with memes, organisms, companies, products fighting it out, and the best rise to the top.
- A kind of emergent percolation sort.
31The more uncertainty there is, the more people cling to proxy metrics.
- The more uncertainty there is, the more people cling to proxy metrics.
- Humans have a tendency to replace a complex question with a simpler one (the proxy) and then solve that proxy.
- But they then forget that it's not the real problem and they are lost.
32Optimize not for being on the right side of the consensus, but the right side of the outcome.
- Optimize not for being on the right side of the consensus, but the right side of the outcome.
- Consensus is often taken as a proxy for what the outcome will be.
- In large organizations with lots of uncertainty and swirl, it's easiest to cling to consensus.
- But what ultimately matters most is the outcome.
- Would you rather go with the flow on a thing you think won't work, or go against the flow on a thing you think will?
33"I would simply do X" is a dead end analysis.
- "I would simply do X" is a dead end analysis.
- It's incurious about why it might not be that simple.
- If it were that simple, it would have almost certainly been done that way already.
- The frame has Saruman vibes; assuming everyone other than them is an idiot.
34A special kind of game-changing value is created when a thing goes from non-consensus to consensus.
- A special kind of game-changing value is created when a thing goes from non-consensus to consensus.
- If it's consensus before and after the transition, there's no alpha.
- Everyone already agreed it was important, so there's no pop of innovation value.
- If it's non-consensus before and after then it never gets momentum, it stays weird and niche.
- It's the crossing from non-consensus to consensus that creates the gradient of innovation value for a business.
- The bigger the difference from before to after, the more game-changing it is, the more innovation value that it creates and captures.
35If you try to control an open system, you make it closed.
- If you try to control an open system, you make it closed.
- The infinite potential becomes finite.
- You get control but the tradeoff is making it mortal, a discontinuous category change.
- However, you can garden an open system without making it closed.
- Gardening is about planting seeds, providing fertilizer and trellises, pruning back and weeding, but fundamentally giving the space to plants to grow.
- You get the best of both worlds!
36Complexity catalyzes complexity.
- Complexity catalyzes complexity.
- If you're competing over resources you have to be at least as complex as your competitor otherwise you're outside of their OODA loop.
- The competitor with the fastest OODA loop will win, all else equal.
- So evolution in an ecosystem goes at a compounding rate.
- As predation happens it kicks off an accelerating arms race.
37Pockets in a system have different properties than the main system.
- Pockets in a system have different properties than the main system.
- This is why they can often find innovations that will percolate out that the larger system couldn't find.
- The pocket has a different centroid than the main context, which might by happenstance be in a direction that is innovative.
- Though note most "innovations" they find aren't viable in the main context, it's only the small set that make it through the structural percolation gauntlet crossing over to the main context that work.
- The gauntlet is the selection pressure that culls the emergent process's outputs.
- Sometimes we use things like "ideas that resonate in my Bluesky clique" as a proxy for what will resonate in the broader context.
- But that only works if you have a random sample in the pocket.
- If there is a structural selection bias (which there must be in an emergent pocket, at least in some dimension) then it ceases to be a good proxy for the surrounding context.
38Another frame for the two unteachable skills: intellectual curiosity and grit.
- Another frame for the two unteachable skills: intellectual curiosity and grit.
39You can be "right there" in all but one dimension and still very far away.
- You can be "right there" in all but one dimension and still very far away.
- "I'm right where the GPS says I should be."
- "Yes, but at the ground floor, we're on floor 100"
40Someone told me their theory that Berkeley is structurally more likely to deliver game changing innovations.
- Someone told me their theory that Berkeley is structurally more likely to deliver game changing innovations.
- It has the weirdos, the crazy ones, the ones who don't want to--who can't--fit into the established power structures.
41If your system is entirely sound in its internal logic but doesn't grapple with the real world, you're building a video game.
- If your system is entirely sound in its internal logic but doesn't grapple with the real world, you're building a video game.
42In an ideology-driven movement, the person who the movement lifts up is not the best, but the most.
- In an ideology-driven movement, the person who the movement lifts up is not the best, but the most.
- Most committed to the ideology, not the most effective at achieving the outcome.
43Once a group of people believe in a particular infinite together they're already most of the way to a religion.
- Once a group of people believe in a particular infinite together they're already most of the way to a religion.
44Just because you can heal it doesn't mean you should.
- Just because you can heal it doesn't mean you should.
- Healing at the wrong pace layer is harmful because it hides the brokenness below.
- If it's broken below in a fundamental way making it look harmonious on top is bad.
45What you "need", "want", and "want to want" are all distinct.
- What you "need", "want", and "want to want" are all distinct.
- When they are aligned, floor it, lean in, that's the time of maximal growth.
- In the hero's journey, external circumstances put you in a crisis which realigns your "want to want" with your "need", and then your "want" aligning with your "want to want" is growth.
- Examples:
- Aligned: "need" + "want to want". Misaligned: "want" - Not motivated to do what you know you need to do.
- Aligned: "need" + "want". Misaligned: "want to want" - A gay person not yet out to themselves.
- Aligned: "want" + "want to want". Misaligned: "need" - Need personal growth and development to understand what you need.
46A parable about the fundamental, inescapable horror of internal politics in large organizations.
- A parable about the fundamental, inescapable horror of internal politics in large organizations.[abt][abu]
- The internals of an organization are best understood by the word kayfabe.
- Kayfabe is a carny word that means "a thing that everyone knows is fake but everyone acts like is real."[abv]
- It's often applied to professional wrestling.
- But it also applies to politics within organizations.
- Organizations exist to cause some positive impact in the outer world.
- But over time, the social processes necessary to work together inside the organization towards a coherent outcome metastasize and take over the organization's soul.
- A little bit of kayfabe is not bad–it's healthy, even.[abw]
- Imagine if in every team meeting when the boss proposed a new goal someone raised their hand and said "here's ten reasons I think this will definitely not work."
- In that case, the plan definitely doesn't work, because no one on the team will even try.
- But if everyone entertains the idea that it might work, maybe as we work together we find a way to make it work.
- But kayfabe tends to only ever grow.
- Imagine that you are responsible for grading progress on an objective that will be rolled up multiple layers of the chain to the CEO.
- One of your projects is objectively in a "yellow" state, but by the time the final roll-up is presented to the CEO next week, you'll have gotten it to a "green" state.
- Maybe you see the solution already and simply need to execute it.
- If you mark it a "yellow" now, you'll be more likely to attract scrutiny that could randomize you and create extra meta-work, and it will be solved by the time of the report anyway.
- What do you mark it down as? Yellow or green?
- What most people would do is mark it green.
- This is reasonable and safe for the employee.
- It's also reasonable and safe for the company.
- The problem is that this same logic plays out in multiple ply up the org chart.
- But if you greenshift on top of a thing that's already been greenshifted, the greenshifting multiplies together.
- That means that up multiple ply it could be off from the ground truth by many orders of magnitude.
- The organization exists to achieve a real outcome in the world, which requires it to understand the ground truth to effectively navigate it.
- But the kayfabe has decohered from reality.
- Imagine you notice the discrepancy–what do you do?
- If you walk over to the ground truth bell and threaten to ring it, someone more senior than you will pull you aside.
- "You're right, we're dangerously far from the ground truth. But if you ring that bell, it will cause chaos–all of the plans will be shattered in an instant and everything will decohere. Instead of ringing it, why not help fix it?"
- This seems reasonable, and so you agree.
- But as time goes on you realize that the kayfabe is not only stronger than you, it is stronger than any assemblage of individuals and is getting stronger every day: an emergent, compounding force.
- As it goes on, it is destroying value for your customers, for your employees, for the company, and for society.
- Resolute, you decide to go ring the bell, no matter the consequences.
- But right before you do, an anonymous zombie tackles you to the ground and stabs you in the dark before you destroy the organization.
- As a leader in an organization like that (and every organization is like this, at least a little): you have a hard decision to make.
- Do you go along with the kayfabe or try to understand the ground truth to create good outcomes in the world?
- As a leader, you have to hold both in your head at the same time–enough kayfabe to not get stabbed, but enough ground truth to actually achieve good outcomes.
- But the kayfabe will win over time.
- If you let go of the kayfabe, you'll get stabbed.
- If you let go of the ground truth, the outcome won't happen… but the social complexity makes it extremely indirect to attribute outcomes to actions anyway so you'll likely be safe as long as everyone thinks you're working hard.
- So the kayfabe tends to ratchet up and up.
- As you lose the grip on the ground truth you become a zombie.
- Once you have let go of the ground truth, all you have is the kayfabe, and defending the kayfabe becomes the end.
- If someone threatens it, they are threatening your infinity, and they must be stopped no matter what.
- So you stab them.
- So the question in this story is: in its fullest manifestation of kayfabe in an organization, there are only two options:
- To ring the ground truth bell knowing you will get killed for it.
- To stab the person who is about to ring the bell.
- Which will you choose?[abz]