A short read on the topic's time range, peak episode, and strongest associations. Use it as the quick orientation before drilling into examples.
infinitely patient appears in 53 chunks across 34 episodes, from 2024-06-10 to 2026-04-06.
Its densest episode is Bits and Bobs 8/11/25 (2025-08-11), with 4 observations on this topic.
Semantically it travels with llms, ChatGPT, and coordination cost, while by chunk count it sits between compounding value and pace layer; its yearly rank moved from #129 in 2024 to #20 in 2026.
Over time
?
Raw mentions over time. Use this to see absolute attention, not relative rank among all topics.
Range2024-06-10 to 2026-04-06Mean1.6 per episodePeak4 on 2025-08-11
Observations
?
The primary evidence view for this topic. Sort it chronologically when you want concrete examples behind the larger pattern.
Showing 53 observations sorted from latest to earliest.
...written by LLMs, then Developer Experience at that layer matters less.
LLMs are infinitely patient, and prefer things that are like other common things.
Humans will be interacting at higher levels of specs; the actual code will be like how we treat...
LLMs are infinitely patient so good enough ACLs aren't good enough any more.
Before your data was protected a bit by security through obscurity.
But LLMs are infinitely patient ...
... most humans give up or aren't patient with overly constrained setups.
LLMs are infinitely patient, so you can have them be very constrained to what you want.
When you have those best practices, the actual instructions to the agent can be quite sho...
I had the opportunity to see a presentation from Tom Costello, one of the authors of the paper that showed that LLMs are great at changing the beliefs of conspiracy theorists.
Previously everyone assumed that conspiracy theorists were inherently hard to convince.
It turns out that it's just hard to
...ime on something than any reasonable person would think was worth it."
LLMs are infinitely patient.
If you let the tokens flow, LLMs could create magic.[lq]
LLMs enable a new kind of perfectly adaptable liquid media.
Traditional media (e.g. essays, movies) are fixed in place, static.
Traditional media contains content that is dead.
Written things are fossils of ideas.
They don't change, even when the world around them changes.
Fossilized content has to
...r System 2[aad] is currently high school graduate level.
One bonus it has: it's infinitely patient, unlike real high school students.
The combination of these two components is extraordinarily, world-changingly powerful.[aae][aaf][aag][aah]
LLMs are human-level common sense[afl] with infinite patience, many orders of magnitude cheaper than real humans.
It's impossible for that to not be disruptive.
Especially since the technology for GPT4 class quality is already commodity.
O3 in particular gets super human performance from grad studen
... you do in a day that could be successfully delegated to another person–even an infinitely patient, human-level-common-sense, cheap agent[aha][ahb][ahc][ahd][ahe].
Coordinating your beliefs and wants, and verifying the quality of the output, is a n...
...thing is on fire, being terse and easy to understand is paramount.
But LLMs are infinitely patient.
They have no problem wading through verbose details.
So maybe for domains like sync where precision of intent is important, verbose code is now more...
...ithout the assistance?
Who knows.
But does it matter if you'll always have that infinitely patient LLM friend there willing and able to help?
Similar to the dizzying, terrifying freedom of relying on a tool for thought. "I have superpowers when I u...
... formal description of the picture in a stream because they're very precise and infinitely patient.
Just streaming the pixels off left to right, top to bottom.
Maybe with some fancy math to do lossy compression.
Humans could theoretically do that, ...