Jordan Rubin: "A library you can import through the right metaphor"
...mport through the right metaphor" The right jargon unlocks the right library of background context. LLMs understand almost all jargon.
12 chunks · 9 episodes
...mport through the right metaphor" The right jargon unlocks the right library of background context. LLMs understand almost all jargon.
... week. In the past I've found being able to pass the Bits and Bobs to Claude as background context made it a much more powerful brainstorming partner. But recently the amount of Bits and Bobs–even just focusing on the ones related to my job–was far...
... media. This is one of the reasons my Bits and Bobs export is such an effective background context[vv] for me to feed to LLMs when I'm brainstorming.[vw] My Bits and Bobs is like my own personal intellectual orange juice concentrate.
...totally can go back and amend, tweak, edit the previous messages to give better background context.
LLMs compress nearly all of humanity's background context into a teensy weeny little hyperobject package. They have effectively infinite background context–a marvel of lossy compression. The answers to innum...
...ke Shark DNA vs Frog DNA. 'Frog DNA' is the generic mush the model learned, the background context it falls back on to fill in the gaps that you didn't specify in your prompt. With RAG you proactively select the most useful bits for it to use to fi...
...How many of them hit a wall needing some kind of hyperobject with near infinite background context? If you sprinkle the pixie dust of LLMs on those old ideas, how many could spring to life?
...resentation to allow this to be feasible." You'd somehow need all of humanity's background context compressed into a tiny little hyperobject that you could cheaply, easily, and quickly query with arbitrary questions. Hmmmm….
What counts as MAYA has to do with the background context. For example, if you tried to introduce jazz in 1850 it would have been rejected. Jazz is a speciation event in a coevolutionary space. It required a...
...on length limits in my Claude chats that use projects that I've crammed full of background context. LLMs do better the more context they have to work with. People will cram contexts as full as the LLM will let them.
...xplainers or READMEs. Have documentation in a Google Doc, not a website. Assume background context the reader might not have. Connect nine of the ten dots. Hide signal in a swarm of interesting but distracting details.
...dd a teensy bit more context to help the idea make sense in the future once the background context is lost, develop the idea just a teensy bit, and maybe interlink with other recent related ideas. A lot of the time, similar ideas have come up in mu...