A new LLM attack: the CopyPasta License Attack.
...ack. Takes advantage of the fact humans don't actually read license blocks, but LLMs do. Infinite patience strikes again.
1,598 mentions · 615 chunks · 117 episodes
...ack. Takes advantage of the fact humans don't actually read license blocks, but LLMs do. Infinite patience strikes again.
...uality, convincing arguments[ak] used to require a significant skill to do. Now LLMs make it so anyone can spin them up at a moment's notice. That cheapens even truly well written and effective argumentation. Will we have to evolve a ...
With LLMs, the benefits seem to have shifted decisively to typed languages. Typed languages are much easier to iterate within without breaking things. The prob...
...ating strings of source code. AST is hard for humans to understand, but not for LLMs! LLMs have the patience to deal with complicated types. Even if they don't understand it at first, they are willing to try again and again and again ...
...ze queries. Follow standard practices and you don't have to think about it. Now LLMs make all text executable. Frameworks don't help. Everything is code. XSS has a solution: we can parse HTML/JS with 100% accuracy and sanitize it. Eve...
This week's wild west roundup, this time using LLMs incidentally in attack chains: Nx compromised: malware uses Claude code CLI to explore the filesystem zack_overflow: "A popular NPM package got compr...
... one I was thinking of. I imagine we'll see much more web traffic in the age of LLMs. A single user intent can spawn orders of magnitude more searches and fetches on their behalf. More traffic to websites, without more intent, seems l...
Are LLMs mass media or not? On the one hand there's a single shared model that has specific biases that shape all interactions with them. Some of them are rea...
...o multiplied by infinity is infinity. "Prompt injection won't be a problem once LLMs get perfectly good at not being tricked" is absurd. Prompt injection comes from a coevolving adversary, not a static distribution of quality. That me...
...ed on the current task. I wonder what other innovations will come form treating LLMs as not append-only ping-pong chat.
I think the world is ready for the dishwasher and microwave oven of LLMs. Today we have the faux humanoid robot that talks and acts like a (weird) human). Chat is a new UI to put on top of all those tools but not be the an...
The fact that LLMs still require humans in the loop to get good recurrent results undermines the "AGI is imminent" perspective. Even if the LLM is right 95% of the time...
...nd a potentially dangerous one! The most natural and powerful interactions with LLMs will not be via a chatbot.
Will LLMs lead to compounding ambiguity in communication? There's value in shrouding a potentially controversial statement in load-bearing ambiguity to keep th...
...um. Also, it's becoming more clear that owning the context will matter, to make LLMs maximally useful for a given user. The cookie jar is an obvious, highly distilled source of context. But if you actually look inside your cookie jar ...
...e missing the GUI for LLM native software. Chatbots are the command line to get LLMs to do structured things. You need to know arcane knowledge for LLMs to do structured or complex things for you today. It just so happens that they're...
To be interesting it has to be opinionated. That's why LLMs by default aren't interesting. They give you the view from nowhere[bw]. They seek perfect "objectivity", a one-size-fits-none impossibility.
...n top. Weaving stories required human effort. Now with the infinite patience of LLMs it's less about the human doing it and more about the human giving permission for data to be built upon.
The best way to think about LLMs for writing code is like managing a team of interns. You could have done the work yourself, but they did it and you verified.
...they could fix a bug you do want. Instead of vendoring a dependency, he has the LLMs draw on their inherent knowledge from all of open source to distill a bespoke, fit for purpose library in place.