LLMs make it so any text is "executable," so a possible injection attack.

· Bits and Bobs 2/18/25
  • LLMs make it so any text is "executable," so a possible injection attack.
    • This is because it allows english to be converted, explicitly or implicitly, to "executable" code as instructions for it to follow.
    • By default, the instructions it executes only affect what kinds of words it puts on your screen.
    • But if you give LLMs access to tools–computer programs outside its sandbox–the attack possibility explodes, because if the LLM can be tricked into using those tools that cause real-world side effects that might be dangerous to you.[yy][yz][za][zb][zc][zd]

More on this topic

From other episodes