LLMs are fundamentally exposed to the prompt injection problem.

· Bits and Bobs 2/3/25
  • LLMs are fundamentally exposed to the prompt injection problem.
    • There's no containment boundary between the data and control planes.[aci]
    • Unlike in SQL, there's no structural way to escape possibly dangerous input and remove it from the control plane.
    • It's all just squishy text.

More on this topic

From other episodes