LLMs are fundamentally exposed to the prompt injection problem.
- LLMs are fundamentally exposed to the prompt injection problem.
- There's no containment boundary between the data and control planes.[aci]
- Unlike in SQL, there's no structural way to escape possibly dangerous input and remove it from the control plane.
- It's all just squishy text.