Prompt injection can't be solved if you assume the chatbot is the main entity calling the shots.

· Bits and Bobs 5/26/25
  • Prompt injection can't be solved if you assume the chatbot is the main entity calling the shots.
    • Because chatbots are confusable, so they can't enforce security boundaries.
    • The chatbot is in charge, which can't be secure.
    • The chatbot has to be a feature, not the paradigm.

More on this topic

From other episodes