Prompt injection can't be solved if you assume the chatbot is the main entity calling the shots.
- Prompt injection can't be solved if you assume the chatbot is the main entity calling the shots.
- Because chatbots are confusable, so they can't enforce security boundaries.
- The chatbot is in charge, which can't be secure.
- The chatbot has to be a feature, not the paradigm.