LLMs make it so any text is "executable," so a possible injection attack.
- LLMs make it so any text is "executable," so a possible injection attack.
- This is because it allows english to be converted, explicitly or implicitly, to "executable" code as instructions for it to follow.
- By default, the instructions it executes only affect what kinds of words it puts on your screen.