A conversation a user has with an LLM is chock full of private information.

· Bits and Bobs 8/12/24

You have to do quite a bit of denaturing to make sure it's not dangerous, to use it in other pipelines.

But imagine a situation where an LLM generates 4 different options in response to a user's high level query.

Then the user picks one of those 4 options as the one to keep.

The individual options aren't that private (they're the LLM's answer to a high-level prompt, not a user's).

The pick of the option isn't that sensitive – it's just a ¼ pick of options from an external system.

That means the signal is very useful and doesn't need to be denatured that much.