The Washington Post covered an example where OpenAI's Operator did something unexpected.
- The Washington Post covered an example where OpenAI's Operator did something unexpected.
- The user had asked Operator to find cheap eggs around them, then left it alone for a few minutes.
- Operator ended up ordering expensive eggs to their house–even though the user had never asked it to actually buy any eggs.
- They had shared their home address so it could narrow its scope, and had given it access to Instacart so it could see prices… but hadn't expected it to actually buy something.
- From New York Times's review:
- "In all, I found that using Operator was usually more trouble than it was worth. Most of what it did for me I could have done faster myself, with fewer headaches. Even when it worked, it asked for so many confirmations and reassurances before acting that I felt less like I had a virtual assistant and more like I was supervising the world's most insecure intern."
- This is a small example of the challenge of the agent frame, of giving something autonomous agency to operate on your behalf.
- If you aren't perfectly aligned, watch out!
- Agents are like the monkey's paw of wish granting.
- Be careful what you wish for if there's any ambiguity in what you asked it to do![aak]