Chats are append-only logs of messages because it's hard for the human to absorb changes in the conversation.
- Chats are append-only logs of messages because it's hard for the human to absorb changes in the conversation.
- Humans don't re-read every previous message of the conversation before responding; they use their imperfect fuzzy memory of what's already been said.
- But LLMs don't do that at all.
- LLMs "read" the whole context for every single token they produce[ahf].
- Which means that as a modality you totally can go back and amend, tweak, edit the previous messages to give better background context.