Managing your Context Window
Control what (and how much) the AI sees in Chorus
Language models have a context window length that is the maximum number of tokens it can handle. “Context” includes all information that’s in the chat, whether it’s text, images, or attachments.
Chorus provides several handy features to manage what’s in the context.
Managing what’s “In Chat”
One of Chorus’ core features is the ability to talk to multiple models at once.
In Chorus, only one AI message is kept in the chat history for each turn of the conversation. This is the message marked as “In Chat”. Not only does this keep the context of your chat smaller, it creates an easy linear history for the LLMs to understand.
To control what’s “In Chat”, simply click on a message to keep it.
Summarizing and Starting a New Chat
If one of your models runs out of context length, you’ll be presented with a error message. If you press the Summarize and Start New Chat
button, it will:
- Summarize and transcribe your current chat
- Create a new chat, attaching the transcription as a markdown file
- Queue up your message that the AI was originally unable to respond to