Expanded context windows for Cody Chat
We've increased input and output context limits across models to support more powerful, large-scale coding workflows.
We've increased input and output context limits across models to support more powerful, large-scale coding workflows.
Developers increasingly want to work with more context, whether it's reviewing all files in a pull request, generating a new Dockerfile based on all others in a repo, or reasoning across a large set of project resources. But context limits have traditionally capped how much they can reference in a single request, leading to fragmented workflows or lower quality responses.
To help with this, we're expanding both input and output context windows in Cody Chat for Cody Enterprise customers.
Input context window (via @mention):
Output context window:
Learn more about context windows in our docs.