UX Design & Webflow Agency NYC | Composite Global

Expanded context windows for Cody Chat

No items found.
June 25, 2025

Developers increasingly want to work with more context, whether it’s reviewing all files in a pull request, generating a new Dockerfile based on all others in a repo, or reasoning across a large set of project resources. But context limits have traditionally capped how much they can reference in a single request, leading to fragmented workflows or lower quality responses.

To help with this, we’re expanding both input and output context windows in Cody Chat for Cody Enterprise customers.

Input context window (via @mention):

  • Claude & Gemini: ~115k tokens
  • OpenAI o-series: ~43k tokens
  • OpenAI GPT-series: ~52k tokens

Output context window:

  • All models: 16k
  • Reasoning models: 100k

Learn more about context windows in our docs.

Subscribe for the latest code AI news and product updates