Supported LLMs

Chat and Prompts

Cody supports a variety of cutting-edge large language models for use in chat and prompts, allowing you to select the best model for your use case.

Newer versions of Sourcegraph Enterprise, starting from v5.6, it will be even easier to add support for new models and providers, see Model Configuration for more information.
ProviderModelFreeProEnterprise
OpenAIGPT-4 Turbo-
OpenAIGPT-4o-
OpenAIGPT-4o-mini
OpenAIo3-mini-medium (experimental)
OpenAIo3-mini-high (experimental)--
OpenAIo1-
AnthropicClaude 3.5 Haiku
AnthropicClaude 3.5 Sonnet
AnthropicClaude 3.7 Sonnet-
GoogleGemini 1.5 Pro✅ (beta)
GoogleGemini 2.0 Flash
GoogleGemini 2.0 Flash-Lite Preview (experimental)
To use Claude 3 Sonnet models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version. Claude 3.7 Sonnet with thinking is not supported for BYOK deployments.

Autocomplete

Cody uses a set of models for autocomplete which are suited for the low latency use case.

ProviderModelFreeProEnterprise
Fireworks.aiDeepSeek-Coder-V2
Fireworks.aiStarCoder--
Anthropicclaude Instant--
The default autocomplete model for Cody Free, Pro and Enterprise users is DeepSeek-Coder-V2.
The DeepSeek model used by Sourcegraph is hosted by Fireworks.ai, and is hosted as a single-tenant service in a US-based data center. For more information see our Cody FAQ.
Previous
Proxy Setup