Model Configuration Examples
This section includes examples about how to configure Cody to use
Sourcegraph-provided models with modelConfiguration
. These examples will
use the following:
Sourcegraph-provided models and BYOK (Bring Your Own Key)
By default, Sourcegraph is fully aware of several models from the following providers:
- "anthropic"
- "google"
- "fireworks"
- "mistral"
- "openai"
Override configuration of a model provider
Instead of Sourcegraph using its own servers to make LLM requests, it is possible to bring your own API keys for a given model provider. For example, if you wish for all Anthropic API requests to go directly to your own Anthropic account and use your own API keys instead of going via Sourcegraph's servers, you could override the anthropic
provider's configuration:
JSON{ "cody.enabled": true, "modelConfiguration": { "sourcegraph": {}, "providerOverrides": [ { "id": "anthropic", "displayName": "Anthropic BYOK", "serverSideConfig": { "type": "anthropic", "accessToken": "token", "endpoint": "https://api.anthropic.com/v1/messages" } } ], "defaultModels": { "chat": "anthropic::2024-10-22::claude-3.5-sonnet", "fastChat": "anthropic::2023-06-01::claude-3-haiku", "codeCompletion": "fireworks::v1::deepseek-coder-v2-lite-base" } }
In the configuration above:
- Enable Sourcegraph-provided models and do not set any overrides (note that
"modelConfiguration.modelOverrides"
is not specified) - Route requests for Anthropic models directly to the Anthropic API (via the provider override specified for "anthropic")
- Route requests for other models (such as the Fireworks model for "autocomplete") through Cody Gateway
Partially override provider config in the namespace
If you want to override the provider config for some models in the namespace and use the Sourcegraph-configured provider config for the rest, you can route requests directly to the LLM provider (bypassing the Cody Gateway) for some models while using the Sourcegraph-configured provider config for the rest.
Example configuration
JSON{ "cody.enabled": true, "modelConfiguration": { "sourcegraph": {}, "providerOverrides": [ { "id": "anthropic-byok", "displayName": "Anthropic BYOK", "serverSideConfig": { "type": "anthropic", "accessToken": "token", "endpoint": "https://api.anthropic.com/v1/messages" } } ], "modelOverrides": [ { "modelRef": "anthropic-byok::2023-06-01::claude-3.5-sonnet", "displayName": "Claude 3.5 Sonnet", "modelName": "claude-3-5-sonnet-latest", "capabilities": ["edit", "chat"], "category": "accuracy", "status": "stable", "contextWindow": { "maxInputTokens": 45000, "maxOutputTokens": 4000 } }, ], "defaultModels": { "chat": "anthropic-byok::2023-06-01::claude-3.5-sonnet", "fastChat": "anthropic::2023-06-01::claude-3-haiku", "codeCompletion": "fireworks::v1::deepseek-coder-v2-lite-base" } }
In the configuration above, we:
- Enable Sourcegraph-supplied models (the
sourcegraph
field is not empty ornull
) - Define a new provider with the ID
"anthropic-byok"
and configure it to use the Anthropic API - Since this provider is unknown to Sourcegraph, no Sourcegraph-supplied models are available. Therefore, we add a custom model in the
"modelOverrides"
section - Use the custom model configured in the previous step (
"anthropic-byok::2024-10-22::claude-3.5-sonnet"
) for"chat"
. Requests are sent directly to the Anthropic API as set in the provider override - For
"fastChat"
and"autocomplete"
, we use Sourcegraph-provided models via Cody Gateway
Config examples for various LLM providers
Below are configuration examples for setting up various LLM providers using BYOK. These examples are applicable whether or not you are using Sourcegraph-supported models.
- In this section, all configuration examples have Sourcegraph-provided models disabled. Please refer to the previous section to use a combination of Sourcegraph-provided models and BYOK.
- Ensure that at least one model is available for each Cody feature ("chat" and "autocomplete"), regardless of the provider and model overrides configured. To verify this, view the configuration and confirm that appropriate models are listed in the
"defaultModels"
section.