Supported local Ollama models with Cody
Compatibility with Ollama is currently in the Experimental stage and is available for Cody Free and Pro plans. The support is for Ollama is limited feel free to contact us for any questions or feedback.
Cody Autocomplete with Ollama
To get autocomplete suggestions from Ollama locally, follow these steps:
- Install and run Ollama
- Download one of the supported local models using
pull. Thepullcommand is used to download models from the Ollama library to your local machine.ollama pull deepseek-coder-v2for deepseek-coderollama pull codellama:13bfor codellamaollama pull starcoder2:7bfor starcoder2
- Update Cody's VS Code settings to use the
experimental-ollamaautocomplete provider and configure the right model:
JSON"cody.autocomplete.advanced.provider": "experimental-ollama", "cody.autocomplete.experimental.ollamaOptions": { "url": "http://localhost:11434", "model": "deepseek-coder-v2" }
- Confirm Cody uses Ollama by looking at the Cody output channel or the autocomplete trace view (in the command palette)
Cody chat with Ollama

To generate chat with Ollama locally, follow these steps:
- Download Ollama
- Start Ollama (make sure the Ollama logo is showing up in your menu bar)
- Select a chat model (model that includes instruct or chat, for example, gemma:7b-instruct-q4_K_M) from the Ollama Library
- Pull (download) the chat model locally (for example,
ollama pull gemma:7b-instruct-q4_K_M) - Once the chat model is downloaded successfully, open Cody in VS Code
- Open a new Cody chat
- In the new chat panel, you should see the chat model you've pulled in the dropdown list
- Currently, you will need to restart VS Code to see the new models
You can run ollama list in your terminal to see what models are currently
available on your machine.
Run Cody offline with local Ollama models
You can use Cody with or without an internet connection. The offline mode does not require you to sign in with your Sourcegraph account to use Ollama. Click the button below the Ollama logo and you'll be ready to go.

You still have the option to switch to your Sourcegraph account whenever you want to use Claude, OpenAI, Gemini, etc.