Cody for JetBrains v6.0.14: Claude 3.5 Sonnet, Gemini 1.5, experimental Ollama support, and more
The Cody plugin has new models from Anthropic and Google, better feature discoverability, updated UI hints, experimental Ollama support, and more.

The Cody plugin has new models from Anthropic and Google, better feature discoverability, updated UI hints, experimental Ollama support, and more.
Cody for JetBrains v6.0.14 is now available! This plugin version has new models from Anthropic and Google, better feature discoverability, updated UI hints, experimental Ollama support, and more.
Cody for JetBrains now includes new flagship models from both Anthropic and Google, including:
These models are available for chat and commands for all Cody users, including Free, Pro, and Enterprise.
We recently upgraded Cody's free tier, and it now includes:
These changes are live for all users. Cody Pro is still available for $9/month for devs wanting unlimited chat and commands plus access to the best flagship models.
You can now more easily find and discover Cody actions using the Search Everywhere feature in JetBrains.
Press Shift twice to open the Search Everywhere window. Then, type in the Cody: prefix to get a list of all supported Cody actions.

When you highlight a code selection, Cody provides an inline hint โ Ctrl + Alt + โต to Edit โ next to your selection. This hint is now shown below the last line of the selection to be less intrusive. It also isn't shown when you only select a single line of code, which prevents it from appearing when you're not intentionally editing a code selection (such as when you use Find/Replace and highlight multiple single-line selections).
You can also turn off this UI hint in the Cody settings.

We understand that hitting rate limits can be frustrating when reaching the limit isn't made obvious. For clarity, we've added notifications in the status bar to indicate when you've hit the chat and command rate limit.

You can now power Cody's chat and commands off Ollama models running on your local machine. This lets you chat without sending messages over the internet to an LLM provider so that you can use Cody offline. You can code with an AI coding assistant wherever you are, even on an airplane!
You'll need to install Ollama and download a chat model such as CodeGemma or Llama3. The README provides full setup instructions. Ollama is currently supported on the Free and Pro tiers.
See the changelog and GitHub releases for a complete list of changes.
Cody wouldn't be what it is without our amazing contributors ๐ A big thank you to everyone who contributed, filed issues, and sent us feedback.
As always, we value your feedback in our support forum and on Discord. Happy Codying!

With Sourcegraph, the code understanding platform for enterprise.
Schedule a demo