Cody for JetBrains v5.5.2 release
Cody for JetBrains v5.5.2 is now available, including fixes for autocomplete formatting bugs, a new chat export function, and Claude 3 Sonnet for Cody Free users.
Read more
Cody for JetBrains v5.5.2 is now available, including fixes for autocomplete formatting bugs, a new chat export function, and Claude 3 Sonnet for Cody Free users.
Read more
Cody for VS Code v1.12.0 is now available. This release brings Claude 3 Sonnet to Cody Free users as the new default model plus several improvements for context handling.
Read more
Cody empowers support engineers to unblock themselves and solve complex issues autonomously. By leveraging Cody's capabilities for documentation retrieval, error detection, script writing, and infrastructure explanation.
Read more
Learn how to use local LLM models to Chat with Cody without an Internet connection powered by Ollama.
Read more
Cody for VS Code v1.10.0 is now available, this release includes support for Claude 3 Haiku, several improvements to doc string generation, and debugging.
Read more
Charles Goode, an electrical engineering student at Kennesaw State University, shares his experience using Cody to rapidly implement a GPS data logging system for the university's Formula SAE racing team, Kennesaw Motorsports.
Read more
The latest release of Sourcegraph's Cody plugin for JetBrains brings exciting new features and improvements. With Claude 3 support, Intelligent file mentioning, better error reporting, and Enhanced remote context for Cody Enterprise users.
Read more
Cody for VS Code v1.8.0 is now available and includes support for Claude 3, local Ollama models, @-mentioning line numbers, keybindings for custom commands, and automatic updating of the local search index.
Read more
No Internet? No problem. Learn how to use Ollama with Cody for VS Code to get local code completion.
Read more
This release includes several keyboard shortcuts, enabling login in VSCodium, reducing autocomplete latency, and fixing issues with chat stealing editor focus and displaying file ranges.
Read more
An informal guide for digital nomads, showcasing how Termux and Cody turn your tablet into a beach-friendly coding hub. Ditch the laptop for a lightweight setup with Termux's Linux shell, Visual Studio Code's web interface, and Cody.
Read more
Cody gets even better with multi-repo context support, faster completions, improved commands, and much more. Read on for all the details.
Read more
We're proud to announce Cody Enterprise, a significant milestone for Cody that helps bridge the gap between realizing the potential of AI coding assistants and meeting the unique needs of enterprises.
Read more
Context is key for AI coding assistants. Cody uses several methods of context fetching to provide answers and code relevant to enterprise-scale codebases.
Read more
Sourcegraph 5.3 includes security-focused features for Cody along with multi-repo context. Code Search also receives a new search results UX.
Read more
Cody for VS Code v1.4.0 is now available and includes a completely reworked Generate Unit Tests command, a new code editing menu, Ask Cody to Explain support for terminal output, easier access to Cody commands from chat, faster autocomplete, and more.
Read more
Cody for VS Code v1.2.0 is now available and includes updated chat history navigation, redesigned chat message editing, improved chat context limit handling, and a number of other bug fixes and improvements.
Read more
In this tutorial, we will build a React app that when given a keyword will generate tweets using GPT-3 . We will use Cody as our AI assistant, and learn about five tips to enhance our React development experience with AI.
Read more
So, there's this function. It's called a lot. More importantly, all those calls are on the critical path of a key user interaction. Let's talk about making it fast.
Read more
Introducing Cody v1.1.0: Enhanced Edit Code Command with Contextual Awareness, Offline Autocomplete with Code Llama Integration, and Improved Chat @-Mentions, aimed at boosting coding productivity in VS Code.
Read more