Installing Cody in Visual Studio
Learn how to use Cody and its features with the Visual Studio editor.
Cody extension for Visual Studio enhances your coding experience by providing intelligent and contextually aware answers to your questions. This guide will walk you through installing and setting Cody within your Visual Studio editor.
Prerequisites
- You have the latest version of Visual Studio installed
- You have a Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account
Install the Visual Studio extension
- Download the Cody extension for Visual Studio from the GitHub repository
- Run the installer and follow the prompts to install the extension
- Once installed, go to the Extensions > Manage Extensions tab in Visual Studio and search for Cody under the Installed list for confirmation
Connect the extension to Sourcegraph
After a successful installation, go to Tools from the main toolbar at the top and click the Cody Chat from the drop-down. This opens the dialog box to connect to your Sourcegraph instance.
Cody Free or Pro users can sign in to their Sourcegraph.com accounts through GitHub, GitLab, or Google. Meanwhile, Sourcegraph Enterprise users should connect Cody via their Enterprise instance URL and the Access Token.
Complete these steps, and you'll be ready to start using Cody in Visual Studio.
Chat
Cody in Visual Studio allows you to ask questions about your code and get contextually aware answers. The chat window is available in a unified interface next to your code. All your previous and existing chats are stored for later use and can be accessed via the History icon from the top menu. You can download them to share or use later in a .json
file or delete them.
The chat input field has a default @-mention
context chips. These are automatically populated with the names of the files you have open in your editor. There is also a drop-down for LLM selection and a button to run pre-built prompts and commands.
LLM selection
Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Pro users can select the LLM they want to use for chat and experiment to choose the best model for the job. Choose from Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Sonnet, Claude 3 Haiku, GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo, Google Gemini 1.5 Pro, Gemini 1.5 Flash, and Mixtral, while Cody Free users have access to the latest base models from Anthropic, OpenAI, Google, and Mixtral.
Local models are also available through Ollama to Cody Free and Cody Pro users. To use a model in Cody chat, simply download it and run it in Ollama.
Administrators for Sourcegraph Enterprise instances can also choose between Claude and GPT models to set for their teams.
Selecting Context with @-mentions
Cody's chat allows you to add files and symbols as context in your messages.
- Type
@-file
and then a filename to include a file as a context - Type
@#
and then a symbol name to include the symbol's definition as context. Functions, methods, classes, types, etc., are all symbols
Context retrieval
When you start a new Cody chat, the chat input window opens with a default @-mention
context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted).
At any point in time, you can edit these context chips or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate @-mention
file or symbols to let Cody use it as a new context source.
When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files.
Prompts
Cody offers a variety of pre-built prompts and commands to help you get the most out of your chat experience. You can access these prompts and commands from the chat input field. Using one of these, you can ask Cody to:
- Edit your code
- Document your code
- Generate unit tests