Installing Cody in Eclipse
Learn how to use Cody and its features with the Eclipse editor.
Cody extension for Eclipse enhances your coding experience by providing intelligent and contextually aware answers to your questions. This guide will walk you through installing and setting Cody within your Eclipse editor.
Prerequisites
- You have the correct version of Eclipse IDE installed
- You have a Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account
Install the Eclipse extension
- Inside Eclipse, go to Help > Install New Software
- Next, add the site URL
https://sourcegraph.github.io/eclipse
- After adding this URL, you should see the Cody category in the list of available plugins
- Click Next and follow the installation instructions
- After you have completed the installation and restarted Eclipse, you should see the Cody view in the Window > Show View > Other menu
Connect the extension to Sourcegraph
After a successful installation, open the Cody view. You should see a button to sign into your Sourcegraph account.
Cody Free or Pro users can sign in to their Sourcegraph.com accounts. Meanwhile, Sourcegraph Enterprise users should connect Cody via their Enterprise instance URL and the Access Token.
Complete these steps, and you'll be ready to use Cody chat in Eclipse.
Chat
Cody in Eclipse allows you to ask questions about your code and get contextually aware answers. The chat window is available in a unified interface next to your code. All your previous and existing chats are stored for later use and can be accessed via the History icon from the top menu. You can download them to share or use later in a .json
file or delete them.
The chat input field has a default @-mention
context chips. These are automatically populated with the names of the files you have open in your editor. There is also a drop-down for LLM selection and a button to run pre-built prompts.
LLM selection
Cody offers a variety of large language models (LLMs) to power your chat experience. Cody Free users can access the latest base models from Anthropic, OpenAI, Google, and Mixtral. At the same time, Cody Pro and Enterprise users can access more extended models.
Local models are also available through Ollama to Cody Free and Cody Pro users. To use a model in Cody chat, simply download it and run it in Ollama.
You can read more about it in our Supported LLM models docs.
Selecting Context with @-mentions
Cody's chat allows you to add files and symbols as context in your messages.
- Type
@-file
and then a filename to include a file as a context - Type
@#
and then a symbol name to include the symbol's definition as context. Functions, methods, classes, types, etc., are all symbols
Context retrieval
When you start a new Cody chat, the chat input window opens with a default @-mention
context chips for all the context it intends to use. This context is based on your current repository and current file (or a file selection if you have code highlighted).
At any point in time, you can edit these context chips or remove them completely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate @-mention
file or symbols to let Cody use it as a new context source.
When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files.
Prompts
Cody offers a variety of pre-built prompts to help you get the most out of your chat experience. You can access these prompts from the chat input field.