Cody supports additional context through Anthropic's Model Context Protocol

Chris Sev

Today marks a significant milestone in AI-assisted development: Anthropic has released Model Context Protocol (MCP), an open standard for connecting AI models with external data. We're proud to announce that Sourcegraph is one of the first tools to support it. This integration opens up new possiblities to get extra context into your editor.

For example, you can now get GitHub or Linear issues, connect to your Postgres database, and access internal documentation without leaving your IDE.

Here's an example of Cody connecting to a Postgres database to write a Prisma query after looking at the database schema:

Cody writing a Prisma query

What is Model Context Protocol?

MCP is Anthropic's new protocol that enables users to provide additional context to LLM-powered applications like Claude.ai. Think of it as a standardized way to feed external information into AI models, making them more aware of your specific use case and environment.

To get started with MCP, you would create an MCP server that connects to the data sources you want to use. Then you would create an MCP client that would connect to your server. The client could be Claude.ai. connect Cody and your editor to your MCP server via OpenCtx.

As launch partners with Anthropic, we've ensured that Cody can seamlessly integrate with MCP, bringing this additional context right into your editor where you need it most.

What can you bring into Cody using MCP?

Anthropic has released several example MCP servers that show how we can create servers to connect to various data sources. Cody supports all of these example servers out of the box. Cody can also support your own MCP server, which we'll cover later.

  • Brave Search - Search the Brave search API
  • Postgres - Connect to your Postgres databases to query schema information and write optimized SQL
  • Filesystem - Access files on your local machine
  • Everything - A demo server showing MCP capabilities
  • Google Drive - Search and access your Google Drive documents
  • Google Maps - Get directions and information about places
  • Memo - Access your Memo notes
  • Git - Get git history and commit information
  • Puppeteer - Control headless Chrome for web automation
  • SQLite - Query SQLite databases

The beauty of MCP lies in its universality. Once you build an MCP server, it becomes a source of context for multiple tools - not just Cody. Here's how it works:

  1. Your MCP server provides structured context through a standardized protocol.
  2. Cody connects to this server through OpenCtx, our open standard for external context.
  3. The context becomes available in your editor through Cody chat.

Trying out the MCP to Cody integration

To get started with MCP, let's clone Anthropic's example servers and try them out. Servers are stored locally on your computer and the Cody client will connect to them.

  1. Clone the example-servers repository
  2. Install the dependencies: npm install
  3. Build the project: npm run build

This will create a build directory with the compiled server code. We will be pointing Cody to the servers in this directory.

Add the following to your VS Code JSON settings:

{
  "openctx.providers": {
    "https://openctx.org/npm/@openctx/provider-modelcontextprotocol": {
      "nodeCommand": "node",
      "mcp.provider.uri": "file:///PATH_TO_YOUR_EXAMPLE_SERVERS_FOLDER/example-servers/everything/build/index.js"
    }
  }
}

Now you can open Cody and you'll see the example-servers/everything server available as a context provider. This server acts as an example to show the capabilities of MCP.

Cody showing the Everything MCP server

The Postgres MCP server

You can try out the Postgres MCP server by updating your JSON settings to point to that example server.

{
  "openctx.providers": {
    "https://openctx.org/npm/@openctx/provider-modelcontextprotocol": {
      "nodeCommand": "node",
      "mcp.provider.uri": "file:///PATH_TO_YOUR_EXAMPLE_SERVERS_FOLDER/example-servers/postgres/build/index.js",
      "mcp.provider.args": ["postgresql://root:@127.0.0.1:5433/my-database"]
    }
  }
}

Notice that we added a mcp.provider.args field that contains the connection string for the Postgres database. Now we are able to connect to the database and use it in our editor.

Cody using Postgres

Building a Linear MCP integration

Want to create your own MCP server? Here's a quick tutorial to get you started. The Model Context Protocol team has created a Python SDK and a TypeScript SDK that make it easy to get started building your own context server.

Here's an example of a Node.js server using the Typescript SDK that will access Linear issues. We will build this in the example-servers folder that you cloned earlier. In addition to that folder, you will need:

  • A Linear API key
  • Cody installed in your editor
  • A new file to create your MCP server: example-servers/src/linear/index.ts

Here is the full code to create an MCP server that will access Linear issues:

// example-servers/src/linear/index.ts
 
// Import required dependencies from MCP SDK and Linear SDK
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
  ListResourcesRequestSchema,
  ReadResourceRequestSchema,
} from "@modelcontextprotocol/sdk/types.js";
import { LinearClient } from "@linear/sdk";
 
// Get Linear API key from command line arguments
const args = process.argv.slice(2);
if (args.length === 0) {
  console.error("Please provide a Linear API key as a command-line argument");
  process.exit(1);
}
 
// Initialize Linear client with API key
const apiKey = args[0];
const linearClient = new LinearClient({
  apiKey,
});
 
// Create new MCP server instance with basic configuration
const server = new Server(
  {
    name: "example-servers/linear",
    version: "0.1.0",
  },
  {
    capabilities: {
      resources: {},
      tools: {},
    },
  }
);
 
// Handler for listing assigned issues
server.setRequestHandler(ListResourcesRequestSchema, async (request) => {
  const viewer = await linearClient.viewer;
  const issues = await viewer.assignedIssues();
 
  return {
    resources: issues.nodes.map((issue) => ({
      uri: `linear://${issue.id}`,
      name: issue.title,
      mimeType: "application/json",
    })),
    nextCursor: issues.pageInfo.endCursor,
  };
});
 
// Handler for reading detailed information about a specific issue
server.setRequestHandler(ReadResourceRequestSchema, async (request) => {
  const issueId = request.params.uri.replace("linear://", "");
  const issue = await linearClient.issue(issueId);
 
  const issueData = {
    id: issue.id,
    title: issue.title,
    description: issue.description,
    status: await issue.state,
    assignee: await issue.assignee,
    createdAt: issue.createdAt,
    updatedAt: issue.updatedAt,
  };
 
  return {
    contents: [
      {
        uri: request.params.uri,
        mimeType: "application/json",
        text: JSON.stringify(issueData, null, 2),
      },
    ],
  };
});
 
// Start the server using stdio transport
const transport = new StdioServerTransport();
await server.connect(transport);

You'll notice that this server implements two methods: ListResourcesRequestSchema and ReadResourceRequestSchema. The list resources method returns a list of resources that the server can provide context for. The read resource method returns the contents of a given resource.

Run the build command again to compile your new server to generate an index.js file that we can point Cody to: npm run build

Now that you have this new index.js file for your MCP server, connect it to Cody using OpenCtx. Add this to your VS Code JSON settings:

{
  "openctx.providers": {
    "https://openctx.org/npm/@openctx/provider-modelcontextprotocol": {
      "nodeCommand": "node",
      "mcp.provider.uri": "file:///PATH_TO_YOUR_EXAMPLE_SERVERS_FOLDER/example-servers/linear/build/index.js",
      "mcp.provider.args": ["YOUR_LINEAR_API_KEY"]
    }
  }
}

Make sure to add your Linear API key to the mcp.provider.args field. Once you've added the settings, you can see the Linear context provider in the list of available providers in Cody.

Cody showing the Linear MCP server

What's next?

The combination of Anthropic's Model Context Protocol and Cody opens up endless possibilities for enhancing your development environment with relevant context. Whether you're building internal tools, accessing documentation, or connecting to external services, MCP provides a standardized way to bring that information right into your editor.

We're excited to see what the developer community builds with this integration. Have ideas for MCP servers? Share them with us! The future of context-aware coding is here, and it's more accessible than ever.

Ready to try it out? Install Cody and check out Anthropic's example MCP servers to get started. Happy coding!

Get Cody, the AI coding assistant

Cody makes it easy to write, fix, and maintain code.