AI coding assistants have unmistakably captured the imagination of developers, with the promise of speeding up coding tasks and helping them understand code in ways not possible before. Enterprises are also curious about the innovation and productivity gains AI can bring but have been cautious in their adoption so far.
In our decade of demonstrable experience helping enterprises understand their large and complex codebases, we know the key to addressing this caution involves seamless integration with existing tools, a robust and thorough approach to security, and the ability to support enterprises and codebases of any size and sophistication. This is why we're excited to introduce Cody Enterprise, an AI coding assistant created specifically with enterprise requirements and scale in mind.
Flexibility and choice
The best enterprise AI coding assistant is one you can actually use in your current setup. We know that every enterprise has a unique way of working and tools they use, so interoperability and universality are at the core of how we have built Cody.
Universal support for code hosts
We’ve built Cody to work with your existing tech stack. From support for all major code hosts like GitLab, GitHub, Bitbucket, Gerrit, Perforce, and Azure DevOps to supporting multiple code hosts in combination, we believe you shouldn't need to upend your entire tooling stack to realize the benefits of AI. One reason Qualtrics, a global Experience Management (XM) company with over 1,000 software developers, chose Cody Enterprise for their developers is how seamlessly it worked with their GitLab implementation.
“We run our own GitLab instance within our own data centers, and Cody works seamlessly with it.â€
-Godwin Babu, Sr. Manager, Qualtrics
You can read more about Qualtrics’ use of Cody Enterprise here.
Deployment choice
Cody Enterprise supports multiple deployment options and configurations to cater for different enterprise security and privacy requirements. Cody can be run in a Sourcegraph-managed, single-tenant cloud instance, but also supports self-hosted deployment so it can live in your own data centers alongside your self-hosted code hosts.
Choose your LLM
Large Language Models (LLMs) are launching and improving at a rapid pace, and each has its own strengths and ideal use cases. In order to set enterprises up for success in an ever-changing market, we’ve built Cody with LLM choice in mind versus being locked into a single proprietary LLM.
Cody allows enterprises to choose from several LLM options like Anthropic Claude 2 and OpenAI GPT-4, and deploy them in a completely secure and private environment within Azure VNet or AWS VPC via “bring-your-own-key†LLMs like Azure OpenAI and Amazon Bedrock. Leidos, a Fortune 500 innovation company rapidly addressing the world’s most vexing challenges in national security and health, adopted Cody because of the LLM flexibility it provides:
“Generative AI is a fast-moving field, and the best model that’s out there today may not be the best model tomorrow. Something better could come out tomorrow. With a lot of solutions, you’re locked into an LLM and putting a lot of faith in that model to keep up with the pace of change. Using Cody means we can avoid that LLM lock-in.â€
-Rob Linger, AI Architect at Leidos
Today, we're announcing that StarCoder is available as an LLM choice for Cody Enterprise code completions. In our testing, StarCoder has proven to be the best LLM defined by the highest Completion Acceptance Rate (CAR%) available for real-world everyday code completions.
Subscribe for the latest code AI news and product updates