Cody for VS Code v1.1.0 release
We're thrilled to announce the release of Cody v1.1.0, our latest update to our VS Code extension. This release is packed with new features and improvements designed to streamline coding workflows and enhance productivity.
Edit Code Command with Contextual Awareness
Cody now supports an enhanced Edit Code command. With this update, you can provide additional context to your coding queries.
Cody now has an enhanced Edit Code command. With this update, you can add additional context when prompting Cody to make direct edits to your code. This results in Cody better understanding your code so that the edits it makes are more accurate. To do this, simply use the "@
" symbol to include entire files or "@#
" to include specific symbols. This powerful feature allows Cody to understand your code in a more holistic manner, leading to more accurate and context-aware responses.
Autocomplete with Code Llama Integration
In our ongoing quest to expand the horizons of AI-enhanced coding, we're excited to unveil a groundbreaking experimental feature - Cody's offline mode! With the assistance of Ollama, developers can now harness the capabilities of large language models directly on their own machines. This innovative approach allows for the integration of Code Llama's LLM into your local VS Code workspace, providing real-time, inline suggestions as you code.
Try it: 1. Download, Install, and run Ollama 1. Download Code Llama model: ollama pull codellama:7b-code
1. Update Cody's VS Code settings to use the unstable-ollama
autocomplete provider.