Sourcegraph 5.7
A newer version of Sourcegraph is available: Sourcegraph 6.12.
Details
Improvements
- Batch Changes Add a container registry deny list
- Add a container registry deny list to complement the allow list.
- Configure in site config via
"batchChanges.containerRegistryDenylist". - Mutually exclusive with
"batchChanges.containerRegistryAllowlist".
- Cody Add `/.api/cody/context` API
- New
POST /.api/cody/contextREST endpoint to retrieve a list of relevant source locations given a natural language query.
- New
Fixes
- Code Search Remove query expansion
This fixes a bug where we added "readme" too often to the context.
Backport 28ff196a663f537c6cb6340f976a91431509a90e from #582 - Code Search Skip if git diff not found in hybrid
When searching an unindexed commit we would consult indexed commits for speeding up results. If our index contained a commit that no longer existed in git we would error out due to a regression in v5.4.5099. This is now fixed.
- Batch Changes Workaround for a bug in GitHub
fix(batches): workaround for a bug in GitHub
Backport aad3a04f8c93561a61c404e69132e70a22d0acba from #576 - Batch Changes Avoid "Name already exists on this account" from creating fork by fetching the repo when the error happens
fix(batches): avoid "Name already exists on this account" from creating fork by fetching the repo when the error happens
- Cody Correctly parse queries containing 'or'
Fixes a regression in Cody context where questions containing the word 'or' could return noisy or no results.
- Cody Fix error handling in LLM API
- LLM API endpoints (
/.api/llm) now return JSON-encoded HTTP bodies for non-200 status codes.
- LLM API endpoints (
- Cody Return valid `finish_reason` in `/.api/llm/chat/completions`
- LLM API
/.api/chat/completionsnow returns OpenAI-compatiblefinish_reason.
- LLM API
- Cody Allow `Bearer TOKEN` header for all LLM APIs
- For compatibility with OpenAI clients, it's possible to use
Bearer TOKENheader with all API endpoints that start with the prefix/.api/llm.
- For compatibility with OpenAI clients, it's possible to use
- Local Check for rogue files and folders in svelte routes
- prevent
web-sveltekitcommands from running if there are untracked files under src/routes
- prevent
- Release Remove the other embedding reference
n/a
- Repo Updater Add WARN level logs every time we sync a code host
repo-updater now emits logs that log the result of every code host sync.
- Sg Make start commands cancel fn be sync.OnceFunc
- the cancel funcs used by sg commands are now wrapped in
sync.OnceFuncto prevent duplicate execution
- the cancel funcs used by sg commands are now wrapped in
- Sg Add deprecation notice to sg wolfi update-hashes
- sg: fix panic when using
wolfi update-hashes - sg: add deprecation notice for
wolfi update-hashes
- sg: fix panic when using
- Sg Check if we are ephemeral before getting lease time
- sg - fix panic in Cloud Ephemeral listing when listing instances that are not Ephemeral
- Sg Clamp deployment name consistently in cloud ephemeral
- ensure deployment / instance names are clamped in all places for cloud ephemeral