Sourcegraph Cody is an AI coding assistant developed by Sourcegraph, a company specializing in code search and code intelligence tools. Cody uses large language models combined with Sourcegraph's code graph technology to provide context-aware code completion, chat-based code assistance, and programmable commands. Unlike general-purpose AI coding assistants that treat source code as plain text, Cody is designed to understand the structure and relationships within an entire codebase by leveraging Sourcegraph's code search and indexing infrastructure. The product was first unveiled in June 2023 and reached general availability on December 14, 2023. As of March 2026, the Cody VS Code extension has over 814,000 installs on the Visual Studio Marketplace.
Sourcegraph announced in June 2025 that Cody's free and pro tiers would be discontinued, with individual users directed toward a new product called Amp. Cody Enterprise, however, remains actively supported and developed for large organizations.
Sourcegraph was founded in 2013 by Quinn Slack and Beyang Liu, both Stanford graduates. Slack, who taught himself to code at age nine, previously worked at Palantir and co-founded Blend Labs, an enterprise technology company focused on home lending. Liu began his career as an engineer at Google, where he became accustomed to Google's internal code search tools and assumed that all software engineers had access to similar capabilities. The two met while working at Palantir, where they realized that most developers lacked access to powerful code search and had to rely on meetings and manual navigation to understand large codebases. They founded Sourcegraph to solve this problem by building a universal code search engine.
The company spent its first two years (2013 to 2015) prototyping a viable technical solution for large-scale code search. In 2016, the product transitioned to a public launch with features like jump-to-definition, find-references, and symbol search across repositories. Sourcegraph raised a $20 million Series A round in 2017, and early customers included Uber, Dropbox, and Lyft. Subsequent funding rounds brought the total raised to approximately $248 million, culminating in a $125 million Series D round led by Andreessen Horowitz in July 2021 that valued the company at $2.6 billion.
Sourcegraph introduced Cody as an AI coding assistant in June 2023, during the first major wave of generative AI tools for software development. The product was designed to combine Sourcegraph's existing code intelligence platform with large language models from providers like Anthropic and OpenAI. While competitors such as GitHub Copilot focused primarily on inline code suggestions, Cody was positioned as a codebase-aware assistant that could retrieve and reason over entire repositories.
Cody reached version 1.0 and general availability on December 14, 2023. At launch, it supported Claude 2.1 from Anthropic and GPT-4 Turbo from OpenAI, with model access also available through Microsoft Azure and Amazon Web Services Bedrock. The initial pricing included a free tier with limited monthly usage, a Pro tier at $9 per user per month, and an Enterprise tier planned for early 2024.
Throughout 2024, Sourcegraph expanded Cody's capabilities with features such as context filters for enterprise administrators, guardrails for open-source attribution, and support for additional LLM providers. The company introduced Cody Enterprise with features like SOC 2 Type II compliance, GDPR and CCPA compliance, role-based access control, and detailed audit logs. Notable enterprise customers included Leidos, Qualtrics, and Booking.com.
Sourcegraph also released Cody extensions for JetBrains IDEs and Visual Studio during this period, broadening the product's reach beyond VS Code.
In June 2025, Sourcegraph announced that Cody Free and Cody Pro plans would no longer accept new signups as of June 25, 2025, with both tiers discontinued entirely on July 23, 2025. Cody Enterprise was not affected and remained fully supported.
Sourcegraph directed individual and small-team users toward Amp, a new agentic coding tool built for multi-step edits, collaborative workflows, and deeper integration with frontier model capabilities. In December 2025, Sourcegraph announced that the Sourcegraph and Amp businesses would split into two independent companies. Dan Adler, who had served as Sourcegraph's CFO and infrastructure engineer since before the company's first customer, stepped up as Sourcegraph's CEO. Co-founders Quinn Slack and Beyang Liu departed to found Amp Inc., taking the Amp team with them. Both companies share the same board investors, including Craft Ventures, Redpoint Ventures, Sequoia Capital, Goldcrest, and Andreessen Horowitz (a16z).
Cody's core technical approach relies on Retrieval-Augmented Generation (RAG). When a user submits a query or requests a code completion, Cody retrieves relevant code snippets and documentation from the user's codebase and passes them alongside the query to a large language model. This approach differs from fine-tuning, which trains a model on specific data; RAG instead supplies up-to-date, relevant context at inference time. Sourcegraph's engineering team has stated that RAG is more appropriate for code because code changes frequently, making fine-tuning impractical for maintaining current knowledge of a codebase.
The retrieval process has two stages: retrieval and ranking. During retrieval, Cody gathers potential context items from multiple sources, including local files open in the IDE, remote repositories indexed by Sourcegraph, and documentation. During ranking, Cody uses an adapted form of the BM25 ranking function combined with other signals tuned for code search to score and order the retrieved snippets by relevance. The system can search across up to 10 repositories simultaneously and creates a global ranking that combines remote and local context sources.
A key differentiator of Cody is its use of Sourcegraph's Code Graph technology. Rather than treating source code as flat text, the Code Graph analyzes the semantic structure of code: definitions, references, symbols, documentation comments, and the relationships between them. This structured understanding is built on SCIP (SCIP Code Intelligence Protocol), an open-source, language-agnostic protocol for indexing source code that Sourcegraph developed as a successor to LSIF (Language Server Index Format).
SCIP provides several advantages over LSIF, including human-readable symbol identifiers instead of opaque numeric IDs, static types from the Protobuf schema, and faster indexing performance. When Sourcegraph replaced lsif-node with scip-typescript, it achieved a 10x speedup. SCIP indexers exist for many programming languages:
| Indexer | Languages |
|---|---|
| scip-java | Java, Scala, Kotlin |
| scip-typescript | TypeScript, JavaScript |
| rust-analyzer | Rust |
| scip-clang | C, C++ |
| scip-ruby | Ruby |
| scip-python | Python |
| scip-dotnet | C#, Visual Basic |
| scip-dart | Dart |
| scip-php | PHP |
The Code Graph data produced by these indexers is uploaded to a Sourcegraph instance, where it powers precise code navigation (go-to-definition, find-references) and provides structured context for Cody's RAG pipeline.
Cody's autocomplete system uses a different architecture than its chat feature, optimized for low latency. Instead of querying remote Sourcegraph instances, autocomplete relies primarily on local context sources: the active file, open tabs, and recently closed tabs. It uses the Tree-Sitter parsing library to identify the user's intent and determine what type of completion is appropriate (single-line vs. multi-line). Autocomplete uses completion-tuned LLMs rather than chat-optimized models. The default autocomplete model for Cody Enterprise is DeepSeek V2 Lite Base, chosen for its balance of speed and accuracy.
Before adopting its current search-based retrieval approach, Cody used OpenAI's text-embedding-ada-002 model to create vector embeddings of code for similarity search. Sourcegraph abandoned this approach for several reasons: it required sending code to third-party embedding providers, it created maintenance complexity with vector database management, and it scaled poorly for organizations with 100,000 or more repositories. The switch to native Sourcegraph platform search eliminated these drawbacks.
Cody's chat interface allows developers to ask questions about their codebase in natural language. Users can ask Cody to explain code, generate new code, identify bugs, suggest refactors, or answer general programming questions. The chat interface supports @-mention syntax for precise context control:
| @-mention Type | Syntax | Purpose |
|---|---|---|
| File | @filename | Include a specific file as context |
| Symbol | @#symbolname | Include a symbol's definition |
| Repository | @repo | Search a specific repository for context |
| Directory | @directory/ | Include files from a directory (Enterprise) |
| Line range | @filepath:1-50 | Include specific line ranges from a file |
| Web URL | @url | Include content from a web page |
When both a repository and specific files are @-mentioned, Cody searches the repository for context while prioritizing the mentioned files. Enterprise users can also @-mention remote directories for multi-repository context.
Auto-edit is an AI-powered feature available in the VS Code extension that suggests code changes based on the developer's recent edits and cursor movements. Unlike traditional autocomplete, which only adds new code, auto-edit can suggest modifications to existing code in the current file and elsewhere. It analyzes the developer's editing patterns and proposes contextual modifications, including instant code review feedback and documentation suggestions.
Cody provides inline code completions as the developer types. The system supports single-line and multi-line suggestions in any programming language, though it performs best with Python, Go, JavaScript, and TypeScript. Autocomplete works in VS Code, JetBrains IDEs, and Visual Studio. Sourcegraph has reported that their autocomplete model reduces P75 latency by 350 milliseconds compared to earlier versions and improves the average completion acceptance rate by more than 4%.
Cody provides a set of built-in prompts for common development tasks, including explaining code, generating unit tests, fixing bugs, documenting code, and identifying code smells. Beyond these defaults, the Prompt Library allows users and teams to create, save, and share custom prompts. Prompts can include dynamic context (current file, current selection) and can be set as public or private. Enterprise teams use the Prompt Library to standardize workflows and enforce best practices across their organization. The Prompt Library is gradually replacing the older "commands" system, with full backward compatibility maintained during the transition.
Context Filters are an enterprise security feature that lets Sourcegraph administrators control which repositories Cody can access when sending context to third-party LLM providers. Administrators can configure include rules (Cody may only use repositories matching specified patterns) or exclude rules (Cody may not use repositories matching specified patterns). This prevents sensitive or proprietary code from being transmitted to external AI providers. Context Filters require Sourcegraph version 5.4.0 or later and supported Cody client versions.
Cody Enterprise includes open-source attribution guardrails that verify generated code does not replicate publicly available open-source code. The system matches any code snippet of at least ten lines against a database of 290,000 indexed open-source repositories. This helps enterprises reduce their exposure to potential copyright issues arising from AI-generated code.
Cody supports models from multiple AI providers, allowing users and administrators to select the model best suited for their tasks. The following table lists the models available as of early 2026:
| Provider | Chat and Prompt Models | Autocomplete Models |
|---|---|---|
| Anthropic | Claude Opus 4.6, Claude Sonnet 4.6, Claude Sonnet 4.5, Claude Sonnet 4.5 with Thinking, Claude Opus 4.5, Claude Opus 4.5 with Thinking, Claude Haiku 4.5 | Claude Haiku 4.5, Claude Haiku 4.5 with Thinking |
| OpenAI | GPT-5.2, GPT-5.1, GPT-5, GPT-5 mini, GPT-5 nano, GPT-4o, GPT-4.1, GPT-4o-mini, GPT-4.1-mini, GPT-4.1-nano, o3, o4-mini | GPT-4.1-nano |
| Google DeepMind | Gemini 2.5 Flash, Gemini 2.5 Pro, Gemini 3 Pro (experimental), Gemini 3 Flash (experimental), Gemini 3.1 Pro (experimental) | N/A |
| Fireworks.ai | N/A | StarCoder, DeepSeek V2 Lite Base |
The default chat model is Claude Sonnet 4.5, and the default autocomplete model is DeepSeek V2 Lite Base. Enterprise administrators can configure which models are available to their organization's developers.
Cody is available as extensions and plugins for several development environments:
| IDE / Editor | Status | Extension ID |
|---|---|---|
| Visual Studio Code | General availability | sourcegraph.cody-ai |
| JetBrains IDEs (IntelliJ, PyCharm, WebStorm, etc.) | General availability | Cody: AI Coding Assistant |
| Visual Studio | Experimental | sourcegraph.cody-vs |
| Sourcegraph Web App | General availability | Built-in |
| Command Line Interface (CLI) | Available | N/A |
The VS Code extension is the most mature client, with over 814,000 installs as of March 2026. It supports all Cody features including chat, autocomplete, auto-edit, prompts, and context filters. The JetBrains extension covers the major JetBrains products and supports chat, autocomplete, and prompts. The Visual Studio extension remains in an experimental state with a more limited feature set.
Cody also integrates with multiple code hosts for repository access, including GitHub, GitLab, and Bitbucket.
Cody's pricing has changed significantly since its launch. The following table summarizes the tiers as they existed historically and their current status:
| Plan | Price | Status (as of July 2025) |
|---|---|---|
| Cody Free | $0 | Discontinued |
| Cody Pro | $9/user/month | Discontinued |
| Enterprise Starter | $19/user/month (up to 50 developers) | Cody removed; Code Search only |
| Enterprise | $59/user/month (25+ developers) | Active and supported |
Cody Enterprise includes full access to all features, administrative controls, context filters, guardrails, audit logging, and choice of cloud-hosted or self-hosted deployment. Enterprise customers can also bring their own LLM API keys for providers like Anthropic, OpenAI, and Google.
Sourcegraph does not use customer code to train AI models and maintains a zero-data-retention policy for enterprise deployments.
Sourcegraph reports that Cody is used by four of the six largest U.S. banks, more than 15 U.S. government agencies, and seven of the ten largest public technology companies. Named enterprise customers include Leidos, Qualtrics, Booking.com, Uber, Lyft, Dropbox, Plaid, Redfin, Databricks, and Reddit.
According to a testimonial from Coinbase, engineers using Cody save approximately five to six hours per week and write code roughly twice as fast.
Cody Enterprise provides administrators with analytics dashboards that track usage across active users, completions, commands, and chats, helping organizations measure adoption and return on investment.
Cody Enterprise offers several security features designed for regulated industries and large organizations:
Cody competes in the AI code assistant market alongside several other products:
| Feature | Sourcegraph Cody | GitHub Copilot | Cursor | Amazon Q Developer |
|---|---|---|---|---|
| Codebase context | Full codebase via Sourcegraph search | Repository-level (Copilot Workspace) | Project-level indexing | AWS service integration |
| Model choice | Multiple providers (Anthropic, OpenAI, Google) | Primarily OpenAI models, some Anthropic | Multiple providers | Amazon Bedrock models |
| Code search integration | Built-in (Sourcegraph Code Search) | GitHub code search | Local project search | AWS CodeWhisperer |
| Self-hosted option | Yes | GitHub Enterprise Server | No (cloud only) | No (AWS cloud) |
| Enterprise analytics | Detailed usage dashboards | Copilot metrics | Limited | CloudWatch integration |
| Open-source guardrails | Yes (290K repositories) | Code reference filter | No | Reference tracking |
| IDE support | VS Code, JetBrains, Visual Studio | VS Code, JetBrains, Visual Studio, Neovim, Xcode | Cursor editor (VS Code fork) | VS Code, JetBrains |
Cody's primary competitive advantage is its deep integration with Sourcegraph's code search platform, which enables it to retrieve context from across an organization's entire codebase, including multiple repositories, code hosts, and programming languages. This is particularly valuable for large enterprises with hundreds or thousands of repositories. Competitors like GitHub Copilot have stronger distribution through GitHub's user base, while Cursor differentiates with its purpose-built IDE experience.
Sourcegraph's relationship with open source has evolved over time. The company made its core product open source in 2018 under an Apache 2.0 license. In June 2023, Sourcegraph relicensed much of its code under a proprietary enterprise license while keeping it source-available. In August 2024, the company moved its core repository to a private GitHub repository, making the code no longer publicly accessible. CEO Quinn Slack explained the decision by citing the need to protect "very differentiated" code and prevent abuse, stating that open source makes little sense for "full server-side end-user applications." A public snapshot of the repository as it existed before the transition was preserved on GitHub as "sourcegraph-public-snapshot" for reference.
The Cody VS Code extension source code was similarly made private, with a public snapshot available at sourcegraph/cody-public-snapshot. The SCIP Code Intelligence Protocol remains fully open source.
Amp is Sourcegraph's successor product for individual developers and small teams, designed as an agentic coding tool capable of autonomous reasoning, multi-step edits, and complex task execution. While Cody operates primarily as an assistant that responds to user queries, Amp takes a more autonomous approach, executing multi-step development tasks with less manual guidance.
As of December 2025, Sourcegraph and Amp operate as independent companies. Sourcegraph, under CEO Dan Adler, continues to develop and support Cody Enterprise as part of its code search platform for large organizations. Amp Inc., founded by Quinn Slack and Beyang Liu, focuses on building frontier coding agent technology. Both companies share the same board investors, and Sourcegraph's Deep Search feature incorporates lessons learned from Amp's development.