Supported LLMs
Chat and Commands
Cody supports a variety of cutting-edge large language models for use in Chat and Commands, allowing you to select the best model for your use case.
Provider | Model | Free | Pro | Enterprise | ||||
---|---|---|---|---|---|---|---|---|
OpenAI | gpt-3.5 turbo | ✅ | ✅ | ✅ | ||||
OpenAI | gpt-4 | - | - | ✅ | ||||
OpenAI | gpt-4 turbo | - | ✅ | ✅ | ||||
OpenAI | gpt-4o | - | ✅ | ✅ | ||||
Anthropic | claude-3 Haiku | ✅ | ✅ | ✅ | ||||
Anthropic | claude-3 Sonnet | ✅ | ✅ | ✅ | ||||
Anthropic | claude-3.5 Sonnet | ✅ | ✅ | ✅ | ||||
Anthropic | claude-3 Opus | - | ✅ | ✅ | ||||
Mistral | mixtral 8x7b | ✅ | ✅ | - | ||||
Mistral | mixtral 8x22b | ✅ | ✅ | - | ||||
Ollama | variety | experimental | experimental | - | ||||
Google Gemini | 1.5 Pro | ✅ | ✅ | ✅ (Beta) | ||||
Google Gemini | 1.5 Flash | ✅ | ✅ | ✅ (Beta) | ||||
To use Claude 3 (Opus and Sonnets) models with Cody Enterprise, make sure you've upgraded your Sourcegraph instance to the latest version.
Autocomplete
Cody uses a set of models for autocomplete which are suited for the low latency use case.
Provider | Model | Free | Pro | Enterprise |
---|---|---|---|---|
Fireworks.ai | StarCoder | ✅ | ✅ | ✅ |
Anthropic | claude Instant | - | - | ✅ |
Google Gemini (Beta) | 1.5 Flash | - | - | ✅ |
Ollama (Experimental) | variety | ✅ | ✅ | - |
For information on context token limits, see our documentation here.