Desktop IDEs
Continue
Last updated March 3, 2026
Using Cencori with the Open Source Continue extension for VS Code and JetBrains.
Continue (continue.dev) is the leading open-source AI extension. It allows you to bring your own model and configuration.
Configuration
To use Cencori (and its Universal Proxy) with Continue, you simply add it as a "Custom Provider" or "OpenAI Compatible" provider in your config.json.
1. Open Configuration
Press Cmd+Shift+P (or Ctrl+Shift+P) and search for "Continue: Open config.json".
2. Add Cencori Provider
Add this to your models array:
{
"models": [
{
"title": "Cencori (Auto)",
"provider": "openai",
"model": "gpt-4o",
"apiKey": "csk_YOUR_API_KEY",
"apiBase": "https://cencori.com/v1"
}
]
}Note: You can put ANY model in the model field (e.g. claude-3-opus, gemini-2.5-flash), and Cencori will route it correctly.
Context Providers
Continue allows "Context Providers" (like @docs).
To let Continue know about Cencori's capabilities, add llm.txt to your docs:
- Type
@docsin the Continue chat. - Select "Add Docs".
- Enter:
https://cencori.com/llm.txt - Title:
Cencori SDK
Now you can ask:
"@Cencori SDK how do I implement memory?"