Providers
Providers Overview
Connect to any LLM provider
🔌
Provider logos grid placeholder
Providers
YourGPT Copilot SDK supports multiple LLM providers out of the box. Switch providers without changing your frontend code.
All providers use the same API. Change one line in your config to switch from OpenAI to Anthropic.
Supported Providers
🟢
OpenAI
GPT-4o, GPT-4, GPT-3.5 Turbo
🟠
Anthropic
Claude 3.5 Sonnet, Claude 3 Opus
🔵
Gemini 1.5 Pro, Gemini Flash
⚡
Groq
Llama 3.1, Mixtral (ultra-fast)
🟣
Mistral
Mistral Large, Medium, Small
☁️
Azure OpenAI
Enterprise OpenAI deployment
🦙
Ollama
Run models locally
⚙️
Custom Provider
Build your own adapter
Quick Start
1. Choose Your Provider
<YourGPTProvider
runtimeUrl="/api/chat"
llm={{
provider: 'openai', // or 'anthropic', 'google', 'groq', etc.
model: 'gpt-4o',
}}
>
<CopilotChat />
</YourGPTProvider>2. Add API Key
# .env.local
OPENAI_API_KEY=sk-...3. That's It!
The runtime automatically detects and uses the correct provider.
Switching Providers
Change one line - everything else stays the same:
// OpenAI
llm={{ provider: 'openai', model: 'gpt-4o' }}
// Anthropic
llm={{ provider: 'anthropic', model: 'claude-3-5-sonnet-20241022' }}
// Google
llm={{ provider: 'google', model: 'gemini-1.5-pro' }}
// Groq (fastest)
llm={{ provider: 'groq', model: 'llama-3.1-70b-versatile' }}Your tools, UI, and all frontend code remain unchanged. The SDK normalizes responses across providers.
Provider Comparison
| Provider | Speed | Quality | Cost | Best For |
|---|---|---|---|---|
| OpenAI | Fast | Excellent | $$ | General use |
| Anthropic | Medium | Excellent | $$ | Long context, safety |
| Fast | Very Good | $ | Multimodal | |
| Groq | Ultra Fast | Good | $ | Speed-critical apps |
| Mistral | Fast | Very Good | $ | European compliance |
| Ollama | Varies | Varies | Free | Local/private |