Supported providers
| Provider | API Base URL | Model example | Auth |
|---|---|---|---|
| OpenAI | https://api.openai.com/v1 | gpt-4o | API key |
| Anthropic | https://api.anthropic.com | claude-sonnet-4-20250514 | API key |
| Ollama | http://localhost:11434/v1 | llama3.2 | None |
| Azure OpenAI | https://{resource}.openai.azure.com | Your deployment name | API key |
| Custom | Any OpenAI-compatible endpoint | Provider-specific | Provider-specific |
Configure in Pastures
Navigate to Pastures → Settings and enter:| Field | Value |
|---|---|
| AI Provider | Your provider (OpenAI, Anthropic, Ollama, etc.) |
| API Base URL | The provider’s API endpoint |
| API Key | Your provider’s API key (leave blank for Ollama) |
| Model | The model identifier |
Ollama local setup
Ollama lets you run models locally with no API key and no data leaving your network.Install Ollama
Limitations of BYO mode
BYO mode uses your model’s general knowledge for diagnostics. The following features are only available with Rancher Oracle:
- Source citations — GitHub issue links in diagnostic responses
- Fleet correlation — Cross-cluster pattern matching
- Fix verification — Post-fix monitoring