Using MetricChat
LLMs
MetricChat lets you bring your own LLM by configuring a provider and API key.
MetricChat lets you bring your own LLM by configuring a provider and API key. It supports major providers and any OpenAI-compatible API.
Supported Providers
- OpenAI
- Anthropic
- Azure OpenAI
- Custom endpoints (Ollama, proxy services)
Recommended Models
OpenAI
| Model | Status |
|---|---|
| GPT-5.1 | Recommended |
| GPT-5 | Recommended |
| GPT-4.1 | Recommended |
| GPT-4.1 Mini | Supported |
Anthropic
| Model | Status |
|---|---|
| Claude 4.5 Opus | Recommended |
| Claude 4.5 Sonnet | Recommended |
| Claude 4 Sonnet | Supported |
| Claude 4 Opus | Supported |
| Model | Status |
|---|---|
| Gemini 2.5 Pro | Recommended |
| Gemini 2.5 Flash | Supported |
Custom LLMs
Any LLM that supports OpenAI-compatible requests and streaming can be used. For custom providers like Ollama or proxy services, specify the base URL and API key.
Configuration
Adding an LLM Provider
- Navigate to Settings > LLMs
- Select from pre-configured models or add a custom model
- Enter your API key and any required configuration
Setting a Default LLM
- Go to Settings > LLMs
- Click the menu icon on the desired LLM row
- Select Make Default
Small Default LLM
Set a smaller, less expensive model for back-office tasks like the LLM judge, tests, and evaluations.
Selecting LLM Per Prompt
Users can choose a specific LLM from the dropdown in the prompt box for individual queries.
Note: Default LLM configuration requires admin permissions. Individual users can select their preferred LLM when creating reports.