MetricChat
Using MetricChat

LLMs

MetricChat lets you bring your own LLM by configuring a provider and API key.

MetricChat lets you bring your own LLM by configuring a provider and API key. It supports major providers and any OpenAI-compatible API.

Supported Providers

  • OpenAI
  • Anthropic
  • Google
  • Azure OpenAI
  • Custom endpoints (Ollama, proxy services)

OpenAI

ModelStatus
GPT-5.1Recommended
GPT-5Recommended
GPT-4.1Recommended
GPT-4.1 MiniSupported

Anthropic

ModelStatus
Claude 4.5 OpusRecommended
Claude 4.5 SonnetRecommended
Claude 4 SonnetSupported
Claude 4 OpusSupported

Google

ModelStatus
Gemini 2.5 ProRecommended
Gemini 2.5 FlashSupported

Custom LLMs

Any LLM that supports OpenAI-compatible requests and streaming can be used. For custom providers like Ollama or proxy services, specify the base URL and API key.

Configuration

Adding an LLM Provider

  1. Navigate to Settings > LLMs
  2. Select from pre-configured models or add a custom model
  3. Enter your API key and any required configuration

Setting a Default LLM

  1. Go to Settings > LLMs
  2. Click the menu icon on the desired LLM row
  3. Select Make Default

Small Default LLM

Set a smaller, less expensive model for back-office tasks like the LLM judge, tests, and evaluations.

Selecting LLM Per Prompt

Users can choose a specific LLM from the dropdown in the prompt box for individual queries.

Note: Default LLM configuration requires admin permissions. Individual users can select their preferred LLM when creating reports.

On this page