Configuration
LLMGrid (SaaS) is configured via the Console at https://app.llmgrid.ai or through the Tenant Admin API. All configurations are tenant-scoped and versioned for auditability.Providers (BYO Keys)
Add provider credentials under Console → Providers:- OpenAI:
OPENAI_API_KEY - Anthropic:
ANTHROPIC_API_KEY - Azure OpenAI:
endpoint,apiKey, and deploymentMap for model IDs - Google Gemini:
GOOGLE_API_KEY - Other:
API_KEY
- Timeouts per provider
- Default region for compliance
- Optional headers for custom integrations
Routes
A route is a named logical endpoint that defines which models can be used and how to select among them. Example (JSON editor in Console):Create a Virtual Key on LLMGrid
Virtual Keys are API keys that allow Open WebUI to authenticate to LLMGrid Gateway.LLMGrid User Management Hierarchy
LLMGrid supports:- Organization: Groups of teams (e.g., US Engineering, EU Developer Tools)
- Team: Groups of users (e.g., Open WebUI Team, Data Science Team)
- User: Individual user (developer, employee)
- Virtual Key: API key associated with a user or team for authentication
Create a Team
Navigate to LLMGrid Console → Access → Teams and create a new team.Create a Virtual Key
Navigate to LLMGrid Console → Access → API Keys and create a Virtual Key.- Assign the key to your team
- Specify which models/routes the key can access
Connect Open WebUI to LLMGrid
In Open WebUI:- Go to Settings → Connections
- Create a new connection:
- URL:
https://LLMGrid.ai/api/v1 - Key: Your Virtual Key from Step ‘Create a Virtual Key’
Test Request
- In Open WebUI, select a model from the dropdown (only models allowed by your Virtual Key will appear)
- Enter your prompt and click Submit
Tracking Usage & Spend
Basic Tracking
After making requests, navigate to LLMGrid Console → Observability → Logs to view:- Model used
- Token usage
- Cost information