Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmgrid.ai/llms.txt

Use this file to discover all available pages before exploring further.

OpenAI Compatible Proxy: API Reference

LLMGrid provides an OpenAI‑compatible proxy, allowing you to use existing OpenAI SDKs and tooling without changing your application logic. To use LLMGrid, simply replace the base_url in your SDK configuration with your LLMGrid proxy endpoint.

Compatibility Overview

The LLMGrid proxy is fully compatible with the OpenAI API surface, including:
  • Chat completions
  • Embeddings
  • Streaming responses
  • Tool / function calling
  • SDK‑based clients
This enables seamless integration with:
  • OpenAI SDKs
  • Popular orchestration frameworks
  • Custom applications built on OpenAI‑style APIs

SDK Examples

OpenAI Python SDK

from openai import OpenAI

client = OpenAI(
    api_key="YOUR_LLMGRID_API_KEY",
    base_url="https://<your-llmgrid-proxy>"
)

response = client.chat.completions.create(
    model="gpt-4.1",
    messages=[
        {"role": "user", "content": "Explain retrieval‑augmented generation"}
    ]
)

print(response.choices[0].message.content)