Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.llmgrid.ai/llms.txt

Use this file to discover all available pages before exploring further.

LangChain’s OpenAI integration supports specifying a custom base_url when using a proxy or compatible endpoint.
import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

llm = ChatOpenAI(
    model="gpt-4.1",
    api_key=os.environ.get("LLMGRID_API_KEY"),
    base_url="https://api.llmgrid.ai/v1",
    temperature=0.2,
)

resp = llm.invoke([HumanMessage(content="Hello from LangChain via LLMGrid")])
print(resp.content)