Documentation Index
Fetch the complete documentation index at: https://to11.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
OpenAI SDK
This guide shows you how to point the official OpenAI SDK at the to11 gateway. The only change is the base_url — the rest of your code stays the same.
Python
Install the SDK if you haven’t already:
Set the base_url to your gateway:
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:4000/v1",
api_key="sk-...", # your OpenAI API key
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello from to11!"}],
max_tokens=256,
)
print(response.choices[0].message.content)
TypeScript / Node.js
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "http://localhost:4000/v1",
apiKey: "sk-...", // your OpenAI API key
});
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello from to11!" }],
max_tokens: 256,
});
console.log(response.choices[0].message.content);
Streaming
Add stream: true to receive Server-Sent Events. The gateway forwards SSE chunks from OpenAI with zero-copy passthrough — sub-millisecond overhead.
stream = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Count to 10"}],
stream=True,
)
for chunk in stream:
content = chunk.choices[0].delta.content
if content:
print(content, end="", flush=True)
You can route OpenAI-format requests to Anthropic models. The gateway translates the request and returns the response in OpenAI format — your code doesn’t change:
# Same OpenAI SDK, but targeting an Anthropic model
response = client.chat.completions.create(
model="claude-sonnet-4-6",
messages=[{"role": "user", "content": "Hello from to11!"}],
max_tokens=256,
)
System messages are automatically extracted and forwarded as Anthropic’s top-level system field. Tool definitions are translated between formats.
Tool calling works identically to direct OpenAI usage:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "What's the weather in Paris?"}],
tools=[{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"}
},
"required": ["location"],
},
},
}],
)
If this request is routed to an Anthropic model, the gateway translates tool definitions to Anthropic’s input_schema format automatically.
Environment variable configuration
If you prefer not to hard-code the base URL, use the OPENAI_BASE_URL environment variable:
export OPENAI_BASE_URL=http://localhost:4000/v1
# No base_url needed — the SDK reads OPENAI_BASE_URL
client = OpenAI(api_key="sk-...")