Documentation Index
Fetch the complete documentation index at: https://to11.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Vercel AI SDK
This guide shows you how to route the Vercel AI SDK through the to11 gateway using the OpenAI provider with a custom base URL.
Installation
npm install ai @ai-sdk/openai
Configuration
Create an OpenAI provider instance pointing at the to11 gateway:
import { createOpenAI } from "@ai-sdk/openai";
const to11 = createOpenAI({
baseURL: "http://localhost:4000/v1",
apiKey: "sk-...", // your OpenAI API key
});
Generate text
import { generateText } from "ai";
const { text } = await generateText({
model: to11("gpt-4o"),
prompt: "Hello from to11!",
});
console.log(text);
Stream text
import { streamText } from "ai";
const result = streamText({
model: to11("gpt-4o"),
prompt: "Count to 10",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
Use with Next.js
In a Next.js route handler:
// app/api/chat/route.ts
import { streamText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
const to11 = createOpenAI({
baseURL: "http://localhost:4000/v1",
apiKey: process.env.OPENAI_API_KEY!,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: to11("gpt-4o"),
messages,
});
return result.toDataStreamResponse();
}
On the client side, use the useChat hook as normal:
// components/chat.tsx
"use client";
import { useChat } from "@ai-sdk/react";
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div>
{messages.map((m) => (
<div key={m.id}>
{m.role}: {m.content}
</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
</form>
</div>
);
}
The to11 gateway routes based on the model field. You can target Anthropic models through the same OpenAI provider:
const { text } = await generateText({
model: to11("claude-sonnet-4-6"),
prompt: "Hello from to11!",
});
The gateway translates the OpenAI-format request to Anthropic format and returns the response in OpenAI format, so the Vercel AI SDK works without changes.
import { generateText, tool } from "ai";
import { z } from "zod";
const { text, toolCalls } = await generateText({
model: to11("gpt-4o"),
prompt: "What's the weather in Paris?",
tools: {
getWeather: tool({
description: "Get current weather for a location",
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
return { temperature: 18, unit: "celsius", location };
},
}),
},
});