SDK MigrationUpdated 2026-01-22

Switch Vercel AI SDK to abliteration.ai

Switch Vercel AI SDK to abliteration.ai using the OpenAI-compatible provider. Keep your generateText and streamText calls unchanged.

The Vercel AI SDK supports OpenAI-compatible providers via @ai-sdk/openai-compatible.

Your generateText, streamText, and useChat hooks work unchanged after configuration.

Definition

Switch Vercel AI SDK to abliteration.ai

Switching the Vercel AI SDK base URL means configuring an OpenAI-compatible provider to route requests through an alternative endpoint.

Why it matters
  • Keep your existing generateText and streamText calls unchanged.
  • Works with Next.js App Router, Server Actions, and Edge Runtime.
  • Maintain React hooks like useChat and useCompletion.
  • Test alternative providers without changing your UI code.
How it works
  1. 01Install @ai-sdk/openai-compatible package.
  2. 02Create a provider with baseURL: "https://api.abliteration.ai/v1".
  3. 03Replace your OpenAI key with your abliteration.ai API key.
  4. 04Use the provider's chatModel() in generateText or streamText.
Vercel AI SDK base URL switch
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
import { generateText, streamText } from "ai";

// Create abliteration.ai provider
const ablit = createOpenAICompatible({
  baseURL: "https://api.abliteration.ai/v1",
  apiKey: process.env.ABLIT_KEY,
  name: "abliteration",
});

// generateText works unchanged
const { text } = await generateText({
  model: ablit.chatModel("abliterated-model"),
  prompt: "Summarize this document.",
});

// streamText works unchanged
const result = await streamText({
  model: ablit.chatModel("abliterated-model"),
  messages: [{ role: "user", content: "Write a short story." }],
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}
FAQ

Frequently asked questions.

Do React hooks like useChat still work?

Yes. Configure the provider in your API route, and client-side hooks work unchanged.

Does this work with Next.js Edge Runtime?

Yes. The OpenAI-compatible provider works in both Node.js and Edge runtimes.

Can I use multiple providers?

Yes. Create multiple providers and select them based on your routing logic.