Do my existing chains still work?
Yes. Chains, agents, and LCEL pipelines work unchanged after switching the LLM endpoint.
SDK Migration
LangChain's ChatOpenAI class supports any OpenAI-compatible endpoint via base_url.
Your chains, agents, memory, and tools work unchanged after the switch.
Switching LangChain's base URL means configuring ChatOpenAI to route requests through an alternative OpenAI-compatible provider.
from langchain_openai import ChatOpenAI
# Before: OpenAI default
# llm = ChatOpenAI(model="gpt-4")
# After: abliteration.ai
llm = ChatOpenAI(
model="abliterated-model",
base_url="https://api.abliteration.ai/v1",
api_key="YOUR_ABLIT_KEY",
)
# Your chains work unchanged
response = llm.invoke("Summarize this document in three bullet points.")
print(response.content)
# Works with chains, agents, and LCEL
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant."),
("user", "{input}")
])
chain = prompt | llm
result = chain.invoke({"input": "What is abliteration?"})FAQ
Yes. Chains, agents, and LCEL pipelines work unchanged after switching the LLM endpoint.
Yes. Use llm.stream() or chain.stream() exactly like with OpenAI models.
Yes. LangGraph agents use the same ChatOpenAI class, so the switch applies automatically.