K
KairosRoute
Docs/Integrations

Framework Integrations

Integrate KairosRoute with your favorite AI agent framework. Copy-paste code examples for every popular framework.

How It Works

KairosRoute is 100% OpenAI-compatible. Every framework connects via the OpenAI SDK or HTTP client. Simply:

  1. 1. Install the framework's LLM library
  2. 2. Point the base URL to https://api.kairosroute.com/v1
  3. 3. Use your KairosRoute API key
  4. 4. Set model="auto" to let KairosRoute intelligently route requests

LangChain / LangGraph

Use KairosRoute with LangChain's ChatOpenAI for simple tasks or LangGraph for multi-agent workflows.

Installation

pip install langchain-openai langgraph

Basic Usage with ChatOpenAI

from langchain_openai import ChatOpenAI # Initialize with KairosRoute endpoint llm = ChatOpenAI( base_url="https://api.kairosroute.com/v1", api_key="kr-your-key", model="auto" # KairosRoute picks the best model per request ) # Simple invocation response = llm.invoke("Summarize the key points of this document...") print(response.content) # With streaming for chunk in llm.stream("Write a blog post about AI"): print(chunk.content, end="", flush=True) # Chain usage from langchain.prompts import ChatPromptTemplate prompt = ChatPromptTemplate.from_messages([ ("system", "You are a helpful assistant."), ("user", "{input}") ]) chain = prompt | llm result = chain.invoke({"input": "What is quantum computing?"}) print(result.content)

LangGraph Multi-Agent

from langgraph.prebuilt import create_react_agent from langchain_openai import ChatOpenAI from langchain_core.tools import tool # Create LLM with KairosRoute llm = ChatOpenAI( base_url="https://api.kairosroute.com/v1", api_key="kr-your-key", model="auto" ) # Define some tools @tool def search_web(query: str) -> str: """Search the web for information""" return f"Results for: {query}" @tool def calculate(expression: str) -> str: """Evaluate mathematical expressions""" return str(eval(expression)) # Create agent tools = [search_web, calculate] agent = create_react_agent(llm, tools) # Run agent from langgraph.types import Command result = agent.invoke({ "messages": [("user", "Research AI trends and calculate 2024 growth rate")] }) print(result["messages"][-1].content)

Pin Specific Models When Needed

# Use auto-routing for most calls (best cost/quality balance) regular_llm = ChatOpenAI( base_url="https://api.kairosroute.com/v1", api_key="kr-your-key", model="auto" ) # Pin expensive models when you know you need them expensive_llm = ChatOpenAI( base_url="https://api.kairosroute.com/v1", api_key="kr-your-key", model="claude-opus-4-6" # Bypass routing for complex tasks ) # Use in your chain chain = prompt | expensive_llm

About model="auto":

Requests with model="auto" are routed to the most cost-effective model that can handle your task. KairosRoute analyzes request complexity, required capabilities, and quality metrics in real-time. This typically saves 40-60% on LLM costs compared to using a single fixed model.

Learn more: LangChain OpenAI Docs | LangGraph Docs

Pro Tips

Use model="auto" for most calls

Let KairosRoute's intelligent router optimize each request. This typically saves 40-60% on LLM costs compared to using a single fixed model, with no quality loss.

Pin expensive models when needed

For tasks that absolutely require top-tier models (e.g., complex reasoning, creative writing), specify the model directly: model="claude-opus-4-6". You bypass routing but maintain control.

Routing overhead is minimal

The intelligent routing algorithm adds less than 50ms of latency per request on average. Network and provider latency typically dominate, making routing overhead negligible.

All responses follow OpenAI format

Streaming, tool use, JSON mode, and function calling all work transparently, regardless of which provider handles the request. Your code doesn't need to know which model was used.

Check response headers for routing info

Response headers include X-KairosRoute-Model (which model handled it) and X-KairosRoute-Savings (your cost savings). Use these for monitoring and debugging.

Next Steps

Get Your API Key

Create a KairosRoute account to generate an API key for your integrations.

Dashboard →

API Reference

Dive into detailed API documentation and endpoint specifications.

Full Reference →

Quick Start

Get up and running in 5 minutes with our quick start guide.

Quick Start →

Migration Guide

Switch from OpenAI or other providers in 2 minutes.

Learn More →