Integrations
The Doubleword Inference API is OpenAI-compatible, so it works with any framework that supports custom OpenAI endpoints.
The guides below show how to configure each framework to route requests through the API. Frameworks that accept an AsyncOpenAI client also support Autobatcher for batch pricing — cutting inference costs by up to 90% for background workloads.
| Framework | Language | Autobatcher |
|---|---|---|
| LangChain / LangGraph | Python | Yes |
| Vercel AI SDK | TypeScript | Yes |
| LlamaIndex | Python | Yes |
| CrewAI | Python | — |
| OpenAI Agents SDK | Python | Yes |
| PydanticAI | Python | Yes |
| Google ADK | Python | — |
| Microsoft Agent Framework | Python | Yes |
| smolagents | Python | — |
| Mastra | TypeScript | Yes |
| Agno | Python | Yes |
| atomic-agents | Python | — |
| OpenClaw | Agent skill | — |