PydanticAI
PydanticAI supports custom OpenAI-compatible endpoints via its OpenAIProvider.
Install
pip install pydantic-aiConfigure
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.openai import OpenAIProvider
model = OpenAIChatModel(
"{{selectedModel.id}}",
provider=OpenAIProvider(
base_url="https://api.doubleword.ai/v1",
api_key="{{apiKey}}",
),
)
agent = Agent(model)
result = agent.run_sync("Say hello.")
print(result.output)Batch pricing with Autobatcher
For background tasks where latency is not critical, use Autobatcher to transparently route requests through the Batch API at reduced cost:
pip install pydantic-ai autobatcherfrom autobatcher import BatchOpenAI
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.openai import OpenAIProvider
client = BatchOpenAI(
api_key="{{apiKey}}",
base_url="https://api.doubleword.ai/v1",
)
model = OpenAIChatModel(
"{{selectedModel.id}}",
provider=OpenAIProvider(openai_client=client),
)
agent = Agent(model)
result = agent.run_sync("Say hello.")
print(result.output)BatchOpenAI collects requests and submits them as batch jobs automatically, cutting inference costs by up to 90%. Your code stays the same — only the client changes.