OpenAI Compatible API
OpenMotoko exposes OpenAI-compatible endpoints so you can use it with any tool or library that supports the OpenAI API format.
Endpoints
Section titled “Endpoints”| Method | Path | Description |
|---|---|---|
POST | /v1/chat/completions | Chat completions (streaming supported) |
GET | /v1/models | List available models |
These endpoints live at the root path (not under /api/).
Chat completions
Section titled “Chat completions”Request
Section titled “Request”curl http://localhost:3457/v1/chat/completions \ -H "Content-Type: application/json" \ -H "Authorization: Bearer your-session-token" \ -d '{ "model": "balanced", "messages": [ { "role": "system", "content": "You are a helpful assistant" }, { "role": "user", "content": "What is OpenMotoko?" } ], "stream": false }'Response
Section titled “Response”{ "id": "chatcmpl-abc123", "object": "chat.completion", "created": 1700000000, "model": "claude-sonnet-4-6", "choices": [ { "index": 0, "message": { "role": "assistant", "content": "OpenMotoko is a personal AI agent..." }, "finish_reason": "stop" } ], "usage": { "prompt_tokens": 25, "completion_tokens": 150, "total_tokens": 175 }}Streaming
Section titled “Streaming”Set "stream": true to receive Server-Sent Events:
curl http://localhost:3457/v1/chat/completions \ -H "Content-Type: application/json" \ -H "Authorization: Bearer your-session-token" \ -d '{ "model": "fast", "messages": [ { "role": "user", "content": "Hello" } ], "stream": true }'Each SSE chunk:
data: {"id":"chatcmpl-abc123","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}
data: [DONE]List models
Section titled “List models”curl http://localhost:3457/v1/models \ -H "Authorization: Bearer your-session-token"{ "object": "list", "data": [ { "id": "claude-sonnet-4-6", "object": "model", "owned_by": "anthropic" }, { "id": "gpt-4o", "object": "model", "owned_by": "openai" } ]}Model aliases
Section titled “Model aliases”You can use OpenMotoko’s model aliases (fast, balanced, smart) in the model field. The router resolves them to the configured provider and model.
Using with third-party tools
Section titled “Using with third-party tools”Python (openai library)
Section titled “Python (openai library)”from openai import OpenAI
client = OpenAI( base_url="http://localhost:3457/v1", api_key="your-session-token",)
response = client.chat.completions.create( model="balanced", messages=[{"role": "user", "content": "Hello"}],)print(response.choices[0].message.content)Continue.dev
Section titled “Continue.dev”In your Continue config:
{ "models": [ { "title": "OpenMotoko", "provider": "openai", "model": "balanced", "apiBase": "http://localhost:3457/v1", "apiKey": "your-session-token" } ]}Any OpenAI-compatible client
Section titled “Any OpenAI-compatible client”Point the base URL to http://localhost:3457/v1 and use your session token as the API key. OpenMotoko handles routing to the configured LLM provider transparently.