Platform

Products

Private AI infrastructure — from compute to agents.

Target groups

Use Cases

For enterprise, SMBs, and individual developers.

Knowledge & Support

Resources

Everything you need to succeed with Mycelis.

Integration · 3 min min read

Drop-In Replace the OpenAI SDK with Mycelis

Mycelis exposes a fully OpenAI-compatible API at https://mycelis.ai/api/proxy/v1. Any SDK or tool that supports OpenAI can switch to Mycelis in under a minute by changing two values: the base URL and the API key.

What you need

  • A Mycelis workspace with at least one agent configured
  • A Personal Access Token (PAT) from Workspace Settings → API Keys
  • Your agent slug (visible in the agent detail view)

Python

from openai import OpenAI

client = OpenAI(
    base_url="https://mycelis.ai/api/proxy/v1",
    api_key="your-mycelis-pat",
)

response = client.chat.completions.create(
    model="your-agent-slug",   # agent slug, not a model name
    messages=[{"role": "user", "content": "Hello!"}],
)

print(response.choices[0].message.content)

No other changes required. All existing chat, streaming, embeddings, and tool-use calls work as-is.

Node.js / TypeScript

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://mycelis.ai/api/proxy/v1",
  apiKey: "your-mycelis-pat",
});

const response = await client.chat.completions.create({
  model: "your-agent-slug",
  messages: [{ role: "user", content: "Hello!" }],
});

console.log(response.choices[0].message.content);

Using environment variables

Set these two variables and your existing code requires zero changes:

export OPENAI_BASE_URL="https://mycelis.ai/api/proxy/v1"
export OPENAI_API_KEY="your-mycelis-pat"

Both the Python and Node.js OpenAI SDKs read these variables automatically. Your model field should still be set to your agent slug.

Raw HTTP (curl)

curl https://mycelis.ai/api/proxy/v1/chat/completions \
  -H "Authorization: Bearer your-mycelis-pat" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "your-agent-slug",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Streaming

Streaming works without any changes — just set stream: true as you normally would:

stream = client.chat.completions.create(
    model="your-agent-slug",
    messages=[{"role": "user", "content": "Count to ten."}],
    stream=True,
)

for chunk in stream:
    print(chunk.choices[0].delta.content or "", end="")

Notes

  • Model field = agent slug. The Mycelis gateway maps the slug to the underlying model configured in your agent. Changing the model in the agent configuration takes effect immediately without touching your code.
  • Unsupported parameters are silently ignored by the gateway, so calls that pass provider-specific options like logprobs or top_logprobs won't error out even if the underlying model doesn't support them.