Platform
Private AI infrastructure — from compute to agents.
Target groups
For enterprise, SMBs, and individual developers.
Knowledge & Support
Everything you need to succeed with Mycelis.
Developers & Private
Swap the OpenAI endpoint for Mycelis in any existing application. Change two lines, keep your entire codebase — and gain full control over which model runs, at what cost, under your own infrastructure.
Drop-in replacement for OpenAI
The Mycelis proxy gateway speaks the OpenAI Chat Completions API. Any SDK, framework, or tool that works with OpenAI works with Mycelis — without a single line of code changed beyond the endpoint URL and API key. Route requests to your own deployed models, fine-tuned checkpoints, or configured agents.
Example: Python SDK
from openai import OpenAI
client = OpenAI(
base_url="https://app.mycelis.ai/api/proxy/v1",
api_key="pat_..."
)
# No other changes needed
response = client.chat.completions.create(
model="my-agent",
messages=[{"role": "user", "content": "Hello!"}]
)What you get
Works with the official OpenAI Python, Node, and Go SDKs, as well as LangChain, LlamaIndex, and any framework that accepts a custom base URL.
Route different requests to different models by agent name. Use smart routing to automatically select the best model for each request.
Keep data within your own deployment. Choose between dedicated GPU instances or commercial deployments with managed or BYOK keys.
Move existing OpenAI-powered apps to Mycelis without refactoring. Test with one agent, then expand to your whole stack.
Frequently Asked Questions
Create a free account, deploy a model, and point your existing code at Mycelis.
Get Started Free