Platform

Products

Private AI infrastructure — from compute to agents.

Target groups

Use Cases

For enterprise, SMBs, and individual developers.

Knowledge & Support

Resources

Everything you need to succeed with Mycelis.

Developers & Private

OpenAI-compatible.
Your models.

Swap the OpenAI endpoint for Mycelis in any existing application. Change two lines, keep your entire codebase — and gain full control over which model runs, at what cost, under your own infrastructure.

Drop-in replacement for OpenAI

The Mycelis proxy gateway speaks the OpenAI Chat Completions API. Any SDK, framework, or tool that works with OpenAI works with Mycelis — without a single line of code changed beyond the endpoint URL and API key. Route requests to your own deployed models, fine-tuned checkpoints, or configured agents.

Example: Python SDK

Python
from openai import OpenAI

client = OpenAI(
    base_url="https://app.mycelis.ai/api/proxy/v1",
    api_key="pat_..."
)

# No other changes needed
response = client.chat.completions.create(
    model="my-agent",
    messages=[{"role": "user", "content": "Hello!"}]
)

What you get

Full SDK Compatibility

Works with the official OpenAI Python, Node, and Go SDKs, as well as LangChain, LlamaIndex, and any framework that accepts a custom base URL.

Multi-Model Routing

Route different requests to different models by agent name. Use smart routing to automatically select the best model for each request.

Your Own Infrastructure

Keep data within your own deployment. Choose between dedicated GPU instances or commercial deployments with managed or BYOK keys.

Easy Migration

Move existing OpenAI-powered apps to Mycelis without refactoring. Test with one agent, then expand to your whole stack.

Frequently Asked Questions

Which OpenAI API features are supported?

The Mycelis gateway supports the Chat Completions endpoint, which covers the vast majority of use cases. Streaming, function calling, and system prompts all work as expected.

What do I set as the model name?

Use the name of your Mycelis agent as the model name. Each agent you configure appears as a selectable model through the gateway.

Does this work with LangChain, LlamaIndex, or other frameworks?

Yes. Any framework that accepts a custom base URL for an OpenAI-compatible provider works with Mycelis. Set the base URL to your Mycelis gateway and provide your API key.

Migrate in minutes.

Create a free account, deploy a model, and point your existing code at Mycelis.

Get Started Free