Yes. Kodus supports Bring Your Own Key (BYOK) on every plan, letting you use your preferred LLM provider. You pay the provider directly — Kodus never marks up tokens and never sees your key in plain text.Documentation Index
Fetch the complete documentation index at: https://docs.kodus.io/llms.txt
Use this file to discover all available pages before exploring further.
How it works
- Open Settings → BYOK (app.kodus.io/organization/byok)
- Pick a recommended model from the curated catalog, or click Configure manually for any other provider/endpoint
- Paste your API key and click Test & save — Kodus validates the key with a cheap metadata call before persisting
Curated recommended models
Ready-to-click cards with pre-tuned defaults (temperature, max tokens, reasoning level):- Claude Sonnet 4.6 / Opus 4.7 (Anthropic)
- Gemini 3.1 Pro custom tools (Google)
- GPT-5.4 (OpenAI)
- Kimi K2.6 Coding (Moonshot AI — Developer API or Kimi Code Plan)
- GLM 5.1 (Z.ai — Developer API or Coding Plan)
Why use your own provider
- Data control — requests go directly to your provider
- Cost management — use your existing billing relationship, no markup
- Model choice — pick the model that works best for your codebase
- Compliance — meet internal requirements for AI tool usage
- Fallback resilience — configure a Main + Fallback to keep reviews running during provider outages
Self-hosted LLM providers
If you self-host Kodus, you can point BYOK at self-hosted or aggregator endpoints like:- Novita, Groq, Together AI, Fireworks AI
- Chutes, Synthetic
- Your own Ollama / vLLM / TGI instance (via the
OpenAI Compatibleprovider)