1.6 KiB
1.6 KiB
title, type, permalink
| title | type | permalink |
|---|---|---|
| litellm-zen-provider-catalog | note | voyage/research/litellm-zen-provider-catalog |
Research: LiteLLM provider catalog and OpenCode Zen support
Date: 2026-03-08 Related plan: AI travel agent in Collections Recommendations
LiteLLM provider enumeration
- Runtime provider list is available via
litellm.provider_listand currently returns 128 provider IDs in this environment. - The enum source
LlmProviderscan be used for canonical provider identifiers.
OpenCode Zen compatibility
- OpenCode Zen is not a native LiteLLM provider alias.
- Zen can be supported via LiteLLM's OpenAI-compatible routing using:
- provider id in app:
opencode_zen - model namespace:
openai/<zen-model> api_base:https://opencode.ai/zen/v1
- provider id in app:
- No new SDK dependency required.
Recommended backend contract
- Add backend source-of-truth endpoint:
GET /api/chat/providers/. - Response fields:
idlabelavailable_for_chatneeds_api_keydefault_modelapi_base
- Return all LiteLLM runtime providers; mark non-mapped providers
available_for_chat=falsefor display-only compliance.
Data/storage compatibility notes
- Existing
UserAPIKey(provider)model supports addingopencode_zenwithout migration. - Consistent provider ID usage across serializer validation, key lookup, and chat request payload is required.
Risks
- Zen model names may evolve; keep default model configurable in backend mapping.
- Full provider list is large; UI should communicate unavailable-for-chat providers clearly.