2.5 KiB
2.5 KiB
title, type, permalink
| title | type | permalink |
|---|---|---|
| ai-configuration | note | voyage/knowledge/domain/ai-configuration |
AI Configuration Domain
WS1 Configuration Infrastructure
WS1-F1: Instance-level env vars and key fallback
settings.py:VOYAGE_AI_PROVIDER,VOYAGE_AI_MODEL,VOYAGE_AI_API_KEYget_llm_api_key(user, provider)falls back to instance key only when provider matchesVOYAGE_AI_PROVIDER- Fallback chain: user key -> matching-provider instance key -> error
- See tech-stack.md, decisions.md
WS1-F2: UserAISettings model
integrations/models.py:UserAISettings(OneToOneField to user) withpreferred_providerandpreferred_model- Endpoint:
/api/integrations/ai-settings/(upsert pattern) - Migration:
0008_useraisettings.py
WS1-F3: Provider catalog enhancement
get_provider_catalog(user=None)addsinstance_configuredanduser_configuredbooleans- User API keys prefetched once per request (no N+1)
ChatProviderCatalogEntryTypeScript type updated with both fields
Frontend Provider Selection (Fixed)
- No longer hardcodes
selectedProvider = 'openai'; auto-selects first usable provider - Filtered to configured+usable entries only (
available_for_chat && (user_configured || instance_configured)) - Warning alert + Settings link when no providers configured
- Model selection uses dropdown from
GET /api/chat/providers/{provider}/models/
Known Frontend Gaps
Root Cause of User-Facing LLM Errors
Three compounding issues (all resolved):
Hardcoded(fixed: auto-selects first usable)'openai'defaultNo provider status feedback(fixed: catalog fields consumed)(fixed: Settings UI saves/loads DB defaults; chat initializes from saved prefs)UserAISettings.preferred_providernever loadedFIELD_ENCRYPTION_KEYnot set disables key storage (env-dependent)TypeScript type missing fields(fixed)
Key Edit Reference Points
| Feature | File | Location |
|---|---|---|
| AI env vars | backend/server/main/settings.py |
after FIELD_ENCRYPTION_KEY |
| Fallback key | backend/server/chat/llm_client.py |
get_llm_api_key() |
| UserAISettings model | backend/server/integrations/models.py |
after UserAPIKey |
| Catalog user flags | backend/server/chat/llm_client.py |
get_provider_catalog() |
| Provider view | backend/server/chat/views/__init__.py |
ChatProviderCatalogViewSet |