Files
voyage/backend/server/requirements.txt
alex 757140ec70 feat(chat): add LLM-powered travel agent with multi-provider support
Implement a full chat-based travel agent using LiteLLM for multi-provider
LLM support (OpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, etc.).

Backend:
- New 'chat' Django app with ChatConversation and ChatMessage models
- Streaming SSE endpoint via StreamingHttpResponse
- 5 agent tools: search_places, list_trips, get_trip_details,
  add_to_itinerary, get_weather
- LiteLLM client wrapper with per-user API key retrieval
- System prompt with user preference context injection

Frontend:
- New /chat route with full-page chat UI (DaisyUI + Tailwind)
- Collapsible conversation sidebar with CRUD
- SSE streaming response display with tool call visualization
- Provider selector dropdown
- SSE proxy fix to stream text/event-stream without buffering
- Navbar link and i18n keys
2026-03-08 18:44:44 +00:00

37 lines
682 B
Plaintext

Django==5.2.12
djangorestframework>=3.16.1
django-allauth==0.63.6
django-money==3.6.0
django-invitations==2.1.0
drf-yasg==1.21.15
django-cors-headers==4.9.0
coreapi==2.3.3
python-dotenv==1.2.2
psycopg2-binary==2.9.11
pillow==12.1.1
whitenoise==6.12.0
django-resized==1.0.3
django-geojson==4.2.0
setuptools==79.0.1
gunicorn==23.0.0
qrcode==8.2
slippers==0.6.2
django-allauth-ui==1.8.1
django-widget-tweaks==1.5.1
django-ical==1.9.2
icalendar==6.3.2
ijson==3.5.0
tqdm==4.67.3
overpy==0.7
publicsuffix2==2.20191221
geopy==2.4.1
psutil==6.1.1
geojson==3.2.0
gpxpy==1.6.2
pymemcache==4.0.0
legacy-cgi==2.6.4
requests>=2.32.5
cryptography>=46.0.5
django-mcp-server>=0.5.7
litellm>=1.72.3