feat(chat): add LLM-powered travel agent with multi-provider support

Implement a full chat-based travel agent using LiteLLM for multi-provider
LLM support (OpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, etc.).

Backend:
- New 'chat' Django app with ChatConversation and ChatMessage models
- Streaming SSE endpoint via StreamingHttpResponse
- 5 agent tools: search_places, list_trips, get_trip_details,
  add_to_itinerary, get_weather
- LiteLLM client wrapper with per-user API key retrieval
- System prompt with user preference context injection

Frontend:
- New /chat route with full-page chat UI (DaisyUI + Tailwind)
- Collapsible conversation sidebar with CRUD
- SSE streaming response display with tool call visualization
- Provider selector dropdown
- SSE proxy fix to stream text/event-stream without buffering
- Navbar link and i18n keys
This commit is contained in:
2026-03-08 18:44:44 +00:00
parent d4e0ef14b8
commit 757140ec70
20 changed files with 1518 additions and 6 deletions

View File

@@ -69,6 +69,7 @@ INSTALLED_APPS = (
"worldtravel",
"users",
"integrations",
"chat",
"mcp_server",
"django.contrib.gis",
# 'achievements', # Not done yet, will be added later in a future update

View File

@@ -31,6 +31,7 @@ schema_view = get_schema_view(
urlpatterns = [
path("api/", include("adventures.urls")),
path("api/", include("worldtravel.urls")),
path("api/chat/", include("chat.urls")),
path(
getattr(settings, "DJANGO_MCP_ENDPOINT", "api/mcp"),
MCPServerStreamableHttpView.as_view(