feat(chat): add LLM-powered travel agent with multi-provider support
Implement a full chat-based travel agent using LiteLLM for multi-provider LLM support (OpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, etc.). Backend: - New 'chat' Django app with ChatConversation and ChatMessage models - Streaming SSE endpoint via StreamingHttpResponse - 5 agent tools: search_places, list_trips, get_trip_details, add_to_itinerary, get_weather - LiteLLM client wrapper with per-user API key retrieval - System prompt with user preference context injection Frontend: - New /chat route with full-page chat UI (DaisyUI + Tailwind) - Collapsible conversation sidebar with CRUD - SSE streaming response display with tool call visualization - Provider selector dropdown - SSE proxy fix to stream text/event-stream without buffering - Navbar link and i18n keys
This commit is contained in:
@@ -69,6 +69,7 @@ INSTALLED_APPS = (
|
||||
"worldtravel",
|
||||
"users",
|
||||
"integrations",
|
||||
"chat",
|
||||
"mcp_server",
|
||||
"django.contrib.gis",
|
||||
# 'achievements', # Not done yet, will be added later in a future update
|
||||
|
||||
@@ -31,6 +31,7 @@ schema_view = get_schema_view(
|
||||
urlpatterns = [
|
||||
path("api/", include("adventures.urls")),
|
||||
path("api/", include("worldtravel.urls")),
|
||||
path("api/chat/", include("chat.urls")),
|
||||
path(
|
||||
getattr(settings, "DJANGO_MCP_ENDPOINT", "api/mcp"),
|
||||
MCPServerStreamableHttpView.as_view(
|
||||
|
||||
Reference in New Issue
Block a user