Implement a full chat-based travel agent using LiteLLM for multi-provider LLM support (OpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, etc.). Backend: - New 'chat' Django app with ChatConversation and ChatMessage models - Streaming SSE endpoint via StreamingHttpResponse - 5 agent tools: search_places, list_trips, get_trip_details, add_to_itinerary, get_weather - LiteLLM client wrapper with per-user API key retrieval - System prompt with user preference context injection Frontend: - New /chat route with full-page chat UI (DaisyUI + Tailwind) - Collapsible conversation sidebar with CRUD - SSE streaming response display with tool call visualization - Provider selector dropdown - SSE proxy fix to stream text/event-stream without buffering - Navbar link and i18n keys
12 lines
277 B
Python
12 lines
277 B
Python
from django.urls import include, path
|
|
from rest_framework.routers import DefaultRouter
|
|
|
|
from .views import ChatViewSet
|
|
|
|
router = DefaultRouter()
|
|
router.register(r"conversations", ChatViewSet, basename="chat-conversation")
|
|
|
|
urlpatterns = [
|
|
path("", include(router.urls)),
|
|
]
|