feat(chat): add LLM-powered travel agent with multi-provider support
Implement a full chat-based travel agent using LiteLLM for multi-provider LLM support (OpenAI, Anthropic, Gemini, Ollama, Groq, Mistral, etc.). Backend: - New 'chat' Django app with ChatConversation and ChatMessage models - Streaming SSE endpoint via StreamingHttpResponse - 5 agent tools: search_places, list_trips, get_trip_details, add_to_itinerary, get_weather - LiteLLM client wrapper with per-user API key retrieval - System prompt with user preference context injection Frontend: - New /chat route with full-page chat UI (DaisyUI + Tailwind) - Collapsible conversation sidebar with CRUD - SSE streaming response display with tool call visualization - Provider selector dropdown - SSE proxy fix to stream text/event-stream without buffering - Navbar link and i18n keys
This commit is contained in:
17
backend/server/chat/admin.py
Normal file
17
backend/server/chat/admin.py
Normal file
@@ -0,0 +1,17 @@
|
||||
from django.contrib import admin
|
||||
|
||||
from .models import ChatConversation, ChatMessage
|
||||
|
||||
|
||||
@admin.register(ChatConversation)
|
||||
class ChatConversationAdmin(admin.ModelAdmin):
|
||||
list_display = ("id", "user", "title", "updated_at", "created_at")
|
||||
search_fields = ("title", "user__username")
|
||||
list_filter = ("created_at", "updated_at")
|
||||
|
||||
|
||||
@admin.register(ChatMessage)
|
||||
class ChatMessageAdmin(admin.ModelAdmin):
|
||||
list_display = ("id", "conversation", "role", "name", "created_at")
|
||||
search_fields = ("conversation__id", "content", "name")
|
||||
list_filter = ("role", "created_at")
|
||||
Reference in New Issue
Block a user