[v0.12.0 pre] Planning & Itinerary Overhaul, Recommendation Engine, OIDC Enhancements, and More (#931)
* Fixes [REQUEST] Email-based auto-linking for OIDC Fixes #921 * Add ClusterMap integration for regions and cities with fit-to-bounds functionality * Update COUNTRY_REGION_JSON_VERSION to v3.0 and modify state ID generation to use ISO2 code * fix: handle email verification required case during signup Updated the signup action to return a specific message when the backend responds with a 401 status, indicating that the signup succeeded but email verification is required. This allows the frontend to display the appropriate message using an i18n key. * feat: add Advanced Configuration documentation with optional environment variables * Fixes #511 * fix: update appVersion to v0.11.0-main-121425 and enhance socialProviders handling in settings page * feat: implement social signup controls and update documentation for new environment variables * fix: update LocationCard props and enhance restore data functionality - Changed the user prop to null in LocationCard component on the dashboard page. - Added isRestoring state to manage loading state during data restoration in settings. - Updated the restore button to show a loading spinner when a restore operation is in progress. * fix: update appVersion to v0.12.0-pre-dev-121625 * feat: implement itinerary planning feature with CollectionItineraryPlanner component and related updates * feat: add overnight lodging indicator and functionality to CollectionItineraryPlanner * feat: add compact display option to LocationCard and enhance lodging filtering in CollectionItineraryPlanner * feat(itinerary): add itinerary management features and link modal - Introduced ItineraryViewSet for managing itinerary items with create and reorder functionalities. - Added itinerary linking capabilities in CollectionModal and CollectionItineraryPlanner components. - Implemented new ItineraryLinkModal for linking existing items to specific dates. - Enhanced the frontend with new modals for creating locations, lodging, transportation, notes, and checklists. - Updated the backend to handle itinerary item creation and reordering with appropriate permissions. - Improved data handling for unscheduled items and their association with the itinerary. - Added new dependencies to the frontend for enhanced functionality. * feat(itinerary): implement auto-generate functionality for itinerary items based on dated records * feat(collection): enhance collection sharing logic and improve data handling on invite acceptance * fix: update appVersion to correct pre-dev version * feat(wikipedia): implement image selection from Wikipedia with enhanced results display * Refactor code structure for improved readability and maintainability * feat: add CollectionRecommendationView component for displaying location recommendations - Implemented CollectionRecommendationView.svelte to handle location recommendations based on user input and selected categories. - Added Recommendation and RecommendationResponse types to types.ts for better type safety and structure. - Updated collections/[id]/+page.svelte to include a new view for recommendations, allowing users to switch between different views seamlessly. * fix: update appVersion and improve button accessibility in collection views * feat: add canModify prop to collection components for user permission handling * feat: add itinerary removal functionality to various cards and update UI components - Implemented `removeFromItinerary` function in `LodgingCard`, `NoteCard`, and `TransportationCard` to allow users to remove items from their itinerary. - Replaced the trash icon with a calendar remove icon in `LocationCard`, `LodgingCard`, `NoteCard`, and `TransportationCard` for better visual representation. - Updated the dropdown menus in `LodgingCard`, `NoteCard`, and `TransportationCard` to include the new remove from itinerary option. - Enhanced `CollectionItineraryPlanner` to pass itinerary items to the respective cards. - Removed `PointSelectionModal.svelte` as it is no longer needed. - Refactored `LocationMedia.svelte` to integrate `ImageManagement` component and clean up unused code related to image handling. * feat: enhance itinerary management with deduplication and initial visit date handling * feat: add FullMap component for enhanced map functionality with clustering support - Introduced FullMap.svelte to handle map rendering, clustering, and marker management. - Updated map page to utilize FullMap component, replacing direct MapLibre usage. - Implemented clustering options and marker properties handling in FullMap. - Added utility functions for resolving theme colors and managing marker states. - Enhanced user experience with hover popups and improved loading states for location details. - Updated app version to v0.12.0-pre-dev-122225. * feat: enhance map interaction for touch devices with custom popup handling * feat: add progress tracker for folder views to display visited and planned locations * feat: add map center and zoom state management with URL synchronization * feat: add status and days until start fields to collections with filtering options * Component folder structure changes * feat: add LodgingMedia and LodgingModal components for managing lodging details and media attachments feat: implement LocationSearchMap component for interactive location searching and mapping functionality * fix: update contentType in ImageManagement component to 'lodging' for correct media handling * feat: enhance lodging management with date validation and update messages * feat: implement lodging detail page with server-side loading and image modal functionality - Added a new server-side load function to fetch lodging details by ID. - Created a new Svelte component for the lodging detail page, including image carousel and map integration. - Implemented a modal for displaying images with navigation. - Enhanced URL handling in the locations page to only read parameters. * feat: add Transportation modal component and related routes - Implemented TransportationModal component for creating and editing transportation entries. - Added server-side loading for transportation details in the new route [id]/+page.server.ts. - Created a new Svelte page for displaying transportation details with image and attachment handling. - Integrated modal for editing transportation in the transportation details page. - Updated lodging routes to include a modal for editing lodging entries. - Removed unused delete action from lodging server-side logic. * feat: add start_code and end_code fields to Transportation model and update related components * feat: implement date validation for itinerary items and add day picker modal for scheduling * Reorder town and county checks in geocoding.py Fix detection if only town exists for a location but county is no city name * Use address keys only if city is found * Make sure reverse geocoding uses correct key for cities (#938) * Reorder town and county checks in geocoding.py Fix detection if only town exists for a location but county is no city name * Use address keys only if city is found * Refactor code structure for improved readability and maintainability * Enhance collection management with modal updates and item handling * feat: integrate CollectionMap component in collections page and update map titles in lodging and transportation pages - Replaced inline map implementation with CollectionMap component in collections/[id]/+page.svelte for better modularity. - Updated the map title in lodging/[id]/+page.svelte to reflect lodging context. - Updated the map title in transportations/[id]/+page.svelte to reflect transportation context. - Added functionality to collect and render GeoJSON data from transportation attachments in transportations/[id]/+page.svelte. * chore: update copyright year to 2026 in various files * feat: enhance backup export functionality with itinerary items and export IDs * fix: improve dropdown close behavior by handling multiple event types * fix: remove unnecessary cache decorator from globespin function * feat: add initial visit date support in ChecklistModal and NoteModal, with UI suggestions for prefilled dates * feat: add details view for checklist and note cards with edit functionality * feat: add travel duration and GPX distance calculation to Transportation model and UI * feat: add primary image support to Collection model, serializers, and UI components * Refactor calendar components and enhance event detail handling - Replaced direct calendar implementation with a reusable CalendarComponent in the calendar route. - Introduced EventDetailsModal for displaying event details, improving modularity and readability. - Added functionality to fetch event details asynchronously when an event is clicked. - Implemented ICS calendar download functionality with loading state management. - Enhanced collections page to support calendar view, integrating event handling and timezone management. - Improved lodging and transportation pages to display local time for stays and trips, including timezone badges. - Cleaned up unused code and comments for better maintainability. * feat: enhance hero image handling in collection view by prioritizing primary image * chore: update .env.example to include account email verification configuration * feat: enhance LodgingCard and TransportationCard components with expandable details and improved layout * feat: add price and currency fields to locations, lodging, and transportation components - Introduced price and price_currency fields in LocationModal, LodgingDetails, LodgingModal, TransportationDetails, and TransportationModal components. - Implemented MoneyInput and CurrencyDropdown components for handling monetary values and currency selection. - Updated data structures and types to accommodate new price and currency fields across various models. - Enhanced cost summary calculations in collections and routes to display total costs by currency. - Added user preference for default currency in settings, affecting new item forms. - Updated UI to display price information in relevant components, ensuring consistent formatting and user experience. * feat: add Development Timeline link to overview and create timeline documentation * feat: enhance map functionality with search and zoom features - Updated availableViews in collection page to include map view based on lodging and transportation locations. - Added search functionality to the map page, allowing users to filter pins by name and category. - Implemented auto-zoom feature to adjust the map view based on filtered search results. - Introduced a search bar with a clear button for better user experience. * feat: enhance ISO code extraction and region matching logic in extractIsoCode function * feat: enhance extractIsoCode function with normalization for locality matching * feat: update extractIsoCode function to include additional ISO3166 levels for improved region matching * feat: enhance extractIsoCode function to handle cases without city information and update CollectionMap to bind user data * feat: add cron job for syncing visited regions and cities, enhance Docker and supervisord configurations * feat: add CollectionItineraryDay model and related functionality for itinerary day metadata management * feat: implement cleanup of out-of-range itinerary items and notify users of potential impacts on itinerary when dates change * Refactor collection page for improved localization and code clarity - Removed unused imports and consolidated cost category labels to be reactive. - Updated cost summary function to accept localized labels. - Enhanced localization for various UI elements, including buttons, headings, and statistics. - Improved user feedback messages for better clarity and consistency. - Ensured all relevant text is translatable using the i18n library. * feat: add collaborator serialization and display in collections - Implemented `_build_profile_pic_url` and `_serialize_collaborator` functions for user profile picture URLs and serialization. - Updated `CollectionSerializer` and `UltraSlimCollectionSerializer` to include collaborators in the serialized output. - Enhanced `CollectionViewSet` to prefetch shared_with users for optimized queries. - Modified frontend components to display collaborators in collection details, including profile pictures and initials. - Added new localization strings for collaborators. - Refactored map and location components to improve usability and functionality. - Updated app version to reflect new changes. * feat: add dynamic lodging icons based on type in CollectionMap component * feat: add CollectionStats component for detailed trip statistics - Implemented CollectionStats.svelte to display various statistics related to the collection, including distances, activities, and locations visited. - Enhanced CollectionMap.svelte to filter activities based on date range using new getActivityDate function. - Updated LocationSearchMap.svelte to handle airport mode for start and end locations. - Modified types.ts to include is_global property in CollectionItineraryItem for trip-wide items. - Updated +page.svelte to integrate the new stats view and manage view state accordingly. * feat: enhance itinerary management by removing old items on date change for notes and checklists; normalize date handling in CollectionMap * feat: add functionality to change day and move items to trip-wide itinerary - Implemented changeDay function in ChecklistCard, LocationCard, LodgingCard, NoteCard, and TransportationCard components to allow users to change the scheduled day of items. - Added a button to move items to the global (trip-wide) itinerary in the aforementioned components, with appropriate dispatch events. - Enhanced CollectionItineraryPlanner to handle moving items to the global itinerary and added UI elements for unscheduled items. - Updated ItineraryDayPickModal to support the deletion of source visits when moving locations. - Added new translations for "Change Day" and "Move Trip Wide" in the English locale. * fix: specify full path for python3 in cron job and add shell and path variables * fix: update appVersion to v0.12.0-pre-dev-010726 * feat: enhance CollectionItineraryPlanner and CollectionStats with dynamic links and transport type normalization * Add Dev Container + WSL install docs and link in install guide (#944) (#951) * feat: enhance internationalization support in CollectionMap and CollectionStats components - Added translation support for various labels and messages in CollectionMap.svelte and CollectionStats.svelte using svelte-i18n. - Updated English and Chinese locale files to include new translation keys for improved user experience. - Simplified the rendering of recommendation views in the collections page. * Refactor itinerary management and UI components - Updated ItineraryViewSet to handle visit updates and creations more efficiently, preserving visit IDs when moving between days. - Enhanced ChecklistCard, LodgingCard, TransportationCard, and NoteCard to include a new "Change Day" option in the actions menu. - Improved user experience in CollectionItineraryPlanner by tracking specific itinerary items being moved and ensuring only the relevant entries are deleted. - Added new location sharing options in LodgingCard and TransportationCard for Apple Maps, Google Maps, and OpenStreetMap. - Updated translations in en.json for consistency and clarity. - Minor UI adjustments for better accessibility and usability across various components. * feat: implement action menus and close event handling in card components * feat: refactor Dockerfile and supervisord configuration to remove cron and add periodic sync script * feat: enhance LocationSearchMap and TransportationDetails components with initialization handling and airport mode logic * feat: add airport and location search mode labels to localization file * feat: enhance periodic sync logging and improve airport mode handling in LocationSearchMap * feat: enhance unscheduled items display with improved card interactions and accessibility * Add dev compose for hot reload and update WSL dev container docs (#958) * feat: enhance localization for itinerary linking and transportation components * Localization: update localization files with new keys and values * fix: improve error messages for Overpass API responses * chore: update dependencies in frontend package.json and pnpm-lock.yaml - Updated @sveltejs/adapter-node from ^5.2.12 to ^5.4.0 - Updated @sveltejs/adapter-vercel from ^5.7.0 to ^6.3.0 - Updated tailwindcss from ^3.4.17 to ^3.4.19 - Updated typescript from ^5.8.3 to ^5.9.3 - Updated vite from ^5.4.19 to ^5.4.21 * chore: update dependencies in pnpm-lock.yaml to latest versions * Refactor code structure for improved readability and maintainability * Refactor code structure for improved readability and maintainability * fix: update package dependencies to resolve compatibility issues * Add "worldtravel" translations to multiple locale files - Added "worldtravel" key with translations for Spanish, French, Hungarian, Italian, Japanese, Korean, Dutch, Norwegian, Polish, Brazilian Portuguese, Russian, Slovak, Swedish, Turkish, Ukrainian, and Chinese. - Updated the navigation section in each locale file to include the new "worldtravel" entry. * Add new screenshots and update email verification message in locale file * feat: Implement data restoration functionality with file import - Added a new action `restoreData` in `+page.server.ts` to handle file uploads for restoring collections. - Enhanced the UI in `+page.svelte` to include an import button and a modal for import progress. - Integrated file input handling to trigger form submission upon file selection. - Removed unused GSAP animations from the login, profile, and signup pages for cleaner code. * feat: Add modals for creating locations and lodging from recommendations, enhance image import functionality * fix: Adjust styles to prevent horizontal scroll and enhance floating action button visibility * feat: Enhance error handling and messaging for Google Maps and OpenStreetMap geocoding functions * fix: Enhance error messaging for Google Maps access forbidden response * feat: Add User-Agent header to Google Maps API requests and refine error messaging for access forbidden response * fix: Update User-Agent header in Google Maps API requests for improved compatibility * fix: Disable proxy settings in Google Maps API request to prevent connection issues * fix: Update Trivy security scan configuration and add .trivyignore for known false positives * fix: Refactor update method to handle is_public cascading for related items * feat: Integrate django-invitations for user invitation management and update settings * feat: Add Tailwind CSS and DaisyUI plugin for styling * feat: Add Tailwind CSS and DaisyUI plugin for styling * feat: Add "Invite a User" guide and update navigation links * docs: Update "Invite a User" guide to include email configuration tip * feat: Update email invitation template for improved styling and clarity * fix: Remove trailing backslash from installation note in Unraid documentation * feat: Add export/import messages and user email verification prompts in multiple languages * Squashed commit of the following: commit a993a15b93ebb7521ae2e5cc31596b98b29fcd6c Author: Alex <div@alexe.at> Date: Mon Jan 12 20:44:47 2026 +0100 Translated using Weblate (German) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/ commit fdc455d9424fbb0f6b72179d9eb1340411700773 Author: Ettore Atalan <atalanttore@googlemail.com> Date: Sat Jan 10 23:24:23 2026 +0100 Translated using Weblate (German) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/ commit 5942129c55e89dd999a13d4df9c40e6e3189355c Author: Orhun <orhunavcu@gmail.com> Date: Sun Jan 11 13:05:31 2026 +0100 Translated using Weblate (Turkish) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/tr/ commit 8712e43d8ba4a7e7fe163fb454d6577187f9a375 Author: Henrique Fonseca Veloso <henriquefv@tutamail.com> Date: Fri Jan 9 22:53:11 2026 +0100 Translated using Weblate (Portuguese (Brazil)) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/pt_BR/ commit 18ee56653470413afe8d71ecd2b5028f6e4cf118 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:57 2026 +0100 Translated using Weblate (Dutch) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/nl/ commit 57783c544e583c035c8b57b5c10ca320f25f399e Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:14 2026 +0100 Translated using Weblate (Arabic) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ar/ commit fb09edfd85bc85234b1c1ba7dd499f2915093fff Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:26 2026 +0100 Translated using Weblate (Spanish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/es/ commit 554a207d8e454a1f7ae826e2a40d389b94be5512 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:21 2026 +0100 Translated using Weblate (German) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/ commit b70b9db27fb8607beefeb288185601c8f5eae28d Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:02 2026 +0100 Translated using Weblate (Norwegian Bokmål) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/nb_NO/ commit 3b467caa9007c553e4ae7de97f53b6e462161ea3 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:07 2026 +0100 Translated using Weblate (Polish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/pl/ commit 30fbbfba3572c8f78ec7c7e1a231e363aca1ef10 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:17 2026 +0100 Translated using Weblate (Russian) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ru/ commit 8cecb492cfcac0a1f93ee8919f7b41d978d331ee Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:42 2026 +0100 Translated using Weblate (Italian) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/it/ commit f0d3d41029c89bfa83d5891ee7af70241f27b7be Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:38 2026 +0100 Translated using Weblate (Hungarian) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/hu/ commit 102e0f1912d010d38755a1713abb2a7f7564aafb Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:21 2026 +0100 Translated using Weblate (Slovak) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/sk/ commit 428b8f18cf6195a96b55109e0221413d82415a2f Author: Максим Горпиніч <gorpinicmaksim0@gmail.com> Date: Sat Jan 10 08:55:28 2026 +0100 Translated using Weblate (Ukrainian) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/uk/ commit 1a71aaf279ecab26c0c1fede05025732e6dcfa5e Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:27 2026 +0100 Translated using Weblate (Swedish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/sv/ commit 36ec3701f3a1a904e7c42ac4ffbe6a050dc6d1ed Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:43 2026 +0100 Translated using Weblate (Chinese (Simplified Han script)) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/zh_Hans/ commit 65d8b74b340c877cad2028b7142c783a1b568d49 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:48 2026 +0100 Translated using Weblate (Japanese) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ja/ commit 4d11d1d31022583657e93aee70301a8ffcde1340 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:52 2026 +0100 Translated using Weblate (Korean) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ko/ commit bd1135bcb965ad73cf493771b15081cc97cf513a Author: Orhun <orhunavcu@gmail.com> Date: Fri Jan 9 22:53:33 2026 +0100 Translated using Weblate (Turkish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/tr/ commit 2c3d814119f4cf2dabd20933699f5b991f20f3e6 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:32 2026 +0100 Translated using Weblate (French) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/fr/ * Refactor code structure and remove redundant code blocks for improved readability and maintainability * fix: Correct appVersion to match the latest pre-release version * fix: Add missing vulnerability reference for jaraco.context in .trivyignore --------- Co-authored-by: Lars Lehmann <33843261+larsl-net@users.noreply.github.com> Co-authored-by: Lars Lehmann <lars@lmail.eu> Co-authored-by: Nick Petrushin <n.a.petrushin@gmail.com>
This commit is contained in:
@@ -27,6 +27,10 @@ EMAIL_BACKEND='console'
|
||||
|
||||
# GOOGLE_MAPS_API_KEY='key'
|
||||
|
||||
# ACCOUNT_EMAIL_VERIFICATION='none' # 'none', 'optional', 'mandatory' # You can change this as needed for your environment
|
||||
|
||||
# FORCE_SOCIALACCOUNT_LOGIN=False # When true, only social login is allowed (no password login) and the login page will show only social providers or redirect directly to the first provider if only one is configured.
|
||||
|
||||
|
||||
# ------------------- #
|
||||
# For Developers to start a Demo Database
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import os
|
||||
from django.contrib import admin
|
||||
from django.utils.html import mark_safe
|
||||
from .models import Location, Checklist, ChecklistItem, Collection, Transportation, Note, ContentImage, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity
|
||||
from django.utils.html import mark_safe, format_html
|
||||
from django.urls import reverse
|
||||
from .models import Location, Checklist, ChecklistItem, Collection, Transportation, Note, ContentImage, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity, CollectionItineraryItem, CollectionItineraryDay
|
||||
from worldtravel.models import Country, Region, VisitedRegion, City, VisitedCity
|
||||
from allauth.account.decorators import secure_admin_login
|
||||
|
||||
@@ -82,11 +83,11 @@ from users.models import CustomUser
|
||||
|
||||
class CustomUserAdmin(UserAdmin):
|
||||
model = CustomUser
|
||||
list_display = ['username', 'is_staff', 'is_active', 'image_display', 'measurement_system']
|
||||
list_display = ['username', 'is_staff', 'is_active', 'image_display', 'measurement_system', 'default_currency']
|
||||
readonly_fields = ('uuid',)
|
||||
search_fields = ('username',)
|
||||
fieldsets = UserAdmin.fieldsets + (
|
||||
(None, {'fields': ('profile_pic', 'uuid', 'public_profile', 'disable_password', 'measurement_system')}),
|
||||
(None, {'fields': ('profile_pic', 'uuid', 'public_profile', 'disable_password', 'measurement_system', 'default_currency')}),
|
||||
)
|
||||
def image_display(self, obj):
|
||||
if obj.profile_pic:
|
||||
@@ -146,6 +147,32 @@ class CollectionAdmin(admin.ModelAdmin):
|
||||
class ActivityAdmin(admin.ModelAdmin):
|
||||
list_display = ('name', 'user', 'visit__location', 'sport_type', 'distance', 'elevation_gain', 'moving_time')
|
||||
|
||||
class CollectionItineraryItemAdmin(admin.ModelAdmin):
|
||||
list_display = ('collection', 'content_type', 'object_link', 'date', 'order')
|
||||
search_fields = ('collection__name', 'content_type__model')
|
||||
list_filter = ('content_type', 'date')
|
||||
raw_id_fields = ('collection',)
|
||||
readonly_fields = ('created_at',)
|
||||
|
||||
def object_link(self, obj):
|
||||
"""
|
||||
Display the generic related object; link to its admin change page if registered.
|
||||
"""
|
||||
linked_obj = obj.item
|
||||
if not linked_obj:
|
||||
return "—"
|
||||
try:
|
||||
ct = obj.content_type
|
||||
app_label = ct.app_label
|
||||
model = ct.model
|
||||
admin_url = reverse('admin:%s_%s_change' % (app_label, model), args=[obj.object_id])
|
||||
return format_html('<a href="{}">{}</a>', admin_url, str(linked_obj))
|
||||
except Exception:
|
||||
# Fallback to plain text if any error (object not registered, missing id, etc.)
|
||||
return str(linked_obj)
|
||||
|
||||
object_link.short_description = 'Item'
|
||||
|
||||
admin.site.register(CustomUser, CustomUserAdmin)
|
||||
admin.site.register(Location, LocationAdmin)
|
||||
admin.site.register(Collection, CollectionAdmin)
|
||||
@@ -166,6 +193,8 @@ admin.site.register(Lodging)
|
||||
admin.site.register(CollectionInvite, CollectionInviteAdmin)
|
||||
admin.site.register(Trail)
|
||||
admin.site.register(Activity, ActivityAdmin)
|
||||
admin.site.register(CollectionItineraryItem, CollectionItineraryItemAdmin)
|
||||
admin.site.register(CollectionItineraryDay)
|
||||
|
||||
admin.site.site_header = 'AdventureLog Admin'
|
||||
admin.site.site_title = 'AdventureLog Admin Site'
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
import requests
|
||||
import time
|
||||
import socket
|
||||
import re
|
||||
import unicodedata
|
||||
from worldtravel.models import Region, City, VisitedRegion, VisitedCity
|
||||
from django.conf import settings
|
||||
|
||||
@@ -10,7 +12,7 @@ def search_google(query):
|
||||
try:
|
||||
api_key = settings.GOOGLE_MAPS_API_KEY
|
||||
if not api_key:
|
||||
return {"error": "Missing Google Maps API key"}
|
||||
return {"error": "Geocoding service unavailable. Please check configuration."}
|
||||
|
||||
# Updated to use the new Places API (New) endpoint
|
||||
url = "https://places.googleapis.com/v1/places:searchText"
|
||||
@@ -34,7 +36,7 @@ def search_google(query):
|
||||
# Check if we have places in the response
|
||||
places = data.get("places", [])
|
||||
if not places:
|
||||
return {"error": "No results found"}
|
||||
return {"error": "No locations found for the given query."}
|
||||
|
||||
results = []
|
||||
for place in places:
|
||||
@@ -71,11 +73,25 @@ def search_google(query):
|
||||
|
||||
return results
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
return {"error": "Network error while contacting Google Maps", "details": str(e)}
|
||||
|
||||
except Exception as e:
|
||||
return {"error": "Unexpected error during Google search", "details": str(e)}
|
||||
except requests.exceptions.Timeout:
|
||||
return {"error": "Request timed out while contacting Google Maps. Please try again."}
|
||||
except requests.exceptions.ConnectionError:
|
||||
return {"error": "Unable to connect to Google Maps service. Please check your internet connection."}
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if response.status_code == 400:
|
||||
return {"error": "Invalid request to Google Maps. Please check your query."}
|
||||
elif response.status_code == 401:
|
||||
return {"error": "Authentication failed with Google Maps. Please check API configuration."}
|
||||
elif response.status_code == 403:
|
||||
return {"error": "Access forbidden to Google Maps. Please check API permissions."}
|
||||
elif response.status_code == 429:
|
||||
return {"error": "Too many requests to Google Maps. Please try again later."}
|
||||
else:
|
||||
return {"error": "Google Maps service error. Please try again later."}
|
||||
except requests.exceptions.RequestException:
|
||||
return {"error": "Network error while contacting Google Maps. Please try again."}
|
||||
except Exception:
|
||||
return {"error": "An unexpected error occurred during Google search. Please try again."}
|
||||
|
||||
def _extract_google_category(types):
|
||||
# Basic category inference based on common place types
|
||||
@@ -111,82 +127,231 @@ def _infer_addresstype(type_):
|
||||
|
||||
|
||||
def search_osm(query):
|
||||
url = f"https://nominatim.openstreetmap.org/search?q={query}&format=jsonv2"
|
||||
headers = {'User-Agent': 'AdventureLog Server'}
|
||||
response = requests.get(url, headers=headers)
|
||||
data = response.json()
|
||||
try:
|
||||
url = f"https://nominatim.openstreetmap.org/search?q={query}&format=jsonv2"
|
||||
headers = {'User-Agent': 'AdventureLog Server'}
|
||||
response = requests.get(url, headers=headers, timeout=(2, 5))
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
return [{
|
||||
"lat": item.get("lat"),
|
||||
"lon": item.get("lon"),
|
||||
"name": item.get("name"),
|
||||
"display_name": item.get("display_name"),
|
||||
"type": item.get("type"),
|
||||
"category": item.get("category"),
|
||||
"importance": item.get("importance"),
|
||||
"addresstype": item.get("addresstype"),
|
||||
"powered_by": "nominatim",
|
||||
} for item in data]
|
||||
return [{
|
||||
"lat": item.get("lat"),
|
||||
"lon": item.get("lon"),
|
||||
"name": item.get("name"),
|
||||
"display_name": item.get("display_name"),
|
||||
"type": item.get("type"),
|
||||
"category": item.get("category"),
|
||||
"importance": item.get("importance"),
|
||||
"addresstype": item.get("addresstype"),
|
||||
"powered_by": "nominatim",
|
||||
} for item in data]
|
||||
except requests.exceptions.Timeout:
|
||||
return {"error": "Request timed out while contacting OpenStreetMap. Please try again."}
|
||||
except requests.exceptions.ConnectionError:
|
||||
return {"error": "Unable to connect to OpenStreetMap service. Please check your internet connection."}
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if response.status_code == 400:
|
||||
return {"error": "Invalid request to OpenStreetMap. Please check your query."}
|
||||
elif response.status_code == 429:
|
||||
return {"error": "Too many requests to OpenStreetMap. Please try again later."}
|
||||
else:
|
||||
return {"error": "OpenStreetMap service error. Please try again later."}
|
||||
except requests.exceptions.RequestException:
|
||||
return {"error": "Network error while contacting OpenStreetMap. Please try again."}
|
||||
except Exception:
|
||||
return {"error": "An unexpected error occurred during OpenStreetMap search. Please try again."}
|
||||
|
||||
def search(query):
|
||||
"""
|
||||
Unified search function that tries Google Maps first, then falls back to OpenStreetMap.
|
||||
"""
|
||||
if getattr(settings, 'GOOGLE_MAPS_API_KEY', None):
|
||||
google_result = search_google(query)
|
||||
if "error" not in google_result:
|
||||
return google_result
|
||||
# If Google fails, fallback to OSM
|
||||
return search_osm(query)
|
||||
|
||||
# -----------------
|
||||
# REVERSE GEOCODING
|
||||
# -----------------
|
||||
|
||||
def extractIsoCode(user, data):
|
||||
"""
|
||||
Extract the ISO code from the response data.
|
||||
Returns a dictionary containing the region name, country name, and ISO code if found.
|
||||
"""
|
||||
iso_code = None
|
||||
town_city_or_county = None
|
||||
display_name = None
|
||||
country_code = None
|
||||
city = None
|
||||
visited_city = None
|
||||
location_name = None
|
||||
"""
|
||||
Extract the ISO code from the response data.
|
||||
Returns a dictionary containing the region name, country name, and ISO code if found.
|
||||
"""
|
||||
iso_code = None
|
||||
display_name = None
|
||||
country_code = None
|
||||
city = None
|
||||
visited_city = None
|
||||
location_name = None
|
||||
|
||||
# town = None
|
||||
# city = None
|
||||
# county = None
|
||||
if 'name' in data.keys():
|
||||
location_name = data['name']
|
||||
|
||||
if 'name' in data.keys():
|
||||
location_name = data['name']
|
||||
|
||||
if 'address' in data.keys():
|
||||
keys = data['address'].keys()
|
||||
for key in keys:
|
||||
if key.find("ISO") != -1:
|
||||
iso_code = data['address'][key]
|
||||
if 'town' in keys:
|
||||
town_city_or_county = data['address']['town']
|
||||
if 'county' in keys:
|
||||
town_city_or_county = data['address']['county']
|
||||
if 'city' in keys:
|
||||
town_city_or_county = data['address']['city']
|
||||
if not iso_code:
|
||||
return {"error": "No region found"}
|
||||
|
||||
region = Region.objects.filter(id=iso_code).first()
|
||||
visited_region = VisitedRegion.objects.filter(region=region, user=user).first()
|
||||
|
||||
region_visited = False
|
||||
city_visited = False
|
||||
country_code = iso_code[:2]
|
||||
|
||||
address = data.get('address', {}) or {}
|
||||
|
||||
# Capture country code early for ISO selection and name fallback.
|
||||
country_code = address.get("ISO3166-1")
|
||||
state_name = address.get("state")
|
||||
|
||||
# Prefer the most specific ISO 3166-2 code available before falling back to country-level.
|
||||
# France gets lvl4 (regions) first for city matching, then lvl6 (departments) as a fallback.
|
||||
preferred_iso_keys = (
|
||||
[
|
||||
"ISO3166-2-lvl10",
|
||||
"ISO3166-2-lvl9",
|
||||
"ISO3166-2-lvl8",
|
||||
"ISO3166-2-lvl4",
|
||||
"ISO3166-2-lvl6",
|
||||
"ISO3166-2-lvl7",
|
||||
"ISO3166-2-lvl5",
|
||||
"ISO3166-2-lvl3",
|
||||
"ISO3166-2-lvl2",
|
||||
"ISO3166-2-lvl1",
|
||||
"ISO3166-2",
|
||||
]
|
||||
if country_code == "FR"
|
||||
else [
|
||||
"ISO3166-2-lvl10",
|
||||
"ISO3166-2-lvl9",
|
||||
"ISO3166-2-lvl8",
|
||||
"ISO3166-2-lvl4",
|
||||
"ISO3166-2-lvl7",
|
||||
"ISO3166-2-lvl6",
|
||||
"ISO3166-2-lvl5",
|
||||
"ISO3166-2-lvl3",
|
||||
"ISO3166-2-lvl2",
|
||||
"ISO3166-2-lvl1",
|
||||
"ISO3166-2",
|
||||
]
|
||||
)
|
||||
|
||||
iso_candidates = []
|
||||
for key in preferred_iso_keys:
|
||||
value = address.get(key)
|
||||
if value and value not in iso_candidates:
|
||||
iso_candidates.append(value)
|
||||
|
||||
# If no region-level code, fall back to country code only as a last resort.
|
||||
if not iso_candidates and "ISO3166-1" in address:
|
||||
iso_candidates.append(address.get("ISO3166-1"))
|
||||
|
||||
iso_code = iso_candidates[0] if iso_candidates else None
|
||||
|
||||
region_candidates = []
|
||||
for candidate in iso_candidates:
|
||||
if len(str(candidate)) <= 2:
|
||||
continue
|
||||
match = Region.objects.filter(id=candidate).first()
|
||||
if match and match not in region_candidates:
|
||||
region_candidates.append(match)
|
||||
|
||||
region = region_candidates[0] if region_candidates else None
|
||||
|
||||
# Fallback: attempt to resolve region by name and country code when no ISO match.
|
||||
if not region and state_name:
|
||||
region_queryset = Region.objects.filter(name__iexact=state_name)
|
||||
if country_code:
|
||||
region_queryset = region_queryset.filter(country__country_code=country_code)
|
||||
region = region_queryset.first()
|
||||
if region:
|
||||
if town_city_or_county:
|
||||
display_name = f"{town_city_or_county}, {region.name}, {country_code}"
|
||||
city = City.objects.filter(name__contains=town_city_or_county, region=region).first()
|
||||
visited_city = VisitedCity.objects.filter(city=city, user=user).first()
|
||||
iso_code = region.id
|
||||
if not country_code:
|
||||
country_code = region.country.country_code
|
||||
if region not in region_candidates:
|
||||
region_candidates.insert(0, region)
|
||||
|
||||
if visited_region:
|
||||
region_visited = True
|
||||
if visited_city:
|
||||
city_visited = True
|
||||
if region:
|
||||
return {"region_id": iso_code, "region": region.name, "country": region.country.name, "country_id": region.country.country_code, "region_visited": region_visited, "display_name": display_name, "city": city.name if city else None, "city_id": city.id if city else None, "city_visited": city_visited, 'location_name': location_name}
|
||||
if not region:
|
||||
return {"error": "No region found"}
|
||||
|
||||
if not country_code:
|
||||
country_code = region.country.country_code
|
||||
|
||||
region_visited = False
|
||||
city_visited = False
|
||||
|
||||
# ordered preference for best-effort locality matching
|
||||
locality_keys = [
|
||||
'suburb',
|
||||
'neighbourhood',
|
||||
'neighborhood', # alternate spelling
|
||||
'city',
|
||||
'city_district',
|
||||
'town',
|
||||
'village',
|
||||
'hamlet',
|
||||
'locality',
|
||||
'municipality',
|
||||
'county',
|
||||
]
|
||||
|
||||
def _normalize_name(value):
|
||||
normalized = unicodedata.normalize("NFKD", value)
|
||||
ascii_only = normalized.encode("ascii", "ignore").decode("ascii")
|
||||
return re.sub(r"[^a-z0-9]", "", ascii_only.lower())
|
||||
|
||||
def match_locality(key_name, target_region):
|
||||
value = address.get(key_name)
|
||||
if not value:
|
||||
return None
|
||||
qs = City.objects.filter(region=target_region)
|
||||
|
||||
# Use exact matches first to avoid broad county/name collisions (e.g. Troms vs Tromsø).
|
||||
exact_match = qs.filter(name__iexact=value).first()
|
||||
if exact_match:
|
||||
return exact_match
|
||||
|
||||
normalized_value = _normalize_name(value)
|
||||
for candidate in qs.values_list('id', 'name'):
|
||||
candidate_id, candidate_name = candidate
|
||||
if _normalize_name(candidate_name) == normalized_value:
|
||||
return qs.filter(id=candidate_id).first()
|
||||
|
||||
# Allow partial matching for most locality fields but keep county strict.
|
||||
if key_name == 'county':
|
||||
return None
|
||||
|
||||
return qs.filter(name__icontains=value).first()
|
||||
|
||||
chosen_region = region
|
||||
for candidate_region in region_candidates or [region]:
|
||||
for key_name in locality_keys:
|
||||
city = match_locality(key_name, candidate_region)
|
||||
if city:
|
||||
chosen_region = candidate_region
|
||||
iso_code = chosen_region.id
|
||||
break
|
||||
if city:
|
||||
break
|
||||
|
||||
region = chosen_region
|
||||
iso_code = region.id
|
||||
visited_region = VisitedRegion.objects.filter(region=region, user=user).first()
|
||||
region_visited = bool(visited_region)
|
||||
|
||||
if city:
|
||||
display_name = f"{city.name}, {region.name}, {country_code or region.country.country_code}"
|
||||
visited_city = VisitedCity.objects.filter(city=city, user=user).first()
|
||||
city_visited = bool(visited_city)
|
||||
else:
|
||||
display_name = f"{region.name}, {country_code or region.country.country_code}"
|
||||
|
||||
return {
|
||||
"region_id": iso_code,
|
||||
"region": region.name,
|
||||
"country": region.country.name,
|
||||
"country_id": region.country.country_code,
|
||||
"region_visited": region_visited,
|
||||
"display_name": display_name,
|
||||
"city": city.name if city else None,
|
||||
"city_id": city.id if city else None,
|
||||
"city_visited": city_visited,
|
||||
'location_name': location_name,
|
||||
}
|
||||
|
||||
def is_host_resolvable(hostname: str) -> bool:
|
||||
try:
|
||||
socket.gethostbyname(hostname)
|
||||
@@ -210,15 +375,28 @@ def reverse_geocode_osm(lat, lon, user):
|
||||
read_timeout = 5
|
||||
|
||||
if not is_host_resolvable("nominatim.openstreetmap.org"):
|
||||
return {"error": "DNS resolution failed"}
|
||||
return {"error": "Unable to resolve OpenStreetMap service. Please check your internet connection."}
|
||||
|
||||
try:
|
||||
response = requests.get(url, headers=headers, timeout=(connect_timeout, read_timeout))
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
return extractIsoCode(user, data)
|
||||
except requests.exceptions.Timeout:
|
||||
return {"error": "Request timed out while contacting OpenStreetMap. Please try again."}
|
||||
except requests.exceptions.ConnectionError:
|
||||
return {"error": "Unable to connect to OpenStreetMap service. Please check your internet connection."}
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if response.status_code == 400:
|
||||
return {"error": "Invalid request to OpenStreetMap. Please check coordinates."}
|
||||
elif response.status_code == 429:
|
||||
return {"error": "Too many requests to OpenStreetMap. Please try again later."}
|
||||
else:
|
||||
return {"error": "OpenStreetMap service error. Please try again later."}
|
||||
except requests.exceptions.RequestException:
|
||||
return {"error": "Network error while contacting OpenStreetMap. Please try again."}
|
||||
except Exception:
|
||||
return {"error": "An internal error occurred while processing the request"}
|
||||
return {"error": "An unexpected error occurred during OpenStreetMap geocoding. Please try again."}
|
||||
|
||||
def reverse_geocode_google(lat, lon, user):
|
||||
api_key = settings.GOOGLE_MAPS_API_KEY
|
||||
@@ -229,12 +407,22 @@ def reverse_geocode_google(lat, lon, user):
|
||||
params = {"latlng": f"{lat},{lon}", "key": api_key}
|
||||
|
||||
try:
|
||||
response = requests.get(url, params=params)
|
||||
response = requests.get(url, params=params, timeout=(2, 5))
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
if data.get("status") != "OK":
|
||||
return {"error": "Geocoding failed"}
|
||||
status = data.get("status")
|
||||
if status != "OK":
|
||||
if status == "ZERO_RESULTS":
|
||||
return {"error": "No location found for the given coordinates."}
|
||||
elif status == "OVER_QUERY_LIMIT":
|
||||
return {"error": "Query limit exceeded for Google Maps. Please try again later."}
|
||||
elif status == "REQUEST_DENIED":
|
||||
return {"error": "Request denied by Google Maps. Please check API configuration."}
|
||||
elif status == "INVALID_REQUEST":
|
||||
return {"error": "Invalid request to Google Maps. Please check coordinates."}
|
||||
else:
|
||||
return {"error": "Geocoding failed. Please try again."}
|
||||
|
||||
# Convert Google schema to Nominatim-style for extractIsoCode
|
||||
first_result = data.get("results", [])[0]
|
||||
@@ -243,8 +431,25 @@ def reverse_geocode_google(lat, lon, user):
|
||||
"address": _parse_google_address_components(first_result.get("address_components", []))
|
||||
}
|
||||
return extractIsoCode(user, result_data)
|
||||
except requests.exceptions.Timeout:
|
||||
return {"error": "Request timed out while contacting Google Maps. Please try again."}
|
||||
except requests.exceptions.ConnectionError:
|
||||
return {"error": "Unable to connect to Google Maps service. Please check your internet connection."}
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if response.status_code == 400:
|
||||
return {"error": "Invalid request to Google Maps. Please check coordinates."}
|
||||
elif response.status_code == 401:
|
||||
return {"error": "Authentication failed with Google Maps. Please check API configuration."}
|
||||
elif response.status_code == 403:
|
||||
return {"error": "Access forbidden to Google Maps. Please check API permissions."}
|
||||
elif response.status_code == 429:
|
||||
return {"error": "Too many requests to Google Maps. Please try again later."}
|
||||
else:
|
||||
return {"error": "Google Maps service error. Please try again later."}
|
||||
except requests.exceptions.RequestException:
|
||||
return {"error": "Network error while contacting Google Maps. Please try again."}
|
||||
except Exception:
|
||||
return {"error": "An internal error occurred while processing the request"}
|
||||
return {"error": "An unexpected error occurred during Google geocoding. Please try again."}
|
||||
|
||||
def _parse_google_address_components(components):
|
||||
parsed = {}
|
||||
@@ -265,13 +470,23 @@ def _parse_google_address_components(components):
|
||||
state_code = short_name
|
||||
if "administrative_area_level_2" in types:
|
||||
parsed["county"] = long_name
|
||||
if "administrative_area_level_3" in types:
|
||||
parsed["municipality"] = long_name
|
||||
if "locality" in types:
|
||||
parsed["city"] = long_name
|
||||
if "sublocality" in types:
|
||||
parsed["town"] = long_name
|
||||
if "postal_town" in types:
|
||||
parsed.setdefault("city", long_name)
|
||||
if "sublocality" in types or any(t.startswith("sublocality_level_") for t in types):
|
||||
parsed["suburb"] = long_name
|
||||
if "neighborhood" in types:
|
||||
parsed["neighbourhood"] = long_name
|
||||
if "route" in types:
|
||||
parsed["road"] = long_name
|
||||
if "street_address" in types:
|
||||
parsed["address"] = long_name
|
||||
|
||||
# Build composite ISO 3166-2 code like US-ME
|
||||
# Build composite ISO 3166-2 code like US-ME (matches Region.id in DB)
|
||||
if country_code and state_code:
|
||||
parsed["ISO3166-2-lvl1"] = f"{country_code}-{state_code}"
|
||||
|
||||
return parsed
|
||||
return parsed
|
||||
|
||||
@@ -0,0 +1,259 @@
|
||||
"""
|
||||
Django management command to synchronize visited regions and cities based on user locations.
|
||||
|
||||
This command processes all users' visited locations and marks their regions and cities as visited.
|
||||
It's designed to be run periodically (e.g., nightly cron job) to keep visited regions/cities up to date.
|
||||
|
||||
Usage:
|
||||
python manage.py sync_visited_regions
|
||||
python manage.py sync_visited_regions --dry-run
|
||||
python manage.py sync_visited_regions --user-id 123
|
||||
python manage.py sync_visited_regions --batch-size 50
|
||||
"""
|
||||
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.db import transaction
|
||||
from django.db.models import Prefetch, Q
|
||||
from adventures.models import Location
|
||||
from worldtravel.models import Region, City, VisitedRegion, VisitedCity
|
||||
from collections import defaultdict
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
User = get_user_model()
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Synchronize visited regions and cities based on user locations'
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
'--dry-run',
|
||||
action='store_true',
|
||||
help='Show what would be created without making changes',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--user-id',
|
||||
type=int,
|
||||
help='Sync visited regions for a specific user ID only',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--batch-size',
|
||||
type=int,
|
||||
default=100,
|
||||
help='Number of users to process in each batch (default: 100)',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--verbose',
|
||||
action='store_true',
|
||||
help='Show detailed output for each user',
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
dry_run = options['dry_run']
|
||||
user_id = options.get('user_id')
|
||||
batch_size = options['batch_size']
|
||||
verbose = options['verbose']
|
||||
|
||||
if dry_run:
|
||||
self.stdout.write(
|
||||
self.style.WARNING('DRY RUN MODE - No changes will be made')
|
||||
)
|
||||
|
||||
# Build user queryset
|
||||
users_queryset = User.objects.all()
|
||||
|
||||
if user_id:
|
||||
users_queryset = users_queryset.filter(id=user_id)
|
||||
if not users_queryset.exists():
|
||||
raise CommandError(f'User with ID {user_id} not found')
|
||||
|
||||
total_users = users_queryset.count()
|
||||
|
||||
if total_users == 0:
|
||||
self.stdout.write(self.style.WARNING('No users found'))
|
||||
return
|
||||
|
||||
self.stdout.write(f'Processing {total_users} user(s)...\n')
|
||||
|
||||
# Track overall statistics
|
||||
total_new_regions = 0
|
||||
total_new_cities = 0
|
||||
users_processed = 0
|
||||
users_with_changes = 0
|
||||
|
||||
# Process users in batches to manage memory
|
||||
user_ids = list(users_queryset.values_list('id', flat=True))
|
||||
|
||||
for i in range(0, len(user_ids), batch_size):
|
||||
batch_user_ids = user_ids[i:i + batch_size]
|
||||
|
||||
for user_id in batch_user_ids:
|
||||
try:
|
||||
new_regions, new_cities = self._process_user(
|
||||
user_id, dry_run, verbose
|
||||
)
|
||||
|
||||
total_new_regions += new_regions
|
||||
total_new_cities += new_cities
|
||||
users_processed += 1
|
||||
|
||||
if new_regions > 0 or new_cities > 0:
|
||||
users_with_changes += 1
|
||||
|
||||
# Progress indicator for large batches
|
||||
if users_processed % 50 == 0:
|
||||
self.stdout.write(
|
||||
f'Processed {users_processed}/{total_users} users...'
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
self.stdout.write(
|
||||
self.style.ERROR(
|
||||
f'Error processing user {user_id}: {str(e)}'
|
||||
)
|
||||
)
|
||||
logger.exception(f'Error processing user {user_id}')
|
||||
|
||||
# Summary
|
||||
self.stdout.write('\n' + '='*60)
|
||||
if dry_run:
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS(
|
||||
f'DRY RUN COMPLETE:\n'
|
||||
f' Users processed: {users_processed}\n'
|
||||
f' Users with changes: {users_with_changes}\n'
|
||||
f' Would create {total_new_regions} new visited regions\n'
|
||||
f' Would create {total_new_cities} new visited cities'
|
||||
)
|
||||
)
|
||||
else:
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS(
|
||||
f'SYNC COMPLETE:\n'
|
||||
f' Users processed: {users_processed}\n'
|
||||
f' Users with changes: {users_with_changes}\n'
|
||||
f' Created {total_new_regions} new visited regions\n'
|
||||
f' Created {total_new_cities} new visited cities'
|
||||
)
|
||||
)
|
||||
|
||||
def _process_user(self, user_id, dry_run=False, verbose=False):
|
||||
"""
|
||||
Process a single user and return counts of new regions and cities.
|
||||
Returns: (new_regions_count, new_cities_count)
|
||||
"""
|
||||
# Get all visited locations with their region and city data in a single query
|
||||
visited_locations = Location.objects.filter(
|
||||
user_id=user_id
|
||||
).select_related('region', 'city')
|
||||
|
||||
# Collect unique regions and cities from visited locations
|
||||
regions_to_mark = set()
|
||||
cities_to_mark = set()
|
||||
|
||||
for location in visited_locations:
|
||||
# Only process locations that are marked as visited
|
||||
if not location.is_visited_status():
|
||||
continue
|
||||
|
||||
if location.region_id:
|
||||
regions_to_mark.add(location.region_id)
|
||||
|
||||
if location.city_id:
|
||||
cities_to_mark.add(location.city_id)
|
||||
|
||||
# Early exit if no regions or cities to mark
|
||||
if not regions_to_mark and not cities_to_mark:
|
||||
return 0, 0
|
||||
|
||||
new_regions_count = 0
|
||||
new_cities_count = 0
|
||||
|
||||
# Process regions
|
||||
if regions_to_mark:
|
||||
new_regions_count = self._sync_visited_regions(
|
||||
user_id, regions_to_mark, dry_run
|
||||
)
|
||||
|
||||
# Process cities
|
||||
if cities_to_mark:
|
||||
new_cities_count = self._sync_visited_cities(
|
||||
user_id, cities_to_mark, dry_run
|
||||
)
|
||||
|
||||
if verbose and (new_regions_count > 0 or new_cities_count > 0):
|
||||
self.stdout.write(
|
||||
f'User {user_id}: '
|
||||
f'{new_regions_count} new regions, '
|
||||
f'{new_cities_count} new cities'
|
||||
)
|
||||
|
||||
return new_regions_count, new_cities_count
|
||||
|
||||
def _sync_visited_regions(self, user_id, region_ids, dry_run=False):
|
||||
"""Sync visited regions for a user. Returns count of new regions created."""
|
||||
# Get existing visited regions for this user in one query
|
||||
existing_visited_regions = set(
|
||||
VisitedRegion.objects.filter(
|
||||
user_id=user_id,
|
||||
region_id__in=region_ids
|
||||
).values_list('region_id', flat=True)
|
||||
)
|
||||
|
||||
# Determine which regions need to be created
|
||||
regions_to_create = region_ids - existing_visited_regions
|
||||
|
||||
if not regions_to_create:
|
||||
return 0
|
||||
|
||||
if dry_run:
|
||||
return len(regions_to_create)
|
||||
|
||||
# Bulk create new VisitedRegion entries
|
||||
new_visited_regions = [
|
||||
VisitedRegion(region_id=region_id, user_id=user_id)
|
||||
for region_id in regions_to_create
|
||||
]
|
||||
|
||||
with transaction.atomic():
|
||||
VisitedRegion.objects.bulk_create(
|
||||
new_visited_regions,
|
||||
ignore_conflicts=True # Handle race conditions gracefully
|
||||
)
|
||||
|
||||
return len(regions_to_create)
|
||||
|
||||
def _sync_visited_cities(self, user_id, city_ids, dry_run=False):
|
||||
"""Sync visited cities for a user. Returns count of new cities created."""
|
||||
# Get existing visited cities for this user in one query
|
||||
existing_visited_cities = set(
|
||||
VisitedCity.objects.filter(
|
||||
user_id=user_id,
|
||||
city_id__in=city_ids
|
||||
).values_list('city_id', flat=True)
|
||||
)
|
||||
|
||||
# Determine which cities need to be created
|
||||
cities_to_create = city_ids - existing_visited_cities
|
||||
|
||||
if not cities_to_create:
|
||||
return 0
|
||||
|
||||
if dry_run:
|
||||
return len(cities_to_create)
|
||||
|
||||
# Bulk create new VisitedCity entries
|
||||
new_visited_cities = [
|
||||
VisitedCity(city_id=city_id, user_id=user_id)
|
||||
for city_id in cities_to_create
|
||||
]
|
||||
|
||||
with transaction.atomic():
|
||||
VisitedCity.objects.bulk_create(
|
||||
new_visited_cities,
|
||||
ignore_conflicts=True # Handle race conditions gracefully
|
||||
)
|
||||
|
||||
return len(cities_to_create)
|
||||
@@ -0,0 +1,32 @@
|
||||
# Generated by Django 5.2.6 on 2025-12-15 16:46
|
||||
|
||||
import django.db.models.deletion
|
||||
import uuid
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('adventures', '0063_alter_activity_timezone_alter_lodging_timezone_and_more'),
|
||||
('contenttypes', '0002_remove_content_type_name'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='CollectionItineraryItem',
|
||||
fields=[
|
||||
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
|
||||
('object_id', models.UUIDField()),
|
||||
('date', models.DateField(blank=True, null=True)),
|
||||
('order', models.PositiveIntegerField(help_text='Manual order within a day')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('collection', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='itinerary_items', to='adventures.collection')),
|
||||
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')),
|
||||
],
|
||||
options={
|
||||
'ordering': ['date', 'order'],
|
||||
'unique_together': {('collection', 'date', 'order')},
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,23 @@
|
||||
# Generated by Django 5.2.6 on 2025-12-27 00:20
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('adventures', '0064_collectionitineraryitem'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='transportation',
|
||||
name='end_code',
|
||||
field=models.CharField(blank=True, max_length=100, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='transportation',
|
||||
name='start_code',
|
||||
field=models.CharField(blank=True, max_length=100, null=True),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,19 @@
|
||||
# Generated by Django 5.2.6 on 2026-01-02 18:11
|
||||
|
||||
import django.db.models.deletion
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('adventures', '0065_transportation_end_code_transportation_start_code'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='collection',
|
||||
name='primary_image',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='primary_for_collections', to='adventures.contentimage'),
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,48 @@
|
||||
from django.db import migrations
|
||||
import djmoney.models.fields
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("adventures", "0066_collection_primary_image"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="lodging",
|
||||
name="price_currency",
|
||||
field=djmoney.models.fields.CurrencyField(
|
||||
default="USD", editable=False, max_length=3
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="transportation",
|
||||
name="price_currency",
|
||||
field=djmoney.models.fields.CurrencyField(
|
||||
default="USD", editable=False, max_length=3
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="lodging",
|
||||
name="price",
|
||||
field=djmoney.models.fields.MoneyField(
|
||||
blank=True,
|
||||
decimal_places=2,
|
||||
default_currency="USD",
|
||||
max_digits=12,
|
||||
null=True,
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="transportation",
|
||||
name="price",
|
||||
field=djmoney.models.fields.MoneyField(
|
||||
blank=True,
|
||||
decimal_places=2,
|
||||
default_currency="USD",
|
||||
max_digits=12,
|
||||
null=True,
|
||||
),
|
||||
),
|
||||
]
|
||||
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@@ -0,0 +1,33 @@
|
||||
# Generated by Django 5.2.8 on 2026-01-05 17:06
|
||||
|
||||
import django.db.models.deletion
|
||||
import uuid
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('adventures', '0069_location_price_location_price_currency'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='CollectionItineraryDay',
|
||||
fields=[
|
||||
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
|
||||
('date', models.DateField()),
|
||||
('name', models.CharField(blank=True, max_length=200, null=True)),
|
||||
('description', models.TextField(blank=True, null=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('collection', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='itinerary_days', to='adventures.collection')),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Collection Itinerary Day',
|
||||
'verbose_name_plural': 'Collection Itinerary Days',
|
||||
'ordering': ['date'],
|
||||
'unique_together': {('collection', 'date')},
|
||||
},
|
||||
),
|
||||
]
|
||||
@@ -0,0 +1,31 @@
|
||||
# Generated by Django 5.2.8 on 2026-01-06 16:19
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('adventures', '0070_collectionitineraryday'),
|
||||
('contenttypes', '0002_remove_content_type_name'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterUniqueTogether(
|
||||
name='collectionitineraryitem',
|
||||
unique_together=set(),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='collectionitineraryitem',
|
||||
name='is_global',
|
||||
field=models.BooleanField(default=False, help_text='Applies to the whole trip (no specific date)'),
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name='collectionitineraryitem',
|
||||
constraint=models.UniqueConstraint(condition=models.Q(('is_global', False), ('date__isnull', False)), fields=('collection', 'date', 'order'), name='unique_order_per_collection_day'),
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name='collectionitineraryitem',
|
||||
constraint=models.UniqueConstraint(condition=models.Q(('is_global', True)), fields=('collection', 'order'), name='unique_order_per_collection_global'),
|
||||
),
|
||||
]
|
||||
@@ -1,4 +1,3 @@
|
||||
from django.core.exceptions import ValidationError
|
||||
import os
|
||||
import uuid
|
||||
from django.db import models
|
||||
@@ -7,8 +6,8 @@ from adventures.managers import LocationManager
|
||||
import threading
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.contrib.postgres.fields import ArrayField
|
||||
from django.forms import ValidationError
|
||||
from django_resized import ResizedImageField
|
||||
from djmoney.models.fields import MoneyField
|
||||
from worldtravel.models import City, Country, Region, VisitedCity, VisitedRegion
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.utils import timezone
|
||||
@@ -16,6 +15,7 @@ from adventures.utils.timezones import TIMEZONES
|
||||
from adventures.utils.sports_types import SPORT_TYPE_CHOICES
|
||||
from adventures.utils.get_is_visited import is_location_visited
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||
from django.db.models import Q
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.contrib.contenttypes.fields import GenericRelation
|
||||
|
||||
@@ -159,6 +159,7 @@ class Location(models.Model):
|
||||
tags = ArrayField(models.CharField(max_length=100), blank=True, null=True)
|
||||
description = models.TextField(blank=True, null=True)
|
||||
rating = models.FloatField(blank=True, null=True)
|
||||
price = MoneyField(max_digits=12, decimal_places=2, default_currency='USD', null=True, blank=True)
|
||||
link = models.URLField(blank=True, null=True, max_length=2083)
|
||||
is_public = models.BooleanField(default=False)
|
||||
longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
|
||||
@@ -286,6 +287,13 @@ class Collection(models.Model):
|
||||
is_archived = models.BooleanField(default=False)
|
||||
shared_with = models.ManyToManyField(User, related_name='shared_with', blank=True)
|
||||
link = models.URLField(blank=True, null=True, max_length=2083)
|
||||
primary_image = models.ForeignKey(
|
||||
'ContentImage',
|
||||
on_delete=models.SET_NULL,
|
||||
related_name='primary_for_collections',
|
||||
null=True,
|
||||
blank=True,
|
||||
)
|
||||
|
||||
# if connected locations are private and collection is public, raise an error
|
||||
def clean(self):
|
||||
@@ -305,6 +313,7 @@ class Transportation(models.Model):
|
||||
name = models.CharField(max_length=200)
|
||||
description = models.TextField(blank=True, null=True)
|
||||
rating = models.FloatField(blank=True, null=True)
|
||||
price = MoneyField(max_digits=12, decimal_places=2, default_currency='USD', null=True, blank=True)
|
||||
link = models.URLField(blank=True, null=True, max_length=2083)
|
||||
date = models.DateTimeField(blank=True, null=True)
|
||||
end_date = models.DateTimeField(blank=True, null=True)
|
||||
@@ -316,6 +325,8 @@ class Transportation(models.Model):
|
||||
origin_longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
|
||||
destination_latitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
|
||||
destination_longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
|
||||
start_code = models.CharField(max_length=100, blank=True, null=True) # Could be airport code, station code, etc.
|
||||
end_code = models.CharField(max_length=100, blank=True, null=True) # Could be airport code, station code, etc.
|
||||
to_location = models.CharField(max_length=200, blank=True, null=True)
|
||||
is_public = models.BooleanField(default=False)
|
||||
collection = models.ForeignKey('Collection', on_delete=models.CASCADE, blank=True, null=True)
|
||||
@@ -551,7 +562,7 @@ class Lodging(models.Model):
|
||||
check_out = models.DateTimeField(blank=True, null=True)
|
||||
timezone = models.CharField(max_length=50, choices=[(tz, tz) for tz in TIMEZONES], null=True, blank=True)
|
||||
reservation_number = models.CharField(max_length=100, blank=True, null=True)
|
||||
price = models.DecimalField(max_digits=9, decimal_places=2, blank=True, null=True)
|
||||
price = MoneyField(max_digits=12, decimal_places=2, default_currency='USD', null=True, blank=True)
|
||||
latitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
|
||||
longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
|
||||
location = models.CharField(max_length=200, blank=True, null=True)
|
||||
@@ -673,4 +684,104 @@ class Activity(models.Model):
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Activity"
|
||||
verbose_name_plural = "Activities"
|
||||
verbose_name_plural = "Activities"
|
||||
|
||||
class CollectionItineraryDay(models.Model):
|
||||
"""Metadata for a specific day in a collection's itinerary"""
|
||||
id = models.UUIDField(default=uuid.uuid4, editable=False, unique=True, primary_key=True)
|
||||
collection = models.ForeignKey('Collection', on_delete=models.CASCADE, related_name='itinerary_days')
|
||||
date = models.DateField()
|
||||
name = models.CharField(max_length=200, blank=True, null=True)
|
||||
description = models.TextField(blank=True, null=True)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta:
|
||||
unique_together = [['collection', 'date']]
|
||||
ordering = ['date']
|
||||
verbose_name = "Collection Itinerary Day"
|
||||
verbose_name_plural = "Collection Itinerary Days"
|
||||
|
||||
def __str__(self):
|
||||
return f"{
|
||||
self.collection.name} - {self.date} - {self.name or 'Unnamed Day'}"
|
||||
|
||||
|
||||
class CollectionItineraryItem(models.Model):
|
||||
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
|
||||
|
||||
collection = models.ForeignKey(
|
||||
Collection,
|
||||
on_delete=models.CASCADE,
|
||||
related_name="itinerary_items"
|
||||
)
|
||||
|
||||
# Generic reference to Visit, Transportation, Lodging, Note, etc
|
||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||
object_id = models.UUIDField()
|
||||
item = GenericForeignKey("content_type", "object_id")
|
||||
|
||||
# Placement (planning concern, not content concern)
|
||||
# Either a specific date or marked as trip-wide (global). Exactly one of these applies.
|
||||
date = models.DateField(blank=True, null=True)
|
||||
is_global = models.BooleanField(default=False, help_text="Applies to the whole trip (no specific date)")
|
||||
order = models.PositiveIntegerField(help_text="Manual order within a day")
|
||||
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ["date", "order"]
|
||||
constraints = [
|
||||
# Ensure unique order per day for dated items
|
||||
models.UniqueConstraint(
|
||||
fields=["collection", "date", "order"],
|
||||
name="unique_order_per_collection_day",
|
||||
condition=Q(is_global=False) & Q(date__isnull=False),
|
||||
),
|
||||
# Ensure unique order within the global group for a collection
|
||||
models.UniqueConstraint(
|
||||
fields=["collection", "order"],
|
||||
name="unique_order_per_collection_global",
|
||||
condition=Q(is_global=True),
|
||||
),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
scope = "GLOBAL" if self.is_global else str(self.date)
|
||||
return f"{self.collection.name} - {self.content_type.model} - {scope} ({self.order})"
|
||||
|
||||
def clean(self):
|
||||
# Enforce XOR between date and is_global
|
||||
if self.is_global and self.date is not None:
|
||||
raise ValidationError({
|
||||
"is_global": "Global items must not have a date.",
|
||||
"date": "Provide either a date or set is_global, not both.",
|
||||
})
|
||||
if (not self.is_global) and self.date is None:
|
||||
raise ValidationError({
|
||||
"date": "Dated items must include a date. To create a trip-wide item, set is_global=true.",
|
||||
})
|
||||
|
||||
@property
|
||||
def start_datetime(self):
|
||||
obj = self.item
|
||||
|
||||
for field in ("start_date", "check_in", "date"):
|
||||
if hasattr(obj, field):
|
||||
value = getattr(obj, field)
|
||||
if value:
|
||||
return value
|
||||
|
||||
return None
|
||||
|
||||
@property
|
||||
def end_datetime(self):
|
||||
obj = self.item
|
||||
|
||||
for field in ("end_date", "check_out"):
|
||||
if hasattr(obj, field):
|
||||
value = getattr(obj, field)
|
||||
if value:
|
||||
return value
|
||||
|
||||
return None
|
||||
|
||||
@@ -23,15 +23,28 @@ class IsPublicReadOnly(permissions.BasePermission):
|
||||
|
||||
class CollectionShared(permissions.BasePermission):
|
||||
"""
|
||||
Allow full access if user is in shared_with of collection(s) or owner,
|
||||
read-only if public or shared_with,
|
||||
write only if owner or shared_with.
|
||||
Permission class for collection sharing functionality.
|
||||
|
||||
This permission handles access control for collections and objects that are shared
|
||||
through collections. It supports:
|
||||
|
||||
- Collection invite acceptance/declining for users with pending invites
|
||||
- Full access for collection owners and shared users
|
||||
- Read-only access for public objects (anonymous and authenticated users)
|
||||
- Write access for owners and users shared via collections
|
||||
|
||||
Access Rules:
|
||||
- Anonymous users: read-only access to public objects
|
||||
- Invited users: can accept/decline invites for collections they're invited to
|
||||
- Collection owners: full access to their collections and related objects
|
||||
- Shared users: full access to collections they're shared with and related objects
|
||||
- Public access: read-only for objects marked as public
|
||||
"""
|
||||
def has_object_permission(self, request, view, obj):
|
||||
user = request.user
|
||||
if not user or not user.is_authenticated:
|
||||
# Anonymous: only read public
|
||||
return request.method in permissions.SAFE_METHODS and obj.is_public
|
||||
return request.method in permissions.SAFE_METHODS and getattr(obj, 'is_public', False)
|
||||
|
||||
# Special case for accept_invite and decline_invite actions
|
||||
# Allow access if user has a pending invite for this collection
|
||||
@@ -55,10 +68,10 @@ class CollectionShared(permissions.BasePermission):
|
||||
|
||||
# Read permission if public or owner
|
||||
if request.method in permissions.SAFE_METHODS:
|
||||
return obj.is_public or obj.user == user
|
||||
return getattr(obj, 'is_public', False) or getattr(obj, 'user', None) == user
|
||||
|
||||
# Write permission only if owner or shared user via collections
|
||||
if obj.user == user:
|
||||
if getattr(obj, 'user', None) == user:
|
||||
return True
|
||||
|
||||
if hasattr(obj, 'collections'):
|
||||
@@ -120,6 +133,12 @@ class IsOwnerOrSharedWithFullAccess(permissions.BasePermission):
|
||||
if hasattr(obj, 'location'):
|
||||
obj = obj.location
|
||||
|
||||
if type(obj).__name__ == 'CollectionItineraryItem':
|
||||
print("Checking permissions for CollectionItineraryItem object", obj)
|
||||
if hasattr(obj, 'object_id') and hasattr(obj, 'content_type'):
|
||||
content_object = obj.content_type.get_object_for_this_type(id=obj.object_id)
|
||||
obj = content_object
|
||||
|
||||
# Anonymous users only get read access to public objects
|
||||
if not user or not user.is_authenticated:
|
||||
return is_safe_method and getattr(obj, 'is_public', False)
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import os
|
||||
from .models import Location, ContentImage, ChecklistItem, Collection, Note, Transportation, Checklist, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity
|
||||
from .models import Location, ContentImage, ChecklistItem, Collection, Note, Transportation, Checklist, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity, CollectionItineraryItem, CollectionItineraryDay
|
||||
from rest_framework import serializers
|
||||
from main.utils import CustomModelSerializer
|
||||
from users.serializers import CustomUserDetailsSerializer
|
||||
@@ -7,11 +7,38 @@ from worldtravel.serializers import CountrySerializer, RegionSerializer, CitySer
|
||||
from geopy.distance import geodesic
|
||||
from integrations.models import ImmichIntegration
|
||||
from adventures.utils.geojson import gpx_to_geojson
|
||||
import gpxpy
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _build_profile_pic_url(user):
|
||||
"""Return absolute-ish profile pic URL using PUBLIC_URL if available."""
|
||||
if not getattr(user, 'profile_pic', None):
|
||||
return None
|
||||
|
||||
public_url = os.environ.get('PUBLIC_URL', 'http://127.0.0.1:8000').rstrip('/')
|
||||
public_url = public_url.replace("'", "")
|
||||
return f"{public_url}/media/{user.profile_pic.name}"
|
||||
|
||||
|
||||
def _serialize_collaborator(user, owner_id=None, request_user=None):
|
||||
if not user:
|
||||
return None
|
||||
|
||||
return {
|
||||
'uuid': str(user.uuid),
|
||||
'username': user.username,
|
||||
'first_name': user.first_name,
|
||||
'last_name': user.last_name,
|
||||
'profile_pic': _build_profile_pic_url(user),
|
||||
'public_profile': bool(getattr(user, 'public_profile', False)),
|
||||
'is_owner': owner_id == user.id,
|
||||
'is_current_user': bool(request_user and request_user.id == user.id),
|
||||
}
|
||||
|
||||
|
||||
class ContentImageSerializer(CustomModelSerializer):
|
||||
class Meta:
|
||||
model = ContentImage
|
||||
@@ -205,6 +232,31 @@ class VisitSerializer(serializers.ModelSerializer):
|
||||
if not validated_data.get('end_date') and validated_data.get('start_date'):
|
||||
validated_data['end_date'] = validated_data['start_date']
|
||||
return super().create(validated_data)
|
||||
|
||||
|
||||
class CalendarVisitSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = Visit
|
||||
fields = ['id', 'start_date', 'end_date', 'timezone']
|
||||
|
||||
|
||||
class CalendarLocationSerializer(serializers.ModelSerializer):
|
||||
visits = CalendarVisitSerializer(many=True, read_only=True)
|
||||
category = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = Location
|
||||
fields = ['id', 'name', 'location', 'category', 'visits']
|
||||
|
||||
def get_category(self, obj):
|
||||
if not obj.category:
|
||||
return None
|
||||
|
||||
return {
|
||||
"name": obj.category.name,
|
||||
"icon": obj.category.icon,
|
||||
}
|
||||
|
||||
|
||||
class LocationSerializer(CustomModelSerializer):
|
||||
images = serializers.SerializerMethodField()
|
||||
@@ -227,7 +279,8 @@ class LocationSerializer(CustomModelSerializer):
|
||||
fields = [
|
||||
'id', 'name', 'description', 'rating', 'tags', 'location',
|
||||
'is_public', 'collections', 'created_at', 'updated_at', 'images', 'link', 'longitude',
|
||||
'latitude', 'visits', 'is_visited', 'category', 'attachments', 'user', 'city', 'country', 'region', 'trails'
|
||||
'latitude', 'visits', 'is_visited', 'category', 'attachments', 'user', 'city', 'country', 'region', 'trails',
|
||||
'price', 'price_currency'
|
||||
]
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'is_visited']
|
||||
|
||||
@@ -424,17 +477,19 @@ class TransportationSerializer(CustomModelSerializer):
|
||||
distance = serializers.SerializerMethodField()
|
||||
images = serializers.SerializerMethodField()
|
||||
attachments = serializers.SerializerMethodField()
|
||||
travel_duration_minutes = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = Transportation
|
||||
fields = [
|
||||
'id', 'user', 'type', 'name', 'description', 'rating',
|
||||
'id', 'user', 'type', 'name', 'description', 'rating', 'price', 'price_currency',
|
||||
'link', 'date', 'flight_number', 'from_location', 'to_location',
|
||||
'is_public', 'collection', 'created_at', 'updated_at', 'end_date',
|
||||
'origin_latitude', 'origin_longitude', 'destination_latitude', 'destination_longitude',
|
||||
'start_timezone', 'end_timezone', 'distance', 'images', 'attachments'
|
||||
'start_timezone', 'end_timezone', 'distance', 'images', 'attachments', 'start_code', 'end_code',
|
||||
'travel_duration_minutes'
|
||||
]
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'distance']
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'distance', 'travel_duration_minutes']
|
||||
|
||||
def get_images(self, obj):
|
||||
serializer = ContentImageSerializer(obj.images.all(), many=True, context=self.context)
|
||||
@@ -447,6 +502,10 @@ class TransportationSerializer(CustomModelSerializer):
|
||||
return [attachment for attachment in serializer.data if attachment is not None]
|
||||
|
||||
def get_distance(self, obj):
|
||||
gpx_distance = self._get_gpx_distance_km(obj)
|
||||
if gpx_distance is not None:
|
||||
return gpx_distance
|
||||
|
||||
if (
|
||||
obj.origin_latitude and obj.origin_longitude and
|
||||
obj.destination_latitude and obj.destination_longitude
|
||||
@@ -459,6 +518,68 @@ class TransportationSerializer(CustomModelSerializer):
|
||||
return None
|
||||
return None
|
||||
|
||||
def _get_gpx_distance_km(self, obj):
|
||||
gpx_attachments = obj.attachments.filter(file__iendswith='.gpx')
|
||||
for attachment in gpx_attachments:
|
||||
distance_km = self._parse_gpx_distance_km(attachment.file)
|
||||
if distance_km is not None:
|
||||
return distance_km
|
||||
return None
|
||||
|
||||
def _parse_gpx_distance_km(self, gpx_file_field):
|
||||
try:
|
||||
with gpx_file_field.open('r') as gpx_file:
|
||||
gpx = gpxpy.parse(gpx_file)
|
||||
|
||||
total_meters = 0.0
|
||||
|
||||
for track in gpx.tracks:
|
||||
for segment in track.segments:
|
||||
segment_length = segment.length_3d() or segment.length_2d()
|
||||
if segment_length:
|
||||
total_meters += segment_length
|
||||
|
||||
for route in gpx.routes:
|
||||
route_length = route.length_3d() or route.length_2d()
|
||||
if route_length:
|
||||
total_meters += route_length
|
||||
|
||||
if total_meters > 0:
|
||||
return round(total_meters / 1000, 2)
|
||||
except Exception as exc:
|
||||
logger.warning(
|
||||
"Failed to calculate GPX distance for file %s: %s",
|
||||
getattr(gpx_file_field, 'name', 'unknown'),
|
||||
exc,
|
||||
)
|
||||
return None
|
||||
|
||||
def get_travel_duration_minutes(self, obj):
|
||||
if not obj.date or not obj.end_date:
|
||||
return None
|
||||
|
||||
if self._is_all_day(obj.date) and self._is_all_day(obj.end_date):
|
||||
return None
|
||||
|
||||
try:
|
||||
total_minutes = int((obj.end_date - obj.date).total_seconds() // 60)
|
||||
return total_minutes if total_minutes >= 0 else None
|
||||
except Exception:
|
||||
logger.warning(
|
||||
"Failed to calculate travel duration for transportation %s",
|
||||
getattr(obj, "id", "unknown"),
|
||||
exc_info=True,
|
||||
)
|
||||
return None
|
||||
|
||||
def _is_all_day(self, dt_value):
|
||||
return (
|
||||
dt_value.time().hour == 0
|
||||
and dt_value.time().minute == 0
|
||||
and dt_value.time().second == 0
|
||||
and dt_value.time().microsecond == 0
|
||||
)
|
||||
|
||||
class LodgingSerializer(CustomModelSerializer):
|
||||
images = serializers.SerializerMethodField()
|
||||
attachments = serializers.SerializerMethodField()
|
||||
@@ -467,7 +588,7 @@ class LodgingSerializer(CustomModelSerializer):
|
||||
model = Lodging
|
||||
fields = [
|
||||
'id', 'user', 'name', 'description', 'rating', 'link', 'check_in', 'check_out',
|
||||
'reservation_number', 'price', 'latitude', 'longitude', 'location', 'is_public',
|
||||
'reservation_number', 'price', 'price_currency', 'latitude', 'longitude', 'location', 'is_public',
|
||||
'collection', 'created_at', 'updated_at', 'type', 'timezone', 'images', 'attachments'
|
||||
]
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user']
|
||||
@@ -583,16 +704,74 @@ class ChecklistSerializer(CustomModelSerializer):
|
||||
return data
|
||||
|
||||
class CollectionSerializer(CustomModelSerializer):
|
||||
collaborators = serializers.SerializerMethodField()
|
||||
locations = serializers.SerializerMethodField()
|
||||
transportations = serializers.SerializerMethodField()
|
||||
notes = serializers.SerializerMethodField()
|
||||
checklists = serializers.SerializerMethodField()
|
||||
lodging = serializers.SerializerMethodField()
|
||||
status = serializers.SerializerMethodField()
|
||||
days_until_start = serializers.SerializerMethodField()
|
||||
primary_image = ContentImageSerializer(read_only=True)
|
||||
primary_image_id = serializers.PrimaryKeyRelatedField(
|
||||
queryset=ContentImage.objects.all(),
|
||||
source='primary_image',
|
||||
write_only=True,
|
||||
required=False,
|
||||
allow_null=True,
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = Collection
|
||||
fields = ['id', 'description', 'user', 'name', 'is_public', 'locations', 'created_at', 'start_date', 'end_date', 'transportations', 'notes', 'updated_at', 'checklists', 'is_archived', 'shared_with', 'link', 'lodging']
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'shared_with']
|
||||
fields = [
|
||||
'id',
|
||||
'description',
|
||||
'user',
|
||||
'name',
|
||||
'is_public',
|
||||
'locations',
|
||||
'created_at',
|
||||
'start_date',
|
||||
'end_date',
|
||||
'transportations',
|
||||
'notes',
|
||||
'updated_at',
|
||||
'checklists',
|
||||
'is_archived',
|
||||
'shared_with',
|
||||
'collaborators',
|
||||
'link',
|
||||
'lodging',
|
||||
'status',
|
||||
'days_until_start',
|
||||
'primary_image',
|
||||
'primary_image_id',
|
||||
]
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'shared_with', 'status', 'days_until_start', 'primary_image']
|
||||
|
||||
def get_collaborators(self, obj):
|
||||
request = self.context.get('request')
|
||||
request_user = getattr(request, 'user', None) if request else None
|
||||
|
||||
users = []
|
||||
if obj.user:
|
||||
users.append(obj.user)
|
||||
users.extend(list(obj.shared_with.all()))
|
||||
|
||||
collaborators = []
|
||||
seen = set()
|
||||
for user in users:
|
||||
if not user:
|
||||
continue
|
||||
key = str(user.uuid)
|
||||
if key in seen:
|
||||
continue
|
||||
seen.add(key)
|
||||
serialized = _serialize_collaborator(user, owner_id=obj.user_id, request_user=request_user)
|
||||
if serialized:
|
||||
collaborators.append(serialized)
|
||||
|
||||
return collaborators
|
||||
|
||||
def get_locations(self, obj):
|
||||
if self.context.get('nested', False):
|
||||
@@ -629,6 +808,72 @@ class CollectionSerializer(CustomModelSerializer):
|
||||
return []
|
||||
return LodgingSerializer(obj.lodging_set.all(), many=True, context=self.context).data
|
||||
|
||||
def get_status(self, obj):
|
||||
"""Calculate the status of the collection based on dates"""
|
||||
from datetime import date
|
||||
|
||||
# If no dates, it's a folder
|
||||
if not obj.start_date or not obj.end_date:
|
||||
return 'folder'
|
||||
|
||||
today = date.today()
|
||||
|
||||
# Future trip
|
||||
if obj.start_date > today:
|
||||
return 'upcoming'
|
||||
|
||||
# Past trip
|
||||
if obj.end_date < today:
|
||||
return 'completed'
|
||||
|
||||
# Current trip
|
||||
return 'in_progress'
|
||||
|
||||
def get_days_until_start(self, obj):
|
||||
"""Calculate days until start for upcoming collections"""
|
||||
from datetime import date
|
||||
|
||||
if not obj.start_date:
|
||||
return None
|
||||
|
||||
today = date.today()
|
||||
|
||||
if obj.start_date > today:
|
||||
return (obj.start_date - today).days
|
||||
|
||||
return None
|
||||
|
||||
def validate(self, attrs):
|
||||
data = super().validate(attrs)
|
||||
|
||||
# Only validate primary image when explicitly provided
|
||||
if 'primary_image' not in data:
|
||||
return data
|
||||
|
||||
primary_image = data.get('primary_image')
|
||||
if primary_image is None:
|
||||
return data
|
||||
|
||||
request = self.context.get('request')
|
||||
if request and primary_image.user != request.user:
|
||||
raise serializers.ValidationError({
|
||||
'primary_image_id': 'You can only choose cover images you own.'
|
||||
})
|
||||
|
||||
if self.instance and not self._image_belongs_to_collection(primary_image, self.instance):
|
||||
raise serializers.ValidationError({
|
||||
'primary_image_id': 'Cover image must come from a location in this collection.'
|
||||
})
|
||||
|
||||
return data
|
||||
|
||||
def _image_belongs_to_collection(self, image, collection):
|
||||
if ContentImage.objects.filter(id=image.id, location__collections=collection).exists():
|
||||
return True
|
||||
if ContentImage.objects.filter(id=image.id, visit__location__collections=collection).exists():
|
||||
return True
|
||||
return False
|
||||
|
||||
def to_representation(self, instance):
|
||||
representation = super().to_representation(instance)
|
||||
|
||||
@@ -660,34 +905,109 @@ class CollectionInviteSerializer(serializers.ModelSerializer):
|
||||
class UltraSlimCollectionSerializer(serializers.ModelSerializer):
|
||||
location_images = serializers.SerializerMethodField()
|
||||
location_count = serializers.SerializerMethodField()
|
||||
status = serializers.SerializerMethodField()
|
||||
days_until_start = serializers.SerializerMethodField()
|
||||
primary_image = ContentImageSerializer(read_only=True)
|
||||
collaborators = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = Collection
|
||||
fields = [
|
||||
'id', 'user', 'name', 'description', 'is_public', 'start_date', 'end_date',
|
||||
'is_archived', 'link', 'created_at', 'updated_at', 'location_images',
|
||||
'location_count', 'shared_with'
|
||||
'location_count', 'shared_with', 'collaborators', 'status', 'days_until_start', 'primary_image'
|
||||
]
|
||||
read_only_fields = fields # All fields are read-only for listing
|
||||
|
||||
def get_collaborators(self, obj):
|
||||
request = self.context.get('request')
|
||||
request_user = getattr(request, 'user', None) if request else None
|
||||
|
||||
users = []
|
||||
if obj.user:
|
||||
users.append(obj.user)
|
||||
users.extend(list(obj.shared_with.all()))
|
||||
|
||||
collaborators = []
|
||||
seen = set()
|
||||
for user in users:
|
||||
if not user:
|
||||
continue
|
||||
key = str(user.uuid)
|
||||
if key in seen:
|
||||
continue
|
||||
seen.add(key)
|
||||
serialized = _serialize_collaborator(user, owner_id=obj.user_id, request_user=request_user)
|
||||
if serialized:
|
||||
collaborators.append(serialized)
|
||||
|
||||
return collaborators
|
||||
|
||||
def get_location_images(self, obj):
|
||||
"""Get primary images from locations in this collection, optimized with select_related"""
|
||||
# Filter first, then slice (removed slicing)
|
||||
images = ContentImage.objects.filter(
|
||||
location__collections=obj
|
||||
).select_related('user').prefetch_related('location')
|
||||
images = list(
|
||||
ContentImage.objects.filter(location__collections=obj)
|
||||
.select_related('user')
|
||||
)
|
||||
|
||||
return ContentImageSerializer(
|
||||
def sort_key(image):
|
||||
if obj.primary_image and image.id == obj.primary_image.id:
|
||||
return (0, str(image.id))
|
||||
if image.is_primary:
|
||||
return (1, str(image.id))
|
||||
return (2, str(image.id))
|
||||
|
||||
images.sort(key=sort_key)
|
||||
|
||||
serializer = ContentImageSerializer(
|
||||
images,
|
||||
many=True,
|
||||
context={'request': self.context.get('request')}
|
||||
).data
|
||||
)
|
||||
# Filter out None values from the serialized data
|
||||
return [image for image in serializer.data if image is not None]
|
||||
|
||||
def get_location_count(self, obj):
|
||||
"""Get count of locations in this collection"""
|
||||
# This uses the cached count if available, or does a simple count query
|
||||
return obj.locations.count()
|
||||
|
||||
def get_status(self, obj):
|
||||
"""Calculate the status of the collection based on dates"""
|
||||
from datetime import date
|
||||
|
||||
# If no dates, it's a folder
|
||||
if not obj.start_date or not obj.end_date:
|
||||
return 'folder'
|
||||
|
||||
today = date.today()
|
||||
|
||||
# Future trip
|
||||
if obj.start_date > today:
|
||||
return 'upcoming'
|
||||
|
||||
# Past trip
|
||||
if obj.end_date < today:
|
||||
return 'completed'
|
||||
|
||||
# Current trip
|
||||
return 'in_progress'
|
||||
|
||||
def get_days_until_start(self, obj):
|
||||
"""Calculate days until start for upcoming collections"""
|
||||
from datetime import date
|
||||
|
||||
if not obj.start_date:
|
||||
return None
|
||||
|
||||
today = date.today()
|
||||
|
||||
if obj.start_date > today:
|
||||
return (obj.start_date - today).days
|
||||
|
||||
return None
|
||||
|
||||
def to_representation(self, instance):
|
||||
representation = super().to_representation(instance)
|
||||
|
||||
@@ -700,4 +1020,47 @@ class UltraSlimCollectionSerializer(serializers.ModelSerializer):
|
||||
shared_uuids.append(str(user.uuid))
|
||||
representation['shared_with'] = shared_uuids
|
||||
return representation
|
||||
|
||||
|
||||
class CollectionItineraryDaySerializer(CustomModelSerializer):
|
||||
class Meta:
|
||||
model = CollectionItineraryDay
|
||||
fields = ['id', 'collection', 'date', 'name', 'description', 'created_at', 'updated_at']
|
||||
read_only_fields = ['id', 'created_at', 'updated_at']
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
# Security: Prevent changing collection or date after creation
|
||||
# This prevents shared users from reassigning itinerary days to themselves
|
||||
validated_data.pop('collection', None)
|
||||
validated_data.pop('date', None)
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
class CollectionItineraryItemSerializer(CustomModelSerializer):
|
||||
item = serializers.SerializerMethodField()
|
||||
start_datetime = serializers.ReadOnlyField()
|
||||
end_datetime = serializers.ReadOnlyField()
|
||||
object_name = serializers.ReadOnlyField(source='content_type.model')
|
||||
|
||||
class Meta:
|
||||
model = CollectionItineraryItem
|
||||
fields = ['id', 'collection', 'content_type', 'object_id', 'item', 'date', 'is_global', 'order', 'start_datetime', 'end_datetime', 'created_at', 'object_name']
|
||||
read_only_fields = ['id', 'created_at', 'start_datetime', 'end_datetime', 'item', 'object_name']
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
# Security: Prevent changing collection, content_type, or object_id after creation
|
||||
# This prevents shared users from reassigning itinerary items to themselves
|
||||
# or linking items to objects they don't have permission to access
|
||||
validated_data.pop('collection', None)
|
||||
validated_data.pop('content_type', None)
|
||||
validated_data.pop('object_id', None)
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
def get_item(self, obj):
|
||||
"""Return id and type for the linked item"""
|
||||
if not obj.item:
|
||||
return None
|
||||
|
||||
return {
|
||||
'id': str(obj.item.id),
|
||||
'type': obj.content_type.model,
|
||||
}
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
from django.db.models.signals import m2m_changed
|
||||
from django.db.models.signals import m2m_changed, post_delete
|
||||
from django.dispatch import receiver
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
|
||||
from adventures.models import Location
|
||||
|
||||
|
||||
@receiver(m2m_changed, sender=Location.collections.through)
|
||||
def update_adventure_publicity(sender, instance, action, **kwargs):
|
||||
"""
|
||||
@@ -24,3 +27,48 @@ def update_adventure_publicity(sender, instance, action, **kwargs):
|
||||
elif not has_public_collection and instance.is_public:
|
||||
instance.is_public = False
|
||||
instance.save(update_fields=['is_public'])
|
||||
|
||||
|
||||
@receiver(post_delete)
|
||||
def _remove_collection_itinerary_items_on_object_delete(sender, instance, **kwargs):
|
||||
"""
|
||||
When any model instance is deleted, remove any CollectionItineraryItem that
|
||||
refers to it via the GenericForeignKey (matches by ContentType and object_id).
|
||||
|
||||
This ensures that if a referenced item (e.g. a `Location`, `Visit`, `Transportation`,
|
||||
`Note`, etc.) is deleted, the itinerary entry that pointed to it is also removed.
|
||||
"""
|
||||
# Avoid acting when a CollectionItineraryItem itself is deleted
|
||||
# to prevent needless extra queries.
|
||||
if sender.__name__ == 'CollectionItineraryItem':
|
||||
return
|
||||
|
||||
# Resolve the content type for the model that was deleted
|
||||
try:
|
||||
ct = ContentType.objects.get_for_model(sender)
|
||||
except Exception:
|
||||
return
|
||||
|
||||
# Import here to avoid circular import problems at module import time
|
||||
from adventures.models import CollectionItineraryItem
|
||||
|
||||
# Try matching the primary key in its native form first, then as a string.
|
||||
# CollectionItineraryItem.object_id is a UUIDField in the model, but some
|
||||
# senders might have different PK representations; handle both safely.
|
||||
pk = instance.pk
|
||||
deleted = False
|
||||
try:
|
||||
qs = CollectionItineraryItem.objects.filter(content_type=ct, object_id=pk)
|
||||
if qs.exists():
|
||||
qs.delete()
|
||||
deleted = True
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if not deleted:
|
||||
try:
|
||||
CollectionItineraryItem.objects.filter(content_type=ct, object_id=str(pk)).delete()
|
||||
except Exception:
|
||||
# If deletion fails for any reason, do nothing; we don't want to
|
||||
# raise errors during another model's delete.
|
||||
pass
|
||||
|
||||
@@ -23,6 +23,8 @@ router.register(r'backup', BackupViewSet, basename='backup')
|
||||
router.register(r'trails', TrailViewSet, basename='trails')
|
||||
router.register(r'activities', ActivityViewSet, basename='activities')
|
||||
router.register(r'visits', VisitViewSet, basename='visits')
|
||||
router.register(r'itineraries', ItineraryViewSet, basename='itineraries')
|
||||
router.register(r'itinerary-days', ItineraryDayViewSet, basename='itinerary-days')
|
||||
|
||||
urlpatterns = [
|
||||
# Include the router under the 'api/' prefix
|
||||
|
||||
221
backend/server/adventures/utils/autogenerate_itinerary.py
Normal file
221
backend/server/adventures/utils/autogenerate_itinerary.py
Normal file
@@ -0,0 +1,221 @@
|
||||
from typing import List
|
||||
from datetime import date, timedelta
|
||||
from django.db import transaction
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.utils import timezone
|
||||
from pytz import timezone as pytz_timezone
|
||||
from adventures.models import Collection, CollectionItineraryItem, Visit, Lodging, Transportation, Note, Checklist
|
||||
from rest_framework.exceptions import ValidationError
|
||||
|
||||
|
||||
def _datetime_to_date_in_timezone(dt, timezone_str: str | None) -> date:
|
||||
"""
|
||||
Convert a datetime to a date, accounting for timezone only if there's actual time information.
|
||||
|
||||
If the datetime is at UTC midnight (00:00:00), treat it as a date-only value and don't convert.
|
||||
If the datetime has a time component, apply timezone conversion.
|
||||
|
||||
Args:
|
||||
dt: datetime object (can be timezone-aware or naive)
|
||||
timezone_str: IANA timezone string (e.g., 'America/New_York')
|
||||
|
||||
Returns:
|
||||
date: The date in the specified timezone (or UTC if date-only)
|
||||
"""
|
||||
if dt is None:
|
||||
return None
|
||||
|
||||
# If it's already a date, return it
|
||||
if isinstance(dt, date) and not hasattr(dt, 'time'):
|
||||
return dt
|
||||
|
||||
# Check if this is a date-only value (stored as UTC midnight)
|
||||
# If time is 00:00:00, treat it as date-only and don't apply timezone conversion
|
||||
if hasattr(dt, 'hour') and dt.hour == 0 and dt.minute == 0 and dt.second == 0:
|
||||
return dt.date() if hasattr(dt, 'date') else dt
|
||||
|
||||
# Ensure datetime is timezone-aware (assume UTC if naive)
|
||||
if hasattr(dt, 'tzinfo') and dt.tzinfo is None:
|
||||
dt = timezone.make_aware(dt, timezone.utc)
|
||||
|
||||
# Convert to target timezone if provided, otherwise use UTC
|
||||
if timezone_str:
|
||||
try:
|
||||
target_tz = pytz_timezone(timezone_str)
|
||||
dt = dt.astimezone(target_tz)
|
||||
except Exception:
|
||||
# If timezone conversion fails, use UTC
|
||||
pass
|
||||
|
||||
return dt.date() if hasattr(dt, 'date') else dt
|
||||
def auto_generate_itinerary(collection: Collection) -> List[CollectionItineraryItem]:
|
||||
"""
|
||||
Auto-generate itinerary items for a collection based on dated records.
|
||||
|
||||
Rules:
|
||||
- Visits: Create one item per day of the visit (spanning multiple days)
|
||||
- Lodging: Create one item on check_in date only
|
||||
- Transportation: Create one item on start date
|
||||
- Notes: Create one item on their date if present
|
||||
- Checklists: Create one item on their date if present
|
||||
|
||||
Order within a day (incremental):
|
||||
1. Lodging (check-ins)
|
||||
2. Visits
|
||||
3. Transportation
|
||||
4. Notes
|
||||
5. Checklists
|
||||
|
||||
Args:
|
||||
collection: Collection to generate itinerary for
|
||||
|
||||
Returns:
|
||||
List[CollectionItineraryItem]: Created itinerary items
|
||||
|
||||
Raises:
|
||||
ValidationError: If collection already has itinerary items or has no dated records
|
||||
"""
|
||||
|
||||
# Validation: collection must have zero itinerary items
|
||||
if collection.itinerary_items.exists():
|
||||
raise ValidationError({
|
||||
"detail": "Collection already has itinerary items. Cannot auto-generate."
|
||||
})
|
||||
|
||||
# Get collection date range
|
||||
if not collection.start_date or not collection.end_date:
|
||||
raise ValidationError({
|
||||
"detail": "Collection must have start_date and end_date set."
|
||||
})
|
||||
|
||||
start_date = collection.start_date
|
||||
end_date = collection.end_date
|
||||
|
||||
# Collect all items to be added, grouped by date
|
||||
items_by_date = {} # date -> [(content_type, object_id, priority)]
|
||||
|
||||
# Priority order for sorting within a day
|
||||
PRIORITY_LODGING = 1
|
||||
PRIORITY_VISIT = 2
|
||||
PRIORITY_TRANSPORTATION = 3
|
||||
PRIORITY_NOTE = 4
|
||||
PRIORITY_CHECKLIST = 5
|
||||
|
||||
# Process Visits: one location item per day of the visit
|
||||
# Note: We reference the Location, not the Visit itself
|
||||
from adventures.models import Location
|
||||
|
||||
visits = Visit.objects.filter(location__collections=collection).select_related('location').distinct()
|
||||
for visit in visits:
|
||||
if visit.start_date and visit.location:
|
||||
# Convert to date using visit's timezone
|
||||
visit_start = _datetime_to_date_in_timezone(visit.start_date, visit.timezone)
|
||||
visit_end = _datetime_to_date_in_timezone(visit.end_date, visit.timezone) if visit.end_date else visit_start
|
||||
|
||||
# Only include dates within collection range
|
||||
visit_start = max(visit_start, start_date)
|
||||
visit_end = min(visit_end or visit_start, end_date)
|
||||
|
||||
current_date = visit_start
|
||||
while current_date <= visit_end:
|
||||
if current_date not in items_by_date:
|
||||
items_by_date[current_date] = []
|
||||
items_by_date[current_date].append((
|
||||
ContentType.objects.get_for_model(Location),
|
||||
visit.location.id, # Use Location ID, not Visit ID
|
||||
PRIORITY_VISIT
|
||||
))
|
||||
current_date += timedelta(days=1)
|
||||
|
||||
# Process Lodging: one item on check_in date only
|
||||
lodgings = Lodging.objects.filter(collection=collection)
|
||||
for lodging in lodgings:
|
||||
if lodging.check_in:
|
||||
# Convert to date using lodging's timezone
|
||||
checkin_date = _datetime_to_date_in_timezone(lodging.check_in, lodging.timezone)
|
||||
|
||||
# Only include if within collection range
|
||||
if start_date <= checkin_date <= end_date:
|
||||
if checkin_date not in items_by_date:
|
||||
items_by_date[checkin_date] = []
|
||||
items_by_date[checkin_date].append((
|
||||
ContentType.objects.get_for_model(Lodging),
|
||||
lodging.id,
|
||||
PRIORITY_LODGING
|
||||
))
|
||||
|
||||
# Process Transportation: one item on start date
|
||||
transportations = Transportation.objects.filter(collection=collection)
|
||||
for transportation in transportations:
|
||||
if transportation.date:
|
||||
# Convert to date using transportation's start timezone
|
||||
trans_date = _datetime_to_date_in_timezone(transportation.date, transportation.start_timezone)
|
||||
|
||||
# Only include if within collection range
|
||||
if start_date <= trans_date <= end_date:
|
||||
if trans_date not in items_by_date:
|
||||
items_by_date[trans_date] = []
|
||||
items_by_date[trans_date].append((
|
||||
ContentType.objects.get_for_model(Transportation),
|
||||
transportation.id,
|
||||
PRIORITY_TRANSPORTATION
|
||||
))
|
||||
|
||||
# Process Notes: one item on their date
|
||||
notes = Note.objects.filter(collection=collection)
|
||||
for note in notes:
|
||||
if note.date:
|
||||
# Notes don't have timezone field, use UTC
|
||||
note_date = _datetime_to_date_in_timezone(note.date, None)
|
||||
|
||||
# Only include if within collection range
|
||||
if start_date <= note_date <= end_date:
|
||||
if note_date not in items_by_date:
|
||||
items_by_date[note_date] = []
|
||||
items_by_date[note_date].append((
|
||||
ContentType.objects.get_for_model(Note),
|
||||
note.id,
|
||||
PRIORITY_NOTE
|
||||
))
|
||||
|
||||
# Process Checklists: one item on their date
|
||||
checklists = Checklist.objects.filter(collection=collection)
|
||||
for checklist in checklists:
|
||||
if checklist.date:
|
||||
# Checklists don't have timezone field, use UTC
|
||||
checklist_date = _datetime_to_date_in_timezone(checklist.date, None)
|
||||
|
||||
# Only include if within collection range
|
||||
if start_date <= checklist_date <= end_date:
|
||||
if checklist_date not in items_by_date:
|
||||
items_by_date[checklist_date] = []
|
||||
items_by_date[checklist_date].append((
|
||||
ContentType.objects.get_for_model(Checklist),
|
||||
checklist.id,
|
||||
PRIORITY_CHECKLIST
|
||||
))
|
||||
|
||||
# Validation: must have at least one dated record
|
||||
if not items_by_date:
|
||||
raise ValidationError({
|
||||
"detail": "No dated records found within collection date range."
|
||||
})
|
||||
|
||||
# Create itinerary items
|
||||
created_items = []
|
||||
|
||||
for day_date in sorted(items_by_date.keys()):
|
||||
# Sort items by priority within the day
|
||||
items = sorted(items_by_date[day_date], key=lambda x: x[2])
|
||||
|
||||
for order, (content_type, object_id, priority) in enumerate(items):
|
||||
itinerary_item = CollectionItineraryItem.objects.create(
|
||||
collection=collection,
|
||||
content_type=content_type,
|
||||
object_id=object_id,
|
||||
date=day_date,
|
||||
order=order
|
||||
)
|
||||
created_items.append(itinerary_item)
|
||||
|
||||
return created_items
|
||||
115
backend/server/adventures/utils/itinerary.py
Normal file
115
backend/server/adventures/utils/itinerary.py
Normal file
@@ -0,0 +1,115 @@
|
||||
from typing import List
|
||||
from django.db import transaction
|
||||
from django.utils.dateparse import parse_date, parse_datetime
|
||||
from rest_framework.exceptions import ValidationError, PermissionDenied
|
||||
from adventures.models import CollectionItineraryItem
|
||||
|
||||
|
||||
@transaction.atomic
|
||||
def reorder_itinerary_items(user, items_data: List[dict]):
|
||||
"""Reorder itinerary items in bulk.
|
||||
|
||||
Args:
|
||||
user: requesting user (for permission checks)
|
||||
items_data: list of dicts with keys `id`, `date`, `order`
|
||||
|
||||
Returns:
|
||||
List[CollectionItineraryItem]: updated items (unsaved instances are saved by this function)
|
||||
|
||||
Raises:
|
||||
ValidationError, PermissionDenied
|
||||
"""
|
||||
if not items_data:
|
||||
raise ValidationError({"items": "This field is required and must not be empty."})
|
||||
|
||||
if not isinstance(items_data, list):
|
||||
raise ValidationError({"items": "Must be a list of item updates."})
|
||||
|
||||
# Resolve ids and fetch items
|
||||
item_ids = [item.get('id') for item in items_data if item.get('id')]
|
||||
items_qs = CollectionItineraryItem.objects.filter(id__in=item_ids).select_related('collection')
|
||||
|
||||
if items_qs.count() != len(item_ids):
|
||||
raise ValidationError({"items": "One or more items not found."})
|
||||
|
||||
items_map = {str(it.id): it for it in items_qs}
|
||||
|
||||
# Permission checks: user must be collection owner or in shared_with
|
||||
for item_id in item_ids:
|
||||
item = items_map.get(item_id)
|
||||
if not item:
|
||||
continue
|
||||
|
||||
collection = item.collection
|
||||
if not (collection.user == user or collection.shared_with.filter(id=user.id).exists()):
|
||||
raise PermissionDenied("You do not have permission to modify items in this collection.")
|
||||
|
||||
# Two-phase update to avoid unique constraint races:
|
||||
# 1) assign very large temporary order values (guaranteed > existing orders)
|
||||
# 2) assign final date/order values
|
||||
|
||||
temp_offset = 1_000_000
|
||||
temp_updates = []
|
||||
for i, item_data in enumerate(items_data):
|
||||
item_id = item_data.get('id')
|
||||
if not item_id:
|
||||
continue
|
||||
item = items_map.get(item_id)
|
||||
if not item:
|
||||
continue
|
||||
item.order = temp_offset + i
|
||||
temp_updates.append(item)
|
||||
|
||||
if temp_updates:
|
||||
CollectionItineraryItem.objects.bulk_update(temp_updates, ['order'])
|
||||
|
||||
# Finalize
|
||||
updated_items = []
|
||||
for item_data in items_data:
|
||||
item_id = item_data.get('id')
|
||||
if not item_id:
|
||||
continue
|
||||
item = items_map.get(item_id)
|
||||
if not item:
|
||||
continue
|
||||
|
||||
new_date = item_data.get('date')
|
||||
new_is_global = item_data.get('is_global')
|
||||
new_order = item_data.get('order')
|
||||
# If is_global is explicitly provided, set it and reconcile date accordingly
|
||||
if new_is_global is not None:
|
||||
item.is_global = bool(new_is_global)
|
||||
if item.is_global:
|
||||
item.date = None
|
||||
if (new_date is not None) and (not item.is_global):
|
||||
# validate date is within collection bounds (if collection has start/end)
|
||||
parsed = None
|
||||
try:
|
||||
parsed = parse_date(str(new_date))
|
||||
except Exception:
|
||||
parsed = None
|
||||
if parsed is None:
|
||||
try:
|
||||
dt = parse_datetime(str(new_date))
|
||||
if dt:
|
||||
parsed = dt.date()
|
||||
except Exception:
|
||||
parsed = None
|
||||
|
||||
collection = item.collection
|
||||
if parsed and collection:
|
||||
if collection.start_date and parsed < collection.start_date:
|
||||
raise ValidationError({"items": f"Item {item_id} date {parsed} is before collection start date {collection.start_date}."})
|
||||
if collection.end_date and parsed > collection.end_date:
|
||||
raise ValidationError({"items": f"Item {item_id} date {parsed} is after collection end date {collection.end_date}."})
|
||||
|
||||
item.date = new_date
|
||||
if new_order is not None:
|
||||
item.order = new_order
|
||||
|
||||
updated_items.append(item)
|
||||
|
||||
if updated_items:
|
||||
CollectionItineraryItem.objects.bulk_update(updated_items, ['date', 'is_global', 'order'])
|
||||
|
||||
return updated_items
|
||||
@@ -17,4 +17,5 @@ from .recommendations_view import *
|
||||
from .import_export_view import *
|
||||
from .trail_view import *
|
||||
from .activity_view import *
|
||||
from .visit_view import *
|
||||
from .visit_view import *
|
||||
from .itinerary_view import *
|
||||
@@ -2,11 +2,13 @@ from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from django.db.models import Q
|
||||
from adventures.models import Checklist
|
||||
from adventures.models import Checklist, CollectionItineraryItem
|
||||
from adventures.serializers import ChecklistSerializer
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
from adventures.permissions import IsOwnerOrSharedWithFullAccess
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.db import transaction
|
||||
|
||||
class ChecklistViewSet(viewsets.ModelViewSet):
|
||||
serializer_class = ChecklistSerializer
|
||||
@@ -43,15 +45,18 @@ class ChecklistViewSet(viewsets.ModelViewSet):
|
||||
# Retrieve the current object
|
||||
instance = self.get_object()
|
||||
|
||||
# Store the old date before updating
|
||||
old_date = instance.date
|
||||
|
||||
# Partially update the instance with the request data
|
||||
serializer = self.get_serializer(instance, data=request.data, partial=True)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
# Retrieve the collection from the validated data
|
||||
new_collection = serializer.validated_data.get('collection')
|
||||
new_date = serializer.validated_data.get('date', old_date)
|
||||
|
||||
user = request.user
|
||||
print(new_collection)
|
||||
|
||||
if new_collection is not None and new_collection!=instance.collection:
|
||||
# Check if the user is the owner of the new collection
|
||||
@@ -62,34 +67,15 @@ class ChecklistViewSet(viewsets.ModelViewSet):
|
||||
if instance.collection is not None and instance.collection.user != user:
|
||||
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
|
||||
|
||||
# Perform the update
|
||||
self.perform_update(serializer)
|
||||
|
||||
# Return the updated instance
|
||||
return Response(serializer.data)
|
||||
|
||||
def partial_update(self, request, *args, **kwargs):
|
||||
# Retrieve the current object
|
||||
instance = self.get_object()
|
||||
|
||||
# Partially update the instance with the request data
|
||||
serializer = self.get_serializer(instance, data=request.data, partial=True)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
# Retrieve the collection from the validated data
|
||||
new_collection = serializer.validated_data.get('collection')
|
||||
|
||||
user = request.user
|
||||
print(new_collection)
|
||||
|
||||
if new_collection is not None and new_collection!=instance.collection:
|
||||
# Check if the user is the owner of the new collection
|
||||
if new_collection.user != user or instance.user != user:
|
||||
raise PermissionDenied("You do not have permission to use this collection.")
|
||||
elif new_collection is None:
|
||||
# Handle the case where the user is trying to set the collection to None
|
||||
if instance.collection is not None and instance.collection.user != user:
|
||||
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
|
||||
# If the date changed, remove old itinerary items for this checklist on the old date
|
||||
if old_date and new_date and old_date != new_date:
|
||||
checklist_ct = ContentType.objects.get_for_model(Checklist)
|
||||
old_itinerary_items = CollectionItineraryItem.objects.filter(
|
||||
content_type=checklist_ct,
|
||||
object_id=str(instance.id),
|
||||
date=old_date
|
||||
)
|
||||
old_itinerary_items.delete()
|
||||
|
||||
# Perform the update
|
||||
self.perform_update(serializer)
|
||||
|
||||
@@ -4,9 +4,18 @@ from django.db import transaction
|
||||
from rest_framework import viewsets
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from adventures.models import Collection, Location, Transportation, Note, Checklist, CollectionInvite, ContentImage
|
||||
from rest_framework.parsers import MultiPartParser
|
||||
from rest_framework import status
|
||||
from django.http import HttpResponse
|
||||
from django.conf import settings
|
||||
import io
|
||||
import os
|
||||
import json
|
||||
import zipfile
|
||||
import tempfile
|
||||
from adventures.models import Collection, Location, Transportation, Note, Checklist, CollectionInvite, ContentImage, CollectionItineraryItem, Lodging, CollectionItineraryDay, ContentAttachment, Category
|
||||
from adventures.permissions import CollectionShared
|
||||
from adventures.serializers import CollectionSerializer, CollectionInviteSerializer, UltraSlimCollectionSerializer
|
||||
from adventures.serializers import CollectionSerializer, CollectionInviteSerializer, UltraSlimCollectionSerializer, CollectionItineraryItemSerializer, CollectionItineraryDaySerializer
|
||||
from users.models import CustomUser as User
|
||||
from adventures.utils import pagination
|
||||
from users.serializers import CustomUserDetailsSerializer as UserSerializer
|
||||
@@ -38,22 +47,47 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
if order_by == 'name':
|
||||
queryset = queryset.annotate(lower_name=Lower('name'))
|
||||
ordering = 'lower_name'
|
||||
if order_direction == 'desc':
|
||||
if order_direction == 'asc':
|
||||
ordering = f'-{ordering}'
|
||||
elif order_by == 'start_date':
|
||||
ordering = 'start_date'
|
||||
if order_direction == 'asc':
|
||||
if order_direction == 'desc':
|
||||
ordering = 'start_date'
|
||||
else:
|
||||
ordering = '-start_date'
|
||||
else:
|
||||
order_by == 'updated_at'
|
||||
ordering = 'updated_at'
|
||||
if order_direction == 'asc':
|
||||
if order_direction == 'desc':
|
||||
ordering = '-updated_at'
|
||||
|
||||
return queryset.order_by(ordering)
|
||||
|
||||
def apply_status_filter(self, queryset):
|
||||
"""Apply status filtering based on query parameter"""
|
||||
from datetime import date
|
||||
status_filter = self.request.query_params.get('status', None)
|
||||
|
||||
if not status_filter:
|
||||
return queryset
|
||||
|
||||
today = date.today()
|
||||
|
||||
if status_filter == 'folder':
|
||||
# Collections without dates
|
||||
return queryset.filter(Q(start_date__isnull=True) | Q(end_date__isnull=True))
|
||||
elif status_filter == 'upcoming':
|
||||
# Start date in the future
|
||||
return queryset.filter(start_date__gt=today)
|
||||
elif status_filter == 'in_progress':
|
||||
# Currently ongoing
|
||||
return queryset.filter(start_date__lte=today, end_date__gte=today)
|
||||
elif status_filter == 'completed':
|
||||
# End date in the past
|
||||
return queryset.filter(end_date__lt=today)
|
||||
|
||||
return queryset
|
||||
|
||||
def get_serializer_context(self):
|
||||
"""Override to add nested and exclusion contexts based on query parameters"""
|
||||
context = super().get_serializer_context()
|
||||
@@ -80,45 +114,49 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
|
||||
def get_optimized_queryset_for_listing(self):
|
||||
"""Get optimized queryset for list actions with prefetching"""
|
||||
return self.get_base_queryset().select_related('user').prefetch_related(
|
||||
return self.get_base_queryset().select_related('user', 'primary_image').prefetch_related(
|
||||
Prefetch(
|
||||
'locations__images',
|
||||
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
|
||||
to_attr='primary_images'
|
||||
)
|
||||
),
|
||||
'shared_with'
|
||||
)
|
||||
|
||||
def get_base_queryset(self):
|
||||
"""Base queryset logic extracted for reuse"""
|
||||
if self.action == 'destroy':
|
||||
return Collection.objects.filter(user=self.request.user.id)
|
||||
|
||||
if self.action in ['update', 'partial_update']:
|
||||
return Collection.objects.filter(
|
||||
queryset = Collection.objects.filter(user=self.request.user.id)
|
||||
elif self.action in ['update', 'partial_update', 'leave']:
|
||||
queryset = Collection.objects.filter(
|
||||
Q(user=self.request.user.id) | Q(shared_with=self.request.user)
|
||||
).distinct()
|
||||
|
||||
# Allow access to collections with pending invites for accept/decline actions
|
||||
if self.action in ['accept_invite', 'decline_invite']:
|
||||
elif self.action in ['accept_invite', 'decline_invite']:
|
||||
if not self.request.user.is_authenticated:
|
||||
return Collection.objects.none()
|
||||
return Collection.objects.filter(
|
||||
Q(user=self.request.user.id) |
|
||||
Q(shared_with=self.request.user) |
|
||||
Q(invites__invited_user=self.request.user)
|
||||
).distinct()
|
||||
|
||||
if self.action == 'retrieve':
|
||||
queryset = Collection.objects.none()
|
||||
else:
|
||||
queryset = Collection.objects.filter(
|
||||
Q(user=self.request.user.id)
|
||||
| Q(shared_with=self.request.user)
|
||||
| Q(invites__invited_user=self.request.user)
|
||||
).distinct()
|
||||
elif self.action == 'retrieve':
|
||||
if not self.request.user.is_authenticated:
|
||||
return Collection.objects.filter(is_public=True)
|
||||
return Collection.objects.filter(
|
||||
Q(is_public=True) | Q(user=self.request.user.id) | Q(shared_with=self.request.user)
|
||||
queryset = Collection.objects.filter(is_public=True)
|
||||
else:
|
||||
queryset = Collection.objects.filter(
|
||||
Q(is_public=True)
|
||||
| Q(user=self.request.user.id)
|
||||
| Q(shared_with=self.request.user)
|
||||
).distinct()
|
||||
else:
|
||||
# For list action and default base queryset, return collections owned by the user (exclude shared)
|
||||
queryset = Collection.objects.filter(
|
||||
Q(user=self.request.user.id) & Q(is_archived=False)
|
||||
).distinct()
|
||||
|
||||
# For list action, include collections owned by the user or shared with the user, that are not archived
|
||||
return Collection.objects.filter(
|
||||
(Q(user=self.request.user.id) | Q(shared_with=self.request.user)) & Q(is_archived=False)
|
||||
).distinct()
|
||||
|
||||
return queryset.select_related('primary_image').prefetch_related('shared_with')
|
||||
|
||||
def get_queryset(self):
|
||||
"""Get queryset with optimizations for list actions"""
|
||||
@@ -131,9 +169,11 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
if not request.user.is_authenticated:
|
||||
return Response({"error": "User is not authenticated"}, status=400)
|
||||
|
||||
# List should only return collections owned by the requesting user (shared collections are available
|
||||
# via the `shared` action).
|
||||
queryset = Collection.objects.filter(
|
||||
(Q(user=request.user.id) | Q(shared_with=request.user)) & Q(is_archived=False)
|
||||
).distinct().select_related('user').prefetch_related(
|
||||
Q(user=request.user.id) & Q(is_archived=False)
|
||||
).distinct().select_related('user', 'primary_image').prefetch_related(
|
||||
Prefetch(
|
||||
'locations__images',
|
||||
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
|
||||
@@ -141,6 +181,7 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
)
|
||||
)
|
||||
|
||||
queryset = self.apply_status_filter(queryset)
|
||||
queryset = self.apply_sorting(queryset)
|
||||
return self.paginate_and_respond(queryset, request)
|
||||
|
||||
@@ -151,7 +192,7 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
|
||||
queryset = Collection.objects.filter(
|
||||
Q(user=request.user)
|
||||
).select_related('user').prefetch_related(
|
||||
).select_related('user', 'primary_image').prefetch_related(
|
||||
Prefetch(
|
||||
'locations__images',
|
||||
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
|
||||
@@ -171,7 +212,7 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
|
||||
queryset = Collection.objects.filter(
|
||||
Q(user=request.user.id) & Q(is_archived=True)
|
||||
).select_related('user').prefetch_related(
|
||||
).select_related('user', 'primary_image').prefetch_related(
|
||||
Prefetch(
|
||||
'locations__images',
|
||||
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
|
||||
@@ -184,67 +225,23 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
|
||||
return Response(serializer.data)
|
||||
|
||||
# this make the is_public field of the collection cascade to the locations
|
||||
@transaction.atomic
|
||||
def update(self, request, *args, **kwargs):
|
||||
partial = kwargs.pop('partial', False)
|
||||
instance = self.get_object()
|
||||
serializer = self.get_serializer(instance, data=request.data, partial=partial)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
def retrieve(self, request, pk=None):
|
||||
"""Retrieve a collection and include itinerary items and day metadata in the response."""
|
||||
collection = self.get_object()
|
||||
serializer = self.get_serializer(collection)
|
||||
data = serializer.data
|
||||
|
||||
if 'collection' in serializer.validated_data:
|
||||
new_collection = serializer.validated_data['collection']
|
||||
# if the new collection is different from the old one and the user making the request is not the owner of the new collection return an error
|
||||
if new_collection != instance.collection and new_collection.user != request.user:
|
||||
return Response({"error": "User does not own the new collection"}, status=400)
|
||||
# Include itinerary items inline with collection details
|
||||
itinerary_items = CollectionItineraryItem.objects.filter(collection=collection)
|
||||
itinerary_serializer = CollectionItineraryItemSerializer(itinerary_items, many=True)
|
||||
data['itinerary'] = itinerary_serializer.data
|
||||
|
||||
# Include itinerary day metadata
|
||||
itinerary_days = CollectionItineraryDay.objects.filter(collection=collection)
|
||||
days_serializer = CollectionItineraryDaySerializer(itinerary_days, many=True)
|
||||
data['itinerary_days'] = days_serializer.data
|
||||
|
||||
# Check if the 'is_public' field is present in the update data
|
||||
if 'is_public' in serializer.validated_data:
|
||||
new_public_status = serializer.validated_data['is_public']
|
||||
|
||||
# if is_public has changed and the user is not the owner of the collection return an error
|
||||
if new_public_status != instance.is_public and instance.user != request.user:
|
||||
print(f"User {request.user.id} does not own the collection {instance.id} that is owned by {instance.user}")
|
||||
return Response({"error": "User does not own the collection"}, status=400)
|
||||
|
||||
# Get all locations in this collection
|
||||
locations_in_collection = Location.objects.filter(collections=instance)
|
||||
|
||||
if new_public_status:
|
||||
# If collection becomes public, make all locations public
|
||||
locations_in_collection.update(is_public=True)
|
||||
else:
|
||||
# If collection becomes private, check each location
|
||||
# Only set a location to private if ALL of its collections are private
|
||||
# Collect locations that do NOT belong to any other public collection (excluding the current one)
|
||||
location_ids_to_set_private = []
|
||||
|
||||
for location in locations_in_collection:
|
||||
has_public_collection = location.collections.filter(is_public=True).exclude(id=instance.id).exists()
|
||||
if not has_public_collection:
|
||||
location_ids_to_set_private.append(location.id)
|
||||
|
||||
# Bulk update those locations
|
||||
Location.objects.filter(id__in=location_ids_to_set_private).update(is_public=False)
|
||||
|
||||
# Update transportations, notes, and checklists related to this collection
|
||||
# These still use direct ForeignKey relationships
|
||||
Transportation.objects.filter(collection=instance).update(is_public=new_public_status)
|
||||
Note.objects.filter(collection=instance).update(is_public=new_public_status)
|
||||
Checklist.objects.filter(collection=instance).update(is_public=new_public_status)
|
||||
|
||||
# Log the action (optional)
|
||||
action = "public" if new_public_status else "private"
|
||||
print(f"Collection {instance.id} and its related objects were set to {action}")
|
||||
|
||||
self.perform_update(serializer)
|
||||
|
||||
if getattr(instance, '_prefetched_objects_cache', None):
|
||||
# If 'prefetch_related' has been applied to a queryset, we need to
|
||||
# forcibly invalidate the prefetch cache on the instance.
|
||||
instance._prefetched_objects_cache = {}
|
||||
|
||||
return Response(serializer.data)
|
||||
return Response(data)
|
||||
|
||||
# make an action to retreive all locations that are shared with the user
|
||||
@action(detail=False, methods=['get'])
|
||||
@@ -477,10 +474,433 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
|
||||
return Response({"success": success_message})
|
||||
|
||||
@action(detail=True, methods=['get'], url_path='export')
|
||||
def export_collection(self, request, pk=None):
|
||||
"""Export a single collection and its related content as a ZIP file."""
|
||||
collection = self.get_object()
|
||||
|
||||
export_data = {
|
||||
'version': getattr(settings, 'ADVENTURELOG_RELEASE_VERSION', 'unknown'),
|
||||
# Omit export_date to keep template-friendly exports (no dates)
|
||||
'collection': {
|
||||
'id': str(collection.id),
|
||||
'name': collection.name,
|
||||
'description': collection.description,
|
||||
'is_public': collection.is_public,
|
||||
# Omit start/end dates
|
||||
'link': collection.link,
|
||||
},
|
||||
'locations': [],
|
||||
'transportation': [],
|
||||
'notes': [],
|
||||
'checklists': [],
|
||||
'lodging': [],
|
||||
# Omit itinerary_items entirely
|
||||
'images': [],
|
||||
'attachments': [],
|
||||
'primary_image_ref': None,
|
||||
}
|
||||
|
||||
image_export_map = {}
|
||||
|
||||
for loc in collection.locations.all().select_related('city', 'region', 'country'):
|
||||
loc_entry = {
|
||||
'id': str(loc.id),
|
||||
'name': loc.name,
|
||||
'description': loc.description,
|
||||
'location': loc.location,
|
||||
'tags': loc.tags or [],
|
||||
'rating': loc.rating,
|
||||
'link': loc.link,
|
||||
'is_public': loc.is_public,
|
||||
'longitude': float(loc.longitude) if loc.longitude is not None else None,
|
||||
'latitude': float(loc.latitude) if loc.latitude is not None else None,
|
||||
'city': loc.city.name if loc.city else None,
|
||||
'region': loc.region.name if loc.region else None,
|
||||
'country': loc.country.name if loc.country else None,
|
||||
'images': [],
|
||||
'attachments': [],
|
||||
}
|
||||
|
||||
for img in loc.images.all():
|
||||
img_export_id = f"img_{len(export_data['images'])}"
|
||||
image_export_map[str(img.id)] = img_export_id
|
||||
export_data['images'].append({
|
||||
'export_id': img_export_id,
|
||||
'id': str(img.id),
|
||||
'name': os.path.basename(getattr(img.image, 'name', 'image')),
|
||||
'is_primary': getattr(img, 'is_primary', False),
|
||||
})
|
||||
loc_entry['images'].append(img_export_id)
|
||||
|
||||
for att in loc.attachments.all():
|
||||
att_export_id = f"att_{len(export_data['attachments'])}"
|
||||
export_data['attachments'].append({
|
||||
'export_id': att_export_id,
|
||||
'id': str(att.id),
|
||||
'name': os.path.basename(getattr(att.file, 'name', 'attachment')),
|
||||
})
|
||||
loc_entry['attachments'].append(att_export_id)
|
||||
|
||||
export_data['locations'].append(loc_entry)
|
||||
|
||||
if collection.primary_image:
|
||||
export_data['primary_image_ref'] = image_export_map.get(str(collection.primary_image.id))
|
||||
|
||||
# Related content (if models have FK to collection)
|
||||
for t in Transportation.objects.filter(collection=collection):
|
||||
export_data['transportation'].append({
|
||||
'id': str(t.id),
|
||||
'type': getattr(t, 'transportation_type', None),
|
||||
'name': getattr(t, 'name', None),
|
||||
# Omit date
|
||||
'notes': getattr(t, 'notes', None),
|
||||
})
|
||||
for n in Note.objects.filter(collection=collection):
|
||||
export_data['notes'].append({
|
||||
'id': str(n.id),
|
||||
'title': getattr(n, 'title', None),
|
||||
'content': getattr(n, 'content', ''),
|
||||
# Omit created_at
|
||||
})
|
||||
for c in Checklist.objects.filter(collection=collection):
|
||||
items = []
|
||||
if hasattr(c, 'items'):
|
||||
items = [
|
||||
{
|
||||
'name': getattr(item, 'name', None),
|
||||
'completed': getattr(item, 'completed', False),
|
||||
} for item in c.items.all()
|
||||
]
|
||||
export_data['checklists'].append({
|
||||
'id': str(c.id),
|
||||
'name': getattr(c, 'name', None),
|
||||
'items': items,
|
||||
})
|
||||
for l in Lodging.objects.filter(collection=collection):
|
||||
export_data['lodging'].append({
|
||||
'id': str(l.id),
|
||||
'type': getattr(l, 'lodging_type', None),
|
||||
'name': getattr(l, 'name', None),
|
||||
# Omit start_date/end_date
|
||||
'notes': getattr(l, 'notes', None),
|
||||
})
|
||||
# Intentionally omit itinerary_items from export
|
||||
|
||||
# Create ZIP in temp file
|
||||
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
|
||||
with zipfile.ZipFile(tmp_file, 'w', zipfile.ZIP_DEFLATED) as zipf:
|
||||
zipf.writestr('metadata.json', json.dumps(export_data, indent=2))
|
||||
|
||||
# Write image files
|
||||
for loc in collection.locations.all():
|
||||
for img in loc.images.all():
|
||||
export_id = image_export_map.get(str(img.id))
|
||||
if not export_id:
|
||||
continue
|
||||
try:
|
||||
file_name = os.path.basename(getattr(img.image, 'name', 'image'))
|
||||
storage = getattr(img.image, 'storage', None)
|
||||
if storage:
|
||||
with storage.open(img.image.name, 'rb') as f:
|
||||
zipf.writestr(f'images/{export_id}-{file_name}', f.read())
|
||||
elif hasattr(img.image, 'path'):
|
||||
with open(img.image.path, 'rb') as f:
|
||||
zipf.writestr(f'images/{export_id}-{file_name}', f.read())
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
# Write attachment files
|
||||
for loc in collection.locations.all():
|
||||
for att in loc.attachments.all():
|
||||
try:
|
||||
file_name = os.path.basename(getattr(att.file, 'name', 'attachment'))
|
||||
storage = getattr(att.file, 'storage', None)
|
||||
if storage:
|
||||
with storage.open(att.file.name, 'rb') as f:
|
||||
zipf.writestr(f'attachments/{file_name}', f.read())
|
||||
elif hasattr(att.file, 'path'):
|
||||
with open(att.file.path, 'rb') as f:
|
||||
zipf.writestr(f'attachments/{file_name}', f.read())
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
with open(tmp_file.name, 'rb') as fh:
|
||||
data = fh.read()
|
||||
os.unlink(tmp_file.name)
|
||||
|
||||
filename = f"collection-{collection.name.replace(' ', '_')}.zip"
|
||||
response = HttpResponse(data, content_type='application/zip')
|
||||
response['Content-Disposition'] = f'attachment; filename="{filename}"'
|
||||
return response
|
||||
|
||||
@action(detail=False, methods=['post'], url_path='import', parser_classes=[MultiPartParser])
|
||||
def import_collection(self, request):
|
||||
"""Import a single collection from a ZIP file. Handles name conflicts by appending (n)."""
|
||||
upload = request.FILES.get('file')
|
||||
if not upload:
|
||||
return Response({'detail': 'No file provided'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Read zip
|
||||
file_bytes = upload.read()
|
||||
with zipfile.ZipFile(io.BytesIO(file_bytes), 'r') as zipf:
|
||||
try:
|
||||
metadata = json.loads(zipf.read('metadata.json').decode('utf-8'))
|
||||
except KeyError:
|
||||
return Response({'detail': 'metadata.json missing'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
base_name = (metadata.get('collection') or {}).get('name') or 'Imported Collection'
|
||||
|
||||
# Ensure unique name per user
|
||||
existing_names = set(request.user.collection_set.values_list('name', flat=True))
|
||||
unique_name = base_name
|
||||
if unique_name in existing_names:
|
||||
i = 1
|
||||
while True:
|
||||
candidate = f"{base_name} ({i})"
|
||||
if candidate not in existing_names:
|
||||
unique_name = candidate
|
||||
break
|
||||
i += 1
|
||||
|
||||
new_collection = Collection.objects.create(
|
||||
user=request.user,
|
||||
name=unique_name,
|
||||
description=(metadata.get('collection') or {}).get('description'),
|
||||
is_public=(metadata.get('collection') or {}).get('is_public', False),
|
||||
start_date=__import__('datetime').date.fromisoformat((metadata.get('collection') or {}).get('start_date')) if (metadata.get('collection') or {}).get('start_date') else None,
|
||||
end_date=__import__('datetime').date.fromisoformat((metadata.get('collection') or {}).get('end_date')) if (metadata.get('collection') or {}).get('end_date') else None,
|
||||
link=(metadata.get('collection') or {}).get('link'),
|
||||
)
|
||||
|
||||
image_export_map = {img['export_id']: img for img in metadata.get('images', [])}
|
||||
attachment_export_map = {att['export_id']: att for att in metadata.get('attachments', [])}
|
||||
|
||||
# Import locations
|
||||
for loc_data in metadata.get('locations', []):
|
||||
cat_obj = None
|
||||
if loc_data.get('category'):
|
||||
cat_obj, _ = Category.objects.get_or_create(user=request.user, name=loc_data['category'])
|
||||
# Attempt to find a very similar existing location for this user
|
||||
from difflib import SequenceMatcher
|
||||
|
||||
def _ratio(a, b):
|
||||
a = (a or '').strip().lower()
|
||||
b = (b or '').strip().lower()
|
||||
if not a and not b:
|
||||
return 1.0
|
||||
return SequenceMatcher(None, a, b).ratio()
|
||||
|
||||
def _coords_close(lat1, lon1, lat2, lon2, threshold=0.02):
|
||||
try:
|
||||
if lat1 is None or lon1 is None or lat2 is None or lon2 is None:
|
||||
return False
|
||||
return abs(float(lat1) - float(lat2)) <= threshold and abs(float(lon1) - float(lon2)) <= threshold
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
incoming_name = loc_data.get('name') or 'Untitled'
|
||||
incoming_location_text = loc_data.get('location')
|
||||
incoming_lat = loc_data.get('latitude')
|
||||
incoming_lon = loc_data.get('longitude')
|
||||
|
||||
existing_loc = None
|
||||
best_score = 0.0
|
||||
for cand in Location.objects.filter(user=request.user):
|
||||
name_score = _ratio(incoming_name, cand.name)
|
||||
loc_text_score = _ratio(incoming_location_text, getattr(cand, 'location', None))
|
||||
close_coords = _coords_close(incoming_lat, incoming_lon, cand.latitude, cand.longitude)
|
||||
# Define "very similar": strong name match OR decent name with location/coords match
|
||||
combined_score = max(name_score, (name_score + loc_text_score) / 2.0)
|
||||
if close_coords:
|
||||
combined_score = max(combined_score, name_score + 0.1) # small boost for coord proximity
|
||||
if combined_score > best_score and (
|
||||
name_score >= 0.92 or (name_score >= 0.85 and (loc_text_score >= 0.85 or close_coords))
|
||||
):
|
||||
best_score = combined_score
|
||||
existing_loc = cand
|
||||
|
||||
if existing_loc:
|
||||
# Link existing location to the new collection, skip creating a duplicate
|
||||
loc = existing_loc
|
||||
loc.collections.add(new_collection)
|
||||
created_new_loc = False
|
||||
else:
|
||||
# Create a brand-new location
|
||||
loc = Location.objects.create(
|
||||
user=request.user,
|
||||
name=incoming_name,
|
||||
description=loc_data.get('description'),
|
||||
location=incoming_location_text,
|
||||
tags=loc_data.get('tags') or [],
|
||||
rating=loc_data.get('rating'),
|
||||
link=loc_data.get('link'),
|
||||
is_public=bool(loc_data.get('is_public', False)),
|
||||
longitude=incoming_lon,
|
||||
latitude=incoming_lat,
|
||||
category=cat_obj,
|
||||
)
|
||||
loc.collections.add(new_collection)
|
||||
created_new_loc = True
|
||||
|
||||
# Images
|
||||
# Only import images for newly created locations to avoid duplicating user content
|
||||
if created_new_loc:
|
||||
for export_id in loc_data.get('images', []):
|
||||
img_meta = image_export_map.get(export_id)
|
||||
if not img_meta:
|
||||
continue
|
||||
prefix = f"images/{export_id}-"
|
||||
member = next((m for m in zipf.namelist() if m.startswith(prefix)), None)
|
||||
if not member:
|
||||
continue
|
||||
file_bytes_img = zipf.read(member)
|
||||
file_name_img = os.path.basename(member)
|
||||
from django.core.files.base import ContentFile
|
||||
image_obj = ContentImage(
|
||||
user=request.user,
|
||||
image=ContentFile(file_bytes_img, name=file_name_img),
|
||||
)
|
||||
# Assign to the generic relation for Location
|
||||
image_obj.content_object = loc
|
||||
image_obj.save()
|
||||
if img_meta.get('is_primary'):
|
||||
new_collection.primary_image = image_obj
|
||||
new_collection.save(update_fields=['primary_image'])
|
||||
|
||||
# Attachments
|
||||
if created_new_loc:
|
||||
for export_id in loc_data.get('attachments', []):
|
||||
att_meta = attachment_export_map.get(export_id)
|
||||
if not att_meta:
|
||||
continue
|
||||
file_name_att = att_meta.get('name', '')
|
||||
member = next((m for m in zipf.namelist() if m == f"attachments/{file_name_att}"), None)
|
||||
if not member:
|
||||
continue
|
||||
file_bytes_att = zipf.read(member)
|
||||
from django.core.files.base import ContentFile
|
||||
attachment_obj = ContentAttachment(
|
||||
user=request.user,
|
||||
file=ContentFile(file_bytes_att, name=file_name_att),
|
||||
)
|
||||
# Assign to the generic relation for Location
|
||||
attachment_obj.content_object = loc
|
||||
attachment_obj.save()
|
||||
|
||||
serializer = self.get_serializer(new_collection)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
# This is ok because you cannot share a collection when creating it
|
||||
serializer.save(user=self.request.user)
|
||||
|
||||
def _cleanup_out_of_range_itinerary_items(self, collection):
|
||||
"""Delete itinerary items and day metadata outside the collection's date range."""
|
||||
if not collection.start_date or not collection.end_date:
|
||||
# If no date range is set, don't delete anything
|
||||
return
|
||||
|
||||
# Delete itinerary items outside the date range
|
||||
deleted_items = CollectionItineraryItem.objects.filter(
|
||||
collection=collection
|
||||
).exclude(
|
||||
date__range=[collection.start_date, collection.end_date]
|
||||
).delete()
|
||||
|
||||
# Delete day metadata outside the date range
|
||||
deleted_days = CollectionItineraryDay.objects.filter(
|
||||
collection=collection
|
||||
).exclude(
|
||||
date__range=[collection.start_date, collection.end_date]
|
||||
).delete()
|
||||
|
||||
return deleted_items, deleted_days
|
||||
|
||||
@transaction.atomic
|
||||
def update(self, request, *args, **kwargs):
|
||||
"""Override update to handle is_public cascading and clean up out-of-range itinerary items when dates change."""
|
||||
instance = self.get_object()
|
||||
old_is_public = instance.is_public
|
||||
old_start_date = instance.start_date
|
||||
old_end_date = instance.end_date
|
||||
|
||||
# Perform the standard update
|
||||
partial = kwargs.pop('partial', False)
|
||||
serializer = self.get_serializer(instance, data=request.data, partial=partial)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
self.perform_update(serializer)
|
||||
|
||||
# Check if is_public changed
|
||||
new_is_public = serializer.instance.is_public
|
||||
is_public_changed = old_is_public != new_is_public
|
||||
|
||||
# Handle is_public cascading
|
||||
if is_public_changed:
|
||||
if new_is_public:
|
||||
# Collection is being made public, update all linked items to public
|
||||
serializer.instance.locations.filter(is_public=False).update(is_public=True)
|
||||
serializer.instance.transportation_set.filter(is_public=False).update(is_public=True)
|
||||
serializer.instance.note_set.filter(is_public=False).update(is_public=True)
|
||||
serializer.instance.checklist_set.filter(is_public=False).update(is_public=True)
|
||||
serializer.instance.lodging_set.filter(is_public=False).update(is_public=True)
|
||||
else:
|
||||
# Collection is being made private, check each linked item
|
||||
# Only set an item to private if it doesn't belong to any other public collection
|
||||
|
||||
# Handle locations (many-to-many relationship)
|
||||
locations_in_collection = serializer.instance.locations.filter(is_public=True)
|
||||
for location in locations_in_collection:
|
||||
# Check if this location belongs to any other public collection
|
||||
has_other_public_collection = location.collections.filter(
|
||||
is_public=True
|
||||
).exclude(id=serializer.instance.id).exists()
|
||||
if not has_other_public_collection:
|
||||
location.is_public = False
|
||||
location.save(update_fields=['is_public'])
|
||||
|
||||
# Handle transportations, notes, checklists, lodging (foreign key relationships)
|
||||
# Transportation
|
||||
transportations_to_check = serializer.instance.transportation_set.filter(is_public=True)
|
||||
for transportation in transportations_to_check:
|
||||
transportation.is_public = False
|
||||
transportation.save(update_fields=['is_public'])
|
||||
|
||||
# Notes
|
||||
notes_to_check = serializer.instance.note_set.filter(is_public=True)
|
||||
for note in notes_to_check:
|
||||
note.is_public = False
|
||||
note.save(update_fields=['is_public'])
|
||||
|
||||
# Checklists
|
||||
checklists_to_check = serializer.instance.checklist_set.filter(is_public=True)
|
||||
for checklist in checklists_to_check:
|
||||
checklist.is_public = False
|
||||
checklist.save(update_fields=['is_public'])
|
||||
|
||||
# Lodging
|
||||
lodging_to_check = serializer.instance.lodging_set.filter(is_public=True)
|
||||
for lodging in lodging_to_check:
|
||||
lodging.is_public = False
|
||||
lodging.save(update_fields=['is_public'])
|
||||
|
||||
# Check if dates changed
|
||||
new_start_date = serializer.instance.start_date
|
||||
new_end_date = serializer.instance.end_date
|
||||
|
||||
dates_changed = (old_start_date != new_start_date or old_end_date != new_end_date)
|
||||
|
||||
# Clean up out-of-range items if dates changed
|
||||
if dates_changed:
|
||||
self._cleanup_out_of_range_itinerary_items(serializer.instance)
|
||||
|
||||
if getattr(instance, '_prefetched_objects_cache', None):
|
||||
# If 'prefetch_related' has been applied to a queryset, we need to
|
||||
# forcibly invalidate the prefetch cache on the instance.
|
||||
instance._prefetched_objects_cache = {}
|
||||
|
||||
return Response(serializer.data)
|
||||
|
||||
def paginate_and_respond(self, queryset, request):
|
||||
paginator = self.pagination_class()
|
||||
page = paginator.paginate_queryset(queryset, request)
|
||||
|
||||
@@ -91,8 +91,13 @@ class GenerateDescription(viewsets.ViewSet):
|
||||
|
||||
try:
|
||||
candidates = self.get_candidate_pages(name, lang)
|
||||
found_images = []
|
||||
|
||||
for candidate in candidates:
|
||||
# Stop after finding 5 valid images
|
||||
if len(found_images) >= 8:
|
||||
break
|
||||
|
||||
page_data = self.fetch_page(
|
||||
lang=lang,
|
||||
candidate=candidate,
|
||||
@@ -113,13 +118,29 @@ class GenerateDescription(viewsets.ViewSet):
|
||||
# Try original image first
|
||||
original_image = page_data.get('original')
|
||||
if original_image and self.is_valid_image(original_image.get('source')):
|
||||
return Response(original_image)
|
||||
found_images.append({
|
||||
'source': original_image.get('source'),
|
||||
'width': original_image.get('width'),
|
||||
'height': original_image.get('height'),
|
||||
'title': page_data.get('title'),
|
||||
'type': 'original'
|
||||
})
|
||||
continue
|
||||
|
||||
# Fall back to thumbnail
|
||||
thumbnail_image = page_data.get('thumbnail')
|
||||
if thumbnail_image and self.is_valid_image(thumbnail_image.get('source')):
|
||||
return Response(thumbnail_image)
|
||||
found_images.append({
|
||||
'source': thumbnail_image.get('source'),
|
||||
'width': thumbnail_image.get('width'),
|
||||
'height': thumbnail_image.get('height'),
|
||||
'title': page_data.get('title'),
|
||||
'type': 'thumbnail'
|
||||
})
|
||||
|
||||
if found_images:
|
||||
return Response({"images": found_images})
|
||||
|
||||
return Response({"error": "No image found"}, status=404)
|
||||
|
||||
except requests.exceptions.RequestException:
|
||||
|
||||
@@ -4,64 +4,89 @@ from rest_framework.decorators import action
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from icalendar import Calendar, Event, vText, vCalAddress
|
||||
from datetime import datetime, timedelta
|
||||
from adventures.models import Location
|
||||
from adventures.serializers import LocationSerializer
|
||||
from django.db.models import Prefetch
|
||||
from adventures.models import Location, Visit
|
||||
|
||||
class IcsCalendarGeneratorViewSet(viewsets.ViewSet):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
@action(detail=False, methods=['get'])
|
||||
def generate(self, request):
|
||||
locations = Location.objects.filter(user=request.user)
|
||||
context={'nested': True, 'allowed_nested_fields': ['visits']}
|
||||
serializer = LocationSerializer(locations, many=True, context=context)
|
||||
locations = (
|
||||
Location.objects.filter(user=request.user)
|
||||
.prefetch_related(
|
||||
Prefetch(
|
||||
'visits',
|
||||
queryset=Visit.objects.only('id', 'start_date', 'end_date')
|
||||
)
|
||||
)
|
||||
.only('id', 'name', 'description', 'location', 'link')
|
||||
)
|
||||
|
||||
user = request.user
|
||||
name = f"{user.first_name} {user.last_name}"
|
||||
|
||||
|
||||
cal = Calendar()
|
||||
cal.add('prodid', '-//My Adventure Calendar//example.com//')
|
||||
cal.add('version', '2.0')
|
||||
|
||||
for location in serializer.data:
|
||||
if location['visits']:
|
||||
for visit in location['visits']:
|
||||
# Skip if start_date is missing
|
||||
if not visit.get('start_date'):
|
||||
continue
|
||||
for location in locations:
|
||||
visits = list(location.visits.all())
|
||||
if not visits:
|
||||
continue
|
||||
|
||||
# Parse start_date and handle end_date
|
||||
for visit in visits:
|
||||
start_value = getattr(visit, 'start_date', None)
|
||||
if not start_value:
|
||||
continue
|
||||
|
||||
# Normalize start date
|
||||
if isinstance(start_value, str):
|
||||
try:
|
||||
start_date = datetime.strptime(visit['start_date'], '%Y-%m-%d').date()
|
||||
start_dt = datetime.fromisoformat(start_value.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
continue # Skip if the start_date is invalid
|
||||
continue
|
||||
else:
|
||||
start_dt = start_value
|
||||
|
||||
start_date = start_dt.date() if hasattr(start_dt, 'date') else start_dt
|
||||
|
||||
# Normalize end date (inclusive of final day)
|
||||
end_value = getattr(visit, 'end_date', None) or start_dt
|
||||
if isinstance(end_value, str):
|
||||
try:
|
||||
end_dt = datetime.fromisoformat(end_value.replace('Z', '+00:00'))
|
||||
except ValueError:
|
||||
end_dt = start_dt
|
||||
else:
|
||||
end_dt = end_value
|
||||
|
||||
end_date = end_dt.date() if hasattr(end_dt, 'date') else end_dt
|
||||
end_date = end_date + timedelta(days=1)
|
||||
|
||||
event = Event()
|
||||
event.add('summary', location.name)
|
||||
event.add('dtstart', start_date)
|
||||
event.add('dtend', end_date)
|
||||
event.add('dtstamp', datetime.now())
|
||||
event.add('transp', 'TRANSPARENT')
|
||||
event.add('class', 'PUBLIC')
|
||||
event.add('created', datetime.now())
|
||||
event.add('last-modified', datetime.now())
|
||||
event.add('description', getattr(location, 'description', '') or '')
|
||||
|
||||
if getattr(location, 'location', None):
|
||||
event.add('location', location.location)
|
||||
|
||||
if getattr(location, 'link', None):
|
||||
event.add('url', location.link)
|
||||
|
||||
organizer = vCalAddress(f'MAILTO:{user.email}')
|
||||
organizer.params['cn'] = vText(name)
|
||||
event.add('organizer', organizer)
|
||||
|
||||
cal.add_component(event)
|
||||
|
||||
end_date = (
|
||||
datetime.strptime(visit['end_date'], '%Y-%m-%d').date() + timedelta(days=1)
|
||||
if visit.get('end_date') else start_date + timedelta(days=1)
|
||||
)
|
||||
|
||||
# Create event
|
||||
event = Event()
|
||||
event.add('summary', location['name'])
|
||||
event.add('dtstart', start_date)
|
||||
event.add('dtend', end_date)
|
||||
event.add('dtstamp', datetime.now())
|
||||
event.add('transp', 'TRANSPARENT')
|
||||
event.add('class', 'PUBLIC')
|
||||
event.add('created', datetime.now())
|
||||
event.add('last-modified', datetime.now())
|
||||
event.add('description', location['description'])
|
||||
if location.get('location'):
|
||||
event.add('location', location['location'])
|
||||
if location.get('link'):
|
||||
event.add('url', location['link'])
|
||||
|
||||
organizer = vCalAddress(f'MAILTO:{user.email}')
|
||||
organizer.params['cn'] = vText(name)
|
||||
event.add('organizer', organizer)
|
||||
|
||||
cal.add_component(event)
|
||||
|
||||
response = HttpResponse(cal.to_ical(), content_type='text/calendar')
|
||||
response['Content-Disposition'] = 'attachment; filename=adventures.ics'
|
||||
return response
|
||||
@@ -19,7 +19,8 @@ from django.contrib.contenttypes.models import ContentType
|
||||
|
||||
from adventures.models import (
|
||||
Location, Collection, Transportation, Note, Checklist, ChecklistItem,
|
||||
ContentImage, ContentAttachment, Category, Lodging, Visit, Trail, Activity
|
||||
ContentImage, ContentAttachment, Category, Lodging, Visit, Trail, Activity,
|
||||
CollectionItineraryItem
|
||||
)
|
||||
from worldtravel.models import VisitedCity, VisitedRegion, City, Region, Country
|
||||
|
||||
@@ -52,7 +53,8 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
'checklists': [],
|
||||
'lodging': [],
|
||||
'visited_cities': [],
|
||||
'visited_regions': []
|
||||
'visited_regions': [],
|
||||
'itinerary_items': []
|
||||
}
|
||||
|
||||
# Export Visited Cities
|
||||
@@ -75,6 +77,9 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
'icon': category.icon,
|
||||
})
|
||||
|
||||
# Track images so we can reference them for collection primary images
|
||||
image_export_map = {}
|
||||
|
||||
# Export Collections
|
||||
for idx, collection in enumerate(user.collection_set.all()):
|
||||
export_data['collections'].append({
|
||||
@@ -175,7 +180,7 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
location_data['trails'].append(trail_data)
|
||||
|
||||
# Add images
|
||||
for image in location.images.all():
|
||||
for image_index, image in enumerate(location.images.all()):
|
||||
image_data = {
|
||||
'immich_id': image.immich_id,
|
||||
'is_primary': image.is_primary,
|
||||
@@ -184,6 +189,13 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
if image.image:
|
||||
image_data['filename'] = image.image.name.split('/')[-1]
|
||||
location_data['images'].append(image_data)
|
||||
|
||||
image_export_map[image.id] = {
|
||||
'location_export_id': idx,
|
||||
'image_index': image_index,
|
||||
'immich_id': image.immich_id,
|
||||
'filename': image_data['filename'],
|
||||
}
|
||||
|
||||
# Add attachments
|
||||
for attachment in location.attachments.all():
|
||||
@@ -196,14 +208,21 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
location_data['attachments'].append(attachment_data)
|
||||
|
||||
export_data['locations'].append(location_data)
|
||||
|
||||
# Attach collection primary image references (if any)
|
||||
for idx, collection in enumerate(user.collection_set.all()):
|
||||
primary = collection.primary_image
|
||||
if primary and primary.id in image_export_map:
|
||||
export_data['collections'][idx]['primary_image'] = image_export_map[primary.id]
|
||||
|
||||
# Export Transportation
|
||||
for transport in user.transportation_set.all():
|
||||
for idx, transport in enumerate(user.transportation_set.all()):
|
||||
collection_export_id = None
|
||||
if transport.collection:
|
||||
collection_export_id = collection_name_to_id.get(transport.collection.name)
|
||||
|
||||
export_data['transportation'].append({
|
||||
'export_id': idx,
|
||||
'type': transport.type,
|
||||
'name': transport.name,
|
||||
'description': transport.description,
|
||||
@@ -225,12 +244,13 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
})
|
||||
|
||||
# Export Notes
|
||||
for note in user.note_set.all():
|
||||
for idx, note in enumerate(user.note_set.all()):
|
||||
collection_export_id = None
|
||||
if note.collection:
|
||||
collection_export_id = collection_name_to_id.get(note.collection.name)
|
||||
|
||||
export_data['notes'].append({
|
||||
'export_id': idx,
|
||||
'name': note.name,
|
||||
'content': note.content,
|
||||
'links': note.links,
|
||||
@@ -240,12 +260,13 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
})
|
||||
|
||||
# Export Checklists
|
||||
for checklist in user.checklist_set.all():
|
||||
for idx, checklist in enumerate(user.checklist_set.all()):
|
||||
collection_export_id = None
|
||||
if checklist.collection:
|
||||
collection_export_id = collection_name_to_id.get(checklist.collection.name)
|
||||
|
||||
checklist_data = {
|
||||
'export_id': idx,
|
||||
'name': checklist.name,
|
||||
'date': checklist.date.isoformat() if checklist.date else None,
|
||||
'is_public': checklist.is_public,
|
||||
@@ -263,12 +284,13 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
export_data['checklists'].append(checklist_data)
|
||||
|
||||
# Export Lodging
|
||||
for lodging in user.lodging_set.all():
|
||||
for idx, lodging in enumerate(user.lodging_set.all()):
|
||||
collection_export_id = None
|
||||
if lodging.collection:
|
||||
collection_export_id = collection_name_to_id.get(lodging.collection.name)
|
||||
|
||||
|
||||
export_data['lodging'].append({
|
||||
'export_id': idx,
|
||||
'name': lodging.name,
|
||||
'type': lodging.type,
|
||||
'description': lodging.description,
|
||||
@@ -286,6 +308,41 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
'collection_export_id': collection_export_id
|
||||
})
|
||||
|
||||
# Export Itinerary Items
|
||||
# Create export_id mappings for all content types
|
||||
location_id_to_export_id = {loc.id: idx for idx, loc in enumerate(user.location_set.all())}
|
||||
transportation_id_to_export_id = {t.id: idx for idx, t in enumerate(user.transportation_set.all())}
|
||||
note_id_to_export_id = {n.id: idx for idx, n in enumerate(user.note_set.all())}
|
||||
lodging_id_to_export_id = {l.id: idx for idx, l in enumerate(user.lodging_set.all())}
|
||||
checklist_id_to_export_id = {c.id: idx for idx, c in enumerate(user.checklist_set.all())}
|
||||
|
||||
for collection_idx, collection in enumerate(user.collection_set.all()):
|
||||
for itinerary_item in collection.itinerary_items.all():
|
||||
content_type_str = itinerary_item.content_type.model
|
||||
item_reference = None
|
||||
|
||||
# Determine how to reference the item based on content type using export_ids
|
||||
if content_type_str == 'location':
|
||||
item_reference = location_id_to_export_id.get(itinerary_item.object_id)
|
||||
elif content_type_str == 'transportation':
|
||||
item_reference = transportation_id_to_export_id.get(itinerary_item.object_id)
|
||||
elif content_type_str == 'note':
|
||||
item_reference = note_id_to_export_id.get(itinerary_item.object_id)
|
||||
elif content_type_str == 'lodging':
|
||||
item_reference = lodging_id_to_export_id.get(itinerary_item.object_id)
|
||||
elif content_type_str == 'checklist':
|
||||
item_reference = checklist_id_to_export_id.get(itinerary_item.object_id)
|
||||
|
||||
if item_reference is not None:
|
||||
export_data['itinerary_items'].append({
|
||||
'collection_export_id': collection_idx,
|
||||
'content_type': content_type_str,
|
||||
'item_reference': item_reference,
|
||||
'date': itinerary_item.date.isoformat() if itinerary_item.date else None,
|
||||
'is_global': itinerary_item.is_global,
|
||||
'order': itinerary_item.order
|
||||
})
|
||||
|
||||
# Create ZIP file
|
||||
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
|
||||
with zipfile.ZipFile(tmp_file.name, 'w', zipfile.ZIP_DEFLATED) as zip_file:
|
||||
@@ -402,6 +459,9 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
|
||||
def _clear_user_data(self, user):
|
||||
"""Clear all existing user data before import"""
|
||||
# Delete itinerary items first (they reference collections and content)
|
||||
CollectionItineraryItem.objects.filter(collection__user=user).delete()
|
||||
|
||||
# Delete in reverse order of dependencies
|
||||
user.activity_set.all().delete() # Delete activities first
|
||||
user.trail_set.all().delete() # Delete trails
|
||||
@@ -439,7 +499,7 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
'transportation': 0, 'notes': 0, 'checklists': 0,
|
||||
'checklist_items': 0, 'lodging': 0, 'images': 0,
|
||||
'attachments': 0, 'visited_cities': 0, 'visited_regions': 0,
|
||||
'trails': 0, 'activities': 0, 'gpx_files': 0
|
||||
'trails': 0, 'activities': 0, 'gpx_files': 0, 'itinerary_items': 0
|
||||
}
|
||||
|
||||
# Import Visited Cities
|
||||
@@ -475,6 +535,9 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
category_map[cat_data['name']] = category
|
||||
summary['categories'] += 1
|
||||
|
||||
pending_primary_images = []
|
||||
location_images_map = {}
|
||||
|
||||
# Import Collections
|
||||
for col_data in backup_data.get('collections', []):
|
||||
collection = Collection.objects.create(
|
||||
@@ -498,6 +561,13 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
collection.shared_with.add(shared_user)
|
||||
except User.DoesNotExist:
|
||||
pass
|
||||
|
||||
# Defer primary image assignment until images are created
|
||||
if col_data.get('primary_image'):
|
||||
pending_primary_images.append({
|
||||
'collection_export_id': col_data['export_id'],
|
||||
'data': col_data['primary_image'],
|
||||
})
|
||||
|
||||
# Import Locations
|
||||
for adv_data in backup_data.get('locations', []):
|
||||
@@ -541,6 +611,7 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
)
|
||||
location.save(_skip_geocode=True) # Skip geocoding for now
|
||||
location_map[adv_data['export_id']] = location
|
||||
location_images_map.setdefault(adv_data['export_id'], [])
|
||||
|
||||
# Add to collections using export_ids - MUST be done after save()
|
||||
for collection_export_id in adv_data.get('collection_export_ids', []):
|
||||
@@ -638,13 +709,14 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
for img_data in adv_data.get('images', []):
|
||||
immich_id = img_data.get('immich_id')
|
||||
if immich_id:
|
||||
ContentImage.objects.create(
|
||||
new_img = ContentImage.objects.create(
|
||||
user=user,
|
||||
immich_id=immich_id,
|
||||
is_primary=img_data.get('is_primary', False),
|
||||
content_type=content_type,
|
||||
object_id=location.id
|
||||
)
|
||||
location_images_map[adv_data['export_id']].append(new_img)
|
||||
summary['images'] += 1
|
||||
else:
|
||||
filename = img_data.get('filename')
|
||||
@@ -652,13 +724,14 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
try:
|
||||
img_content = zip_file.read(f'images/{filename}')
|
||||
img_file = ContentFile(img_content, name=filename)
|
||||
ContentImage.objects.create(
|
||||
new_img = ContentImage.objects.create(
|
||||
user=user,
|
||||
image=img_file,
|
||||
is_primary=img_data.get('is_primary', False),
|
||||
content_type=content_type,
|
||||
object_id=location.id
|
||||
)
|
||||
location_images_map[adv_data['export_id']].append(new_img)
|
||||
summary['images'] += 1
|
||||
except KeyError:
|
||||
pass
|
||||
@@ -682,14 +755,32 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
pass
|
||||
|
||||
summary['locations'] += 1
|
||||
|
||||
# Apply primary image selections now that images exist
|
||||
for entry in pending_primary_images:
|
||||
collection = collection_map.get(entry['collection_export_id'])
|
||||
data = entry.get('data', {}) or {}
|
||||
if not collection:
|
||||
continue
|
||||
|
||||
loc_export_id = data.get('location_export_id')
|
||||
img_index = data.get('image_index')
|
||||
if loc_export_id is None or img_index is None:
|
||||
continue
|
||||
|
||||
images_for_location = location_images_map.get(loc_export_id, [])
|
||||
if 0 <= img_index < len(images_for_location):
|
||||
collection.primary_image = images_for_location[img_index]
|
||||
collection.save(update_fields=['primary_image'])
|
||||
|
||||
# Import Transportation
|
||||
transportation_map = {} # Map export_id to actual transportation object
|
||||
for trans_data in backup_data.get('transportation', []):
|
||||
collection = None
|
||||
if trans_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(trans_data['collection_export_id'])
|
||||
|
||||
Transportation.objects.create(
|
||||
transportation = Transportation.objects.create(
|
||||
user=user,
|
||||
type=trans_data['type'],
|
||||
name=trans_data['name'],
|
||||
@@ -710,15 +801,19 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
is_public=trans_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
# Only add to map if export_id exists (for backward compatibility with old backups)
|
||||
if 'export_id' in trans_data:
|
||||
transportation_map[trans_data['export_id']] = transportation
|
||||
summary['transportation'] += 1
|
||||
|
||||
# Import Notes
|
||||
note_map = {} # Map export_id to actual note object
|
||||
for note_data in backup_data.get('notes', []):
|
||||
collection = None
|
||||
if note_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(note_data['collection_export_id'])
|
||||
|
||||
Note.objects.create(
|
||||
note = Note.objects.create(
|
||||
user=user,
|
||||
name=note_data['name'],
|
||||
content=note_data.get('content'),
|
||||
@@ -727,9 +822,13 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
is_public=note_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
# Only add to map if export_id exists (for backward compatibility with old backups)
|
||||
if 'export_id' in note_data:
|
||||
note_map[note_data['export_id']] = note
|
||||
summary['notes'] += 1
|
||||
|
||||
# Import Checklists
|
||||
checklist_map = {} # Map export_id to actual checklist object
|
||||
for check_data in backup_data.get('checklists', []):
|
||||
collection = None
|
||||
if check_data.get('collection_export_id') is not None:
|
||||
@@ -753,15 +852,19 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
)
|
||||
summary['checklist_items'] += 1
|
||||
|
||||
# Only add to map if export_id exists (for backward compatibility with old backups)
|
||||
if 'export_id' in check_data:
|
||||
checklist_map[check_data['export_id']] = checklist
|
||||
summary['checklists'] += 1
|
||||
|
||||
# Import Lodging
|
||||
lodging_map = {} # Map export_id to actual lodging object
|
||||
for lodg_data in backup_data.get('lodging', []):
|
||||
collection = None
|
||||
if lodg_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(lodg_data['collection_export_id'])
|
||||
|
||||
Lodging.objects.create(
|
||||
lodging = Lodging.objects.create(
|
||||
user=user,
|
||||
name=lodg_data['name'],
|
||||
type=lodg_data.get('type', 'other'),
|
||||
@@ -779,6 +882,51 @@ class BackupViewSet(viewsets.ViewSet):
|
||||
is_public=lodg_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
# Only add to map if export_id exists (for backward compatibility with old backups)
|
||||
if 'export_id' in lodg_data:
|
||||
lodging_map[lodg_data['export_id']] = lodging
|
||||
summary['lodging'] += 1
|
||||
|
||||
# Import Itinerary Items
|
||||
# Maps already created during import of each content type
|
||||
|
||||
for itinerary_data in backup_data.get('itinerary_items', []):
|
||||
collection = collection_map.get(itinerary_data['collection_export_id'])
|
||||
if not collection:
|
||||
continue
|
||||
|
||||
content_type_str = itinerary_data['content_type']
|
||||
item_reference = itinerary_data['item_reference']
|
||||
|
||||
# Get the actual object based on content type
|
||||
content_object = None
|
||||
content_type = None
|
||||
|
||||
if content_type_str == 'location':
|
||||
content_object = location_map.get(item_reference) # item_reference is export_id
|
||||
content_type = ContentType.objects.get(model='location')
|
||||
elif content_type_str == 'transportation':
|
||||
content_object = transportation_map.get(item_reference) # item_reference is export_id
|
||||
content_type = ContentType.objects.get(model='transportation')
|
||||
elif content_type_str == 'note':
|
||||
content_object = note_map.get(item_reference) # item_reference is export_id
|
||||
content_type = ContentType.objects.get(model='note')
|
||||
elif content_type_str == 'lodging':
|
||||
content_object = lodging_map.get(item_reference) # item_reference is export_id
|
||||
content_type = ContentType.objects.get(model='lodging')
|
||||
elif content_type_str == 'checklist':
|
||||
content_object = checklist_map.get(item_reference) # item_reference is export_id
|
||||
content_type = ContentType.objects.get(model='checklist')
|
||||
|
||||
if content_object and content_type:
|
||||
CollectionItineraryItem.objects.create(
|
||||
collection=collection,
|
||||
content_type=content_type,
|
||||
object_id=content_object.id,
|
||||
date=itinerary_data.get('date') if not itinerary_data.get('is_global') else None,
|
||||
is_global=bool(itinerary_data.get('is_global', False)),
|
||||
order=itinerary_data['order']
|
||||
)
|
||||
summary['itinerary_items'] += 1
|
||||
|
||||
return summary
|
||||
518
backend/server/adventures/views/itinerary_view.py
Normal file
518
backend/server/adventures/views/itinerary_view.py
Normal file
@@ -0,0 +1,518 @@
|
||||
from adventures.models import Location, Collection, CollectionItineraryItem, Transportation, Note, Lodging, Visit, Checklist, Note, CollectionItineraryDay
|
||||
import datetime
|
||||
from django.utils.dateparse import parse_date, parse_datetime
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.db import models
|
||||
from adventures.serializers import CollectionItineraryItemSerializer, CollectionItineraryDaySerializer
|
||||
from adventures.utils.itinerary import reorder_itinerary_items
|
||||
from adventures.utils.autogenerate_itinerary import auto_generate_itinerary
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.exceptions import ValidationError, PermissionDenied
|
||||
from adventures.permissions import IsOwnerOrSharedWithFullAccess
|
||||
from django.db.models import Q
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
|
||||
class ItineraryViewSet(viewsets.ModelViewSet):
|
||||
serializer_class = CollectionItineraryItemSerializer
|
||||
permission_classes = [IsOwnerOrSharedWithFullAccess]
|
||||
|
||||
def get_queryset(self):
|
||||
user = self.request.user
|
||||
|
||||
if not user.is_authenticated:
|
||||
return CollectionItineraryItem.objects.none()
|
||||
|
||||
# Return itinerary items from collections the user owns or is shared with
|
||||
return CollectionItineraryItem.objects.filter(
|
||||
Q(collection__user=user) | Q(collection__shared_with=user)
|
||||
).distinct().select_related('collection', 'collection__user').order_by('date', 'order')
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
"""
|
||||
Accept 'content_type' as either a ContentType PK or a model name string
|
||||
(e.g. 'location', 'lodging', 'transportation', 'note', 'visit'). If a
|
||||
string is provided we resolve it to the appropriate ContentType PK and
|
||||
validate the referenced object exists and the user has permission to
|
||||
access it.
|
||||
|
||||
Optional parameter 'update_item_date': if True, update the actual item's
|
||||
date field to match the itinerary date.
|
||||
"""
|
||||
if not request.user.is_authenticated:
|
||||
return Response({"error": "User is not authenticated"}, status=status.HTTP_401_UNAUTHORIZED)
|
||||
|
||||
data = request.data.copy()
|
||||
content_type_val = data.get('content_type')
|
||||
object_id = data.get('object_id')
|
||||
update_item_date = data.get('update_item_date', False)
|
||||
target_date = data.get('date')
|
||||
is_global = data.get('is_global', False)
|
||||
# Normalize is_global to boolean
|
||||
if isinstance(is_global, str):
|
||||
is_global = is_global.lower() in ['1', 'true', 'yes']
|
||||
data['is_global'] = is_global
|
||||
|
||||
# Support legacy field 'location' -> treat as content_type='location'
|
||||
if not content_type_val and data.get('location'):
|
||||
content_type_val = 'location'
|
||||
object_id = object_id or data.get('location')
|
||||
data['content_type'] = content_type_val
|
||||
data['object_id'] = object_id
|
||||
|
||||
# If content_type is provided as a string model name, map to ContentType PK
|
||||
if content_type_val and isinstance(content_type_val, str):
|
||||
# If it's already numeric-like, leave it
|
||||
if not content_type_val.isdigit():
|
||||
content_map = {
|
||||
'location': Location,
|
||||
'transportation': Transportation,
|
||||
'note': Note,
|
||||
'lodging': Lodging,
|
||||
'visit': Visit,
|
||||
'checklist': Checklist,
|
||||
'note': Note,
|
||||
}
|
||||
|
||||
if content_type_val not in content_map:
|
||||
return Response({
|
||||
'error': f"Invalid content_type. Must be one of: {', '.join(content_map.keys())}"
|
||||
}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
model_class = content_map[content_type_val]
|
||||
|
||||
# Validate referenced object exists
|
||||
try:
|
||||
content_object = model_class.objects.get(id=object_id)
|
||||
except (ValueError, model_class.DoesNotExist):
|
||||
return Response({'error': f"{content_type_val} not found"}, status=status.HTTP_404_NOT_FOUND)
|
||||
|
||||
# Permission check
|
||||
permission_checker = IsOwnerOrSharedWithFullAccess()
|
||||
if not permission_checker.has_object_permission(request, self, content_object):
|
||||
return Response({'error': 'User does not have permission to access this content'}, status=status.HTTP_403_FORBIDDEN)
|
||||
|
||||
ct = ContentType.objects.get_for_model(model_class)
|
||||
data['content_type'] = ct.pk
|
||||
|
||||
# If update_item_date is True and target_date is provided, update the item's date
|
||||
if update_item_date and target_date and content_object:
|
||||
# Extract just the date part if target_date is datetime
|
||||
clean_date = str(target_date).split('T')[0] if 'T' in str(target_date) else str(target_date)
|
||||
|
||||
# For locations, create an all-day visit instead of updating a date field
|
||||
if content_type_val == 'location':
|
||||
# Determine start/end bounds. Support single date or optional start_date/end_date in payload.
|
||||
# Prefer explicit start_date/end_date if provided, otherwise use the single target date.
|
||||
start_input = data.get('start_date') or clean_date
|
||||
end_input = data.get('end_date') or clean_date
|
||||
|
||||
def parse_bounds(val):
|
||||
if not val:
|
||||
return None
|
||||
s = str(val)
|
||||
# If datetime string provided, parse directly
|
||||
if 'T' in s:
|
||||
dt = parse_datetime(s)
|
||||
return dt
|
||||
# Otherwise parse as date and convert to datetime at start/end of day
|
||||
d = parse_date(s)
|
||||
if d:
|
||||
return d
|
||||
return None
|
||||
|
||||
# Normalize to date or datetime values
|
||||
parsed_start = parse_bounds(start_input)
|
||||
parsed_end = parse_bounds(end_input)
|
||||
|
||||
# If both are plain dates, convert to datetimes spanning the day
|
||||
if isinstance(parsed_start, datetime.date) and not isinstance(parsed_start, datetime.datetime):
|
||||
new_start = datetime.datetime.combine(parsed_start, datetime.time.min)
|
||||
elif isinstance(parsed_start, datetime.datetime):
|
||||
new_start = parsed_start
|
||||
else:
|
||||
new_start = None
|
||||
|
||||
if isinstance(parsed_end, datetime.date) and not isinstance(parsed_end, datetime.datetime):
|
||||
new_end = datetime.datetime.combine(parsed_end, datetime.time.max)
|
||||
elif isinstance(parsed_end, datetime.datetime):
|
||||
new_end = parsed_end
|
||||
else:
|
||||
new_end = None
|
||||
|
||||
# If we couldn't parse bounds, fallback to the all-day target date
|
||||
if not new_start or not new_end:
|
||||
try:
|
||||
d = parse_date(clean_date)
|
||||
new_start = datetime.datetime.combine(d, datetime.time.min)
|
||||
new_end = datetime.datetime.combine(d, datetime.time.max)
|
||||
except Exception:
|
||||
new_start = None
|
||||
new_end = None
|
||||
|
||||
# Update existing visit or create new one
|
||||
# When moving between days, update the existing visit to preserve visit ID and data
|
||||
if new_start and new_end:
|
||||
source_visit_id = data.get('source_visit_id')
|
||||
|
||||
# If source visit provided, update it
|
||||
if source_visit_id:
|
||||
try:
|
||||
source_visit = Visit.objects.get(id=source_visit_id, location=content_object)
|
||||
source_visit.start_date = new_start
|
||||
source_visit.end_date = new_end
|
||||
source_visit.save(update_fields=['start_date', 'end_date'])
|
||||
except Visit.DoesNotExist:
|
||||
# Fall back to create logic below
|
||||
pass
|
||||
|
||||
# If no source visit or update failed, check for overlapping visits
|
||||
if not source_visit_id:
|
||||
# Check for exact match to avoid duplicates
|
||||
exact_match = Visit.objects.filter(
|
||||
location=content_object,
|
||||
start_date=new_start,
|
||||
end_date=new_end
|
||||
).exists()
|
||||
|
||||
if not exact_match:
|
||||
# Check for any overlapping visits
|
||||
overlap_q = Q(start_date__lte=new_end) & Q(end_date__gte=new_start)
|
||||
existing = Visit.objects.filter(location=content_object).filter(overlap_q).first()
|
||||
|
||||
if existing:
|
||||
# Update existing overlapping visit
|
||||
existing.start_date = new_start
|
||||
existing.end_date = new_end
|
||||
existing.save(update_fields=['start_date', 'end_date'])
|
||||
else:
|
||||
# Create new visit
|
||||
Visit.objects.create(
|
||||
location=content_object,
|
||||
start_date=new_start,
|
||||
end_date=new_end,
|
||||
notes="Created from itinerary planning"
|
||||
)
|
||||
else:
|
||||
# For other item types, update their date field and preserve duration
|
||||
if content_type_val == 'transportation':
|
||||
# For transportation: update date and end_date, preserving duration and times
|
||||
if hasattr(content_object, 'date') and hasattr(content_object, 'end_date'):
|
||||
old_date = content_object.date
|
||||
old_end_date = content_object.end_date
|
||||
|
||||
if old_date and old_end_date:
|
||||
# Extract time from original start date
|
||||
original_time = old_date.time()
|
||||
# Create new_date with the new date but preserve the original time
|
||||
new_date = datetime.datetime.combine(parse_date(clean_date), original_time)
|
||||
# Duration = end_date - date
|
||||
duration = old_end_date - old_date
|
||||
# Apply same duration to new date
|
||||
new_end_date = new_date + duration
|
||||
else:
|
||||
# No original end date, set to same as start date
|
||||
new_date = datetime.datetime.combine(parse_date(clean_date), datetime.time.min)
|
||||
new_end_date = new_date
|
||||
|
||||
content_object.date = new_date
|
||||
content_object.end_date = new_end_date
|
||||
content_object.save(update_fields=['date', 'end_date'])
|
||||
elif content_type_val == 'lodging':
|
||||
# For lodging: update check_in and check_out, preserving duration and times
|
||||
if hasattr(content_object, 'check_in') and hasattr(content_object, 'check_out'):
|
||||
old_check_in = content_object.check_in
|
||||
old_check_out = content_object.check_out
|
||||
|
||||
if old_check_in and old_check_out:
|
||||
# Extract time from original check_in
|
||||
original_time = old_check_in.time()
|
||||
# Create new_check_in with the new date but preserve the original time
|
||||
new_check_in = datetime.datetime.combine(parse_date(clean_date), original_time)
|
||||
# Duration = check_out - check_in
|
||||
duration = old_check_out - old_check_in
|
||||
# Apply same duration to new check_in
|
||||
new_check_out = new_check_in + duration
|
||||
else:
|
||||
# No original dates: check_in at midnight on selected day, check_out at midnight next day
|
||||
new_check_in = datetime.datetime.combine(parse_date(clean_date), datetime.time.min)
|
||||
new_check_out = new_check_in + datetime.timedelta(days=1)
|
||||
|
||||
content_object.check_in = new_check_in
|
||||
content_object.check_out = new_check_out
|
||||
content_object.save(update_fields=['check_in', 'check_out'])
|
||||
else:
|
||||
# For note, checklist, etc. - just update the date field
|
||||
date_field = None
|
||||
if hasattr(content_object, 'date'):
|
||||
date_field = 'date'
|
||||
elif hasattr(content_object, 'start_date'):
|
||||
date_field = 'start_date'
|
||||
|
||||
if date_field:
|
||||
setattr(content_object, date_field, clean_date)
|
||||
content_object.save(update_fields=[date_field])
|
||||
|
||||
# Ensure order is unique for this collection+group combination (day or global)
|
||||
collection_id = data.get('collection')
|
||||
item_date = data.get('date')
|
||||
item_order = data.get('order', 0)
|
||||
|
||||
# Basic XOR validation between date and is_global
|
||||
if is_global and item_date:
|
||||
return Response({'error': 'Global itinerary items must not include a date.'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
if (not is_global) and not item_date:
|
||||
return Response({'error': 'Dated itinerary items must include a date.'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Validate that the itinerary date (if provided) falls within the
|
||||
# collection's start_date/end_date range (if those bounds are set).
|
||||
if collection_id and item_date and not is_global:
|
||||
# Try parse date or datetime-like values
|
||||
parsed_date = None
|
||||
try:
|
||||
parsed_date = parse_date(str(item_date))
|
||||
except Exception:
|
||||
parsed_date = None
|
||||
if parsed_date is None:
|
||||
try:
|
||||
dt = parse_datetime(str(item_date))
|
||||
if dt:
|
||||
parsed_date = dt.date()
|
||||
except Exception:
|
||||
parsed_date = None
|
||||
|
||||
if parsed_date is not None:
|
||||
try:
|
||||
collection_obj = Collection.objects.get(id=collection_id)
|
||||
except Collection.DoesNotExist:
|
||||
return Response({'error': 'Collection not found'}, status=status.HTTP_404_NOT_FOUND)
|
||||
|
||||
if collection_obj.start_date and parsed_date < collection_obj.start_date:
|
||||
return Response({'error': 'Itinerary item date is before the collection start_date'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
if collection_obj.end_date and parsed_date > collection_obj.end_date:
|
||||
return Response({'error': 'Itinerary item date is after the collection end_date'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
if collection_id:
|
||||
if is_global:
|
||||
# Max order within global group
|
||||
existing_max = CollectionItineraryItem.objects.filter(
|
||||
collection_id=collection_id,
|
||||
is_global=True
|
||||
).aggregate(max_order=models.Max('order'))['max_order']
|
||||
if existing_max is None:
|
||||
existing_max = -1
|
||||
if item_order is None or item_order <= existing_max:
|
||||
data['order'] = existing_max + 1
|
||||
elif item_date:
|
||||
# Find the maximum order for this collection+date
|
||||
existing_max = CollectionItineraryItem.objects.filter(
|
||||
collection_id=collection_id,
|
||||
date=item_date,
|
||||
is_global=False
|
||||
).aggregate(max_order=models.Max('order'))['max_order']
|
||||
|
||||
# Check if the requested order conflicts with existing items
|
||||
if existing_max is not None and item_order <= existing_max:
|
||||
# Assign next available order
|
||||
data['order'] = existing_max + 1
|
||||
|
||||
# Proceed with normal serializer flow using modified data
|
||||
serializer = self.get_serializer(data=data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
self.perform_create(serializer)
|
||||
|
||||
# If we updated the item's date, include the updated object in response for frontend sync
|
||||
response_data = serializer.data
|
||||
if update_item_date and content_type_val and object_id:
|
||||
if content_type_val == 'transportation':
|
||||
try:
|
||||
t = Transportation.objects.get(id=object_id)
|
||||
from adventures.serializers import TransportationSerializer
|
||||
response_data['updated_object'] = TransportationSerializer(t).data
|
||||
except Transportation.DoesNotExist:
|
||||
pass
|
||||
elif content_type_val == 'lodging':
|
||||
try:
|
||||
l = Lodging.objects.get(id=object_id)
|
||||
from adventures.serializers import LodgingSerializer
|
||||
response_data['updated_object'] = LodgingSerializer(l).data
|
||||
except Lodging.DoesNotExist:
|
||||
pass
|
||||
|
||||
headers = self.get_success_headers(serializer.data)
|
||||
return Response(response_data, status=status.HTTP_201_CREATED, headers=headers)
|
||||
|
||||
@transaction.atomic
|
||||
def destroy(self, request, *args, **kwargs):
|
||||
"""
|
||||
Override destroy to remove associated visits when deleting a location itinerary item.
|
||||
|
||||
When removing a location from the itinerary, any PLANNED visits (future visits) at
|
||||
that location on the same date as the itinerary item should also be removed.
|
||||
|
||||
If preserve_visits=true query parameter is provided, visits will NOT be deleted.
|
||||
This is useful when moving items to global/trip context where we want to keep the visits.
|
||||
"""
|
||||
instance = self.get_object()
|
||||
preserve_visits = request.query_params.get('preserve_visits', 'false').lower() == 'true'
|
||||
|
||||
# Check if this is a location type itinerary item
|
||||
location_ct = ContentType.objects.get_for_model(Location)
|
||||
if instance.content_type == location_ct and instance.object_id and not preserve_visits:
|
||||
try:
|
||||
location = Location.objects.get(id=instance.object_id)
|
||||
itinerary_date = instance.date
|
||||
|
||||
if itinerary_date:
|
||||
# Convert itinerary date to datetime for comparison
|
||||
if isinstance(itinerary_date, str):
|
||||
itinerary_date = parse_date(itinerary_date)
|
||||
|
||||
# Find and delete visits at this location on this date
|
||||
# When removing from itinerary, we remove the associated visit
|
||||
visits_to_delete = Visit.objects.filter(
|
||||
location=location,
|
||||
start_date__date=itinerary_date
|
||||
)
|
||||
|
||||
deleted_count = visits_to_delete.count()
|
||||
if deleted_count > 0:
|
||||
visits_to_delete.delete()
|
||||
|
||||
except Location.DoesNotExist:
|
||||
# Location doesn't exist, just proceed with deleting the itinerary item
|
||||
pass
|
||||
|
||||
# Call parent destroy to delete the itinerary item
|
||||
return super().destroy(request, *args, **kwargs)
|
||||
|
||||
@action(detail=False, methods=['post'], url_path='reorder')
|
||||
@transaction.atomic
|
||||
def reorder(self, request):
|
||||
"""
|
||||
Reorder itinerary items in bulk.
|
||||
|
||||
Expected payload:
|
||||
{
|
||||
"items": [
|
||||
{"id": "uuid", "date": "2024-01-01", "order": 0},
|
||||
{"id": "uuid", "date": "2024-01-01", "order": 1},
|
||||
...
|
||||
]
|
||||
}
|
||||
"""
|
||||
items_data = request.data.get('items', [])
|
||||
|
||||
# Delegate to reusable helper which handles validation, permission checks
|
||||
# and the two-phase update to avoid unique constraint races.
|
||||
updated_items = reorder_itinerary_items(request.user, items_data)
|
||||
|
||||
serializer = self.get_serializer(updated_items, many=True)
|
||||
return Response(serializer.data, status=status.HTTP_200_OK)
|
||||
|
||||
@action(detail=False, methods=['post'], url_path='auto-generate')
|
||||
@transaction.atomic
|
||||
def auto_generate(self, request):
|
||||
"""
|
||||
Auto-generate itinerary items for a collection based on dated records.
|
||||
|
||||
Only works when:
|
||||
- Collection has zero itinerary items
|
||||
- Collection has dated records (visits, lodging, transportation, notes, checklists)
|
||||
|
||||
Expected payload:
|
||||
{
|
||||
"collection_id": "uuid"
|
||||
}
|
||||
|
||||
Returns: List of created itinerary items
|
||||
"""
|
||||
collection_id = request.data.get('collection_id')
|
||||
|
||||
if not collection_id:
|
||||
return Response(
|
||||
{"error": "collection_id is required"},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Get collection and check permissions
|
||||
try:
|
||||
collection = Collection.objects.get(id=collection_id)
|
||||
except Collection.DoesNotExist:
|
||||
return Response(
|
||||
{"error": "Collection not found"},
|
||||
status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
# Permission check: user must be collection owner or in shared_with
|
||||
if not (collection.user == request.user or collection.shared_with.filter(id=request.user.id).exists()):
|
||||
return Response(
|
||||
{"error": "You do not have permission to modify this collection"},
|
||||
status=status.HTTP_403_FORBIDDEN
|
||||
)
|
||||
|
||||
try:
|
||||
created_items = auto_generate_itinerary(collection)
|
||||
serializer = self.get_serializer(created_items, many=True)
|
||||
return Response({
|
||||
"message": f"Successfully generated {len(created_items)} itinerary items",
|
||||
"items": serializer.data
|
||||
}, status=status.HTTP_201_CREATED)
|
||||
except ValidationError as e:
|
||||
return Response(e.detail, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
class ItineraryDayViewSet(viewsets.ModelViewSet):
|
||||
"""ViewSet for managing itinerary day metadata (names and descriptions)"""
|
||||
serializer_class = CollectionItineraryDaySerializer
|
||||
permission_classes = [IsOwnerOrSharedWithFullAccess]
|
||||
|
||||
def get_queryset(self):
|
||||
user = self.request.user
|
||||
|
||||
if not user.is_authenticated:
|
||||
return CollectionItineraryDay.objects.none()
|
||||
|
||||
# Return day metadata from collections the user owns or is shared with
|
||||
return CollectionItineraryDay.objects.filter(
|
||||
Q(collection__user=user) | Q(collection__shared_with=user)
|
||||
).distinct().select_related('collection', 'collection__user').order_by('date')
|
||||
|
||||
def perform_create(self, serializer):
|
||||
"""Ensure the user has permission to modify the collection"""
|
||||
collection = serializer.validated_data.get('collection')
|
||||
|
||||
if not collection:
|
||||
raise ValidationError("Collection is required")
|
||||
|
||||
# Check if user has permission to modify this collection
|
||||
if not (collection.user == self.request.user or
|
||||
collection.shared_with.filter(id=self.request.user.id).exists()):
|
||||
raise PermissionDenied("You do not have permission to modify this collection")
|
||||
|
||||
serializer.save()
|
||||
|
||||
def perform_update(self, serializer):
|
||||
"""Ensure the user has permission to modify the collection"""
|
||||
instance = self.get_object()
|
||||
collection = instance.collection
|
||||
|
||||
# Check if user has permission to modify this collection
|
||||
if not (collection.user == self.request.user or
|
||||
collection.shared_with.filter(id=self.request.user.id).exists()):
|
||||
raise PermissionDenied("You do not have permission to modify this collection")
|
||||
|
||||
serializer.save()
|
||||
|
||||
def perform_destroy(self, instance):
|
||||
"""Ensure the user has permission to modify the collection"""
|
||||
collection = instance.collection
|
||||
|
||||
# Check if user has permission to modify this collection
|
||||
if not (collection.user == self.request.user or
|
||||
collection.shared_with.filter(id=self.request.user.id).exists()):
|
||||
raise PermissionDenied("You do not have permission to modify this collection")
|
||||
|
||||
instance.delete()
|
||||
@@ -1,15 +1,16 @@
|
||||
from django.utils import timezone
|
||||
from django.db import transaction
|
||||
from django.core.exceptions import PermissionDenied
|
||||
from django.db.models import Q, Max
|
||||
from django.db.models import Q, Max, Prefetch
|
||||
from django.db.models.functions import Lower
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
import requests
|
||||
from adventures.models import Location, Category
|
||||
from adventures.models import Location, Category, CollectionItineraryItem, Visit
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from adventures.permissions import IsOwnerOrSharedWithFullAccess
|
||||
from adventures.serializers import LocationSerializer, MapPinSerializer
|
||||
from adventures.serializers import LocationSerializer, MapPinSerializer, CalendarLocationSerializer
|
||||
from adventures.utils import pagination
|
||||
|
||||
class LocationViewSet(viewsets.ModelViewSet):
|
||||
@@ -208,6 +209,29 @@ class LocationViewSet(viewsets.ModelViewSet):
|
||||
serializer = self.get_serializer(queryset, many=True, context={'nested': nested, 'allowed_nested_fields': allowedNestedFields})
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=False, methods=['get'])
|
||||
def calendar(self, request):
|
||||
"""Return a lightweight payload for calendar rendering."""
|
||||
if not request.user.is_authenticated:
|
||||
return Response({"error": "User is not authenticated"}, status=400)
|
||||
|
||||
queryset = (
|
||||
self.get_queryset()
|
||||
.filter(visits__isnull=False)
|
||||
.select_related('category')
|
||||
.prefetch_related(
|
||||
Prefetch(
|
||||
'visits',
|
||||
queryset=Visit.objects.only('id', 'start_date', 'end_date', 'timezone')
|
||||
)
|
||||
)
|
||||
.only('id', 'name', 'location', 'category__name', 'category__icon')
|
||||
.distinct()
|
||||
)
|
||||
|
||||
serializer = CalendarLocationSerializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=True, methods=['get'], url_path='additional-info')
|
||||
def additional_info(self, request, pk=None):
|
||||
"""Get adventure with additional sunrise/sunset information."""
|
||||
@@ -277,6 +301,25 @@ class LocationViewSet(viewsets.ModelViewSet):
|
||||
raise PermissionDenied(
|
||||
f"You don't have permission to remove this location from one of the collections it's linked to.'"
|
||||
)
|
||||
else:
|
||||
# If the removal is permitted, also remove any itinerary items
|
||||
# in this collection that reference this Location instance.
|
||||
try:
|
||||
ct = ContentType.objects.get_for_model(instance.__class__)
|
||||
# Try deleting by native PK type first, then by string.
|
||||
qs = CollectionItineraryItem.objects.filter(
|
||||
collection=collection, content_type=ct, object_id=instance.pk
|
||||
)
|
||||
if qs.exists():
|
||||
qs.delete()
|
||||
else:
|
||||
CollectionItineraryItem.objects.filter(
|
||||
collection=collection, content_type=ct, object_id=str(instance.pk)
|
||||
).delete()
|
||||
except Exception:
|
||||
# Don't raise on cleanup failures; deletion of itinerary items
|
||||
# is best-effort and shouldn't block the update operation.
|
||||
pass
|
||||
|
||||
def _validate_collection_permissions(self, collections):
|
||||
"""Validate permissions for all collections (used in create)."""
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.response import Response
|
||||
from django.db.models import Q
|
||||
from adventures.models import Note
|
||||
from adventures.models import Note, CollectionItineraryItem
|
||||
from adventures.serializers import NoteSerializer
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
from adventures.permissions import IsOwnerOrSharedWithFullAccess
|
||||
from rest_framework.decorators import action
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.db import transaction
|
||||
|
||||
class NoteViewSet(viewsets.ModelViewSet):
|
||||
queryset = Note.objects.all()
|
||||
@@ -53,15 +55,18 @@ class NoteViewSet(viewsets.ModelViewSet):
|
||||
# Retrieve the current object
|
||||
instance = self.get_object()
|
||||
|
||||
# Store the old date before updating
|
||||
old_date = instance.date
|
||||
|
||||
# Partially update the instance with the request data
|
||||
serializer = self.get_serializer(instance, data=request.data, partial=True)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
# Retrieve the collection from the validated data
|
||||
new_collection = serializer.validated_data.get('collection')
|
||||
new_date = serializer.validated_data.get('date', old_date)
|
||||
|
||||
user = request.user
|
||||
print(new_collection)
|
||||
|
||||
if new_collection is not None and new_collection!=instance.collection:
|
||||
# Check if the user is the owner of the new collection
|
||||
@@ -72,34 +77,15 @@ class NoteViewSet(viewsets.ModelViewSet):
|
||||
if instance.collection is not None and instance.collection.user != user:
|
||||
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
|
||||
|
||||
# Perform the update
|
||||
self.perform_update(serializer)
|
||||
|
||||
# Return the updated instance
|
||||
return Response(serializer.data)
|
||||
|
||||
def partial_update(self, request, *args, **kwargs):
|
||||
# Retrieve the current object
|
||||
instance = self.get_object()
|
||||
|
||||
# Partially update the instance with the request data
|
||||
serializer = self.get_serializer(instance, data=request.data, partial=True)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
# Retrieve the collection from the validated data
|
||||
new_collection = serializer.validated_data.get('collection')
|
||||
|
||||
user = request.user
|
||||
print(new_collection)
|
||||
|
||||
if new_collection is not None and new_collection!=instance.collection:
|
||||
# Check if the user is the owner of the new collection
|
||||
if new_collection.user != user or instance.user != user:
|
||||
raise PermissionDenied("You do not have permission to use this collection.")
|
||||
elif new_collection is None:
|
||||
# Handle the case where the user is trying to set the collection to None
|
||||
if instance.collection is not None and instance.collection.user != user:
|
||||
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
|
||||
# If the date changed, remove old itinerary items for this note on the old date
|
||||
if old_date and new_date and old_date != new_date:
|
||||
note_ct = ContentType.objects.get_for_model(Note)
|
||||
old_itinerary_items = CollectionItineraryItem.objects.filter(
|
||||
content_type=note_ct,
|
||||
object_id=str(instance.id),
|
||||
date=old_date
|
||||
)
|
||||
old_itinerary_items.delete()
|
||||
|
||||
# Perform the update
|
||||
self.perform_update(serializer)
|
||||
|
||||
@@ -5,197 +5,398 @@ from rest_framework.response import Response
|
||||
from django.conf import settings
|
||||
import requests
|
||||
from geopy.distance import geodesic
|
||||
import logging
|
||||
from ..geocoding import search_google, search_osm
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class RecommendationsViewSet(viewsets.ViewSet):
|
||||
permission_classes = [IsAuthenticated]
|
||||
BASE_URL = "https://overpass-api.de/api/interpreter"
|
||||
OVERPASS_URL = "https://overpass-api.de/api/interpreter"
|
||||
NOMINATIM_URL = "https://nominatim.openstreetmap.org/search"
|
||||
HEADERS = {'User-Agent': 'AdventureLog Server'}
|
||||
|
||||
# Quality thresholds
|
||||
MIN_GOOGLE_RATING = 3.0 # Minimum rating to include
|
||||
MIN_GOOGLE_REVIEWS = 5 # Minimum number of reviews
|
||||
MAX_RESULTS = 50 # Maximum results to return
|
||||
|
||||
def calculate_quality_score(self, place_data):
|
||||
"""
|
||||
Calculate a quality score based on multiple factors.
|
||||
Higher score = better quality recommendation.
|
||||
"""
|
||||
import math
|
||||
score = 0.0
|
||||
|
||||
# Rating contribution (0-50 points)
|
||||
rating = place_data.get('rating')
|
||||
if rating is not None and rating > 0:
|
||||
score += (rating / 5.0) * 50
|
||||
|
||||
# Review count contribution (0-30 points, logarithmic scale)
|
||||
review_count = place_data.get('review_count')
|
||||
if review_count is not None and review_count > 0:
|
||||
# Logarithmic scale: 10 reviews = ~10 pts, 100 = ~20 pts, 1000 = ~30 pts
|
||||
score += min(30, math.log10(review_count) * 10)
|
||||
|
||||
# Distance penalty (0-20 points, closer is better)
|
||||
distance_km = place_data.get('distance_km')
|
||||
if distance_km is not None:
|
||||
if distance_km < 1:
|
||||
score += 20
|
||||
elif distance_km < 5:
|
||||
score += 15
|
||||
elif distance_km < 10:
|
||||
score += 10
|
||||
elif distance_km < 20:
|
||||
score += 5
|
||||
|
||||
# Verified/business status bonus (0-10 points)
|
||||
if place_data.get('is_verified') or place_data.get('business_status') == 'OPERATIONAL':
|
||||
score += 10
|
||||
|
||||
# Has photos bonus (0-5 points)
|
||||
photos = place_data.get('photos')
|
||||
if photos and len(photos) > 0:
|
||||
score += 5
|
||||
|
||||
# Has opening hours bonus (0-5 points)
|
||||
opening_hours = place_data.get('opening_hours')
|
||||
if opening_hours and len(opening_hours) > 0:
|
||||
score += 5
|
||||
|
||||
return round(score, 2)
|
||||
|
||||
def parse_google_places(self, places, origin):
|
||||
"""
|
||||
Parse Google Places API results into unified format.
|
||||
Enhanced with quality filtering and comprehensive data extraction.
|
||||
"""
|
||||
locations = []
|
||||
api_key = getattr(settings, 'GOOGLE_MAPS_API_KEY', None)
|
||||
|
||||
for place in places:
|
||||
location = place.get('location', {})
|
||||
types = place.get('types', [])
|
||||
|
||||
# Updated for new API response structure
|
||||
formatted_address = place.get("formattedAddress") or place.get("shortFormattedAddress")
|
||||
# Extract display name
|
||||
display_name = place.get("displayName", {})
|
||||
name = display_name.get("text") if isinstance(display_name, dict) else display_name
|
||||
|
||||
# Extract coordinates
|
||||
lat = location.get('latitude')
|
||||
lon = location.get('longitude')
|
||||
|
||||
if not name or not lat or not lon:
|
||||
continue
|
||||
|
||||
# Extract rating information
|
||||
rating = place.get('rating')
|
||||
review_count = place.get('userRatingCount', 0)
|
||||
|
||||
# Quality filter: Skip low-rated or unreviewed places
|
||||
if rating and rating < self.MIN_GOOGLE_RATING:
|
||||
continue
|
||||
if review_count < self.MIN_GOOGLE_REVIEWS:
|
||||
continue
|
||||
|
||||
# Calculate distance
|
||||
distance_km = geodesic(origin, (lat, lon)).km
|
||||
|
||||
adventure = {
|
||||
"id": place.get('id'),
|
||||
"type": 'place',
|
||||
# Extract address information
|
||||
formatted_address = place.get("formattedAddress") or place.get("shortFormattedAddress")
|
||||
|
||||
# Extract business status
|
||||
business_status = place.get('businessStatus')
|
||||
is_operational = business_status == 'OPERATIONAL'
|
||||
|
||||
# Extract opening hours
|
||||
opening_hours = place.get('regularOpeningHours', {})
|
||||
current_opening_hours = place.get('currentOpeningHours', {})
|
||||
is_open_now = current_opening_hours.get('openNow')
|
||||
|
||||
# Extract photos and construct URLs
|
||||
photos = place.get('photos', [])
|
||||
photo_urls = []
|
||||
if photos and api_key:
|
||||
# Get first 5 photos and construct full URLs
|
||||
for photo in photos[:5]:
|
||||
photo_name = photo.get('name', '')
|
||||
if photo_name:
|
||||
# Construct Google Places Photo API URL
|
||||
# Format: https://places.googleapis.com/v1/{name}/media?key={key}&maxHeightPx=800&maxWidthPx=800
|
||||
photo_url = f"https://places.googleapis.com/v1/{photo_name}/media?key={api_key}&maxHeightPx=800&maxWidthPx=800"
|
||||
photo_urls.append(photo_url)
|
||||
|
||||
# Extract contact information
|
||||
phone_number = place.get('nationalPhoneNumber') or place.get('internationalPhoneNumber')
|
||||
website = place.get('websiteUri')
|
||||
google_maps_uri = place.get('googleMapsUri')
|
||||
|
||||
# Extract price level
|
||||
price_level = place.get('priceLevel')
|
||||
|
||||
# Extract editorial summary/description
|
||||
editorial_summary = place.get('editorialSummary', {})
|
||||
description = editorial_summary.get('text') if isinstance(editorial_summary, dict) else None
|
||||
|
||||
# Filter out unwanted types (generic categories)
|
||||
filtered_types = [t for t in types if t not in ['point_of_interest', 'establishment']]
|
||||
|
||||
# Build unified response
|
||||
place_data = {
|
||||
"id": f"google:{place.get('id')}",
|
||||
"external_id": place.get('id'),
|
||||
"source": "google",
|
||||
"name": name,
|
||||
"description": place.get('businessStatus', None),
|
||||
"description": description,
|
||||
"latitude": lat,
|
||||
"longitude": lon,
|
||||
"address": formatted_address,
|
||||
"tag": types[0] if types else None,
|
||||
"distance_km": round(distance_km, 2),
|
||||
"rating": rating,
|
||||
"review_count": review_count,
|
||||
"price_level": price_level,
|
||||
"types": filtered_types,
|
||||
"primary_type": filtered_types[0] if filtered_types else None,
|
||||
"business_status": business_status,
|
||||
"is_open_now": is_open_now,
|
||||
"opening_hours": opening_hours.get('weekdayDescriptions', []) if opening_hours else None,
|
||||
"phone_number": phone_number,
|
||||
"website": website,
|
||||
"google_maps_url": google_maps_uri,
|
||||
"photos": photo_urls,
|
||||
"is_verified": is_operational,
|
||||
}
|
||||
|
||||
locations.append(adventure)
|
||||
|
||||
# Sort by distance ascending
|
||||
locations.sort(key=lambda x: x["distance_km"])
|
||||
|
||||
# Calculate quality score
|
||||
place_data['quality_score'] = self.calculate_quality_score(place_data)
|
||||
|
||||
locations.append(place_data)
|
||||
|
||||
return locations
|
||||
|
||||
def parse_overpass_response(self, data, request):
|
||||
def parse_overpass_response(self, data, request, origin):
|
||||
"""
|
||||
Parse Overpass API (OSM) results into unified format.
|
||||
Enhanced with quality filtering and comprehensive data extraction.
|
||||
"""
|
||||
nodes = data.get('elements', [])
|
||||
locations = []
|
||||
all = request.query_params.get('all', False)
|
||||
|
||||
origin = None
|
||||
try:
|
||||
origin = (
|
||||
float(request.query_params.get('lat')),
|
||||
float(request.query_params.get('lon'))
|
||||
)
|
||||
except(ValueError, TypeError):
|
||||
origin = None
|
||||
|
||||
for node in nodes:
|
||||
if node.get('type') not in ['node', 'way', 'relation']:
|
||||
continue
|
||||
|
||||
tags = node.get('tags', {})
|
||||
lat = node.get('lat')
|
||||
lon = node.get('lon')
|
||||
name = tags.get('name', tags.get('official_name', ''))
|
||||
|
||||
# Get coordinates (for ways/relations, use center)
|
||||
lat = node.get('lat') or node.get('center', {}).get('lat')
|
||||
lon = node.get('lon') or node.get('center', {}).get('lon')
|
||||
|
||||
# Extract name (with fallbacks)
|
||||
name = tags.get('name') or tags.get('official_name') or tags.get('alt_name')
|
||||
|
||||
if not name or lat is None or lon is None:
|
||||
if not all:
|
||||
continue
|
||||
continue
|
||||
|
||||
# Flatten address
|
||||
address_parts = [tags.get(f'addr:{k}') for k in ['housenumber', 'street', 'suburb', 'city', 'state', 'postcode', 'country']]
|
||||
formatted_address = ", ".join(filter(None, address_parts)) or name
|
||||
# Calculate distance
|
||||
distance_km = round(geodesic(origin, (lat, lon)).km, 2) if origin else None
|
||||
|
||||
# Calculate distance if possible
|
||||
distance_km = None
|
||||
if origin:
|
||||
distance_km = round(geodesic(origin, (lat, lon)).km, 2)
|
||||
# Extract address information
|
||||
address_parts = [
|
||||
tags.get('addr:housenumber'),
|
||||
tags.get('addr:street'),
|
||||
tags.get('addr:suburb') or tags.get('addr:neighbourhood'),
|
||||
tags.get('addr:city'),
|
||||
tags.get('addr:state'),
|
||||
tags.get('addr:postcode'),
|
||||
tags.get('addr:country')
|
||||
]
|
||||
formatted_address = ", ".join(filter(None, address_parts)) or None
|
||||
|
||||
# Unified format
|
||||
adventure = {
|
||||
"id": f"osm:{node.get('id')}",
|
||||
"type": "place",
|
||||
# Extract contact information
|
||||
phone = tags.get('phone') or tags.get('contact:phone')
|
||||
website = tags.get('website') or tags.get('contact:website') or tags.get('url')
|
||||
|
||||
# Extract opening hours
|
||||
opening_hours = tags.get('opening_hours')
|
||||
|
||||
# Extract rating/stars (if available)
|
||||
stars = tags.get('stars')
|
||||
|
||||
# Determine category/type hierarchy
|
||||
category_keys = ['tourism', 'leisure', 'amenity', 'natural', 'historic', 'attraction', 'shop', 'sport']
|
||||
types = [tags.get(key) for key in category_keys if key in tags]
|
||||
primary_type = types[0] if types else None
|
||||
|
||||
# Extract description and additional info
|
||||
description = tags.get('description') or tags.get('note')
|
||||
wikipedia = tags.get('wikipedia') or tags.get('wikidata')
|
||||
|
||||
# Extract image if available
|
||||
image = tags.get('image') or tags.get('wikimedia_commons')
|
||||
|
||||
# Quality filters for OSM data
|
||||
# Skip if it's just a generic POI without specific category
|
||||
if not primary_type:
|
||||
continue
|
||||
|
||||
# Skip construction or disused places
|
||||
if tags.get('disused') or tags.get('construction'):
|
||||
continue
|
||||
|
||||
# Build unified response
|
||||
place_data = {
|
||||
"id": f"osm:{node.get('type')}:{node.get('id')}",
|
||||
"external_id": str(node.get('id')),
|
||||
"source": "osm",
|
||||
"name": name,
|
||||
"description": tags.get('description'),
|
||||
"description": description,
|
||||
"latitude": lat,
|
||||
"longitude": lon,
|
||||
"address": formatted_address,
|
||||
"tag": next((tags.get(key) for key in ['leisure', 'tourism', 'natural', 'historic', 'amenity'] if key in tags), None),
|
||||
"distance_km": distance_km,
|
||||
"powered_by": "osm"
|
||||
"rating": None, # OSM doesn't have ratings
|
||||
"review_count": None,
|
||||
"price_level": None,
|
||||
"types": types,
|
||||
"primary_type": primary_type,
|
||||
"business_status": None,
|
||||
"is_open_now": None,
|
||||
"opening_hours": [opening_hours] if opening_hours else None,
|
||||
"phone_number": phone,
|
||||
"website": website,
|
||||
"google_maps_url": None,
|
||||
"photos": [image] if image else [],
|
||||
"is_verified": bool(wikipedia), # Has Wikipedia = more verified
|
||||
"osm_type": node.get('type'),
|
||||
"wikipedia": wikipedia,
|
||||
"stars": stars,
|
||||
}
|
||||
|
||||
locations.append(adventure)
|
||||
|
||||
# Sort by distance if available
|
||||
if origin:
|
||||
locations.sort(key=lambda x: x.get("distance_km") or float("inf"))
|
||||
|
||||
# Calculate quality score (will be lower without ratings)
|
||||
place_data['quality_score'] = self.calculate_quality_score(place_data)
|
||||
|
||||
locations.append(place_data)
|
||||
|
||||
return locations
|
||||
|
||||
|
||||
def query_overpass(self, lat, lon, radius, category, request):
|
||||
"""
|
||||
Query Overpass API (OpenStreetMap) for nearby places.
|
||||
Enhanced with better queries and error handling.
|
||||
"""
|
||||
# Limit radius for OSM to prevent timeouts (max 5km for OSM due to server limits)
|
||||
osm_radius = min(radius, 5000)
|
||||
|
||||
# Build optimized query - use simpler queries and limit results
|
||||
# Reduced timeout and simplified queries to prevent 504 errors
|
||||
if category == 'tourism':
|
||||
query = f"""
|
||||
[out:json];
|
||||
[out:json][timeout:25];
|
||||
(
|
||||
node(around:{radius},{lat},{lon})["tourism"];
|
||||
node(around:{radius},{lat},{lon})["leisure"];
|
||||
node(around:{radius},{lat},{lon})["historic"];
|
||||
node(around:{radius},{lat},{lon})["sport"];
|
||||
node(around:{radius},{lat},{lon})["natural"];
|
||||
node(around:{radius},{lat},{lon})["attraction"];
|
||||
node(around:{radius},{lat},{lon})["museum"];
|
||||
node(around:{radius},{lat},{lon})["zoo"];
|
||||
node(around:{radius},{lat},{lon})["aquarium"];
|
||||
nwr["tourism"~"attraction|viewpoint|museum|gallery|zoo|aquarium"](around:{osm_radius},{lat},{lon});
|
||||
nwr["historic"~"monument|castle|memorial"](around:{osm_radius},{lat},{lon});
|
||||
nwr["leisure"~"park|garden|nature_reserve"](around:{osm_radius},{lat},{lon});
|
||||
);
|
||||
out;
|
||||
out center tags 50;
|
||||
"""
|
||||
elif category == 'lodging':
|
||||
query = f"""
|
||||
[out:json];
|
||||
(
|
||||
node(around:{radius},{lat},{lon})["tourism"="hotel"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="motel"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="guest_house"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="hostel"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="camp_site"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="caravan_site"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="chalet"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="alpine_hut"];
|
||||
node(around:{radius},{lat},{lon})["tourism"="apartment"];
|
||||
);
|
||||
out;
|
||||
[out:json][timeout:25];
|
||||
nwr["tourism"~"hotel|motel|guest_house|hostel"](around:{osm_radius},{lat},{lon});
|
||||
out center tags 50;
|
||||
"""
|
||||
elif category == 'food':
|
||||
query = f"""
|
||||
[out:json];
|
||||
(
|
||||
node(around:{radius},{lat},{lon})["amenity"="restaurant"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="cafe"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="fast_food"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="pub"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="bar"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="food_court"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="ice_cream"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="bakery"];
|
||||
node(around:{radius},{lat},{lon})["amenity"="confectionery"];
|
||||
);
|
||||
out;
|
||||
[out:json][timeout:25];
|
||||
nwr["amenity"~"restaurant|cafe|bar|pub"](around:{osm_radius},{lat},{lon});
|
||||
out center tags 50;
|
||||
"""
|
||||
else:
|
||||
return Response({"error": "Invalid category."}, status=400)
|
||||
logger.error(f"Invalid category requested: {category}")
|
||||
return {"error": "Invalid category.", "results": []}
|
||||
|
||||
overpass_url = f"{self.BASE_URL}?data={query}"
|
||||
try:
|
||||
response = requests.get(overpass_url, headers=self.HEADERS)
|
||||
response = requests.post(
|
||||
self.OVERPASS_URL,
|
||||
data=query,
|
||||
headers=self.HEADERS,
|
||||
timeout=30
|
||||
)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
except Exception as e:
|
||||
print("Overpass API error:", e)
|
||||
return Response({"error": "Failed to retrieve data from Overpass API."}, status=500)
|
||||
except requests.exceptions.Timeout:
|
||||
logger.warning(f"Overpass API timeout for {category} at ({lat}, {lon}) with radius {osm_radius}m")
|
||||
return {"error": f"OpenStreetMap query timed out. The service is overloaded. Radius limited to {int(osm_radius)}m.", "results": []}
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if e.response.status_code == 504:
|
||||
logger.warning(f"Overpass API 504 Gateway Timeout for {category}")
|
||||
return {"error": "OpenStreetMap server is overloaded. Try again later or use Google source.", "results": []}
|
||||
logger.warning(f"Overpass API HTTP error: {e}")
|
||||
return {"error": f"OpenStreetMap error: please try again later.", "results": []}
|
||||
except requests.exceptions.RequestException as e:
|
||||
logger.warning(f"Overpass API error: {e}")
|
||||
return {"error": f"OpenStreetMap temporarily unavailable: please try again later.", "results": []}
|
||||
except ValueError as e:
|
||||
logger.error(f"Invalid JSON response from Overpass: {e}")
|
||||
return {"error": "Invalid response from OpenStreetMap.", "results": []}
|
||||
|
||||
locations = self.parse_overpass_response(data, request)
|
||||
return Response(locations)
|
||||
origin = (float(lat), float(lon))
|
||||
locations = self.parse_overpass_response(data, request, origin)
|
||||
|
||||
logger.info(f"Overpass returned {len(locations)} results")
|
||||
return {"error": None, "results": locations}
|
||||
|
||||
def query_google_nearby(self, lat, lon, radius, category, request):
|
||||
"""Query Google Places API (New) for nearby places"""
|
||||
"""
|
||||
Query Google Places API (New) for nearby places.
|
||||
Enhanced with comprehensive field masks and better error handling.
|
||||
"""
|
||||
api_key = settings.GOOGLE_MAPS_API_KEY
|
||||
|
||||
# Updated to use new Places API endpoint
|
||||
url = "https://places.googleapis.com/v1/places:searchNearby"
|
||||
|
||||
# Comprehensive field mask to get all useful information
|
||||
headers = {
|
||||
'Content-Type': 'application/json',
|
||||
'X-Goog-Api-Key': api_key,
|
||||
'X-Goog-FieldMask': 'places.displayName.text,places.formattedAddress,places.location,places.types,places.rating,places.userRatingCount,places.businessStatus,places.id'
|
||||
'X-Goog-FieldMask': (
|
||||
'places.id,'
|
||||
'places.displayName,'
|
||||
'places.formattedAddress,'
|
||||
'places.shortFormattedAddress,'
|
||||
'places.location,'
|
||||
'places.types,'
|
||||
'places.rating,'
|
||||
'places.userRatingCount,'
|
||||
'places.businessStatus,'
|
||||
'places.priceLevel,'
|
||||
'places.websiteUri,'
|
||||
'places.googleMapsUri,'
|
||||
'places.nationalPhoneNumber,'
|
||||
'places.internationalPhoneNumber,'
|
||||
'places.editorialSummary,'
|
||||
'places.photos,'
|
||||
'places.currentOpeningHours,'
|
||||
'places.regularOpeningHours'
|
||||
)
|
||||
}
|
||||
|
||||
# Map categories to place types for the new API
|
||||
# Map categories to place types - use multiple types for better coverage
|
||||
type_mapping = {
|
||||
'lodging': 'lodging',
|
||||
'food': 'restaurant',
|
||||
'tourism': 'tourist_attraction',
|
||||
'lodging': ['lodging', 'hotel', 'hostel', 'resort_hotel', 'extended_stay_hotel'],
|
||||
'food': ['restaurant', 'cafe', 'bar', 'bakery', 'meal_takeaway', 'meal_delivery'],
|
||||
'tourism': ['tourist_attraction', 'museum', 'art_gallery', 'aquarium', 'zoo', 'amusement_park', 'park', 'natural_feature'],
|
||||
}
|
||||
|
||||
payload = {
|
||||
"includedTypes": [type_mapping[category]],
|
||||
"includedTypes": type_mapping.get(category, ['tourist_attraction']),
|
||||
"maxResultCount": 20,
|
||||
"rankPreference": "DISTANCE", # Sort by distance first
|
||||
"locationRestriction": {
|
||||
"circle": {
|
||||
"center": {
|
||||
@@ -208,7 +409,7 @@ class RecommendationsViewSet(viewsets.ViewSet):
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.post(url, json=payload, headers=headers, timeout=10)
|
||||
response = requests.post(url, json=payload, headers=headers, timeout=15)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
@@ -216,41 +417,265 @@ class RecommendationsViewSet(viewsets.ViewSet):
|
||||
origin = (float(lat), float(lon))
|
||||
locations = self.parse_google_places(places, origin)
|
||||
|
||||
return Response(locations)
|
||||
logger.info(f"Google Places returned {len(locations)} quality results for category '{category}'")
|
||||
|
||||
return Response(self._prepare_final_results(locations))
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
logger.warning("Google Places API timeout, falling back to OSM")
|
||||
return self.query_overpass(lat, lon, radius, category, request)
|
||||
except requests.exceptions.RequestException as e:
|
||||
print(f"Google Places API error: {e}")
|
||||
# Fallback to Overpass API
|
||||
logger.warning(f"Google Places API error: {e}, falling back to OSM")
|
||||
return self.query_overpass(lat, lon, radius, category, request)
|
||||
except Exception as e:
|
||||
print(f"Unexpected error with Google Places API: {e}")
|
||||
# Fallback to Overpass API
|
||||
logger.error(f"Unexpected error with Google Places API: {e}")
|
||||
return self.query_overpass(lat, lon, radius, category, request)
|
||||
|
||||
def _prepare_final_results(self, locations):
|
||||
"""
|
||||
Prepare final results: sort by quality score and limit results.
|
||||
"""
|
||||
# Sort by quality score (highest first)
|
||||
locations.sort(key=lambda x: x.get('quality_score', 0), reverse=True)
|
||||
|
||||
# Limit to MAX_RESULTS
|
||||
locations = locations[:self.MAX_RESULTS]
|
||||
|
||||
return locations
|
||||
|
||||
def _deduplicate_results(self, google_results, osm_results):
|
||||
"""
|
||||
Deduplicate results from both sources based on name and proximity.
|
||||
Prioritize Google results when duplicates are found.
|
||||
"""
|
||||
from difflib import SequenceMatcher
|
||||
|
||||
def is_similar(name1, name2, threshold=0.85):
|
||||
"""Check if two names are similar using fuzzy matching."""
|
||||
return SequenceMatcher(None, name1.lower(), name2.lower()).ratio() > threshold
|
||||
|
||||
def is_nearby(loc1, loc2, max_distance_m=50):
|
||||
"""Check if two locations are within max_distance_m meters."""
|
||||
dist = geodesic(
|
||||
(loc1['latitude'], loc1['longitude']),
|
||||
(loc2['latitude'], loc2['longitude'])
|
||||
).meters
|
||||
return dist < max_distance_m
|
||||
|
||||
# Start with all Google results (higher quality)
|
||||
deduplicated = list(google_results)
|
||||
|
||||
# Add OSM results that don't match Google results
|
||||
for osm_loc in osm_results:
|
||||
is_duplicate = False
|
||||
for google_loc in google_results:
|
||||
if (is_similar(osm_loc['name'], google_loc['name']) and
|
||||
is_nearby(osm_loc, google_loc)):
|
||||
is_duplicate = True
|
||||
break
|
||||
|
||||
if not is_duplicate:
|
||||
deduplicated.append(osm_loc)
|
||||
|
||||
return deduplicated
|
||||
|
||||
@action(detail=False, methods=['get'])
|
||||
def query(self, request):
|
||||
"""
|
||||
Query both Google Places and OSM for recommendations.
|
||||
Returns unified, high-quality results sorted by quality score.
|
||||
|
||||
Query Parameters:
|
||||
- lat (required): Latitude
|
||||
- lon (required): Longitude
|
||||
- radius (optional): Search radius in meters (default: 5000, max: 50000)
|
||||
- category (required): Category - 'tourism', 'food', or 'lodging'
|
||||
- sources (optional): Comma-separated sources - 'google', 'osm', or 'both' (default: 'both')
|
||||
"""
|
||||
lat = request.query_params.get('lat')
|
||||
lon = request.query_params.get('lon')
|
||||
radius = request.query_params.get('radius', '1000')
|
||||
category = request.query_params.get('category', 'all')
|
||||
# Allow a free-text `location` parameter which will be geocoded
|
||||
location_param = request.query_params.get('location')
|
||||
radius = request.query_params.get('radius', '5000')
|
||||
category = request.query_params.get('category')
|
||||
sources = request.query_params.get('sources', 'both').lower()
|
||||
|
||||
# If lat/lon not supplied, try geocoding the free-text location param
|
||||
if (not lat or not lon) and location_param:
|
||||
geocode_results = None
|
||||
# Try Google first if API key configured
|
||||
if getattr(settings, 'GOOGLE_MAPS_API_KEY', None):
|
||||
try:
|
||||
geocode_results = search_google(location_param)
|
||||
except Exception:
|
||||
logger.warning("Google geocoding failed; falling back to OSM")
|
||||
geocode_results = None
|
||||
|
||||
# Fallback to OSM Nominatim
|
||||
if not geocode_results:
|
||||
try:
|
||||
geocode_results = search_osm(location_param)
|
||||
except Exception:
|
||||
logger.warning("OSM geocoding failed")
|
||||
geocode_results = None
|
||||
|
||||
# Validate geocode results
|
||||
if isinstance(geocode_results, dict) and geocode_results.get('error'):
|
||||
# Log internal geocoding error but avoid exposing sensitive details
|
||||
logger.warning("Geocoding helper returned an internal error")
|
||||
return Response({"error": "Geocoding failed. Please try a different location or contact support."}, status=400)
|
||||
|
||||
if not geocode_results:
|
||||
return Response({"error": "Could not geocode provided location."}, status=400)
|
||||
|
||||
# geocode_results expected to be a list of results; pick the best (first)
|
||||
best = None
|
||||
if isinstance(geocode_results, list) and len(geocode_results) > 0:
|
||||
best = geocode_results[0]
|
||||
elif isinstance(geocode_results, dict):
|
||||
# Some helpers might return a dict when only one result found
|
||||
best = geocode_results
|
||||
|
||||
if not best:
|
||||
return Response({"error": "No geocoding results found."}, status=400)
|
||||
|
||||
try:
|
||||
lat = float(best.get('lat') or best.get('latitude'))
|
||||
lon = float(best.get('lon') or best.get('longitude'))
|
||||
except Exception:
|
||||
return Response({"error": "Geocoding result missing coordinates."}, status=400)
|
||||
|
||||
# Replace location_param with display name when available for logging/debug
|
||||
location_param = best.get('display_name') or best.get('name') or location_param
|
||||
|
||||
# Validation: require lat and lon at this point
|
||||
if not lat or not lon:
|
||||
return Response({"error": "Latitude and longitude parameters are required."}, status=400)
|
||||
|
||||
valid_categories = {
|
||||
'lodging': 'lodging',
|
||||
'food': 'restaurant',
|
||||
'tourism': 'tourist_attraction',
|
||||
}
|
||||
return Response({
|
||||
"error": "Latitude and longitude parameters are required (or provide a 'location' parameter to geocode)."
|
||||
}, status=400)
|
||||
|
||||
try:
|
||||
lat = float(lat)
|
||||
lon = float(lon)
|
||||
radius = min(float(radius), 50000) # Max 50km radius
|
||||
except ValueError:
|
||||
return Response({
|
||||
"error": "Invalid latitude, longitude, or radius value."
|
||||
}, status=400)
|
||||
|
||||
valid_categories = ['lodging', 'food', 'tourism']
|
||||
if category not in valid_categories:
|
||||
return Response({"error": f"Invalid category. Valid categories: {', '.join(valid_categories)}"}, status=400)
|
||||
return Response({
|
||||
"error": f"Invalid category. Valid categories: {', '.join(valid_categories)}"
|
||||
}, status=400)
|
||||
|
||||
valid_sources = ['google', 'osm', 'both']
|
||||
if sources not in valid_sources:
|
||||
return Response({
|
||||
"error": f"Invalid sources. Valid options: {', '.join(valid_sources)}"
|
||||
}, status=400)
|
||||
|
||||
api_key = getattr(settings, 'GOOGLE_MAPS_API_KEY', None)
|
||||
|
||||
# Fallback to Overpass if no API key configured
|
||||
if not api_key:
|
||||
return self.query_overpass(lat, lon, radius, category, request)
|
||||
|
||||
# Use the new Google Places API
|
||||
return self.query_google_nearby(lat, lon, radius, category, request)
|
||||
|
||||
google_results = []
|
||||
osm_results = []
|
||||
|
||||
# Query Google Places if available and requested
|
||||
if api_key and sources in ['google', 'both']:
|
||||
try:
|
||||
url = "https://places.googleapis.com/v1/places:searchNearby"
|
||||
headers = {
|
||||
'Content-Type': 'application/json',
|
||||
'X-Goog-Api-Key': api_key,
|
||||
'X-Goog-FieldMask': (
|
||||
'places.id,places.displayName,places.formattedAddress,'
|
||||
'places.shortFormattedAddress,places.location,places.types,'
|
||||
'places.rating,places.userRatingCount,places.businessStatus,'
|
||||
'places.priceLevel,places.websiteUri,places.googleMapsUri,'
|
||||
'places.nationalPhoneNumber,places.internationalPhoneNumber,'
|
||||
'places.editorialSummary,places.photos,'
|
||||
'places.currentOpeningHours,places.regularOpeningHours'
|
||||
)
|
||||
}
|
||||
|
||||
type_mapping = {
|
||||
'lodging': ['lodging', 'hotel', 'hostel', 'resort_hotel'],
|
||||
'food': ['restaurant', 'cafe', 'bar', 'bakery'],
|
||||
'tourism': ['tourist_attraction', 'museum', 'art_gallery', 'aquarium', 'zoo', 'park'],
|
||||
}
|
||||
|
||||
payload = {
|
||||
"includedTypes": type_mapping.get(category, ['tourist_attraction']),
|
||||
"maxResultCount": 20,
|
||||
"rankPreference": "DISTANCE",
|
||||
"locationRestriction": {
|
||||
"circle": {
|
||||
"center": {"latitude": lat, "longitude": lon},
|
||||
"radius": radius
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
response = requests.post(url, json=payload, headers=headers, timeout=15)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
places = data.get('places', [])
|
||||
origin = (lat, lon)
|
||||
google_results = self.parse_google_places(places, origin)
|
||||
logger.info(f"Google Places: {len(google_results)} quality results")
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Google Places failed: {e}")
|
||||
|
||||
# Query OSM if requested or as fallback
|
||||
osm_error = None
|
||||
if sources in ['osm', 'both'] or (sources == 'google' and not google_results):
|
||||
osm_response = self.query_overpass(lat, lon, radius, category, request)
|
||||
osm_results = osm_response.get('results', [])
|
||||
osm_error = osm_response.get('error')
|
||||
|
||||
if osm_error:
|
||||
logger.warning(f"OSM query had issues: {osm_error}")
|
||||
|
||||
# Combine and deduplicate if using both sources
|
||||
if sources == 'both' and google_results and osm_results:
|
||||
all_results = self._deduplicate_results(google_results, osm_results)
|
||||
else:
|
||||
all_results = google_results + osm_results
|
||||
|
||||
# Prepare final results
|
||||
final_results = self._prepare_final_results(all_results)
|
||||
|
||||
logger.info(f"Returning {len(final_results)} total recommendations")
|
||||
|
||||
# Build response with metadata
|
||||
response_data = {
|
||||
"count": len(final_results),
|
||||
"results": final_results,
|
||||
"sources_used": {
|
||||
"google": len(google_results),
|
||||
"osm": len(osm_results),
|
||||
"total_before_dedup": len(google_results) + len(osm_results)
|
||||
}
|
||||
}
|
||||
|
||||
# Add warnings if there were errors but we still have some results
|
||||
warnings = []
|
||||
if osm_error and len(osm_results) == 0:
|
||||
warnings.append(osm_error)
|
||||
|
||||
if warnings:
|
||||
response_data["warnings"] = warnings
|
||||
|
||||
# If no results at all and user requested only OSM, return error status
|
||||
if len(final_results) == 0 and sources == 'osm' and osm_error:
|
||||
# Log internal error notice for investigation but do not expose details to clients
|
||||
logger.debug("OSM query error (internal)")
|
||||
return Response({
|
||||
"error": "OpenStreetMap service temporarily unavailable. Please try again later.",
|
||||
"count": 0,
|
||||
"results": [],
|
||||
"sources_used": response_data["sources_used"]
|
||||
}, status=503)
|
||||
|
||||
return Response(response_data)
|
||||
@@ -45,41 +45,90 @@ class ReverseGeocodeViewSet(viewsets.ViewSet):
|
||||
|
||||
@action(detail=False, methods=['post'])
|
||||
def mark_visited_region(self, request):
|
||||
# searches through all of the users locations, if the serialized data is_visited, is true, runs reverse geocode on the locations and if a region is found, marks it as visited. Use the extractIsoCode function to get the region
|
||||
"""
|
||||
Marks regions and cities as visited based on user's visited locations.
|
||||
Uses the pre-stored region/city data on locations to avoid expensive reverse geocoding.
|
||||
"""
|
||||
new_region_count = 0
|
||||
new_regions = {}
|
||||
new_city_count = 0
|
||||
new_cities = {}
|
||||
locations = Location.objects.filter(user=self.request.user)
|
||||
serializer = LocationSerializer(locations, many=True)
|
||||
for adventure, serialized_adventure in zip(locations, serializer.data):
|
||||
if serialized_adventure['is_visited'] == True:
|
||||
lat = adventure.latitude
|
||||
lon = adventure.longitude
|
||||
if not lat or not lon:
|
||||
continue
|
||||
|
||||
# Use the existing reverse_geocode function which handles both Google and OSM
|
||||
data = reverse_geocode(lat, lon, self.request.user)
|
||||
if 'error' in data:
|
||||
continue
|
||||
|
||||
# data already contains region_id and city_id
|
||||
if 'region_id' in data and data['region_id'] is not None:
|
||||
region = Region.objects.filter(id=data['region_id']).first()
|
||||
visited_region = VisitedRegion.objects.filter(region=region, user=self.request.user).first()
|
||||
if not visited_region:
|
||||
visited_region = VisitedRegion(region=region, user=self.request.user)
|
||||
visited_region.save()
|
||||
new_region_count += 1
|
||||
new_regions[region.id] = region.name
|
||||
|
||||
if 'city_id' in data and data['city_id'] is not None:
|
||||
city = City.objects.filter(id=data['city_id']).first()
|
||||
visited_city = VisitedCity.objects.filter(city=city, user=self.request.user).first()
|
||||
if not visited_city:
|
||||
visited_city = VisitedCity(city=city, user=self.request.user)
|
||||
visited_city.save()
|
||||
new_city_count += 1
|
||||
new_cities[city.id] = city.name
|
||||
return Response({"new_regions": new_region_count, "regions": new_regions, "new_cities": new_city_count, "cities": new_cities})
|
||||
|
||||
# Get all visited locations with their region and city data
|
||||
visited_locations = Location.objects.filter(
|
||||
user=self.request.user
|
||||
).select_related('region', 'city')
|
||||
|
||||
# Track unique regions and cities to create VisitedRegion/VisitedCity entries
|
||||
regions_to_mark = set()
|
||||
cities_to_mark = set()
|
||||
|
||||
for location in visited_locations:
|
||||
# Only process locations that are marked as visited
|
||||
if not location.is_visited_status():
|
||||
continue
|
||||
|
||||
# Collect regions
|
||||
if location.region:
|
||||
regions_to_mark.add(location.region.id)
|
||||
|
||||
# Collect cities
|
||||
if location.city:
|
||||
cities_to_mark.add(location.city.id)
|
||||
|
||||
# Get existing visited regions for this user
|
||||
existing_visited_regions = set(
|
||||
VisitedRegion.objects.filter(
|
||||
user=self.request.user,
|
||||
region_id__in=regions_to_mark
|
||||
).values_list('region_id', flat=True)
|
||||
)
|
||||
|
||||
# Create new VisitedRegion entries
|
||||
new_visited_regions = []
|
||||
for region_id in regions_to_mark:
|
||||
if region_id not in existing_visited_regions:
|
||||
new_visited_regions.append(
|
||||
VisitedRegion(region_id=region_id, user=self.request.user)
|
||||
)
|
||||
|
||||
if new_visited_regions:
|
||||
VisitedRegion.objects.bulk_create(new_visited_regions)
|
||||
new_region_count = len(new_visited_regions)
|
||||
# Get region names for response
|
||||
regions = Region.objects.filter(
|
||||
id__in=[vr.region_id for vr in new_visited_regions]
|
||||
)
|
||||
new_regions = {r.id: r.name for r in regions}
|
||||
|
||||
# Get existing visited cities for this user
|
||||
existing_visited_cities = set(
|
||||
VisitedCity.objects.filter(
|
||||
user=self.request.user,
|
||||
city_id__in=cities_to_mark
|
||||
).values_list('city_id', flat=True)
|
||||
)
|
||||
|
||||
# Create new VisitedCity entries
|
||||
new_visited_cities = []
|
||||
for city_id in cities_to_mark:
|
||||
if city_id not in existing_visited_cities:
|
||||
new_visited_cities.append(
|
||||
VisitedCity(city_id=city_id, user=self.request.user)
|
||||
)
|
||||
|
||||
if new_visited_cities:
|
||||
VisitedCity.objects.bulk_create(new_visited_cities)
|
||||
new_city_count = len(new_visited_cities)
|
||||
# Get city names for response
|
||||
cities = City.objects.filter(
|
||||
id__in=[vc.city_id for vc in new_visited_cities]
|
||||
)
|
||||
new_cities = {c.id: c.name for c in cities}
|
||||
|
||||
return Response({
|
||||
"new_regions": new_region_count,
|
||||
"regions": new_regions,
|
||||
"new_cities": new_city_count,
|
||||
"cities": new_cities
|
||||
})
|
||||
@@ -1,11 +1,8 @@
|
||||
"""
|
||||
Django settings for demo project.
|
||||
AdventureLog Server settings
|
||||
|
||||
For more information on this file, see
|
||||
https://docs.djangoproject.com/en/1.7/topics/settings/
|
||||
|
||||
For the full list of settings and their values, see
|
||||
https://docs.djangoproject.com/en/1.7/ref/settings/
|
||||
Reference:
|
||||
- Django settings: https://docs.djangoproject.com/en/stable/ref/settings/
|
||||
"""
|
||||
|
||||
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
|
||||
@@ -15,14 +12,21 @@ from os import getenv
|
||||
from pathlib import Path
|
||||
from urllib.parse import urlparse
|
||||
from publicsuffix2 import get_sld
|
||||
# Load environment variables from .env file
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Environment & Paths
|
||||
# ---------------------------------------------------------------------------
|
||||
# Load environment variables from .env file early so getenv works everywhere.
|
||||
load_dotenv()
|
||||
|
||||
BASE_DIR = os.path.dirname(os.path.dirname(__file__))
|
||||
|
||||
# Quick-start development settings - unsuitable for production
|
||||
# See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/
|
||||
# See Django deployment checklist for production hardening.
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Core Security & Debug
|
||||
# ---------------------------------------------------------------------------
|
||||
# SECURITY WARNING: keep the secret key used in production secret!
|
||||
SECRET_KEY = getenv('SECRET_KEY')
|
||||
|
||||
@@ -34,9 +38,13 @@ DEBUG = getenv('DEBUG', 'true').lower() == 'true'
|
||||
# '127.0.0.1',
|
||||
# 'server'
|
||||
# ]
|
||||
ALLOWED_HOSTS = ['*']
|
||||
ALLOWED_HOSTS = ['*'] # In production, restrict to known hosts.
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Installed Apps
|
||||
# ---------------------------------------------------------------------------
|
||||
INSTALLED_APPS = (
|
||||
"allauth_ui",
|
||||
'django.contrib.admin',
|
||||
'django.contrib.auth',
|
||||
'django.contrib.contenttypes',
|
||||
@@ -44,7 +52,6 @@ INSTALLED_APPS = (
|
||||
'django.contrib.messages',
|
||||
'django.contrib.staticfiles',
|
||||
'django.contrib.sites',
|
||||
# "allauth_ui",
|
||||
'rest_framework',
|
||||
'rest_framework.authtoken',
|
||||
'allauth',
|
||||
@@ -54,7 +61,9 @@ INSTALLED_APPS = (
|
||||
'allauth.socialaccount',
|
||||
'allauth.socialaccount.providers.github',
|
||||
'allauth.socialaccount.providers.openid_connect',
|
||||
'invitations',
|
||||
'drf_yasg',
|
||||
'djmoney',
|
||||
'corsheaders',
|
||||
'adventures',
|
||||
'worldtravel',
|
||||
@@ -62,11 +71,14 @@ INSTALLED_APPS = (
|
||||
'integrations',
|
||||
'django.contrib.gis',
|
||||
# 'achievements', # Not done yet, will be added later in a future update
|
||||
# 'widget_tweaks',
|
||||
# 'slippers',
|
||||
'widget_tweaks',
|
||||
'slippers',
|
||||
|
||||
)
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Middleware
|
||||
# ---------------------------------------------------------------------------
|
||||
MIDDLEWARE = (
|
||||
'whitenoise.middleware.WhiteNoiseMiddleware',
|
||||
'adventures.middleware.XSessionTokenMiddleware',
|
||||
@@ -83,9 +95,9 @@ MIDDLEWARE = (
|
||||
'allauth.account.middleware.AccountMiddleware',
|
||||
)
|
||||
|
||||
# disable verifications for new users
|
||||
ACCOUNT_EMAIL_VERIFICATION = 'none'
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Caching
|
||||
# ---------------------------------------------------------------------------
|
||||
CACHES = {
|
||||
'default': {
|
||||
'BACKEND': 'django.core.cache.backends.memcached.PyMemcacheCache',
|
||||
@@ -101,9 +113,9 @@ ROOT_URLCONF = 'main.urls'
|
||||
|
||||
# WSGI_APPLICATION = 'demo.wsgi.application'
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Database
|
||||
# https://docs.djangoproject.com/en/1.7/ref/settings/#databases
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Using legacy PG environment variables for compatibility with existing setups
|
||||
|
||||
def env(*keys, default=None):
|
||||
@@ -131,58 +143,52 @@ DATABASES = {
|
||||
# Internationalization
|
||||
# https://docs.djangoproject.com/en/1.7/topics/i18n/
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Internationalization
|
||||
# ---------------------------------------------------------------------------
|
||||
LANGUAGE_CODE = 'en-us'
|
||||
|
||||
TIME_ZONE = 'UTC'
|
||||
|
||||
USE_I18N = True
|
||||
|
||||
USE_L10N = True
|
||||
|
||||
USE_TZ = True
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Frontend URL & Cookies
|
||||
# ---------------------------------------------------------------------------
|
||||
# Derive frontend URL from environment and configure cookie behavior.
|
||||
unParsedFrontenedUrl = getenv('FRONTEND_URL', 'http://localhost:3000')
|
||||
FRONTEND_URL = unParsedFrontenedUrl.translate(str.maketrans('', '', '\'"'))
|
||||
|
||||
SESSION_COOKIE_SAMESITE = 'Lax'
|
||||
|
||||
SESSION_COOKIE_NAME = 'sessionid'
|
||||
|
||||
# Secure cookies if frontend is served over HTTPS
|
||||
SESSION_COOKIE_SECURE = FRONTEND_URL.startswith('https')
|
||||
CSRF_COOKIE_SECURE = FRONTEND_URL.startswith('https')
|
||||
|
||||
|
||||
# Dynamically determine cookie domain to support subdomains while avoiding IPs
|
||||
hostname = urlparse(FRONTEND_URL).hostname
|
||||
is_ip_address = hostname.replace('.', '').isdigit()
|
||||
|
||||
# Check if the hostname is single-label (no dots)
|
||||
is_single_label = '.' not in hostname
|
||||
is_single_label = '.' not in hostname # single-label hostnames (e.g., "localhost")
|
||||
|
||||
if is_ip_address or is_single_label:
|
||||
# Do not set a domain for IP addresses or single-label hostnames
|
||||
SESSION_COOKIE_DOMAIN = None
|
||||
else:
|
||||
# Use publicsuffix2 to calculate the correct cookie domain
|
||||
cookie_domain = get_sld(hostname)
|
||||
if cookie_domain:
|
||||
SESSION_COOKIE_DOMAIN = f".{cookie_domain}"
|
||||
else:
|
||||
# Fallback to the hostname if parsing fails
|
||||
SESSION_COOKIE_DOMAIN = hostname
|
||||
SESSION_COOKIE_DOMAIN = f".{cookie_domain}" if cookie_domain else hostname
|
||||
|
||||
|
||||
# Static files (CSS, JavaScript, Images)
|
||||
# https://docs.djangoproject.com/en/1.7/howto/static-files/
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Static & Media Files
|
||||
# ---------------------------------------------------------------------------
|
||||
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
|
||||
|
||||
|
||||
BASE_DIR = Path(__file__).resolve().parent.parent
|
||||
STATIC_ROOT = BASE_DIR / "staticfiles"
|
||||
STATIC_URL = '/static/'
|
||||
|
||||
MEDIA_URL = '/media/'
|
||||
MEDIA_ROOT = BASE_DIR / 'media' # This path must match the NGINX root
|
||||
MEDIA_ROOT = BASE_DIR / 'media' # Must match NGINX root for media serving
|
||||
STATICFILES_DIRS = [BASE_DIR / 'static']
|
||||
|
||||
STORAGES = {
|
||||
@@ -196,6 +202,9 @@ STORAGES = {
|
||||
|
||||
SILENCED_SYSTEM_CHECKS = ["slippers.E001"]
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Templates
|
||||
# ---------------------------------------------------------------------------
|
||||
TEMPLATES = [
|
||||
{
|
||||
'BACKEND': 'django.template.backends.django.DjangoTemplates',
|
||||
@@ -212,31 +221,37 @@ TEMPLATES = [
|
||||
},
|
||||
]
|
||||
|
||||
# Authentication settings
|
||||
ALLAUTH_UI_THEME = "dim"
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Authentication & Accounts
|
||||
# ---------------------------------------------------------------------------
|
||||
DISABLE_REGISTRATION = getenv('DISABLE_REGISTRATION', 'false').lower() == 'true'
|
||||
DISABLE_REGISTRATION_MESSAGE = getenv('DISABLE_REGISTRATION_MESSAGE', 'Registration is disabled. Please contact the administrator if you need an account.')
|
||||
|
||||
SOCIALACCOUNT_ALLOW_SIGNUP = getenv('SOCIALACCOUNT_ALLOW_SIGNUP', 'false').lower() == 'true'
|
||||
|
||||
AUTH_USER_MODEL = 'users.CustomUser'
|
||||
|
||||
ACCOUNT_ADAPTER = 'users.adapters.NoNewUsersAccountAdapter'
|
||||
|
||||
ACCOUNT_ADAPTER = 'users.adapters.CustomAccountAdapter'
|
||||
INVITATIONS_ADAPTER = ACCOUNT_ADAPTER
|
||||
INVITATIONS_ACCEPT_INVITE_AFTER_SIGNUP = True
|
||||
INVITATIONS_EMAIL_SUBJECT_PREFIX = 'AdventureLog: '
|
||||
SOCIALACCOUNT_ADAPTER = 'users.adapters.CustomSocialAccountAdapter'
|
||||
ACCOUNT_SIGNUP_FORM_CLASS = 'users.form_overrides.CustomSignupForm'
|
||||
|
||||
SESSION_SAVE_EVERY_REQUEST = True
|
||||
|
||||
# Set login redirect URL to the frontend
|
||||
LOGIN_REDIRECT_URL = FRONTEND_URL
|
||||
LOGIN_REDIRECT_URL = FRONTEND_URL # Redirect to frontend after login
|
||||
|
||||
SOCIALACCOUNT_LOGIN_ON_GET = True
|
||||
INVITATIONS_INVITE_FORM = 'users.form_overrides.UseAdminInviteForm'
|
||||
INVITATIONS_SIGNUP_REDIRECT_URL = f"{FRONTEND_URL}/signup"
|
||||
|
||||
HEADLESS_FRONTEND_URLS = {
|
||||
"account_confirm_email": f"{FRONTEND_URL}/user/verify-email/{{key}}",
|
||||
"account_reset_password": f"{FRONTEND_URL}/user/reset-password",
|
||||
"account_reset_password_from_key": f"{FRONTEND_URL}/user/reset-password/{{key}}",
|
||||
"account_signup": f"{FRONTEND_URL}/signup",
|
||||
# Fallback in case the state containing the `next` URL is lost and the handshake
|
||||
# with the third-party provider fails.
|
||||
# Fallback if handshake with provider fails and `next` URL is lost.
|
||||
"socialaccount_login_error": f"{FRONTEND_URL}/account/provider/callback",
|
||||
}
|
||||
|
||||
@@ -249,8 +264,14 @@ AUTHENTICATION_BACKENDS = [
|
||||
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
|
||||
SITE_ID = 1
|
||||
ACCOUNT_EMAIL_REQUIRED = True
|
||||
ACCOUNT_AUTHENTICATION_METHOD = 'username'
|
||||
ACCOUNT_EMAIL_VERIFICATION = 'optional'
|
||||
ACCOUNT_UNIQUE_EMAIL = True
|
||||
ACCOUNT_EMAIL_VERIFICATION = getenv('ACCOUNT_EMAIL_VERIFICATION', 'none') # 'none', 'optional', 'mandatory'
|
||||
|
||||
SOCIALACCOUNT_EMAIL_AUTHENTICATION = True
|
||||
SOCIALACCOUNT_EMAIL_AUTHENTICATION_AUTO_CONNECT = True # Auto-link by email
|
||||
SOCIALACCOUNT_AUTO_SIGNUP = True # Allow auto-signup post adapter checks
|
||||
|
||||
FORCE_SOCIALACCOUNT_LOGIN = getenv('FORCE_SOCIALACCOUNT_LOGIN', 'false').lower() == 'true' # When true, only social login is allowed (no password login) and the login page will show only social providers or redirect directly to the first provider if only one is configured.
|
||||
|
||||
if getenv('EMAIL_BACKEND', 'console') == 'console':
|
||||
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
|
||||
@@ -274,6 +295,9 @@ else:
|
||||
# DEFAULT_FROM_EMAIL = 'mail@mail.user.com'
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Django REST Framework
|
||||
# ---------------------------------------------------------------------------
|
||||
REST_FRAMEWORK = {
|
||||
'DEFAULT_AUTHENTICATION_CLASSES': (
|
||||
'rest_framework.authentication.SessionAuthentication',
|
||||
@@ -292,15 +316,18 @@ else:
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# CORS & CSRF
|
||||
# ---------------------------------------------------------------------------
|
||||
CORS_ALLOWED_ORIGINS = [origin.strip() for origin in getenv('CSRF_TRUSTED_ORIGINS', 'http://localhost').split(',') if origin.strip()]
|
||||
|
||||
|
||||
CSRF_TRUSTED_ORIGINS = [origin.strip() for origin in getenv('CSRF_TRUSTED_ORIGINS', 'http://localhost').split(',') if origin.strip()]
|
||||
|
||||
CORS_ALLOW_CREDENTIALS = True
|
||||
|
||||
DEFAULT_AUTO_FIELD = 'django.db.models.AutoField'
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Logging
|
||||
# ---------------------------------------------------------------------------
|
||||
LOGGING = {
|
||||
'version': 1,
|
||||
'disable_existing_loggers': False,
|
||||
@@ -326,6 +353,9 @@ LOGGING = {
|
||||
},
|
||||
}
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Public URLs & Third-Party Integrations
|
||||
# ---------------------------------------------------------------------------
|
||||
PUBLIC_URL = getenv('PUBLIC_URL', 'http://localhost:8000')
|
||||
|
||||
# ADVENTURELOG_CDN_URL = getenv('ADVENTURELOG_CDN_URL', 'https://cdn.adventurelog.app')
|
||||
@@ -334,9 +364,9 @@ PUBLIC_URL = getenv('PUBLIC_URL', 'http://localhost:8000')
|
||||
ADVENTURELOG_RELEASE_VERSION = 'v0.11.0'
|
||||
|
||||
# https://github.com/dr5hn/countries-states-cities-database/tags
|
||||
COUNTRY_REGION_JSON_VERSION = 'v2.6'
|
||||
COUNTRY_REGION_JSON_VERSION = 'v3.0'
|
||||
|
||||
# External service keys (do not hardcode secrets)
|
||||
GOOGLE_MAPS_API_KEY = getenv('GOOGLE_MAPS_API_KEY', '')
|
||||
|
||||
STRAVA_CLIENT_ID = getenv('STRAVA_CLIENT_ID', '')
|
||||
STRAVA_CLIENT_SECRET = getenv('STRAVA_CLIENT_SECRET', '')
|
||||
@@ -33,6 +33,8 @@ urlpatterns = [
|
||||
|
||||
path('csrf/', get_csrf_token, name='get_csrf_token'),
|
||||
path('public-url/', get_public_url, name='get_public_url'),
|
||||
|
||||
path("invitations/", include('invitations.urls', namespace='invitations')),
|
||||
|
||||
path('', TemplateView.as_view(template_name='home.html')),
|
||||
|
||||
|
||||
@@ -6,5 +6,6 @@ def get_user_uuid(user):
|
||||
class CustomModelSerializer(serializers.ModelSerializer):
|
||||
def to_representation(self, instance):
|
||||
representation = super().to_representation(instance)
|
||||
representation['user'] = get_user_uuid(instance.user)
|
||||
if hasattr(instance, 'user') and instance.user:
|
||||
representation['user'] = get_user_uuid(instance.user)
|
||||
return representation
|
||||
@@ -1,6 +1,8 @@
|
||||
Django==5.2.8
|
||||
djangorestframework>=3.15.2
|
||||
django-allauth==0.63.3
|
||||
django-money==3.5.4
|
||||
django-invitations==2.1.0
|
||||
drf-yasg==1.21.4
|
||||
django-cors-headers==4.4.0
|
||||
coreapi==2.3.3
|
||||
@@ -14,7 +16,7 @@ setuptools==79.0.1
|
||||
gunicorn==23.0.0
|
||||
qrcode==8.0
|
||||
slippers==0.6.2
|
||||
django-allauth-ui==1.5.1
|
||||
django-allauth-ui==1.7.0
|
||||
django-widget-tweaks==1.5.0
|
||||
django-ical==1.9.2
|
||||
icalendar==6.1.0
|
||||
|
||||
105
backend/server/run_periodic_sync.py
Normal file
105
backend/server/run_periodic_sync.py
Normal file
@@ -0,0 +1,105 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Periodic sync runner for AdventureLog.
|
||||
Runs sync_visited_regions management command every 60 seconds.
|
||||
Managed by supervisord to ensure it inherits container environment variables.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import logging
|
||||
import signal
|
||||
import threading
|
||||
from datetime import datetime, timedelta
|
||||
from pathlib import Path
|
||||
|
||||
# Setup Django
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'main.settings')
|
||||
|
||||
import django
|
||||
django.setup()
|
||||
|
||||
from django.core.management import call_command
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s [%(levelname)s] %(message)s',
|
||||
handlers=[logging.StreamHandler(sys.stdout)]
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
INTERVAL_SECONDS = 60
|
||||
|
||||
# Event used to signal shutdown from signal handlers
|
||||
_stop_event = threading.Event()
|
||||
|
||||
|
||||
def _seconds_until_next_midnight() -> float:
|
||||
"""Return number of seconds until the next local midnight."""
|
||||
now = datetime.now()
|
||||
next_midnight = (now + timedelta(days=1)).replace(
|
||||
hour=0, minute=0, second=0, microsecond=0
|
||||
)
|
||||
return (next_midnight - now).total_seconds()
|
||||
|
||||
|
||||
def _handle_termination(signum, frame):
|
||||
"""Signal handler for SIGTERM and SIGINT: request graceful shutdown."""
|
||||
logger.info(f"Received signal {signum}; shutting down gracefully...")
|
||||
_stop_event.set()
|
||||
|
||||
|
||||
def run_sync():
|
||||
"""Run the sync_visited_regions command."""
|
||||
try:
|
||||
logger.info("Running sync_visited_regions...")
|
||||
call_command('sync_visited_regions')
|
||||
logger.info("Sync completed successfully")
|
||||
except Exception as e:
|
||||
logger.error(f"Sync failed: {e}", exc_info=True)
|
||||
|
||||
|
||||
def main():
|
||||
"""Main loop - run sync every INTERVAL_SECONDS."""
|
||||
logger.info(f"Starting periodic sync worker for midnight background jobs...")
|
||||
|
||||
# Install signal handlers so supervisord (or other process managers)
|
||||
# can request a clean shutdown using SIGTERM/SIGINT.
|
||||
signal.signal(signal.SIGTERM, _handle_termination)
|
||||
signal.signal(signal.SIGINT, _handle_termination)
|
||||
|
||||
try:
|
||||
while not _stop_event.is_set():
|
||||
# Wait until the next local midnight (or until shutdown)
|
||||
wait_seconds = _seconds_until_next_midnight()
|
||||
hours = wait_seconds / 3600.0
|
||||
logger.info(
|
||||
f"Next sync scheduled in {wait_seconds:.0f}s (~{hours:.2f}h) at UTC midnight"
|
||||
)
|
||||
# Sleep until midnight or until stop event is set
|
||||
if _stop_event.wait(wait_seconds):
|
||||
break
|
||||
|
||||
# It's midnight (or we woke up), run the sync once
|
||||
run_sync()
|
||||
|
||||
# After running at midnight, loop continues to compute next midnight
|
||||
except Exception:
|
||||
logger.exception("Unexpected error in periodic sync loop")
|
||||
finally:
|
||||
logger.info("Periodic sync worker exiting")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
try:
|
||||
main()
|
||||
except KeyboardInterrupt:
|
||||
# Fallback in case the signal is delivered as KeyboardInterrupt
|
||||
logger.info("KeyboardInterrupt received — exiting")
|
||||
_stop_event.set()
|
||||
except SystemExit:
|
||||
logger.info("SystemExit received — exiting")
|
||||
finally:
|
||||
logger.info("run_periodic_sync terminated")
|
||||
2
backend/server/static/allauth_ui/tailwindcss
Normal file
2
backend/server/static/allauth_ui/tailwindcss
Normal file
@@ -0,0 +1,2 @@
|
||||
@import url('https://cdn.tailwindcss.com');
|
||||
@plugin "daisyui";
|
||||
BIN
backend/server/static/favicon.png
Normal file
BIN
backend/server/static/favicon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 87 KiB |
3
backend/server/templates/admin/base.html
Normal file
3
backend/server/templates/admin/base.html
Normal file
@@ -0,0 +1,3 @@
|
||||
{% extends "admin/base.html" %} {% load static %} {% block extrahead %}
|
||||
<link rel="icon" href="{% static 'favicon.png' %}" />
|
||||
{% endblock %}
|
||||
@@ -4,6 +4,7 @@
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||
<meta name="description" content="AdventureLog API Server" />
|
||||
<link rel="icon" href="/static/favicon.png" />
|
||||
<meta name="author" content="Sean Morley" />
|
||||
<title>AdventureLog API Server</title>
|
||||
|
||||
|
||||
@@ -0,0 +1,96 @@
|
||||
{% load i18n %} {% autoescape off %} {% blocktrans %}
|
||||
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<title>You’re Invited to AdventureLog</title>
|
||||
<style>
|
||||
body {
|
||||
font-family: "Helvetica Neue", Helvetica, Arial, sans-serif;
|
||||
background-color: #f0f4f8;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
color: #1f2937;
|
||||
}
|
||||
.container {
|
||||
max-width: 600px;
|
||||
margin: 50px auto;
|
||||
background: #ffffff;
|
||||
border-radius: 16px;
|
||||
box-shadow: 0 8px 30px rgba(0, 0, 0, 0.08);
|
||||
padding: 40px 30px;
|
||||
text-align: center;
|
||||
}
|
||||
.logo {
|
||||
width: 80px;
|
||||
height: 80px;
|
||||
margin-bottom: 24px;
|
||||
}
|
||||
h1 {
|
||||
font-size: 28px;
|
||||
color: #111827;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
p {
|
||||
font-size: 16px;
|
||||
line-height: 1.65;
|
||||
margin-bottom: 24px;
|
||||
color: #1f2937;
|
||||
}
|
||||
.highlight {
|
||||
font-weight: 600;
|
||||
color: #4f46e5;
|
||||
}
|
||||
.button {
|
||||
display: inline-block;
|
||||
padding: 16px 32px;
|
||||
background: linear-gradient(90deg, #4f46e5, #6366f1);
|
||||
color: #ffffff !important;
|
||||
text-decoration: none;
|
||||
font-weight: 600;
|
||||
font-size: 16px;
|
||||
border-radius: 12px;
|
||||
}
|
||||
.footer {
|
||||
margin-top: 40px;
|
||||
font-size: 12px;
|
||||
line-height: 1.6;
|
||||
color: #9ca3af;
|
||||
text-align: center;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<div class="container">
|
||||
<img
|
||||
src="https://adventurelog.app/adventurelog.png"
|
||||
alt="AdventureLog"
|
||||
class="logo"
|
||||
/>
|
||||
|
||||
<h1>You’re Invited to AdventureLog</h1>
|
||||
|
||||
<p>Hello <strong>{{ email }}</strong>,</p>
|
||||
|
||||
<p>
|
||||
Adventure awaits! You've been invited to join
|
||||
<span class="highlight">AdventureLog</span>, the ultimate travel
|
||||
companion to track, plan, and collaborate on your journeys.
|
||||
</p>
|
||||
<p>Hit the button below to accept your invitation and start exploring!</p>
|
||||
<p><a href="{{ invite_url }}" class="button">Join AdventureLog</a></p>
|
||||
|
||||
<div class="footer">
|
||||
You’re receiving this email because someone invited you to join
|
||||
AdventureLog.<br />
|
||||
If you weren’t expecting this invitation, you can safely ignore this
|
||||
email.<br /><br />
|
||||
© 2023–2025 AdventureLog
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
{% endblocktrans %} {% endautoescape %}
|
||||
23
backend/server/templates/invitations/forms/_invite.html
Normal file
23
backend/server/templates/invitations/forms/_invite.html
Normal file
@@ -0,0 +1,23 @@
|
||||
{# templates/invitations/invite.html #} {% load i18n %}
|
||||
|
||||
<h4 class="title">{% trans "Send Invitation" %}</h4>
|
||||
|
||||
<div class="alert alert-info">
|
||||
{% blocktrans %}To send an invitation, please go to the Django admin and
|
||||
create a new Invitation object.{% endblocktrans %}
|
||||
</div>
|
||||
|
||||
<p>
|
||||
<a
|
||||
href="{% url 'admin:invitations_invitation_add' %}"
|
||||
class="btn btn-primary"
|
||||
>
|
||||
{% trans "Go to Django Admin" %}
|
||||
</a>
|
||||
</p>
|
||||
|
||||
{% if request.user.is_staff %}
|
||||
<p class="text-muted small">
|
||||
{% blocktrans %}Only staff users can send invitations.{% endblocktrans %}
|
||||
</p>
|
||||
{% endif %}
|
||||
39
backend/server/templates/mfa/authenticate.html
Normal file
39
backend/server/templates/mfa/authenticate.html
Normal file
@@ -0,0 +1,39 @@
|
||||
{% extends "mfa/authenticate.html" %}
|
||||
{% load allauth %}
|
||||
{% load allauth_ui %}
|
||||
{% load i18n %}
|
||||
{% block content %}
|
||||
{% trans "Two-Factor Authentication" as heading %}
|
||||
{% blocktranslate asvar subheading %}Your account is protected by two-factor authentication. Please enter an authenticator code:{% endblocktranslate %}
|
||||
{% url 'mfa_authenticate' as action_url %}
|
||||
{% #container heading=heading subheading=subheading %}
|
||||
{% translate "Activate" as button_text %}
|
||||
{% trans "Sign In" as button_text %}
|
||||
{% #form form=form url=action_url button_text=button_text %}
|
||||
{% csrf_token %}
|
||||
{% /form %}
|
||||
{% if "webauthn" in MFA_SUPPORTED_TYPES %}
|
||||
<div class="divider"></div>
|
||||
<h2 class="my-3 text-lg">{% translate "Alternative options" %}</h2>
|
||||
{% #form form=webauthn_form url=action_url use_default_button="false" %}
|
||||
<button type="submit" class="btn btn-neutral">
|
||||
{% trans "Use a security key" %}
|
||||
</button>
|
||||
<a href="{% url "account_login" %}" class="btn btn-accent">{% trans "Cancel" %}</a>
|
||||
{% csrf_token %}
|
||||
{% /form %}
|
||||
{% endif %}
|
||||
{% /container %}
|
||||
{{ js_data|json_script:"js_data" }}
|
||||
{# djlint:off #}
|
||||
<script type="text/javascript">
|
||||
allauth.webauthn.forms.authenticateForm({
|
||||
ids: {
|
||||
authenticate: "mfa_webauthn_authenticate",
|
||||
credential: "{{ webauthn_form.credential.auto_id }}"
|
||||
},
|
||||
data: JSON.parse(document.getElementById('js_data').textContent)
|
||||
})
|
||||
</script>
|
||||
{# djlint:on #}
|
||||
{% endblock content %}
|
||||
@@ -1,10 +1,65 @@
|
||||
from allauth.account.adapter import DefaultAccountAdapter
|
||||
from django.conf import settings
|
||||
# your_app/adapters.py
|
||||
|
||||
class NoNewUsersAccountAdapter(DefaultAccountAdapter):
|
||||
"""
|
||||
Disable new user registration.
|
||||
"""
|
||||
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
|
||||
from allauth.account.adapter import DefaultAccountAdapter
|
||||
from allauth.account.signals import user_signed_up
|
||||
from django.conf import settings
|
||||
from django.urls import resolve, Resolver404
|
||||
from invitations.models import Invitation
|
||||
|
||||
|
||||
class CustomAccountAdapter(DefaultAccountAdapter):
|
||||
"""Control regular signup based on DISABLE_REGISTRATION, but allow invites."""
|
||||
|
||||
def is_open_for_signup(self, request):
|
||||
is_disabled = getattr(settings, 'DISABLE_REGISTRATION', False)
|
||||
return not is_disabled
|
||||
"""
|
||||
Allow signup only if:
|
||||
- DISABLE_REGISTRATION is False, OR
|
||||
- the request is for the invitation acceptance URL, OR
|
||||
- there's a valid invitation key in the request parameters.
|
||||
"""
|
||||
# If registration is globally open, allow as usual
|
||||
if settings.DISABLE_REGISTRATION is False:
|
||||
return True
|
||||
|
||||
# If an invitation-verified email is stashed in the session, allow signup
|
||||
if hasattr(request, "session") and request.session.get("account_verified_email"):
|
||||
return True
|
||||
|
||||
# When disabled, allow signups via invitation accept URL
|
||||
try:
|
||||
match = resolve(request.path_info)
|
||||
print("Resolved view name:", match.view_name)
|
||||
if match.view_name == "invitations:accept-invite":
|
||||
return True
|
||||
except Resolver404:
|
||||
pass
|
||||
|
||||
|
||||
# Block any other signup
|
||||
return False
|
||||
|
||||
def get_user_signed_up_signal(self):
|
||||
"""Return the allauth `user_signed_up` signal for compatibility with
|
||||
django-invitations which expects this method on the adapter.
|
||||
"""
|
||||
return user_signed_up
|
||||
|
||||
|
||||
class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
|
||||
"""Control social signup based on SOCIALACCOUNT_ALLOW_SIGNUP setting"""
|
||||
|
||||
def is_open_for_signup(self, request, sociallogin):
|
||||
"""
|
||||
Determines if social signup is allowed.
|
||||
Check SOCIALACCOUNT_ALLOW_SIGNUP env variable.
|
||||
|
||||
Returning False shows the same 'signup_closed.html' template
|
||||
as regular signup, but only blocks NEW social signups.
|
||||
Existing users can still log in.
|
||||
"""
|
||||
# If social signup is disabled, only allow existing users
|
||||
if not settings.SOCIALACCOUNT_ALLOW_SIGNUP:
|
||||
return sociallogin.is_existing
|
||||
|
||||
return True
|
||||
|
||||
@@ -4,3 +4,8 @@ from django.apps import AppConfig
|
||||
class UsersConfig(AppConfig):
|
||||
default_auto_field = 'django.db.models.BigAutoField'
|
||||
name = 'users'
|
||||
|
||||
def ready(self):
|
||||
# Ensure signal handlers are registered
|
||||
import users.signals # noqa: F401
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.backends import ModelBackend
|
||||
from allauth.socialaccount.models import SocialAccount
|
||||
from allauth.account.auth_backends import AuthenticationBackend as AllauthBackend
|
||||
@@ -7,6 +8,10 @@ User = get_user_model()
|
||||
|
||||
class NoPasswordAuthBackend(ModelBackend):
|
||||
def authenticate(self, request, username=None, password=None, **kwargs):
|
||||
# Block all password-based logins when social-only mode is enforced
|
||||
if getattr(settings, "FORCE_SOCIALACCOUNT_LOGIN", False) and password:
|
||||
return None
|
||||
|
||||
# Handle allauth-specific authentication (like email login)
|
||||
allauth_backend = AllauthBackend()
|
||||
allauth_user = allauth_backend.authenticate(request, username=username, password=password, **kwargs)
|
||||
|
||||
@@ -14,4 +14,17 @@ class CustomSignupForm(forms.Form):
|
||||
|
||||
# Save the user instance
|
||||
user.save()
|
||||
return user
|
||||
return user
|
||||
|
||||
class UseAdminInviteForm(forms.Form):
|
||||
"""
|
||||
Dummy form that just tells admins to use the Django admin to send invites.
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
# Remove any fields; we only want to show a message
|
||||
self.fields.clear()
|
||||
|
||||
def as_widget(self):
|
||||
# This is not needed; we’ll just use a template
|
||||
pass
|
||||
@@ -0,0 +1,18 @@
|
||||
# Generated by Django 5.2.8 on 2026-01-03 20:30
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('users', '0005_customuser_measurement_system'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='customuser',
|
||||
name='default_currency',
|
||||
field=models.CharField(choices=[('USD', 'US Dollar'), ('EUR', 'Euro'), ('GBP', 'British Pound'), ('JPY', 'Japanese Yen'), ('AUD', 'Australian Dollar'), ('CAD', 'Canadian Dollar'), ('CHF', 'Swiss Franc'), ('CNY', 'Chinese Yuan'), ('HKD', 'Hong Kong Dollar'), ('SGD', 'Singapore Dollar'), ('SEK', 'Swedish Krona'), ('NOK', 'Norwegian Krone'), ('DKK', 'Danish Krone'), ('NZD', 'New Zealand Dollar'), ('INR', 'Indian Rupee'), ('MXN', 'Mexican Peso'), ('BRL', 'Brazilian Real'), ('ZAR', 'South African Rand'), ('AED', 'UAE Dirham'), ('TRY', 'Turkish Lira')], default='USD', max_length=5),
|
||||
),
|
||||
]
|
||||
@@ -3,6 +3,30 @@ from django.contrib.auth.models import AbstractUser
|
||||
from django.db import models
|
||||
from django_resized import ResizedImageField
|
||||
|
||||
|
||||
CURRENCY_CHOICES = (
|
||||
('USD', 'US Dollar'),
|
||||
('EUR', 'Euro'),
|
||||
('GBP', 'British Pound'),
|
||||
('JPY', 'Japanese Yen'),
|
||||
('AUD', 'Australian Dollar'),
|
||||
('CAD', 'Canadian Dollar'),
|
||||
('CHF', 'Swiss Franc'),
|
||||
('CNY', 'Chinese Yuan'),
|
||||
('HKD', 'Hong Kong Dollar'),
|
||||
('SGD', 'Singapore Dollar'),
|
||||
('SEK', 'Swedish Krona'),
|
||||
('NOK', 'Norwegian Krone'),
|
||||
('DKK', 'Danish Krone'),
|
||||
('NZD', 'New Zealand Dollar'),
|
||||
('INR', 'Indian Rupee'),
|
||||
('MXN', 'Mexican Peso'),
|
||||
('BRL', 'Brazilian Real'),
|
||||
('ZAR', 'South African Rand'),
|
||||
('AED', 'UAE Dirham'),
|
||||
('TRY', 'Turkish Lira'),
|
||||
)
|
||||
|
||||
class CustomUser(AbstractUser):
|
||||
email = models.EmailField(unique=True) # Override the email field with unique constraint
|
||||
profile_pic = ResizedImageField(force_format="WEBP", quality=75, null=True, blank=True, upload_to='profile-pics/')
|
||||
@@ -10,6 +34,8 @@ class CustomUser(AbstractUser):
|
||||
public_profile = models.BooleanField(default=False)
|
||||
disable_password = models.BooleanField(default=False)
|
||||
measurement_system = models.CharField(max_length=10, choices=[('metric', 'Metric'), ('imperial', 'Imperial')], default='metric')
|
||||
default_currency = models.CharField(max_length=5, choices=CURRENCY_CHOICES, default='USD')
|
||||
|
||||
|
||||
def __str__(self):
|
||||
return self.username
|
||||
@@ -50,7 +50,7 @@ class UserDetailsSerializer(serializers.ModelSerializer):
|
||||
|
||||
class Meta:
|
||||
model = CustomUser
|
||||
extra_fields = ['profile_pic', 'uuid', 'public_profile', 'measurement_system']
|
||||
extra_fields = ['profile_pic', 'uuid', 'public_profile']
|
||||
|
||||
if hasattr(UserModel, 'USERNAME_FIELD'):
|
||||
extra_fields.append(UserModel.USERNAME_FIELD)
|
||||
@@ -68,6 +68,8 @@ class UserDetailsSerializer(serializers.ModelSerializer):
|
||||
extra_fields.append('disable_password')
|
||||
if hasattr(UserModel, 'measurement_system'):
|
||||
extra_fields.append('measurement_system')
|
||||
if hasattr(UserModel, 'default_currency'):
|
||||
extra_fields.append('default_currency')
|
||||
|
||||
fields = ['pk', *extra_fields]
|
||||
read_only_fields = ('email', 'date_joined', 'is_staff', 'is_superuser', 'is_active', 'pk', 'disable_password')
|
||||
@@ -98,7 +100,7 @@ class CustomUserDetailsSerializer(UserDetailsSerializer):
|
||||
|
||||
class Meta(UserDetailsSerializer.Meta):
|
||||
model = CustomUser
|
||||
fields = UserDetailsSerializer.Meta.fields + ['profile_pic', 'uuid', 'public_profile', 'has_password', 'disable_password', 'measurement_system']
|
||||
fields = UserDetailsSerializer.Meta.fields + ['has_password', 'disable_password']
|
||||
read_only_fields = UserDetailsSerializer.Meta.read_only_fields + ('uuid', 'has_password', 'disable_password')
|
||||
|
||||
@staticmethod
|
||||
|
||||
53
backend/server/users/signals.py
Normal file
53
backend/server/users/signals.py
Normal file
@@ -0,0 +1,53 @@
|
||||
from django.db.models.signals import post_save, post_delete
|
||||
from django.dispatch import receiver
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.contrib.auth.models import AbstractUser
|
||||
from allauth.account.models import EmailAddress
|
||||
from django.db.models.signals import pre_delete
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
def _sync_user_email(user: AbstractUser):
|
||||
from allauth.account.models import EmailAddress # local import to avoid early import issues
|
||||
|
||||
# Prefer the primary email if it exists
|
||||
primary = EmailAddress.objects.filter(user=user, primary=True).first()
|
||||
if primary:
|
||||
if user.email != primary.email:
|
||||
user.email = primary.email
|
||||
user.save(update_fields=['email'])
|
||||
return
|
||||
|
||||
# Fallback: if any email exists, use the first; if none, leave user.email unchanged
|
||||
any_email = EmailAddress.objects.filter(user=user).order_by('id').first()
|
||||
if any_email and user.email != any_email.email:
|
||||
user.email = any_email.email
|
||||
user.save(update_fields=['email'])
|
||||
|
||||
@receiver(post_save)
|
||||
def emailaddress_post_save(sender, instance, **kwargs):
|
||||
# Only react to allauth EmailAddress saves
|
||||
try:
|
||||
from allauth.account.models import EmailAddress
|
||||
except Exception:
|
||||
return
|
||||
if sender is EmailAddress:
|
||||
_sync_user_email(instance.user)
|
||||
|
||||
@receiver(post_delete)
|
||||
def emailaddress_post_delete(sender, instance, **kwargs):
|
||||
# Only react to allauth EmailAddress deletes
|
||||
try:
|
||||
from allauth.account.models import EmailAddress
|
||||
except Exception:
|
||||
return
|
||||
if sender is EmailAddress:
|
||||
_sync_user_email(instance.user)
|
||||
|
||||
# Prevent deleting the last email address for a user
|
||||
@receiver(pre_delete, sender=EmailAddress)
|
||||
def prevent_deleting_last_email(sender, instance, using, **kwargs):
|
||||
user = instance.user
|
||||
email_count = EmailAddress.objects.filter(user=user).count()
|
||||
if email_count <= 1:
|
||||
raise ValueError("Cannot delete the last email address of a user.")
|
||||
@@ -171,7 +171,8 @@ class EnabledSocialProvidersView(APIView):
|
||||
providers.append({
|
||||
'provider': provider.provider,
|
||||
'url': f"{getenv('PUBLIC_URL')}/accounts/{new_provider}/login/",
|
||||
'name': provider.name
|
||||
'name': provider.name,
|
||||
'usage_required': settings.FORCE_SOCIALACCOUNT_LOGIN
|
||||
})
|
||||
return Response(providers, status=status.HTTP_200_OK)
|
||||
|
||||
|
||||
@@ -167,7 +167,7 @@ class Command(BaseCommand):
|
||||
# Process regions/states
|
||||
if country['states']:
|
||||
for state in country['states']:
|
||||
state_id = f"{country_code}-{state['state_code']}"
|
||||
state_id = f"{country_code}-{state['iso2']}"
|
||||
state_name = state['name']
|
||||
state_lat = round(float(state['latitude']), 6) if state['latitude'] else None
|
||||
state_lng = round(float(state['longitude']), 6) if state['longitude'] else None
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
from django.shortcuts import render, get_object_or_404
|
||||
from django.shortcuts import get_object_or_404
|
||||
from .models import Country, Region, VisitedRegion, City, VisitedCity
|
||||
from .serializers import CitySerializer, CountrySerializer, RegionSerializer, VisitedRegionSerializer, VisitedCitySerializer
|
||||
from rest_framework import viewsets, status
|
||||
@@ -6,35 +6,8 @@ from rest_framework.permissions import IsAuthenticated
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.decorators import api_view, permission_classes, action
|
||||
from django.contrib.gis.geos import Point
|
||||
from django.core.cache import cache
|
||||
from django.views.decorators.cache import cache_page
|
||||
from django.utils.decorators import method_decorator
|
||||
from adventures.models import Location
|
||||
|
||||
# Cache TTL
|
||||
CACHE_TTL = 60 * 60 * 24 # 1 day
|
||||
|
||||
|
||||
def invalidate_visit_caches_for_region_and_user(region, user):
|
||||
"""Invalidate cached visit lists for a given region and user.
|
||||
|
||||
Removes both the per-region and per-country per-user cache keys so
|
||||
UI calls will refetch updated visited lists.
|
||||
"""
|
||||
try:
|
||||
if region is None or user is None:
|
||||
return
|
||||
# per-region cache
|
||||
cache.delete(f"visits_by_region_{region.id}_{user.id}")
|
||||
# per-country cache (region -> country -> country_code)
|
||||
country_code = getattr(region.country, 'country_code', None)
|
||||
if country_code:
|
||||
cache.delete(f"visits_by_country_{country_code}_{user.id}")
|
||||
except Exception:
|
||||
# Avoid raising cache-related exceptions; best-effort invalidation
|
||||
pass
|
||||
|
||||
@cache_page(CACHE_TTL)
|
||||
@api_view(['GET'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def regions_by_country(request, country_code):
|
||||
@@ -46,17 +19,11 @@ def regions_by_country(request, country_code):
|
||||
@api_view(['GET'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def visits_by_country(request, country_code):
|
||||
cache_key = f"visits_by_country_{country_code}_{request.user.id}"
|
||||
data = cache.get(cache_key)
|
||||
if data is not None:
|
||||
return Response(data)
|
||||
country = get_object_or_404(Country, country_code=country_code)
|
||||
visits = VisitedRegion.objects.filter(region__country=country, user=request.user.id)
|
||||
serializer = VisitedRegionSerializer(visits, many=True)
|
||||
cache.set(cache_key, serializer.data, CACHE_TTL)
|
||||
return Response(serializer.data)
|
||||
|
||||
@cache_page(CACHE_TTL)
|
||||
@api_view(['GET'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def cities_by_region(request, region_id):
|
||||
@@ -68,14 +35,9 @@ def cities_by_region(request, region_id):
|
||||
@api_view(['GET'])
|
||||
@permission_classes([IsAuthenticated])
|
||||
def visits_by_region(request, region_id):
|
||||
cache_key = f"visits_by_region_{region_id}_{request.user.id}"
|
||||
data = cache.get(cache_key)
|
||||
if data is not None:
|
||||
return Response(data)
|
||||
region = get_object_or_404(Region, id=region_id)
|
||||
visits = VisitedCity.objects.filter(city__region=region, user=request.user.id)
|
||||
serializer = VisitedCitySerializer(visits, many=True)
|
||||
cache.set(cache_key, serializer.data, CACHE_TTL)
|
||||
return Response(serializer.data)
|
||||
|
||||
# view called spin the globe that return a random country, a random region in that country and a random city in that region
|
||||
@@ -99,7 +61,6 @@ def globespin(request):
|
||||
|
||||
return Response(data)
|
||||
|
||||
@method_decorator(cache_page(CACHE_TTL), name='list')
|
||||
class CountryViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
queryset = Country.objects.all().order_by('name')
|
||||
serializer_class = CountrySerializer
|
||||
@@ -134,7 +95,6 @@ class CountryViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
continue
|
||||
return Response({'regions_visited': count})
|
||||
|
||||
@method_decorator(cache_page(CACHE_TTL), name='list')
|
||||
class RegionViewSet(viewsets.ReadOnlyModelViewSet):
|
||||
queryset = Region.objects.all()
|
||||
serializer_class = RegionSerializer
|
||||
@@ -158,22 +118,13 @@ class VisitedRegionViewSet(viewsets.ModelViewSet):
|
||||
serializer.is_valid(raise_exception=True)
|
||||
self.perform_create(serializer)
|
||||
headers = self.get_success_headers(serializer.data)
|
||||
# Invalidate caches for this region and its country for the user
|
||||
try:
|
||||
region = serializer.validated_data.get('region')
|
||||
invalidate_visit_caches_for_region_and_user(region, request.user)
|
||||
except Exception:
|
||||
pass
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
|
||||
|
||||
def destroy(self, request, **kwargs):
|
||||
region = get_object_or_404(Region, id=kwargs['pk'])
|
||||
visited_region = VisitedRegion.objects.filter(user=request.user.id, region=region)
|
||||
if visited_region.exists():
|
||||
# capture region before deleting so we can invalidate caches
|
||||
affected_region = visited_region.first().region
|
||||
visited_region.delete()
|
||||
invalidate_visit_caches_for_region_and_user(affected_region, request.user)
|
||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
||||
else:
|
||||
return Response({"error": "Visited region not found."}, status=status.HTTP_404_NOT_FOUND)
|
||||
@@ -193,14 +144,10 @@ class VisitedCityViewSet(viewsets.ModelViewSet):
|
||||
serializer = self.get_serializer(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
self.perform_create(serializer)
|
||||
# Ensure a VisitedRegion exists for the city and invalidate caches
|
||||
# Ensure a VisitedRegion exists for the city
|
||||
region = serializer.validated_data['city'].region
|
||||
if not VisitedRegion.objects.filter(user=request.user.id, region=region).exists():
|
||||
VisitedRegion.objects.create(user=request.user, region=region)
|
||||
try:
|
||||
invalidate_visit_caches_for_region_and_user(region, request.user)
|
||||
except Exception:
|
||||
pass
|
||||
headers = self.get_success_headers(serializer.data)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
|
||||
|
||||
@@ -208,9 +155,7 @@ class VisitedCityViewSet(viewsets.ModelViewSet):
|
||||
city = get_object_or_404(City, id=kwargs['pk'])
|
||||
visited_city = VisitedCity.objects.filter(user=request.user.id, city=city)
|
||||
if visited_city.exists():
|
||||
region = city.region
|
||||
visited_city.delete()
|
||||
invalidate_visit_caches_for_region_and_user(region, request.user)
|
||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
||||
else:
|
||||
return Response({"error": "Visited city not found."}, status=status.HTTP_404_NOT_FOUND)
|
||||
|
||||
Reference in New Issue
Block a user