[v0.12.0 pre] Planning & Itinerary Overhaul, Recommendation Engine, OIDC Enhancements, and More (#931)

* Fixes [REQUEST] Email-based auto-linking for OIDC
Fixes #921

* Add ClusterMap integration for regions and cities with fit-to-bounds functionality

* Update COUNTRY_REGION_JSON_VERSION to v3.0 and modify state ID generation to use ISO2 code

* fix: handle email verification required case during signup

Updated the signup action to return a specific message when the backend responds with a 401 status, indicating that the signup succeeded but email verification is required. This allows the frontend to display the appropriate message using an i18n key.

* feat: add Advanced Configuration documentation with optional environment variables

* Fixes #511

* fix: update appVersion to v0.11.0-main-121425 and enhance socialProviders handling in settings page

* feat: implement social signup controls and update documentation for new environment variables

* fix: update LocationCard props and enhance restore data functionality

- Changed the user prop to null in LocationCard component on the dashboard page.
- Added isRestoring state to manage loading state during data restoration in settings.
- Updated the restore button to show a loading spinner when a restore operation is in progress.

* fix: update appVersion to v0.12.0-pre-dev-121625

* feat: implement itinerary planning feature with CollectionItineraryPlanner component and related updates

* feat: add overnight lodging indicator and functionality to CollectionItineraryPlanner

* feat: add compact display option to LocationCard and enhance lodging filtering in CollectionItineraryPlanner

* feat(itinerary): add itinerary management features and link modal

- Introduced ItineraryViewSet for managing itinerary items with create and reorder functionalities.
- Added itinerary linking capabilities in CollectionModal and CollectionItineraryPlanner components.
- Implemented new ItineraryLinkModal for linking existing items to specific dates.
- Enhanced the frontend with new modals for creating locations, lodging, transportation, notes, and checklists.
- Updated the backend to handle itinerary item creation and reordering with appropriate permissions.
- Improved data handling for unscheduled items and their association with the itinerary.
- Added new dependencies to the frontend for enhanced functionality.

* feat(itinerary): implement auto-generate functionality for itinerary items based on dated records

* feat(collection): enhance collection sharing logic and improve data handling on invite acceptance

* fix: update appVersion to correct pre-dev version

* feat(wikipedia): implement image selection from Wikipedia with enhanced results display

* Refactor code structure for improved readability and maintainability

* feat: add CollectionRecommendationView component for displaying location recommendations

- Implemented CollectionRecommendationView.svelte to handle location recommendations based on user input and selected categories.
- Added Recommendation and RecommendationResponse types to types.ts for better type safety and structure.
- Updated collections/[id]/+page.svelte to include a new view for recommendations, allowing users to switch between different views seamlessly.

* fix: update appVersion and improve button accessibility in collection views

* feat: add canModify prop to collection components for user permission handling

* feat: add itinerary removal functionality to various cards and update UI components

- Implemented `removeFromItinerary` function in `LodgingCard`, `NoteCard`, and `TransportationCard` to allow users to remove items from their itinerary.
- Replaced the trash icon with a calendar remove icon in `LocationCard`, `LodgingCard`, `NoteCard`, and `TransportationCard` for better visual representation.
- Updated the dropdown menus in `LodgingCard`, `NoteCard`, and `TransportationCard` to include the new remove from itinerary option.
- Enhanced `CollectionItineraryPlanner` to pass itinerary items to the respective cards.
- Removed `PointSelectionModal.svelte` as it is no longer needed.
- Refactored `LocationMedia.svelte` to integrate `ImageManagement` component and clean up unused code related to image handling.

* feat: enhance itinerary management with deduplication and initial visit date handling

* feat: add FullMap component for enhanced map functionality with clustering support

- Introduced FullMap.svelte to handle map rendering, clustering, and marker management.
- Updated map page to utilize FullMap component, replacing direct MapLibre usage.
- Implemented clustering options and marker properties handling in FullMap.
- Added utility functions for resolving theme colors and managing marker states.
- Enhanced user experience with hover popups and improved loading states for location details.
- Updated app version to v0.12.0-pre-dev-122225.

* feat: enhance map interaction for touch devices with custom popup handling

* feat: add progress tracker for folder views to display visited and planned locations

* feat: add map center and zoom state management with URL synchronization

* feat: add status and days until start fields to collections with filtering options

* Component folder structure changes

* feat: add LodgingMedia and LodgingModal components for managing lodging details and media attachments

feat: implement LocationSearchMap component for interactive location searching and mapping functionality

* fix: update contentType in ImageManagement component to 'lodging' for correct media handling

* feat: enhance lodging management with date validation and update messages

* feat: implement lodging detail page with server-side loading and image modal functionality

- Added a new server-side load function to fetch lodging details by ID.
- Created a new Svelte component for the lodging detail page, including image carousel and map integration.
- Implemented a modal for displaying images with navigation.
- Enhanced URL handling in the locations page to only read parameters.

* feat: add Transportation modal component and related routes

- Implemented TransportationModal component for creating and editing transportation entries.
- Added server-side loading for transportation details in the new route [id]/+page.server.ts.
- Created a new Svelte page for displaying transportation details with image and attachment handling.
- Integrated modal for editing transportation in the transportation details page.
- Updated lodging routes to include a modal for editing lodging entries.
- Removed unused delete action from lodging server-side logic.

* feat: add start_code and end_code fields to Transportation model and update related components

* feat: implement date validation for itinerary items and add day picker modal for scheduling

* Reorder town and county checks in geocoding.py

Fix detection if only town exists for a location but county is no city name

* Use address keys only if city is found

* Make sure reverse geocoding uses correct key for cities (#938)

* Reorder town and county checks in geocoding.py

Fix detection if only town exists for a location but county is no city name

* Use address keys only if city is found

* Refactor code structure for improved readability and maintainability

* Enhance collection management with modal updates and item handling

* feat: integrate CollectionMap component in collections page and update map titles in lodging and transportation pages

- Replaced inline map implementation with CollectionMap component in collections/[id]/+page.svelte for better modularity.
- Updated the map title in lodging/[id]/+page.svelte to reflect lodging context.
- Updated the map title in transportations/[id]/+page.svelte to reflect transportation context.
- Added functionality to collect and render GeoJSON data from transportation attachments in transportations/[id]/+page.svelte.

* chore: update copyright year to 2026 in various files

* feat: enhance backup export functionality with itinerary items and export IDs

* fix: improve dropdown close behavior by handling multiple event types

* fix: remove unnecessary cache decorator from globespin function

* feat: add initial visit date support in ChecklistModal and NoteModal, with UI suggestions for prefilled dates

* feat: add details view for checklist and note cards with edit functionality

* feat: add travel duration and GPX distance calculation to Transportation model and UI

* feat: add primary image support to Collection model, serializers, and UI components

* Refactor calendar components and enhance event detail handling

- Replaced direct calendar implementation with a reusable CalendarComponent in the calendar route.
- Introduced EventDetailsModal for displaying event details, improving modularity and readability.
- Added functionality to fetch event details asynchronously when an event is clicked.
- Implemented ICS calendar download functionality with loading state management.
- Enhanced collections page to support calendar view, integrating event handling and timezone management.
- Improved lodging and transportation pages to display local time for stays and trips, including timezone badges.
- Cleaned up unused code and comments for better maintainability.

* feat: enhance hero image handling in collection view by prioritizing primary image

* chore: update .env.example to include account email verification configuration

* feat: enhance LodgingCard and TransportationCard components with expandable details and improved layout

* feat: add price and currency fields to locations, lodging, and transportation components

- Introduced price and price_currency fields in LocationModal, LodgingDetails, LodgingModal, TransportationDetails, and TransportationModal components.
- Implemented MoneyInput and CurrencyDropdown components for handling monetary values and currency selection.
- Updated data structures and types to accommodate new price and currency fields across various models.
- Enhanced cost summary calculations in collections and routes to display total costs by currency.
- Added user preference for default currency in settings, affecting new item forms.
- Updated UI to display price information in relevant components, ensuring consistent formatting and user experience.

* feat: add Development Timeline link to overview and create timeline documentation

* feat: enhance map functionality with search and zoom features

- Updated availableViews in collection page to include map view based on lodging and transportation locations.
- Added search functionality to the map page, allowing users to filter pins by name and category.
- Implemented auto-zoom feature to adjust the map view based on filtered search results.
- Introduced a search bar with a clear button for better user experience.

* feat: enhance ISO code extraction and region matching logic in extractIsoCode function

* feat: enhance extractIsoCode function with normalization for locality matching

* feat: update extractIsoCode function to include additional ISO3166 levels for improved region matching

* feat: enhance extractIsoCode function to handle cases without city information and update CollectionMap to bind user data

* feat: add cron job for syncing visited regions and cities, enhance Docker and supervisord configurations

* feat: add CollectionItineraryDay model and related functionality for itinerary day metadata management

* feat: implement cleanup of out-of-range itinerary items and notify users of potential impacts on itinerary when dates change

* Refactor collection page for improved localization and code clarity

- Removed unused imports and consolidated cost category labels to be reactive.
- Updated cost summary function to accept localized labels.
- Enhanced localization for various UI elements, including buttons, headings, and statistics.
- Improved user feedback messages for better clarity and consistency.
- Ensured all relevant text is translatable using the i18n library.

* feat: add collaborator serialization and display in collections

- Implemented `_build_profile_pic_url` and `_serialize_collaborator` functions for user profile picture URLs and serialization.
- Updated `CollectionSerializer` and `UltraSlimCollectionSerializer` to include collaborators in the serialized output.
- Enhanced `CollectionViewSet` to prefetch shared_with users for optimized queries.
- Modified frontend components to display collaborators in collection details, including profile pictures and initials.
- Added new localization strings for collaborators.
- Refactored map and location components to improve usability and functionality.
- Updated app version to reflect new changes.

* feat: add dynamic lodging icons based on type in CollectionMap component

* feat: add CollectionStats component for detailed trip statistics

- Implemented CollectionStats.svelte to display various statistics related to the collection, including distances, activities, and locations visited.
- Enhanced CollectionMap.svelte to filter activities based on date range using new getActivityDate function.
- Updated LocationSearchMap.svelte to handle airport mode for start and end locations.
- Modified types.ts to include is_global property in CollectionItineraryItem for trip-wide items.
- Updated +page.svelte to integrate the new stats view and manage view state accordingly.

* feat: enhance itinerary management by removing old items on date change for notes and checklists; normalize date handling in CollectionMap

* feat: add functionality to change day and move items to trip-wide itinerary

- Implemented changeDay function in ChecklistCard, LocationCard, LodgingCard, NoteCard, and TransportationCard components to allow users to change the scheduled day of items.
- Added a button to move items to the global (trip-wide) itinerary in the aforementioned components, with appropriate dispatch events.
- Enhanced CollectionItineraryPlanner to handle moving items to the global itinerary and added UI elements for unscheduled items.
- Updated ItineraryDayPickModal to support the deletion of source visits when moving locations.
- Added new translations for "Change Day" and "Move Trip Wide" in the English locale.

* fix: specify full path for python3 in cron job and add shell and path variables

* fix: update appVersion to v0.12.0-pre-dev-010726

* feat: enhance CollectionItineraryPlanner and CollectionStats with dynamic links and transport type normalization

* Add Dev Container + WSL install docs and link in install guide (#944) (#951)

* feat: enhance internationalization support in CollectionMap and CollectionStats components

- Added translation support for various labels and messages in CollectionMap.svelte and CollectionStats.svelte using svelte-i18n.
- Updated English and Chinese locale files to include new translation keys for improved user experience.
- Simplified the rendering of recommendation views in the collections page.

* Refactor itinerary management and UI components

- Updated ItineraryViewSet to handle visit updates and creations more efficiently, preserving visit IDs when moving between days.
- Enhanced ChecklistCard, LodgingCard, TransportationCard, and NoteCard to include a new "Change Day" option in the actions menu.
- Improved user experience in CollectionItineraryPlanner by tracking specific itinerary items being moved and ensuring only the relevant entries are deleted.
- Added new location sharing options in LodgingCard and TransportationCard for Apple Maps, Google Maps, and OpenStreetMap.
- Updated translations in en.json for consistency and clarity.
- Minor UI adjustments for better accessibility and usability across various components.

* feat: implement action menus and close event handling in card components

* feat: refactor Dockerfile and supervisord configuration to remove cron and add periodic sync script

* feat: enhance LocationSearchMap and TransportationDetails components with initialization handling and airport mode logic

* feat: add airport and location search mode labels to localization file

* feat: enhance periodic sync logging and improve airport mode handling in LocationSearchMap

* feat: enhance unscheduled items display with improved card interactions and accessibility

* Add dev compose for hot reload and update WSL dev container docs (#958)

* feat: enhance localization for itinerary linking and transportation components

* Localization: update localization files with new keys and values

* fix: improve error messages for Overpass API responses

* chore: update dependencies in frontend package.json and pnpm-lock.yaml

- Updated @sveltejs/adapter-node from ^5.2.12 to ^5.4.0
- Updated @sveltejs/adapter-vercel from ^5.7.0 to ^6.3.0
- Updated tailwindcss from ^3.4.17 to ^3.4.19
- Updated typescript from ^5.8.3 to ^5.9.3
- Updated vite from ^5.4.19 to ^5.4.21

* chore: update dependencies in pnpm-lock.yaml to latest versions

* Refactor code structure for improved readability and maintainability

* Refactor code structure for improved readability and maintainability

* fix: update package dependencies to resolve compatibility issues

* Add "worldtravel" translations to multiple locale files

- Added "worldtravel" key with translations for Spanish, French, Hungarian, Italian, Japanese, Korean, Dutch, Norwegian, Polish, Brazilian Portuguese, Russian, Slovak, Swedish, Turkish, Ukrainian, and Chinese.
- Updated the navigation section in each locale file to include the new "worldtravel" entry.

* Add new screenshots and update email verification message in locale file

* feat: Implement data restoration functionality with file import

- Added a new action `restoreData` in `+page.server.ts` to handle file uploads for restoring collections.
- Enhanced the UI in `+page.svelte` to include an import button and a modal for import progress.
- Integrated file input handling to trigger form submission upon file selection.
- Removed unused GSAP animations from the login, profile, and signup pages for cleaner code.

* feat: Add modals for creating locations and lodging from recommendations, enhance image import functionality

* fix: Adjust styles to prevent horizontal scroll and enhance floating action button visibility

* feat: Enhance error handling and messaging for Google Maps and OpenStreetMap geocoding functions

* fix: Enhance error messaging for Google Maps access forbidden response

* feat: Add User-Agent header to Google Maps API requests and refine error messaging for access forbidden response

* fix: Update User-Agent header in Google Maps API requests for improved compatibility

* fix: Disable proxy settings in Google Maps API request to prevent connection issues

* fix: Update Trivy security scan configuration and add .trivyignore for known false positives

* fix: Refactor update method to handle is_public cascading for related items

* feat: Integrate django-invitations for user invitation management and update settings

* feat: Add Tailwind CSS and DaisyUI plugin for styling

* feat: Add Tailwind CSS and DaisyUI plugin for styling

* feat: Add "Invite a User" guide and update navigation links

* docs: Update "Invite a User" guide to include email configuration tip

* feat: Update email invitation template for improved styling and clarity

* fix: Remove trailing backslash from installation note in Unraid documentation

* feat: Add export/import messages and user email verification prompts in multiple languages

* Squashed commit of the following:

commit a993a15b93ebb7521ae2e5cc31596b98b29fcd6c
Author: Alex <div@alexe.at>
Date:   Mon Jan 12 20:44:47 2026 +0100

    Translated using Weblate (German)

    Currently translated at 100.0% (1048 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/

commit fdc455d9424fbb0f6b72179d9eb1340411700773
Author: Ettore Atalan <atalanttore@googlemail.com>
Date:   Sat Jan 10 23:24:23 2026 +0100

    Translated using Weblate (German)

    Currently translated at 100.0% (1048 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/

commit 5942129c55e89dd999a13d4df9c40e6e3189355c
Author: Orhun <orhunavcu@gmail.com>
Date:   Sun Jan 11 13:05:31 2026 +0100

    Translated using Weblate (Turkish)

    Currently translated at 100.0% (1048 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/tr/

commit 8712e43d8ba4a7e7fe163fb454d6577187f9a375
Author: Henrique Fonseca Veloso <henriquefv@tutamail.com>
Date:   Fri Jan 9 22:53:11 2026 +0100

    Translated using Weblate (Portuguese (Brazil))

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/pt_BR/

commit 18ee56653470413afe8d71ecd2b5028f6e4cf118
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:57 2026 +0100

    Translated using Weblate (Dutch)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/nl/

commit 57783c544e583c035c8b57b5c10ca320f25f399e
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:14 2026 +0100

    Translated using Weblate (Arabic)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ar/

commit fb09edfd85bc85234b1c1ba7dd499f2915093fff
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:26 2026 +0100

    Translated using Weblate (Spanish)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/es/

commit 554a207d8e454a1f7ae826e2a40d389b94be5512
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:21 2026 +0100

    Translated using Weblate (German)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/

commit b70b9db27fb8607beefeb288185601c8f5eae28d
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:53:02 2026 +0100

    Translated using Weblate (Norwegian Bokmål)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/nb_NO/

commit 3b467caa9007c553e4ae7de97f53b6e462161ea3
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:53:07 2026 +0100

    Translated using Weblate (Polish)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/pl/

commit 30fbbfba3572c8f78ec7c7e1a231e363aca1ef10
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:53:17 2026 +0100

    Translated using Weblate (Russian)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ru/

commit 8cecb492cfcac0a1f93ee8919f7b41d978d331ee
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:42 2026 +0100

    Translated using Weblate (Italian)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/it/

commit f0d3d41029c89bfa83d5891ee7af70241f27b7be
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:38 2026 +0100

    Translated using Weblate (Hungarian)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/hu/

commit 102e0f1912d010d38755a1713abb2a7f7564aafb
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:53:21 2026 +0100

    Translated using Weblate (Slovak)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/sk/

commit 428b8f18cf6195a96b55109e0221413d82415a2f
Author: Максим Горпиніч <gorpinicmaksim0@gmail.com>
Date:   Sat Jan 10 08:55:28 2026 +0100

    Translated using Weblate (Ukrainian)

    Currently translated at 100.0% (1048 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/uk/

commit 1a71aaf279ecab26c0c1fede05025732e6dcfa5e
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:53:27 2026 +0100

    Translated using Weblate (Swedish)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/sv/

commit 36ec3701f3a1a904e7c42ac4ffbe6a050dc6d1ed
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:53:43 2026 +0100

    Translated using Weblate (Chinese (Simplified Han script))

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/zh_Hans/

commit 65d8b74b340c877cad2028b7142c783a1b568d49
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:48 2026 +0100

    Translated using Weblate (Japanese)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ja/

commit 4d11d1d31022583657e93aee70301a8ffcde1340
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:52 2026 +0100

    Translated using Weblate (Korean)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ko/

commit bd1135bcb965ad73cf493771b15081cc97cf513a
Author: Orhun <orhunavcu@gmail.com>
Date:   Fri Jan 9 22:53:33 2026 +0100

    Translated using Weblate (Turkish)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/tr/

commit 2c3d814119f4cf2dabd20933699f5b991f20f3e6
Author: Anonymous <noreply@weblate.org>
Date:   Fri Jan 9 22:52:32 2026 +0100

    Translated using Weblate (French)

    Currently translated at 99.9% (1047 of 1048 strings)

    Translation: AdventureLog/Web App
    Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/fr/

* Refactor code structure and remove redundant code blocks for improved readability and maintainability

* fix: Correct appVersion to match the latest pre-release version

* fix: Add missing vulnerability reference for jaraco.context in .trivyignore

---------

Co-authored-by: Lars Lehmann <33843261+larsl-net@users.noreply.github.com>
Co-authored-by: Lars Lehmann <lars@lmail.eu>
Co-authored-by: Nick Petrushin <n.a.petrushin@gmail.com>
This commit is contained in:
Sean Morley
2026-01-14 10:22:14 -05:00
committed by GitHub
parent d954ac057d
commit c39ff2449e
206 changed files with 33830 additions and 14254 deletions

View File

@@ -30,6 +30,12 @@ BACKEND_PORT=8016
DISABLE_REGISTRATION=False
# DISABLE_REGISTRATION_MESSAGE=Registration is disabled for this instance of AdventureLog.
# SOCIALACCOUNT_ALLOW_SIGNUP=False # When false, social providers cannot be used to create new user accounts when registration is disabled.
# FORCE_SOCIALACCOUNT_LOGIN=False # When true, only social login is allowed (no password login) and the login page will show only social providers or redirect directly to the first provider if only one is configured.
# ACCOUNT_EMAIL_VERIFICATION='none' # 'none', 'optional', 'mandatory' # You can change this as needed for your environment
# Optional: Use email
# https://adventurelog.app/docs/configuration/email.html
# EMAIL_BACKEND=email

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,9 @@
name: Trivy Security Scans
permissions:
contents: read # Minimal permissions needed for scanning source and images
contents: read
# Needed if you later add SARIF upload to GitHub Security
# security-events: write
on:
push:
@@ -13,7 +15,7 @@ on:
- main
- development
schedule:
- cron: "0 8 * * 1" # Weekly scan on Mondays at 8 AM UTC
- cron: "0 8 * * 1" # Weekly on Mondays at 8 AM UTC
jobs:
filesystem-scan:
@@ -33,6 +35,8 @@ jobs:
exit-code: 1
ignore-unfixed: true
severity: CRITICAL,HIGH
# Use .trivyignore to suppress known false positives
trivyignores: .trivyignore
image-scan:
name: Trivy Docker Image Scan (Backend & Frontend)
@@ -66,6 +70,7 @@ jobs:
exit-code: 1
ignore-unfixed: true
severity: CRITICAL,HIGH
trivyignores: .trivyignore
- name: Scan frontend Docker image with Trivy
uses: aquasecurity/trivy-action@master
@@ -75,3 +80,4 @@ jobs:
exit-code: 1
ignore-unfixed: true
severity: CRITICAL,HIGH
trivyignores: .trivyignore

13
.trivyignore Normal file
View File

@@ -0,0 +1,13 @@
# Node.js / npm ecosystem
# glob CLI command injection (CVE-2025-64756)
# Only affects glob -c/--cmd; we only use glob as a library in Vite/SvelteKit.
CVE-2025-64756
# Go stdlib false positives in esbuild binary
# esbuild doesn't use the vulnerable archive/tar or crypto/x509 paths in a way that's exploitable.
CVE-2025-58183
CVE-2025-61729
# jaraco.context Has a Path Traversal Vulnerability Fixed via setuptools
GHSA-58pv-8j8x-9vj2

39
.vscode/settings.json vendored
View File

@@ -35,6 +35,43 @@
"navbar.themes.forest",
"navbar.themes.night",
"navbar.themes.dark",
"navbar.themes.light"
"navbar.themes.light",
"transportation.modes.bike",
"transportation.modes.boat",
"transportation.modes.bus",
"transportation.modes.car",
"transportation.modes.other",
"transportation.modes.plane",
"transportation.modes.train",
"transportation.modes.walking",
"settings.account_settings",
"settings.email_change",
"settings.invalid_credentials",
"settings.mfa_required",
"settings.new_email",
"settings.no_emai_set",
"settings.not_verified",
"immich.no_images",
"immich.try_different_date",
"currencies.USD",
"currencies.EUR",
"currencies.GBP",
"currencies.JPY",
"currencies.AUD",
"currencies.CAD",
"currencies.CHF",
"currencies.CNY",
"currencies.HKD",
"currencies.SGD",
"currencies.SEK",
"currencies.NOK",
"currencies.DKK",
"currencies.NZD",
"currencies.INR",
"currencies.MXN",
"currencies.BRL",
"currencies.ZAR",
"currencies.AED",
"currencies.TRY"
]
}

View File

@@ -1,5 +1,5 @@
AdventureLog: Self-hostable travel tracker and trip planner.
Copyright (C) 2023-2025 Sean Morley
Copyright (C) 2023-2026 Sean Morley
Contact: contact@seanmorley.com
This program is free software: you can redistribute it and/or modify

View File

@@ -60,7 +60,7 @@ AdventureLog was created to solve a problem: the lack of a modern, open-source,
<img src="./brand/screenshots/dashboard.png" alt="Dashboard" />
<p>Displays a summary of your locations, including your world travel stats.</p>
<img src="./brand/screenshots/itinerary.png" alt="Itinerary" />
<p>Plan your adventures and travel itinerary with a list of activities and a map view. View your trip in a variety of ways, including an itinerary list, a map view, and a calendar view.</p>
<p>Plan your adventures and travel itinerary with a list of activities and a map view. View your trip in a variety of ways, including an itinerary list, a map view, and a calendar view. Order your plans and details to create the perfect trip.</p>
<img src="./brand/screenshots/countries.png" alt="Countries" />
<p>Lists all the countries you have visited and plan to visit, with the ability to filter by visit status.</p>
<img src="./brand/screenshots/regions.png" alt="Regions" />

View File

@@ -69,5 +69,8 @@ RUN python3 manage.py collectstatic --noinput --verbosity 2
# Expose ports
EXPOSE 80 8000
# Start Supervisor
# Start with an entrypoint that runs init tasks then starts supervisord
ENTRYPOINT ["/code/entrypoint.sh"]
# Start supervisord to manage processes
CMD ["supervisord", "-c", "/etc/supervisor/conf.d/supervisord.conf"]

View File

@@ -84,8 +84,4 @@ fi
cat /code/adventurelog.txt
# Start Gunicorn in foreground
exec gunicorn main.wsgi:application \
--bind [::]:8000 \
--workers 2 \
--timeout 120
exec "$@"

View File

@@ -27,6 +27,10 @@ EMAIL_BACKEND='console'
# GOOGLE_MAPS_API_KEY='key'
# ACCOUNT_EMAIL_VERIFICATION='none' # 'none', 'optional', 'mandatory' # You can change this as needed for your environment
# FORCE_SOCIALACCOUNT_LOGIN=False # When true, only social login is allowed (no password login) and the login page will show only social providers or redirect directly to the first provider if only one is configured.
# ------------------- #
# For Developers to start a Demo Database

View File

@@ -1,7 +1,8 @@
import os
from django.contrib import admin
from django.utils.html import mark_safe
from .models import Location, Checklist, ChecklistItem, Collection, Transportation, Note, ContentImage, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity
from django.utils.html import mark_safe, format_html
from django.urls import reverse
from .models import Location, Checklist, ChecklistItem, Collection, Transportation, Note, ContentImage, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity, CollectionItineraryItem, CollectionItineraryDay
from worldtravel.models import Country, Region, VisitedRegion, City, VisitedCity
from allauth.account.decorators import secure_admin_login
@@ -82,11 +83,11 @@ from users.models import CustomUser
class CustomUserAdmin(UserAdmin):
model = CustomUser
list_display = ['username', 'is_staff', 'is_active', 'image_display', 'measurement_system']
list_display = ['username', 'is_staff', 'is_active', 'image_display', 'measurement_system', 'default_currency']
readonly_fields = ('uuid',)
search_fields = ('username',)
fieldsets = UserAdmin.fieldsets + (
(None, {'fields': ('profile_pic', 'uuid', 'public_profile', 'disable_password', 'measurement_system')}),
(None, {'fields': ('profile_pic', 'uuid', 'public_profile', 'disable_password', 'measurement_system', 'default_currency')}),
)
def image_display(self, obj):
if obj.profile_pic:
@@ -146,6 +147,32 @@ class CollectionAdmin(admin.ModelAdmin):
class ActivityAdmin(admin.ModelAdmin):
list_display = ('name', 'user', 'visit__location', 'sport_type', 'distance', 'elevation_gain', 'moving_time')
class CollectionItineraryItemAdmin(admin.ModelAdmin):
list_display = ('collection', 'content_type', 'object_link', 'date', 'order')
search_fields = ('collection__name', 'content_type__model')
list_filter = ('content_type', 'date')
raw_id_fields = ('collection',)
readonly_fields = ('created_at',)
def object_link(self, obj):
"""
Display the generic related object; link to its admin change page if registered.
"""
linked_obj = obj.item
if not linked_obj:
return ""
try:
ct = obj.content_type
app_label = ct.app_label
model = ct.model
admin_url = reverse('admin:%s_%s_change' % (app_label, model), args=[obj.object_id])
return format_html('<a href="{}">{}</a>', admin_url, str(linked_obj))
except Exception:
# Fallback to plain text if any error (object not registered, missing id, etc.)
return str(linked_obj)
object_link.short_description = 'Item'
admin.site.register(CustomUser, CustomUserAdmin)
admin.site.register(Location, LocationAdmin)
admin.site.register(Collection, CollectionAdmin)
@@ -166,6 +193,8 @@ admin.site.register(Lodging)
admin.site.register(CollectionInvite, CollectionInviteAdmin)
admin.site.register(Trail)
admin.site.register(Activity, ActivityAdmin)
admin.site.register(CollectionItineraryItem, CollectionItineraryItemAdmin)
admin.site.register(CollectionItineraryDay)
admin.site.site_header = 'AdventureLog Admin'
admin.site.site_title = 'AdventureLog Admin Site'

View File

@@ -1,6 +1,8 @@
import requests
import time
import socket
import re
import unicodedata
from worldtravel.models import Region, City, VisitedRegion, VisitedCity
from django.conf import settings
@@ -10,7 +12,7 @@ def search_google(query):
try:
api_key = settings.GOOGLE_MAPS_API_KEY
if not api_key:
return {"error": "Missing Google Maps API key"}
return {"error": "Geocoding service unavailable. Please check configuration."}
# Updated to use the new Places API (New) endpoint
url = "https://places.googleapis.com/v1/places:searchText"
@@ -34,7 +36,7 @@ def search_google(query):
# Check if we have places in the response
places = data.get("places", [])
if not places:
return {"error": "No results found"}
return {"error": "No locations found for the given query."}
results = []
for place in places:
@@ -71,11 +73,25 @@ def search_google(query):
return results
except requests.exceptions.RequestException as e:
return {"error": "Network error while contacting Google Maps", "details": str(e)}
except Exception as e:
return {"error": "Unexpected error during Google search", "details": str(e)}
except requests.exceptions.Timeout:
return {"error": "Request timed out while contacting Google Maps. Please try again."}
except requests.exceptions.ConnectionError:
return {"error": "Unable to connect to Google Maps service. Please check your internet connection."}
except requests.exceptions.HTTPError as e:
if response.status_code == 400:
return {"error": "Invalid request to Google Maps. Please check your query."}
elif response.status_code == 401:
return {"error": "Authentication failed with Google Maps. Please check API configuration."}
elif response.status_code == 403:
return {"error": "Access forbidden to Google Maps. Please check API permissions."}
elif response.status_code == 429:
return {"error": "Too many requests to Google Maps. Please try again later."}
else:
return {"error": "Google Maps service error. Please try again later."}
except requests.exceptions.RequestException:
return {"error": "Network error while contacting Google Maps. Please try again."}
except Exception:
return {"error": "An unexpected error occurred during Google search. Please try again."}
def _extract_google_category(types):
# Basic category inference based on common place types
@@ -111,82 +127,231 @@ def _infer_addresstype(type_):
def search_osm(query):
url = f"https://nominatim.openstreetmap.org/search?q={query}&format=jsonv2"
headers = {'User-Agent': 'AdventureLog Server'}
response = requests.get(url, headers=headers)
data = response.json()
try:
url = f"https://nominatim.openstreetmap.org/search?q={query}&format=jsonv2"
headers = {'User-Agent': 'AdventureLog Server'}
response = requests.get(url, headers=headers, timeout=(2, 5))
response.raise_for_status()
data = response.json()
return [{
"lat": item.get("lat"),
"lon": item.get("lon"),
"name": item.get("name"),
"display_name": item.get("display_name"),
"type": item.get("type"),
"category": item.get("category"),
"importance": item.get("importance"),
"addresstype": item.get("addresstype"),
"powered_by": "nominatim",
} for item in data]
return [{
"lat": item.get("lat"),
"lon": item.get("lon"),
"name": item.get("name"),
"display_name": item.get("display_name"),
"type": item.get("type"),
"category": item.get("category"),
"importance": item.get("importance"),
"addresstype": item.get("addresstype"),
"powered_by": "nominatim",
} for item in data]
except requests.exceptions.Timeout:
return {"error": "Request timed out while contacting OpenStreetMap. Please try again."}
except requests.exceptions.ConnectionError:
return {"error": "Unable to connect to OpenStreetMap service. Please check your internet connection."}
except requests.exceptions.HTTPError as e:
if response.status_code == 400:
return {"error": "Invalid request to OpenStreetMap. Please check your query."}
elif response.status_code == 429:
return {"error": "Too many requests to OpenStreetMap. Please try again later."}
else:
return {"error": "OpenStreetMap service error. Please try again later."}
except requests.exceptions.RequestException:
return {"error": "Network error while contacting OpenStreetMap. Please try again."}
except Exception:
return {"error": "An unexpected error occurred during OpenStreetMap search. Please try again."}
def search(query):
"""
Unified search function that tries Google Maps first, then falls back to OpenStreetMap.
"""
if getattr(settings, 'GOOGLE_MAPS_API_KEY', None):
google_result = search_google(query)
if "error" not in google_result:
return google_result
# If Google fails, fallback to OSM
return search_osm(query)
# -----------------
# REVERSE GEOCODING
# -----------------
def extractIsoCode(user, data):
"""
Extract the ISO code from the response data.
Returns a dictionary containing the region name, country name, and ISO code if found.
"""
iso_code = None
town_city_or_county = None
display_name = None
country_code = None
city = None
visited_city = None
location_name = None
"""
Extract the ISO code from the response data.
Returns a dictionary containing the region name, country name, and ISO code if found.
"""
iso_code = None
display_name = None
country_code = None
city = None
visited_city = None
location_name = None
# town = None
# city = None
# county = None
if 'name' in data.keys():
location_name = data['name']
if 'name' in data.keys():
location_name = data['name']
if 'address' in data.keys():
keys = data['address'].keys()
for key in keys:
if key.find("ISO") != -1:
iso_code = data['address'][key]
if 'town' in keys:
town_city_or_county = data['address']['town']
if 'county' in keys:
town_city_or_county = data['address']['county']
if 'city' in keys:
town_city_or_county = data['address']['city']
if not iso_code:
return {"error": "No region found"}
region = Region.objects.filter(id=iso_code).first()
visited_region = VisitedRegion.objects.filter(region=region, user=user).first()
region_visited = False
city_visited = False
country_code = iso_code[:2]
address = data.get('address', {}) or {}
# Capture country code early for ISO selection and name fallback.
country_code = address.get("ISO3166-1")
state_name = address.get("state")
# Prefer the most specific ISO 3166-2 code available before falling back to country-level.
# France gets lvl4 (regions) first for city matching, then lvl6 (departments) as a fallback.
preferred_iso_keys = (
[
"ISO3166-2-lvl10",
"ISO3166-2-lvl9",
"ISO3166-2-lvl8",
"ISO3166-2-lvl4",
"ISO3166-2-lvl6",
"ISO3166-2-lvl7",
"ISO3166-2-lvl5",
"ISO3166-2-lvl3",
"ISO3166-2-lvl2",
"ISO3166-2-lvl1",
"ISO3166-2",
]
if country_code == "FR"
else [
"ISO3166-2-lvl10",
"ISO3166-2-lvl9",
"ISO3166-2-lvl8",
"ISO3166-2-lvl4",
"ISO3166-2-lvl7",
"ISO3166-2-lvl6",
"ISO3166-2-lvl5",
"ISO3166-2-lvl3",
"ISO3166-2-lvl2",
"ISO3166-2-lvl1",
"ISO3166-2",
]
)
iso_candidates = []
for key in preferred_iso_keys:
value = address.get(key)
if value and value not in iso_candidates:
iso_candidates.append(value)
# If no region-level code, fall back to country code only as a last resort.
if not iso_candidates and "ISO3166-1" in address:
iso_candidates.append(address.get("ISO3166-1"))
iso_code = iso_candidates[0] if iso_candidates else None
region_candidates = []
for candidate in iso_candidates:
if len(str(candidate)) <= 2:
continue
match = Region.objects.filter(id=candidate).first()
if match and match not in region_candidates:
region_candidates.append(match)
region = region_candidates[0] if region_candidates else None
# Fallback: attempt to resolve region by name and country code when no ISO match.
if not region and state_name:
region_queryset = Region.objects.filter(name__iexact=state_name)
if country_code:
region_queryset = region_queryset.filter(country__country_code=country_code)
region = region_queryset.first()
if region:
if town_city_or_county:
display_name = f"{town_city_or_county}, {region.name}, {country_code}"
city = City.objects.filter(name__contains=town_city_or_county, region=region).first()
visited_city = VisitedCity.objects.filter(city=city, user=user).first()
iso_code = region.id
if not country_code:
country_code = region.country.country_code
if region not in region_candidates:
region_candidates.insert(0, region)
if visited_region:
region_visited = True
if visited_city:
city_visited = True
if region:
return {"region_id": iso_code, "region": region.name, "country": region.country.name, "country_id": region.country.country_code, "region_visited": region_visited, "display_name": display_name, "city": city.name if city else None, "city_id": city.id if city else None, "city_visited": city_visited, 'location_name': location_name}
if not region:
return {"error": "No region found"}
if not country_code:
country_code = region.country.country_code
region_visited = False
city_visited = False
# ordered preference for best-effort locality matching
locality_keys = [
'suburb',
'neighbourhood',
'neighborhood', # alternate spelling
'city',
'city_district',
'town',
'village',
'hamlet',
'locality',
'municipality',
'county',
]
def _normalize_name(value):
normalized = unicodedata.normalize("NFKD", value)
ascii_only = normalized.encode("ascii", "ignore").decode("ascii")
return re.sub(r"[^a-z0-9]", "", ascii_only.lower())
def match_locality(key_name, target_region):
value = address.get(key_name)
if not value:
return None
qs = City.objects.filter(region=target_region)
# Use exact matches first to avoid broad county/name collisions (e.g. Troms vs Tromsø).
exact_match = qs.filter(name__iexact=value).first()
if exact_match:
return exact_match
normalized_value = _normalize_name(value)
for candidate in qs.values_list('id', 'name'):
candidate_id, candidate_name = candidate
if _normalize_name(candidate_name) == normalized_value:
return qs.filter(id=candidate_id).first()
# Allow partial matching for most locality fields but keep county strict.
if key_name == 'county':
return None
return qs.filter(name__icontains=value).first()
chosen_region = region
for candidate_region in region_candidates or [region]:
for key_name in locality_keys:
city = match_locality(key_name, candidate_region)
if city:
chosen_region = candidate_region
iso_code = chosen_region.id
break
if city:
break
region = chosen_region
iso_code = region.id
visited_region = VisitedRegion.objects.filter(region=region, user=user).first()
region_visited = bool(visited_region)
if city:
display_name = f"{city.name}, {region.name}, {country_code or region.country.country_code}"
visited_city = VisitedCity.objects.filter(city=city, user=user).first()
city_visited = bool(visited_city)
else:
display_name = f"{region.name}, {country_code or region.country.country_code}"
return {
"region_id": iso_code,
"region": region.name,
"country": region.country.name,
"country_id": region.country.country_code,
"region_visited": region_visited,
"display_name": display_name,
"city": city.name if city else None,
"city_id": city.id if city else None,
"city_visited": city_visited,
'location_name': location_name,
}
def is_host_resolvable(hostname: str) -> bool:
try:
socket.gethostbyname(hostname)
@@ -210,15 +375,28 @@ def reverse_geocode_osm(lat, lon, user):
read_timeout = 5
if not is_host_resolvable("nominatim.openstreetmap.org"):
return {"error": "DNS resolution failed"}
return {"error": "Unable to resolve OpenStreetMap service. Please check your internet connection."}
try:
response = requests.get(url, headers=headers, timeout=(connect_timeout, read_timeout))
response.raise_for_status()
data = response.json()
return extractIsoCode(user, data)
except requests.exceptions.Timeout:
return {"error": "Request timed out while contacting OpenStreetMap. Please try again."}
except requests.exceptions.ConnectionError:
return {"error": "Unable to connect to OpenStreetMap service. Please check your internet connection."}
except requests.exceptions.HTTPError as e:
if response.status_code == 400:
return {"error": "Invalid request to OpenStreetMap. Please check coordinates."}
elif response.status_code == 429:
return {"error": "Too many requests to OpenStreetMap. Please try again later."}
else:
return {"error": "OpenStreetMap service error. Please try again later."}
except requests.exceptions.RequestException:
return {"error": "Network error while contacting OpenStreetMap. Please try again."}
except Exception:
return {"error": "An internal error occurred while processing the request"}
return {"error": "An unexpected error occurred during OpenStreetMap geocoding. Please try again."}
def reverse_geocode_google(lat, lon, user):
api_key = settings.GOOGLE_MAPS_API_KEY
@@ -229,12 +407,22 @@ def reverse_geocode_google(lat, lon, user):
params = {"latlng": f"{lat},{lon}", "key": api_key}
try:
response = requests.get(url, params=params)
response = requests.get(url, params=params, timeout=(2, 5))
response.raise_for_status()
data = response.json()
if data.get("status") != "OK":
return {"error": "Geocoding failed"}
status = data.get("status")
if status != "OK":
if status == "ZERO_RESULTS":
return {"error": "No location found for the given coordinates."}
elif status == "OVER_QUERY_LIMIT":
return {"error": "Query limit exceeded for Google Maps. Please try again later."}
elif status == "REQUEST_DENIED":
return {"error": "Request denied by Google Maps. Please check API configuration."}
elif status == "INVALID_REQUEST":
return {"error": "Invalid request to Google Maps. Please check coordinates."}
else:
return {"error": "Geocoding failed. Please try again."}
# Convert Google schema to Nominatim-style for extractIsoCode
first_result = data.get("results", [])[0]
@@ -243,8 +431,25 @@ def reverse_geocode_google(lat, lon, user):
"address": _parse_google_address_components(first_result.get("address_components", []))
}
return extractIsoCode(user, result_data)
except requests.exceptions.Timeout:
return {"error": "Request timed out while contacting Google Maps. Please try again."}
except requests.exceptions.ConnectionError:
return {"error": "Unable to connect to Google Maps service. Please check your internet connection."}
except requests.exceptions.HTTPError as e:
if response.status_code == 400:
return {"error": "Invalid request to Google Maps. Please check coordinates."}
elif response.status_code == 401:
return {"error": "Authentication failed with Google Maps. Please check API configuration."}
elif response.status_code == 403:
return {"error": "Access forbidden to Google Maps. Please check API permissions."}
elif response.status_code == 429:
return {"error": "Too many requests to Google Maps. Please try again later."}
else:
return {"error": "Google Maps service error. Please try again later."}
except requests.exceptions.RequestException:
return {"error": "Network error while contacting Google Maps. Please try again."}
except Exception:
return {"error": "An internal error occurred while processing the request"}
return {"error": "An unexpected error occurred during Google geocoding. Please try again."}
def _parse_google_address_components(components):
parsed = {}
@@ -265,13 +470,23 @@ def _parse_google_address_components(components):
state_code = short_name
if "administrative_area_level_2" in types:
parsed["county"] = long_name
if "administrative_area_level_3" in types:
parsed["municipality"] = long_name
if "locality" in types:
parsed["city"] = long_name
if "sublocality" in types:
parsed["town"] = long_name
if "postal_town" in types:
parsed.setdefault("city", long_name)
if "sublocality" in types or any(t.startswith("sublocality_level_") for t in types):
parsed["suburb"] = long_name
if "neighborhood" in types:
parsed["neighbourhood"] = long_name
if "route" in types:
parsed["road"] = long_name
if "street_address" in types:
parsed["address"] = long_name
# Build composite ISO 3166-2 code like US-ME
# Build composite ISO 3166-2 code like US-ME (matches Region.id in DB)
if country_code and state_code:
parsed["ISO3166-2-lvl1"] = f"{country_code}-{state_code}"
return parsed
return parsed

View File

@@ -0,0 +1,259 @@
"""
Django management command to synchronize visited regions and cities based on user locations.
This command processes all users' visited locations and marks their regions and cities as visited.
It's designed to be run periodically (e.g., nightly cron job) to keep visited regions/cities up to date.
Usage:
python manage.py sync_visited_regions
python manage.py sync_visited_regions --dry-run
python manage.py sync_visited_regions --user-id 123
python manage.py sync_visited_regions --batch-size 50
"""
from django.core.management.base import BaseCommand, CommandError
from django.contrib.auth import get_user_model
from django.db import transaction
from django.db.models import Prefetch, Q
from adventures.models import Location
from worldtravel.models import Region, City, VisitedRegion, VisitedCity
from collections import defaultdict
import logging
logger = logging.getLogger(__name__)
User = get_user_model()
class Command(BaseCommand):
help = 'Synchronize visited regions and cities based on user locations'
def add_arguments(self, parser):
parser.add_argument(
'--dry-run',
action='store_true',
help='Show what would be created without making changes',
)
parser.add_argument(
'--user-id',
type=int,
help='Sync visited regions for a specific user ID only',
)
parser.add_argument(
'--batch-size',
type=int,
default=100,
help='Number of users to process in each batch (default: 100)',
)
parser.add_argument(
'--verbose',
action='store_true',
help='Show detailed output for each user',
)
def handle(self, *args, **options):
dry_run = options['dry_run']
user_id = options.get('user_id')
batch_size = options['batch_size']
verbose = options['verbose']
if dry_run:
self.stdout.write(
self.style.WARNING('DRY RUN MODE - No changes will be made')
)
# Build user queryset
users_queryset = User.objects.all()
if user_id:
users_queryset = users_queryset.filter(id=user_id)
if not users_queryset.exists():
raise CommandError(f'User with ID {user_id} not found')
total_users = users_queryset.count()
if total_users == 0:
self.stdout.write(self.style.WARNING('No users found'))
return
self.stdout.write(f'Processing {total_users} user(s)...\n')
# Track overall statistics
total_new_regions = 0
total_new_cities = 0
users_processed = 0
users_with_changes = 0
# Process users in batches to manage memory
user_ids = list(users_queryset.values_list('id', flat=True))
for i in range(0, len(user_ids), batch_size):
batch_user_ids = user_ids[i:i + batch_size]
for user_id in batch_user_ids:
try:
new_regions, new_cities = self._process_user(
user_id, dry_run, verbose
)
total_new_regions += new_regions
total_new_cities += new_cities
users_processed += 1
if new_regions > 0 or new_cities > 0:
users_with_changes += 1
# Progress indicator for large batches
if users_processed % 50 == 0:
self.stdout.write(
f'Processed {users_processed}/{total_users} users...'
)
except Exception as e:
self.stdout.write(
self.style.ERROR(
f'Error processing user {user_id}: {str(e)}'
)
)
logger.exception(f'Error processing user {user_id}')
# Summary
self.stdout.write('\n' + '='*60)
if dry_run:
self.stdout.write(
self.style.SUCCESS(
f'DRY RUN COMPLETE:\n'
f' Users processed: {users_processed}\n'
f' Users with changes: {users_with_changes}\n'
f' Would create {total_new_regions} new visited regions\n'
f' Would create {total_new_cities} new visited cities'
)
)
else:
self.stdout.write(
self.style.SUCCESS(
f'SYNC COMPLETE:\n'
f' Users processed: {users_processed}\n'
f' Users with changes: {users_with_changes}\n'
f' Created {total_new_regions} new visited regions\n'
f' Created {total_new_cities} new visited cities'
)
)
def _process_user(self, user_id, dry_run=False, verbose=False):
"""
Process a single user and return counts of new regions and cities.
Returns: (new_regions_count, new_cities_count)
"""
# Get all visited locations with their region and city data in a single query
visited_locations = Location.objects.filter(
user_id=user_id
).select_related('region', 'city')
# Collect unique regions and cities from visited locations
regions_to_mark = set()
cities_to_mark = set()
for location in visited_locations:
# Only process locations that are marked as visited
if not location.is_visited_status():
continue
if location.region_id:
regions_to_mark.add(location.region_id)
if location.city_id:
cities_to_mark.add(location.city_id)
# Early exit if no regions or cities to mark
if not regions_to_mark and not cities_to_mark:
return 0, 0
new_regions_count = 0
new_cities_count = 0
# Process regions
if regions_to_mark:
new_regions_count = self._sync_visited_regions(
user_id, regions_to_mark, dry_run
)
# Process cities
if cities_to_mark:
new_cities_count = self._sync_visited_cities(
user_id, cities_to_mark, dry_run
)
if verbose and (new_regions_count > 0 or new_cities_count > 0):
self.stdout.write(
f'User {user_id}: '
f'{new_regions_count} new regions, '
f'{new_cities_count} new cities'
)
return new_regions_count, new_cities_count
def _sync_visited_regions(self, user_id, region_ids, dry_run=False):
"""Sync visited regions for a user. Returns count of new regions created."""
# Get existing visited regions for this user in one query
existing_visited_regions = set(
VisitedRegion.objects.filter(
user_id=user_id,
region_id__in=region_ids
).values_list('region_id', flat=True)
)
# Determine which regions need to be created
regions_to_create = region_ids - existing_visited_regions
if not regions_to_create:
return 0
if dry_run:
return len(regions_to_create)
# Bulk create new VisitedRegion entries
new_visited_regions = [
VisitedRegion(region_id=region_id, user_id=user_id)
for region_id in regions_to_create
]
with transaction.atomic():
VisitedRegion.objects.bulk_create(
new_visited_regions,
ignore_conflicts=True # Handle race conditions gracefully
)
return len(regions_to_create)
def _sync_visited_cities(self, user_id, city_ids, dry_run=False):
"""Sync visited cities for a user. Returns count of new cities created."""
# Get existing visited cities for this user in one query
existing_visited_cities = set(
VisitedCity.objects.filter(
user_id=user_id,
city_id__in=city_ids
).values_list('city_id', flat=True)
)
# Determine which cities need to be created
cities_to_create = city_ids - existing_visited_cities
if not cities_to_create:
return 0
if dry_run:
return len(cities_to_create)
# Bulk create new VisitedCity entries
new_visited_cities = [
VisitedCity(city_id=city_id, user_id=user_id)
for city_id in cities_to_create
]
with transaction.atomic():
VisitedCity.objects.bulk_create(
new_visited_cities,
ignore_conflicts=True # Handle race conditions gracefully
)
return len(cities_to_create)

View File

@@ -0,0 +1,32 @@
# Generated by Django 5.2.6 on 2025-12-15 16:46
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('adventures', '0063_alter_activity_timezone_alter_lodging_timezone_and_more'),
('contenttypes', '0002_remove_content_type_name'),
]
operations = [
migrations.CreateModel(
name='CollectionItineraryItem',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('object_id', models.UUIDField()),
('date', models.DateField(blank=True, null=True)),
('order', models.PositiveIntegerField(help_text='Manual order within a day')),
('created_at', models.DateTimeField(auto_now_add=True)),
('collection', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='itinerary_items', to='adventures.collection')),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')),
],
options={
'ordering': ['date', 'order'],
'unique_together': {('collection', 'date', 'order')},
},
),
]

View File

@@ -0,0 +1,23 @@
# Generated by Django 5.2.6 on 2025-12-27 00:20
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('adventures', '0064_collectionitineraryitem'),
]
operations = [
migrations.AddField(
model_name='transportation',
name='end_code',
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='transportation',
name='start_code',
field=models.CharField(blank=True, max_length=100, null=True),
),
]

View File

@@ -0,0 +1,19 @@
# Generated by Django 5.2.6 on 2026-01-02 18:11
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('adventures', '0065_transportation_end_code_transportation_start_code'),
]
operations = [
migrations.AddField(
model_name='collection',
name='primary_image',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='primary_for_collections', to='adventures.contentimage'),
),
]

View File

@@ -0,0 +1,48 @@
from django.db import migrations
import djmoney.models.fields
class Migration(migrations.Migration):
dependencies = [
("adventures", "0066_collection_primary_image"),
]
operations = [
migrations.AddField(
model_name="lodging",
name="price_currency",
field=djmoney.models.fields.CurrencyField(
default="USD", editable=False, max_length=3
),
),
migrations.AddField(
model_name="transportation",
name="price_currency",
field=djmoney.models.fields.CurrencyField(
default="USD", editable=False, max_length=3
),
),
migrations.AlterField(
model_name="lodging",
name="price",
field=djmoney.models.fields.MoneyField(
blank=True,
decimal_places=2,
default_currency="USD",
max_digits=12,
null=True,
),
),
migrations.AddField(
model_name="transportation",
name="price",
field=djmoney.models.fields.MoneyField(
blank=True,
decimal_places=2,
default_currency="USD",
max_digits=12,
null=True,
),
),
]

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,33 @@
# Generated by Django 5.2.8 on 2026-01-05 17:06
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('adventures', '0069_location_price_location_price_currency'),
]
operations = [
migrations.CreateModel(
name='CollectionItineraryDay',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('date', models.DateField()),
('name', models.CharField(blank=True, max_length=200, null=True)),
('description', models.TextField(blank=True, null=True)),
('created_at', models.DateTimeField(auto_now_add=True)),
('updated_at', models.DateTimeField(auto_now=True)),
('collection', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='itinerary_days', to='adventures.collection')),
],
options={
'verbose_name': 'Collection Itinerary Day',
'verbose_name_plural': 'Collection Itinerary Days',
'ordering': ['date'],
'unique_together': {('collection', 'date')},
},
),
]

View File

@@ -0,0 +1,31 @@
# Generated by Django 5.2.8 on 2026-01-06 16:19
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('adventures', '0070_collectionitineraryday'),
('contenttypes', '0002_remove_content_type_name'),
]
operations = [
migrations.AlterUniqueTogether(
name='collectionitineraryitem',
unique_together=set(),
),
migrations.AddField(
model_name='collectionitineraryitem',
name='is_global',
field=models.BooleanField(default=False, help_text='Applies to the whole trip (no specific date)'),
),
migrations.AddConstraint(
model_name='collectionitineraryitem',
constraint=models.UniqueConstraint(condition=models.Q(('is_global', False), ('date__isnull', False)), fields=('collection', 'date', 'order'), name='unique_order_per_collection_day'),
),
migrations.AddConstraint(
model_name='collectionitineraryitem',
constraint=models.UniqueConstraint(condition=models.Q(('is_global', True)), fields=('collection', 'order'), name='unique_order_per_collection_global'),
),
]

View File

@@ -1,4 +1,3 @@
from django.core.exceptions import ValidationError
import os
import uuid
from django.db import models
@@ -7,8 +6,8 @@ from adventures.managers import LocationManager
import threading
from django.contrib.auth import get_user_model
from django.contrib.postgres.fields import ArrayField
from django.forms import ValidationError
from django_resized import ResizedImageField
from djmoney.models.fields import MoneyField
from worldtravel.models import City, Country, Region, VisitedCity, VisitedRegion
from django.core.exceptions import ValidationError
from django.utils import timezone
@@ -16,6 +15,7 @@ from adventures.utils.timezones import TIMEZONES
from adventures.utils.sports_types import SPORT_TYPE_CHOICES
from adventures.utils.get_is_visited import is_location_visited
from django.contrib.contenttypes.fields import GenericForeignKey
from django.db.models import Q
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericRelation
@@ -159,6 +159,7 @@ class Location(models.Model):
tags = ArrayField(models.CharField(max_length=100), blank=True, null=True)
description = models.TextField(blank=True, null=True)
rating = models.FloatField(blank=True, null=True)
price = MoneyField(max_digits=12, decimal_places=2, default_currency='USD', null=True, blank=True)
link = models.URLField(blank=True, null=True, max_length=2083)
is_public = models.BooleanField(default=False)
longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
@@ -286,6 +287,13 @@ class Collection(models.Model):
is_archived = models.BooleanField(default=False)
shared_with = models.ManyToManyField(User, related_name='shared_with', blank=True)
link = models.URLField(blank=True, null=True, max_length=2083)
primary_image = models.ForeignKey(
'ContentImage',
on_delete=models.SET_NULL,
related_name='primary_for_collections',
null=True,
blank=True,
)
# if connected locations are private and collection is public, raise an error
def clean(self):
@@ -305,6 +313,7 @@ class Transportation(models.Model):
name = models.CharField(max_length=200)
description = models.TextField(blank=True, null=True)
rating = models.FloatField(blank=True, null=True)
price = MoneyField(max_digits=12, decimal_places=2, default_currency='USD', null=True, blank=True)
link = models.URLField(blank=True, null=True, max_length=2083)
date = models.DateTimeField(blank=True, null=True)
end_date = models.DateTimeField(blank=True, null=True)
@@ -316,6 +325,8 @@ class Transportation(models.Model):
origin_longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
destination_latitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
destination_longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
start_code = models.CharField(max_length=100, blank=True, null=True) # Could be airport code, station code, etc.
end_code = models.CharField(max_length=100, blank=True, null=True) # Could be airport code, station code, etc.
to_location = models.CharField(max_length=200, blank=True, null=True)
is_public = models.BooleanField(default=False)
collection = models.ForeignKey('Collection', on_delete=models.CASCADE, blank=True, null=True)
@@ -551,7 +562,7 @@ class Lodging(models.Model):
check_out = models.DateTimeField(blank=True, null=True)
timezone = models.CharField(max_length=50, choices=[(tz, tz) for tz in TIMEZONES], null=True, blank=True)
reservation_number = models.CharField(max_length=100, blank=True, null=True)
price = models.DecimalField(max_digits=9, decimal_places=2, blank=True, null=True)
price = MoneyField(max_digits=12, decimal_places=2, default_currency='USD', null=True, blank=True)
latitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
longitude = models.DecimalField(max_digits=9, decimal_places=6, null=True, blank=True)
location = models.CharField(max_length=200, blank=True, null=True)
@@ -673,4 +684,104 @@ class Activity(models.Model):
class Meta:
verbose_name = "Activity"
verbose_name_plural = "Activities"
verbose_name_plural = "Activities"
class CollectionItineraryDay(models.Model):
"""Metadata for a specific day in a collection's itinerary"""
id = models.UUIDField(default=uuid.uuid4, editable=False, unique=True, primary_key=True)
collection = models.ForeignKey('Collection', on_delete=models.CASCADE, related_name='itinerary_days')
date = models.DateField()
name = models.CharField(max_length=200, blank=True, null=True)
description = models.TextField(blank=True, null=True)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
unique_together = [['collection', 'date']]
ordering = ['date']
verbose_name = "Collection Itinerary Day"
verbose_name_plural = "Collection Itinerary Days"
def __str__(self):
return f"{
self.collection.name} - {self.date} - {self.name or 'Unnamed Day'}"
class CollectionItineraryItem(models.Model):
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
collection = models.ForeignKey(
Collection,
on_delete=models.CASCADE,
related_name="itinerary_items"
)
# Generic reference to Visit, Transportation, Lodging, Note, etc
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
object_id = models.UUIDField()
item = GenericForeignKey("content_type", "object_id")
# Placement (planning concern, not content concern)
# Either a specific date or marked as trip-wide (global). Exactly one of these applies.
date = models.DateField(blank=True, null=True)
is_global = models.BooleanField(default=False, help_text="Applies to the whole trip (no specific date)")
order = models.PositiveIntegerField(help_text="Manual order within a day")
created_at = models.DateTimeField(auto_now_add=True)
class Meta:
ordering = ["date", "order"]
constraints = [
# Ensure unique order per day for dated items
models.UniqueConstraint(
fields=["collection", "date", "order"],
name="unique_order_per_collection_day",
condition=Q(is_global=False) & Q(date__isnull=False),
),
# Ensure unique order within the global group for a collection
models.UniqueConstraint(
fields=["collection", "order"],
name="unique_order_per_collection_global",
condition=Q(is_global=True),
),
]
def __str__(self):
scope = "GLOBAL" if self.is_global else str(self.date)
return f"{self.collection.name} - {self.content_type.model} - {scope} ({self.order})"
def clean(self):
# Enforce XOR between date and is_global
if self.is_global and self.date is not None:
raise ValidationError({
"is_global": "Global items must not have a date.",
"date": "Provide either a date or set is_global, not both.",
})
if (not self.is_global) and self.date is None:
raise ValidationError({
"date": "Dated items must include a date. To create a trip-wide item, set is_global=true.",
})
@property
def start_datetime(self):
obj = self.item
for field in ("start_date", "check_in", "date"):
if hasattr(obj, field):
value = getattr(obj, field)
if value:
return value
return None
@property
def end_datetime(self):
obj = self.item
for field in ("end_date", "check_out"):
if hasattr(obj, field):
value = getattr(obj, field)
if value:
return value
return None

View File

@@ -23,15 +23,28 @@ class IsPublicReadOnly(permissions.BasePermission):
class CollectionShared(permissions.BasePermission):
"""
Allow full access if user is in shared_with of collection(s) or owner,
read-only if public or shared_with,
write only if owner or shared_with.
Permission class for collection sharing functionality.
This permission handles access control for collections and objects that are shared
through collections. It supports:
- Collection invite acceptance/declining for users with pending invites
- Full access for collection owners and shared users
- Read-only access for public objects (anonymous and authenticated users)
- Write access for owners and users shared via collections
Access Rules:
- Anonymous users: read-only access to public objects
- Invited users: can accept/decline invites for collections they're invited to
- Collection owners: full access to their collections and related objects
- Shared users: full access to collections they're shared with and related objects
- Public access: read-only for objects marked as public
"""
def has_object_permission(self, request, view, obj):
user = request.user
if not user or not user.is_authenticated:
# Anonymous: only read public
return request.method in permissions.SAFE_METHODS and obj.is_public
return request.method in permissions.SAFE_METHODS and getattr(obj, 'is_public', False)
# Special case for accept_invite and decline_invite actions
# Allow access if user has a pending invite for this collection
@@ -55,10 +68,10 @@ class CollectionShared(permissions.BasePermission):
# Read permission if public or owner
if request.method in permissions.SAFE_METHODS:
return obj.is_public or obj.user == user
return getattr(obj, 'is_public', False) or getattr(obj, 'user', None) == user
# Write permission only if owner or shared user via collections
if obj.user == user:
if getattr(obj, 'user', None) == user:
return True
if hasattr(obj, 'collections'):
@@ -120,6 +133,12 @@ class IsOwnerOrSharedWithFullAccess(permissions.BasePermission):
if hasattr(obj, 'location'):
obj = obj.location
if type(obj).__name__ == 'CollectionItineraryItem':
print("Checking permissions for CollectionItineraryItem object", obj)
if hasattr(obj, 'object_id') and hasattr(obj, 'content_type'):
content_object = obj.content_type.get_object_for_this_type(id=obj.object_id)
obj = content_object
# Anonymous users only get read access to public objects
if not user or not user.is_authenticated:
return is_safe_method and getattr(obj, 'is_public', False)

View File

@@ -1,5 +1,5 @@
import os
from .models import Location, ContentImage, ChecklistItem, Collection, Note, Transportation, Checklist, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity
from .models import Location, ContentImage, ChecklistItem, Collection, Note, Transportation, Checklist, Visit, Category, ContentAttachment, Lodging, CollectionInvite, Trail, Activity, CollectionItineraryItem, CollectionItineraryDay
from rest_framework import serializers
from main.utils import CustomModelSerializer
from users.serializers import CustomUserDetailsSerializer
@@ -7,11 +7,38 @@ from worldtravel.serializers import CountrySerializer, RegionSerializer, CitySer
from geopy.distance import geodesic
from integrations.models import ImmichIntegration
from adventures.utils.geojson import gpx_to_geojson
import gpxpy
import logging
logger = logging.getLogger(__name__)
def _build_profile_pic_url(user):
"""Return absolute-ish profile pic URL using PUBLIC_URL if available."""
if not getattr(user, 'profile_pic', None):
return None
public_url = os.environ.get('PUBLIC_URL', 'http://127.0.0.1:8000').rstrip('/')
public_url = public_url.replace("'", "")
return f"{public_url}/media/{user.profile_pic.name}"
def _serialize_collaborator(user, owner_id=None, request_user=None):
if not user:
return None
return {
'uuid': str(user.uuid),
'username': user.username,
'first_name': user.first_name,
'last_name': user.last_name,
'profile_pic': _build_profile_pic_url(user),
'public_profile': bool(getattr(user, 'public_profile', False)),
'is_owner': owner_id == user.id,
'is_current_user': bool(request_user and request_user.id == user.id),
}
class ContentImageSerializer(CustomModelSerializer):
class Meta:
model = ContentImage
@@ -205,6 +232,31 @@ class VisitSerializer(serializers.ModelSerializer):
if not validated_data.get('end_date') and validated_data.get('start_date'):
validated_data['end_date'] = validated_data['start_date']
return super().create(validated_data)
class CalendarVisitSerializer(serializers.ModelSerializer):
class Meta:
model = Visit
fields = ['id', 'start_date', 'end_date', 'timezone']
class CalendarLocationSerializer(serializers.ModelSerializer):
visits = CalendarVisitSerializer(many=True, read_only=True)
category = serializers.SerializerMethodField()
class Meta:
model = Location
fields = ['id', 'name', 'location', 'category', 'visits']
def get_category(self, obj):
if not obj.category:
return None
return {
"name": obj.category.name,
"icon": obj.category.icon,
}
class LocationSerializer(CustomModelSerializer):
images = serializers.SerializerMethodField()
@@ -227,7 +279,8 @@ class LocationSerializer(CustomModelSerializer):
fields = [
'id', 'name', 'description', 'rating', 'tags', 'location',
'is_public', 'collections', 'created_at', 'updated_at', 'images', 'link', 'longitude',
'latitude', 'visits', 'is_visited', 'category', 'attachments', 'user', 'city', 'country', 'region', 'trails'
'latitude', 'visits', 'is_visited', 'category', 'attachments', 'user', 'city', 'country', 'region', 'trails',
'price', 'price_currency'
]
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'is_visited']
@@ -424,17 +477,19 @@ class TransportationSerializer(CustomModelSerializer):
distance = serializers.SerializerMethodField()
images = serializers.SerializerMethodField()
attachments = serializers.SerializerMethodField()
travel_duration_minutes = serializers.SerializerMethodField()
class Meta:
model = Transportation
fields = [
'id', 'user', 'type', 'name', 'description', 'rating',
'id', 'user', 'type', 'name', 'description', 'rating', 'price', 'price_currency',
'link', 'date', 'flight_number', 'from_location', 'to_location',
'is_public', 'collection', 'created_at', 'updated_at', 'end_date',
'origin_latitude', 'origin_longitude', 'destination_latitude', 'destination_longitude',
'start_timezone', 'end_timezone', 'distance', 'images', 'attachments'
'start_timezone', 'end_timezone', 'distance', 'images', 'attachments', 'start_code', 'end_code',
'travel_duration_minutes'
]
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'distance']
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'distance', 'travel_duration_minutes']
def get_images(self, obj):
serializer = ContentImageSerializer(obj.images.all(), many=True, context=self.context)
@@ -447,6 +502,10 @@ class TransportationSerializer(CustomModelSerializer):
return [attachment for attachment in serializer.data if attachment is not None]
def get_distance(self, obj):
gpx_distance = self._get_gpx_distance_km(obj)
if gpx_distance is not None:
return gpx_distance
if (
obj.origin_latitude and obj.origin_longitude and
obj.destination_latitude and obj.destination_longitude
@@ -459,6 +518,68 @@ class TransportationSerializer(CustomModelSerializer):
return None
return None
def _get_gpx_distance_km(self, obj):
gpx_attachments = obj.attachments.filter(file__iendswith='.gpx')
for attachment in gpx_attachments:
distance_km = self._parse_gpx_distance_km(attachment.file)
if distance_km is not None:
return distance_km
return None
def _parse_gpx_distance_km(self, gpx_file_field):
try:
with gpx_file_field.open('r') as gpx_file:
gpx = gpxpy.parse(gpx_file)
total_meters = 0.0
for track in gpx.tracks:
for segment in track.segments:
segment_length = segment.length_3d() or segment.length_2d()
if segment_length:
total_meters += segment_length
for route in gpx.routes:
route_length = route.length_3d() or route.length_2d()
if route_length:
total_meters += route_length
if total_meters > 0:
return round(total_meters / 1000, 2)
except Exception as exc:
logger.warning(
"Failed to calculate GPX distance for file %s: %s",
getattr(gpx_file_field, 'name', 'unknown'),
exc,
)
return None
def get_travel_duration_minutes(self, obj):
if not obj.date or not obj.end_date:
return None
if self._is_all_day(obj.date) and self._is_all_day(obj.end_date):
return None
try:
total_minutes = int((obj.end_date - obj.date).total_seconds() // 60)
return total_minutes if total_minutes >= 0 else None
except Exception:
logger.warning(
"Failed to calculate travel duration for transportation %s",
getattr(obj, "id", "unknown"),
exc_info=True,
)
return None
def _is_all_day(self, dt_value):
return (
dt_value.time().hour == 0
and dt_value.time().minute == 0
and dt_value.time().second == 0
and dt_value.time().microsecond == 0
)
class LodgingSerializer(CustomModelSerializer):
images = serializers.SerializerMethodField()
attachments = serializers.SerializerMethodField()
@@ -467,7 +588,7 @@ class LodgingSerializer(CustomModelSerializer):
model = Lodging
fields = [
'id', 'user', 'name', 'description', 'rating', 'link', 'check_in', 'check_out',
'reservation_number', 'price', 'latitude', 'longitude', 'location', 'is_public',
'reservation_number', 'price', 'price_currency', 'latitude', 'longitude', 'location', 'is_public',
'collection', 'created_at', 'updated_at', 'type', 'timezone', 'images', 'attachments'
]
read_only_fields = ['id', 'created_at', 'updated_at', 'user']
@@ -583,16 +704,74 @@ class ChecklistSerializer(CustomModelSerializer):
return data
class CollectionSerializer(CustomModelSerializer):
collaborators = serializers.SerializerMethodField()
locations = serializers.SerializerMethodField()
transportations = serializers.SerializerMethodField()
notes = serializers.SerializerMethodField()
checklists = serializers.SerializerMethodField()
lodging = serializers.SerializerMethodField()
status = serializers.SerializerMethodField()
days_until_start = serializers.SerializerMethodField()
primary_image = ContentImageSerializer(read_only=True)
primary_image_id = serializers.PrimaryKeyRelatedField(
queryset=ContentImage.objects.all(),
source='primary_image',
write_only=True,
required=False,
allow_null=True,
)
class Meta:
model = Collection
fields = ['id', 'description', 'user', 'name', 'is_public', 'locations', 'created_at', 'start_date', 'end_date', 'transportations', 'notes', 'updated_at', 'checklists', 'is_archived', 'shared_with', 'link', 'lodging']
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'shared_with']
fields = [
'id',
'description',
'user',
'name',
'is_public',
'locations',
'created_at',
'start_date',
'end_date',
'transportations',
'notes',
'updated_at',
'checklists',
'is_archived',
'shared_with',
'collaborators',
'link',
'lodging',
'status',
'days_until_start',
'primary_image',
'primary_image_id',
]
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'shared_with', 'status', 'days_until_start', 'primary_image']
def get_collaborators(self, obj):
request = self.context.get('request')
request_user = getattr(request, 'user', None) if request else None
users = []
if obj.user:
users.append(obj.user)
users.extend(list(obj.shared_with.all()))
collaborators = []
seen = set()
for user in users:
if not user:
continue
key = str(user.uuid)
if key in seen:
continue
seen.add(key)
serialized = _serialize_collaborator(user, owner_id=obj.user_id, request_user=request_user)
if serialized:
collaborators.append(serialized)
return collaborators
def get_locations(self, obj):
if self.context.get('nested', False):
@@ -629,6 +808,72 @@ class CollectionSerializer(CustomModelSerializer):
return []
return LodgingSerializer(obj.lodging_set.all(), many=True, context=self.context).data
def get_status(self, obj):
"""Calculate the status of the collection based on dates"""
from datetime import date
# If no dates, it's a folder
if not obj.start_date or not obj.end_date:
return 'folder'
today = date.today()
# Future trip
if obj.start_date > today:
return 'upcoming'
# Past trip
if obj.end_date < today:
return 'completed'
# Current trip
return 'in_progress'
def get_days_until_start(self, obj):
"""Calculate days until start for upcoming collections"""
from datetime import date
if not obj.start_date:
return None
today = date.today()
if obj.start_date > today:
return (obj.start_date - today).days
return None
def validate(self, attrs):
data = super().validate(attrs)
# Only validate primary image when explicitly provided
if 'primary_image' not in data:
return data
primary_image = data.get('primary_image')
if primary_image is None:
return data
request = self.context.get('request')
if request and primary_image.user != request.user:
raise serializers.ValidationError({
'primary_image_id': 'You can only choose cover images you own.'
})
if self.instance and not self._image_belongs_to_collection(primary_image, self.instance):
raise serializers.ValidationError({
'primary_image_id': 'Cover image must come from a location in this collection.'
})
return data
def _image_belongs_to_collection(self, image, collection):
if ContentImage.objects.filter(id=image.id, location__collections=collection).exists():
return True
if ContentImage.objects.filter(id=image.id, visit__location__collections=collection).exists():
return True
return False
def to_representation(self, instance):
representation = super().to_representation(instance)
@@ -660,34 +905,109 @@ class CollectionInviteSerializer(serializers.ModelSerializer):
class UltraSlimCollectionSerializer(serializers.ModelSerializer):
location_images = serializers.SerializerMethodField()
location_count = serializers.SerializerMethodField()
status = serializers.SerializerMethodField()
days_until_start = serializers.SerializerMethodField()
primary_image = ContentImageSerializer(read_only=True)
collaborators = serializers.SerializerMethodField()
class Meta:
model = Collection
fields = [
'id', 'user', 'name', 'description', 'is_public', 'start_date', 'end_date',
'is_archived', 'link', 'created_at', 'updated_at', 'location_images',
'location_count', 'shared_with'
'location_count', 'shared_with', 'collaborators', 'status', 'days_until_start', 'primary_image'
]
read_only_fields = fields # All fields are read-only for listing
def get_collaborators(self, obj):
request = self.context.get('request')
request_user = getattr(request, 'user', None) if request else None
users = []
if obj.user:
users.append(obj.user)
users.extend(list(obj.shared_with.all()))
collaborators = []
seen = set()
for user in users:
if not user:
continue
key = str(user.uuid)
if key in seen:
continue
seen.add(key)
serialized = _serialize_collaborator(user, owner_id=obj.user_id, request_user=request_user)
if serialized:
collaborators.append(serialized)
return collaborators
def get_location_images(self, obj):
"""Get primary images from locations in this collection, optimized with select_related"""
# Filter first, then slice (removed slicing)
images = ContentImage.objects.filter(
location__collections=obj
).select_related('user').prefetch_related('location')
images = list(
ContentImage.objects.filter(location__collections=obj)
.select_related('user')
)
return ContentImageSerializer(
def sort_key(image):
if obj.primary_image and image.id == obj.primary_image.id:
return (0, str(image.id))
if image.is_primary:
return (1, str(image.id))
return (2, str(image.id))
images.sort(key=sort_key)
serializer = ContentImageSerializer(
images,
many=True,
context={'request': self.context.get('request')}
).data
)
# Filter out None values from the serialized data
return [image for image in serializer.data if image is not None]
def get_location_count(self, obj):
"""Get count of locations in this collection"""
# This uses the cached count if available, or does a simple count query
return obj.locations.count()
def get_status(self, obj):
"""Calculate the status of the collection based on dates"""
from datetime import date
# If no dates, it's a folder
if not obj.start_date or not obj.end_date:
return 'folder'
today = date.today()
# Future trip
if obj.start_date > today:
return 'upcoming'
# Past trip
if obj.end_date < today:
return 'completed'
# Current trip
return 'in_progress'
def get_days_until_start(self, obj):
"""Calculate days until start for upcoming collections"""
from datetime import date
if not obj.start_date:
return None
today = date.today()
if obj.start_date > today:
return (obj.start_date - today).days
return None
def to_representation(self, instance):
representation = super().to_representation(instance)
@@ -700,4 +1020,47 @@ class UltraSlimCollectionSerializer(serializers.ModelSerializer):
shared_uuids.append(str(user.uuid))
representation['shared_with'] = shared_uuids
return representation
class CollectionItineraryDaySerializer(CustomModelSerializer):
class Meta:
model = CollectionItineraryDay
fields = ['id', 'collection', 'date', 'name', 'description', 'created_at', 'updated_at']
read_only_fields = ['id', 'created_at', 'updated_at']
def update(self, instance, validated_data):
# Security: Prevent changing collection or date after creation
# This prevents shared users from reassigning itinerary days to themselves
validated_data.pop('collection', None)
validated_data.pop('date', None)
return super().update(instance, validated_data)
class CollectionItineraryItemSerializer(CustomModelSerializer):
item = serializers.SerializerMethodField()
start_datetime = serializers.ReadOnlyField()
end_datetime = serializers.ReadOnlyField()
object_name = serializers.ReadOnlyField(source='content_type.model')
class Meta:
model = CollectionItineraryItem
fields = ['id', 'collection', 'content_type', 'object_id', 'item', 'date', 'is_global', 'order', 'start_datetime', 'end_datetime', 'created_at', 'object_name']
read_only_fields = ['id', 'created_at', 'start_datetime', 'end_datetime', 'item', 'object_name']
def update(self, instance, validated_data):
# Security: Prevent changing collection, content_type, or object_id after creation
# This prevents shared users from reassigning itinerary items to themselves
# or linking items to objects they don't have permission to access
validated_data.pop('collection', None)
validated_data.pop('content_type', None)
validated_data.pop('object_id', None)
return super().update(instance, validated_data)
def get_item(self, obj):
"""Return id and type for the linked item"""
if not obj.item:
return None
return {
'id': str(obj.item.id),
'type': obj.content_type.model,
}

View File

@@ -1,7 +1,10 @@
from django.db.models.signals import m2m_changed
from django.db.models.signals import m2m_changed, post_delete
from django.dispatch import receiver
from django.contrib.contenttypes.models import ContentType
from adventures.models import Location
@receiver(m2m_changed, sender=Location.collections.through)
def update_adventure_publicity(sender, instance, action, **kwargs):
"""
@@ -24,3 +27,48 @@ def update_adventure_publicity(sender, instance, action, **kwargs):
elif not has_public_collection and instance.is_public:
instance.is_public = False
instance.save(update_fields=['is_public'])
@receiver(post_delete)
def _remove_collection_itinerary_items_on_object_delete(sender, instance, **kwargs):
"""
When any model instance is deleted, remove any CollectionItineraryItem that
refers to it via the GenericForeignKey (matches by ContentType and object_id).
This ensures that if a referenced item (e.g. a `Location`, `Visit`, `Transportation`,
`Note`, etc.) is deleted, the itinerary entry that pointed to it is also removed.
"""
# Avoid acting when a CollectionItineraryItem itself is deleted
# to prevent needless extra queries.
if sender.__name__ == 'CollectionItineraryItem':
return
# Resolve the content type for the model that was deleted
try:
ct = ContentType.objects.get_for_model(sender)
except Exception:
return
# Import here to avoid circular import problems at module import time
from adventures.models import CollectionItineraryItem
# Try matching the primary key in its native form first, then as a string.
# CollectionItineraryItem.object_id is a UUIDField in the model, but some
# senders might have different PK representations; handle both safely.
pk = instance.pk
deleted = False
try:
qs = CollectionItineraryItem.objects.filter(content_type=ct, object_id=pk)
if qs.exists():
qs.delete()
deleted = True
except Exception:
pass
if not deleted:
try:
CollectionItineraryItem.objects.filter(content_type=ct, object_id=str(pk)).delete()
except Exception:
# If deletion fails for any reason, do nothing; we don't want to
# raise errors during another model's delete.
pass

View File

@@ -23,6 +23,8 @@ router.register(r'backup', BackupViewSet, basename='backup')
router.register(r'trails', TrailViewSet, basename='trails')
router.register(r'activities', ActivityViewSet, basename='activities')
router.register(r'visits', VisitViewSet, basename='visits')
router.register(r'itineraries', ItineraryViewSet, basename='itineraries')
router.register(r'itinerary-days', ItineraryDayViewSet, basename='itinerary-days')
urlpatterns = [
# Include the router under the 'api/' prefix

View File

@@ -0,0 +1,221 @@
from typing import List
from datetime import date, timedelta
from django.db import transaction
from django.contrib.contenttypes.models import ContentType
from django.utils import timezone
from pytz import timezone as pytz_timezone
from adventures.models import Collection, CollectionItineraryItem, Visit, Lodging, Transportation, Note, Checklist
from rest_framework.exceptions import ValidationError
def _datetime_to_date_in_timezone(dt, timezone_str: str | None) -> date:
"""
Convert a datetime to a date, accounting for timezone only if there's actual time information.
If the datetime is at UTC midnight (00:00:00), treat it as a date-only value and don't convert.
If the datetime has a time component, apply timezone conversion.
Args:
dt: datetime object (can be timezone-aware or naive)
timezone_str: IANA timezone string (e.g., 'America/New_York')
Returns:
date: The date in the specified timezone (or UTC if date-only)
"""
if dt is None:
return None
# If it's already a date, return it
if isinstance(dt, date) and not hasattr(dt, 'time'):
return dt
# Check if this is a date-only value (stored as UTC midnight)
# If time is 00:00:00, treat it as date-only and don't apply timezone conversion
if hasattr(dt, 'hour') and dt.hour == 0 and dt.minute == 0 and dt.second == 0:
return dt.date() if hasattr(dt, 'date') else dt
# Ensure datetime is timezone-aware (assume UTC if naive)
if hasattr(dt, 'tzinfo') and dt.tzinfo is None:
dt = timezone.make_aware(dt, timezone.utc)
# Convert to target timezone if provided, otherwise use UTC
if timezone_str:
try:
target_tz = pytz_timezone(timezone_str)
dt = dt.astimezone(target_tz)
except Exception:
# If timezone conversion fails, use UTC
pass
return dt.date() if hasattr(dt, 'date') else dt
def auto_generate_itinerary(collection: Collection) -> List[CollectionItineraryItem]:
"""
Auto-generate itinerary items for a collection based on dated records.
Rules:
- Visits: Create one item per day of the visit (spanning multiple days)
- Lodging: Create one item on check_in date only
- Transportation: Create one item on start date
- Notes: Create one item on their date if present
- Checklists: Create one item on their date if present
Order within a day (incremental):
1. Lodging (check-ins)
2. Visits
3. Transportation
4. Notes
5. Checklists
Args:
collection: Collection to generate itinerary for
Returns:
List[CollectionItineraryItem]: Created itinerary items
Raises:
ValidationError: If collection already has itinerary items or has no dated records
"""
# Validation: collection must have zero itinerary items
if collection.itinerary_items.exists():
raise ValidationError({
"detail": "Collection already has itinerary items. Cannot auto-generate."
})
# Get collection date range
if not collection.start_date or not collection.end_date:
raise ValidationError({
"detail": "Collection must have start_date and end_date set."
})
start_date = collection.start_date
end_date = collection.end_date
# Collect all items to be added, grouped by date
items_by_date = {} # date -> [(content_type, object_id, priority)]
# Priority order for sorting within a day
PRIORITY_LODGING = 1
PRIORITY_VISIT = 2
PRIORITY_TRANSPORTATION = 3
PRIORITY_NOTE = 4
PRIORITY_CHECKLIST = 5
# Process Visits: one location item per day of the visit
# Note: We reference the Location, not the Visit itself
from adventures.models import Location
visits = Visit.objects.filter(location__collections=collection).select_related('location').distinct()
for visit in visits:
if visit.start_date and visit.location:
# Convert to date using visit's timezone
visit_start = _datetime_to_date_in_timezone(visit.start_date, visit.timezone)
visit_end = _datetime_to_date_in_timezone(visit.end_date, visit.timezone) if visit.end_date else visit_start
# Only include dates within collection range
visit_start = max(visit_start, start_date)
visit_end = min(visit_end or visit_start, end_date)
current_date = visit_start
while current_date <= visit_end:
if current_date not in items_by_date:
items_by_date[current_date] = []
items_by_date[current_date].append((
ContentType.objects.get_for_model(Location),
visit.location.id, # Use Location ID, not Visit ID
PRIORITY_VISIT
))
current_date += timedelta(days=1)
# Process Lodging: one item on check_in date only
lodgings = Lodging.objects.filter(collection=collection)
for lodging in lodgings:
if lodging.check_in:
# Convert to date using lodging's timezone
checkin_date = _datetime_to_date_in_timezone(lodging.check_in, lodging.timezone)
# Only include if within collection range
if start_date <= checkin_date <= end_date:
if checkin_date not in items_by_date:
items_by_date[checkin_date] = []
items_by_date[checkin_date].append((
ContentType.objects.get_for_model(Lodging),
lodging.id,
PRIORITY_LODGING
))
# Process Transportation: one item on start date
transportations = Transportation.objects.filter(collection=collection)
for transportation in transportations:
if transportation.date:
# Convert to date using transportation's start timezone
trans_date = _datetime_to_date_in_timezone(transportation.date, transportation.start_timezone)
# Only include if within collection range
if start_date <= trans_date <= end_date:
if trans_date not in items_by_date:
items_by_date[trans_date] = []
items_by_date[trans_date].append((
ContentType.objects.get_for_model(Transportation),
transportation.id,
PRIORITY_TRANSPORTATION
))
# Process Notes: one item on their date
notes = Note.objects.filter(collection=collection)
for note in notes:
if note.date:
# Notes don't have timezone field, use UTC
note_date = _datetime_to_date_in_timezone(note.date, None)
# Only include if within collection range
if start_date <= note_date <= end_date:
if note_date not in items_by_date:
items_by_date[note_date] = []
items_by_date[note_date].append((
ContentType.objects.get_for_model(Note),
note.id,
PRIORITY_NOTE
))
# Process Checklists: one item on their date
checklists = Checklist.objects.filter(collection=collection)
for checklist in checklists:
if checklist.date:
# Checklists don't have timezone field, use UTC
checklist_date = _datetime_to_date_in_timezone(checklist.date, None)
# Only include if within collection range
if start_date <= checklist_date <= end_date:
if checklist_date not in items_by_date:
items_by_date[checklist_date] = []
items_by_date[checklist_date].append((
ContentType.objects.get_for_model(Checklist),
checklist.id,
PRIORITY_CHECKLIST
))
# Validation: must have at least one dated record
if not items_by_date:
raise ValidationError({
"detail": "No dated records found within collection date range."
})
# Create itinerary items
created_items = []
for day_date in sorted(items_by_date.keys()):
# Sort items by priority within the day
items = sorted(items_by_date[day_date], key=lambda x: x[2])
for order, (content_type, object_id, priority) in enumerate(items):
itinerary_item = CollectionItineraryItem.objects.create(
collection=collection,
content_type=content_type,
object_id=object_id,
date=day_date,
order=order
)
created_items.append(itinerary_item)
return created_items

View File

@@ -0,0 +1,115 @@
from typing import List
from django.db import transaction
from django.utils.dateparse import parse_date, parse_datetime
from rest_framework.exceptions import ValidationError, PermissionDenied
from adventures.models import CollectionItineraryItem
@transaction.atomic
def reorder_itinerary_items(user, items_data: List[dict]):
"""Reorder itinerary items in bulk.
Args:
user: requesting user (for permission checks)
items_data: list of dicts with keys `id`, `date`, `order`
Returns:
List[CollectionItineraryItem]: updated items (unsaved instances are saved by this function)
Raises:
ValidationError, PermissionDenied
"""
if not items_data:
raise ValidationError({"items": "This field is required and must not be empty."})
if not isinstance(items_data, list):
raise ValidationError({"items": "Must be a list of item updates."})
# Resolve ids and fetch items
item_ids = [item.get('id') for item in items_data if item.get('id')]
items_qs = CollectionItineraryItem.objects.filter(id__in=item_ids).select_related('collection')
if items_qs.count() != len(item_ids):
raise ValidationError({"items": "One or more items not found."})
items_map = {str(it.id): it for it in items_qs}
# Permission checks: user must be collection owner or in shared_with
for item_id in item_ids:
item = items_map.get(item_id)
if not item:
continue
collection = item.collection
if not (collection.user == user or collection.shared_with.filter(id=user.id).exists()):
raise PermissionDenied("You do not have permission to modify items in this collection.")
# Two-phase update to avoid unique constraint races:
# 1) assign very large temporary order values (guaranteed > existing orders)
# 2) assign final date/order values
temp_offset = 1_000_000
temp_updates = []
for i, item_data in enumerate(items_data):
item_id = item_data.get('id')
if not item_id:
continue
item = items_map.get(item_id)
if not item:
continue
item.order = temp_offset + i
temp_updates.append(item)
if temp_updates:
CollectionItineraryItem.objects.bulk_update(temp_updates, ['order'])
# Finalize
updated_items = []
for item_data in items_data:
item_id = item_data.get('id')
if not item_id:
continue
item = items_map.get(item_id)
if not item:
continue
new_date = item_data.get('date')
new_is_global = item_data.get('is_global')
new_order = item_data.get('order')
# If is_global is explicitly provided, set it and reconcile date accordingly
if new_is_global is not None:
item.is_global = bool(new_is_global)
if item.is_global:
item.date = None
if (new_date is not None) and (not item.is_global):
# validate date is within collection bounds (if collection has start/end)
parsed = None
try:
parsed = parse_date(str(new_date))
except Exception:
parsed = None
if parsed is None:
try:
dt = parse_datetime(str(new_date))
if dt:
parsed = dt.date()
except Exception:
parsed = None
collection = item.collection
if parsed and collection:
if collection.start_date and parsed < collection.start_date:
raise ValidationError({"items": f"Item {item_id} date {parsed} is before collection start date {collection.start_date}."})
if collection.end_date and parsed > collection.end_date:
raise ValidationError({"items": f"Item {item_id} date {parsed} is after collection end date {collection.end_date}."})
item.date = new_date
if new_order is not None:
item.order = new_order
updated_items.append(item)
if updated_items:
CollectionItineraryItem.objects.bulk_update(updated_items, ['date', 'is_global', 'order'])
return updated_items

View File

@@ -17,4 +17,5 @@ from .recommendations_view import *
from .import_export_view import *
from .trail_view import *
from .activity_view import *
from .visit_view import *
from .visit_view import *
from .itinerary_view import *

View File

@@ -2,11 +2,13 @@ from rest_framework import viewsets, status
from rest_framework.decorators import action
from rest_framework.response import Response
from django.db.models import Q
from adventures.models import Checklist
from adventures.models import Checklist, CollectionItineraryItem
from adventures.serializers import ChecklistSerializer
from rest_framework.exceptions import PermissionDenied
from adventures.permissions import IsOwnerOrSharedWithFullAccess
from rest_framework.permissions import IsAuthenticated
from django.contrib.contenttypes.models import ContentType
from django.db import transaction
class ChecklistViewSet(viewsets.ModelViewSet):
serializer_class = ChecklistSerializer
@@ -43,15 +45,18 @@ class ChecklistViewSet(viewsets.ModelViewSet):
# Retrieve the current object
instance = self.get_object()
# Store the old date before updating
old_date = instance.date
# Partially update the instance with the request data
serializer = self.get_serializer(instance, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
# Retrieve the collection from the validated data
new_collection = serializer.validated_data.get('collection')
new_date = serializer.validated_data.get('date', old_date)
user = request.user
print(new_collection)
if new_collection is not None and new_collection!=instance.collection:
# Check if the user is the owner of the new collection
@@ -62,34 +67,15 @@ class ChecklistViewSet(viewsets.ModelViewSet):
if instance.collection is not None and instance.collection.user != user:
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
# Perform the update
self.perform_update(serializer)
# Return the updated instance
return Response(serializer.data)
def partial_update(self, request, *args, **kwargs):
# Retrieve the current object
instance = self.get_object()
# Partially update the instance with the request data
serializer = self.get_serializer(instance, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
# Retrieve the collection from the validated data
new_collection = serializer.validated_data.get('collection')
user = request.user
print(new_collection)
if new_collection is not None and new_collection!=instance.collection:
# Check if the user is the owner of the new collection
if new_collection.user != user or instance.user != user:
raise PermissionDenied("You do not have permission to use this collection.")
elif new_collection is None:
# Handle the case where the user is trying to set the collection to None
if instance.collection is not None and instance.collection.user != user:
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
# If the date changed, remove old itinerary items for this checklist on the old date
if old_date and new_date and old_date != new_date:
checklist_ct = ContentType.objects.get_for_model(Checklist)
old_itinerary_items = CollectionItineraryItem.objects.filter(
content_type=checklist_ct,
object_id=str(instance.id),
date=old_date
)
old_itinerary_items.delete()
# Perform the update
self.perform_update(serializer)

View File

@@ -4,9 +4,18 @@ from django.db import transaction
from rest_framework import viewsets
from rest_framework.decorators import action
from rest_framework.response import Response
from adventures.models import Collection, Location, Transportation, Note, Checklist, CollectionInvite, ContentImage
from rest_framework.parsers import MultiPartParser
from rest_framework import status
from django.http import HttpResponse
from django.conf import settings
import io
import os
import json
import zipfile
import tempfile
from adventures.models import Collection, Location, Transportation, Note, Checklist, CollectionInvite, ContentImage, CollectionItineraryItem, Lodging, CollectionItineraryDay, ContentAttachment, Category
from adventures.permissions import CollectionShared
from adventures.serializers import CollectionSerializer, CollectionInviteSerializer, UltraSlimCollectionSerializer
from adventures.serializers import CollectionSerializer, CollectionInviteSerializer, UltraSlimCollectionSerializer, CollectionItineraryItemSerializer, CollectionItineraryDaySerializer
from users.models import CustomUser as User
from adventures.utils import pagination
from users.serializers import CustomUserDetailsSerializer as UserSerializer
@@ -38,22 +47,47 @@ class CollectionViewSet(viewsets.ModelViewSet):
if order_by == 'name':
queryset = queryset.annotate(lower_name=Lower('name'))
ordering = 'lower_name'
if order_direction == 'desc':
if order_direction == 'asc':
ordering = f'-{ordering}'
elif order_by == 'start_date':
ordering = 'start_date'
if order_direction == 'asc':
if order_direction == 'desc':
ordering = 'start_date'
else:
ordering = '-start_date'
else:
order_by == 'updated_at'
ordering = 'updated_at'
if order_direction == 'asc':
if order_direction == 'desc':
ordering = '-updated_at'
return queryset.order_by(ordering)
def apply_status_filter(self, queryset):
"""Apply status filtering based on query parameter"""
from datetime import date
status_filter = self.request.query_params.get('status', None)
if not status_filter:
return queryset
today = date.today()
if status_filter == 'folder':
# Collections without dates
return queryset.filter(Q(start_date__isnull=True) | Q(end_date__isnull=True))
elif status_filter == 'upcoming':
# Start date in the future
return queryset.filter(start_date__gt=today)
elif status_filter == 'in_progress':
# Currently ongoing
return queryset.filter(start_date__lte=today, end_date__gte=today)
elif status_filter == 'completed':
# End date in the past
return queryset.filter(end_date__lt=today)
return queryset
def get_serializer_context(self):
"""Override to add nested and exclusion contexts based on query parameters"""
context = super().get_serializer_context()
@@ -80,45 +114,49 @@ class CollectionViewSet(viewsets.ModelViewSet):
def get_optimized_queryset_for_listing(self):
"""Get optimized queryset for list actions with prefetching"""
return self.get_base_queryset().select_related('user').prefetch_related(
return self.get_base_queryset().select_related('user', 'primary_image').prefetch_related(
Prefetch(
'locations__images',
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
to_attr='primary_images'
)
),
'shared_with'
)
def get_base_queryset(self):
"""Base queryset logic extracted for reuse"""
if self.action == 'destroy':
return Collection.objects.filter(user=self.request.user.id)
if self.action in ['update', 'partial_update']:
return Collection.objects.filter(
queryset = Collection.objects.filter(user=self.request.user.id)
elif self.action in ['update', 'partial_update', 'leave']:
queryset = Collection.objects.filter(
Q(user=self.request.user.id) | Q(shared_with=self.request.user)
).distinct()
# Allow access to collections with pending invites for accept/decline actions
if self.action in ['accept_invite', 'decline_invite']:
elif self.action in ['accept_invite', 'decline_invite']:
if not self.request.user.is_authenticated:
return Collection.objects.none()
return Collection.objects.filter(
Q(user=self.request.user.id) |
Q(shared_with=self.request.user) |
Q(invites__invited_user=self.request.user)
).distinct()
if self.action == 'retrieve':
queryset = Collection.objects.none()
else:
queryset = Collection.objects.filter(
Q(user=self.request.user.id)
| Q(shared_with=self.request.user)
| Q(invites__invited_user=self.request.user)
).distinct()
elif self.action == 'retrieve':
if not self.request.user.is_authenticated:
return Collection.objects.filter(is_public=True)
return Collection.objects.filter(
Q(is_public=True) | Q(user=self.request.user.id) | Q(shared_with=self.request.user)
queryset = Collection.objects.filter(is_public=True)
else:
queryset = Collection.objects.filter(
Q(is_public=True)
| Q(user=self.request.user.id)
| Q(shared_with=self.request.user)
).distinct()
else:
# For list action and default base queryset, return collections owned by the user (exclude shared)
queryset = Collection.objects.filter(
Q(user=self.request.user.id) & Q(is_archived=False)
).distinct()
# For list action, include collections owned by the user or shared with the user, that are not archived
return Collection.objects.filter(
(Q(user=self.request.user.id) | Q(shared_with=self.request.user)) & Q(is_archived=False)
).distinct()
return queryset.select_related('primary_image').prefetch_related('shared_with')
def get_queryset(self):
"""Get queryset with optimizations for list actions"""
@@ -131,9 +169,11 @@ class CollectionViewSet(viewsets.ModelViewSet):
if not request.user.is_authenticated:
return Response({"error": "User is not authenticated"}, status=400)
# List should only return collections owned by the requesting user (shared collections are available
# via the `shared` action).
queryset = Collection.objects.filter(
(Q(user=request.user.id) | Q(shared_with=request.user)) & Q(is_archived=False)
).distinct().select_related('user').prefetch_related(
Q(user=request.user.id) & Q(is_archived=False)
).distinct().select_related('user', 'primary_image').prefetch_related(
Prefetch(
'locations__images',
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
@@ -141,6 +181,7 @@ class CollectionViewSet(viewsets.ModelViewSet):
)
)
queryset = self.apply_status_filter(queryset)
queryset = self.apply_sorting(queryset)
return self.paginate_and_respond(queryset, request)
@@ -151,7 +192,7 @@ class CollectionViewSet(viewsets.ModelViewSet):
queryset = Collection.objects.filter(
Q(user=request.user)
).select_related('user').prefetch_related(
).select_related('user', 'primary_image').prefetch_related(
Prefetch(
'locations__images',
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
@@ -171,7 +212,7 @@ class CollectionViewSet(viewsets.ModelViewSet):
queryset = Collection.objects.filter(
Q(user=request.user.id) & Q(is_archived=True)
).select_related('user').prefetch_related(
).select_related('user', 'primary_image').prefetch_related(
Prefetch(
'locations__images',
queryset=ContentImage.objects.filter(is_primary=True).select_related('user'),
@@ -184,67 +225,23 @@ class CollectionViewSet(viewsets.ModelViewSet):
return Response(serializer.data)
# this make the is_public field of the collection cascade to the locations
@transaction.atomic
def update(self, request, *args, **kwargs):
partial = kwargs.pop('partial', False)
instance = self.get_object()
serializer = self.get_serializer(instance, data=request.data, partial=partial)
serializer.is_valid(raise_exception=True)
def retrieve(self, request, pk=None):
"""Retrieve a collection and include itinerary items and day metadata in the response."""
collection = self.get_object()
serializer = self.get_serializer(collection)
data = serializer.data
if 'collection' in serializer.validated_data:
new_collection = serializer.validated_data['collection']
# if the new collection is different from the old one and the user making the request is not the owner of the new collection return an error
if new_collection != instance.collection and new_collection.user != request.user:
return Response({"error": "User does not own the new collection"}, status=400)
# Include itinerary items inline with collection details
itinerary_items = CollectionItineraryItem.objects.filter(collection=collection)
itinerary_serializer = CollectionItineraryItemSerializer(itinerary_items, many=True)
data['itinerary'] = itinerary_serializer.data
# Include itinerary day metadata
itinerary_days = CollectionItineraryDay.objects.filter(collection=collection)
days_serializer = CollectionItineraryDaySerializer(itinerary_days, many=True)
data['itinerary_days'] = days_serializer.data
# Check if the 'is_public' field is present in the update data
if 'is_public' in serializer.validated_data:
new_public_status = serializer.validated_data['is_public']
# if is_public has changed and the user is not the owner of the collection return an error
if new_public_status != instance.is_public and instance.user != request.user:
print(f"User {request.user.id} does not own the collection {instance.id} that is owned by {instance.user}")
return Response({"error": "User does not own the collection"}, status=400)
# Get all locations in this collection
locations_in_collection = Location.objects.filter(collections=instance)
if new_public_status:
# If collection becomes public, make all locations public
locations_in_collection.update(is_public=True)
else:
# If collection becomes private, check each location
# Only set a location to private if ALL of its collections are private
# Collect locations that do NOT belong to any other public collection (excluding the current one)
location_ids_to_set_private = []
for location in locations_in_collection:
has_public_collection = location.collections.filter(is_public=True).exclude(id=instance.id).exists()
if not has_public_collection:
location_ids_to_set_private.append(location.id)
# Bulk update those locations
Location.objects.filter(id__in=location_ids_to_set_private).update(is_public=False)
# Update transportations, notes, and checklists related to this collection
# These still use direct ForeignKey relationships
Transportation.objects.filter(collection=instance).update(is_public=new_public_status)
Note.objects.filter(collection=instance).update(is_public=new_public_status)
Checklist.objects.filter(collection=instance).update(is_public=new_public_status)
# Log the action (optional)
action = "public" if new_public_status else "private"
print(f"Collection {instance.id} and its related objects were set to {action}")
self.perform_update(serializer)
if getattr(instance, '_prefetched_objects_cache', None):
# If 'prefetch_related' has been applied to a queryset, we need to
# forcibly invalidate the prefetch cache on the instance.
instance._prefetched_objects_cache = {}
return Response(serializer.data)
return Response(data)
# make an action to retreive all locations that are shared with the user
@action(detail=False, methods=['get'])
@@ -477,10 +474,433 @@ class CollectionViewSet(viewsets.ModelViewSet):
return Response({"success": success_message})
@action(detail=True, methods=['get'], url_path='export')
def export_collection(self, request, pk=None):
"""Export a single collection and its related content as a ZIP file."""
collection = self.get_object()
export_data = {
'version': getattr(settings, 'ADVENTURELOG_RELEASE_VERSION', 'unknown'),
# Omit export_date to keep template-friendly exports (no dates)
'collection': {
'id': str(collection.id),
'name': collection.name,
'description': collection.description,
'is_public': collection.is_public,
# Omit start/end dates
'link': collection.link,
},
'locations': [],
'transportation': [],
'notes': [],
'checklists': [],
'lodging': [],
# Omit itinerary_items entirely
'images': [],
'attachments': [],
'primary_image_ref': None,
}
image_export_map = {}
for loc in collection.locations.all().select_related('city', 'region', 'country'):
loc_entry = {
'id': str(loc.id),
'name': loc.name,
'description': loc.description,
'location': loc.location,
'tags': loc.tags or [],
'rating': loc.rating,
'link': loc.link,
'is_public': loc.is_public,
'longitude': float(loc.longitude) if loc.longitude is not None else None,
'latitude': float(loc.latitude) if loc.latitude is not None else None,
'city': loc.city.name if loc.city else None,
'region': loc.region.name if loc.region else None,
'country': loc.country.name if loc.country else None,
'images': [],
'attachments': [],
}
for img in loc.images.all():
img_export_id = f"img_{len(export_data['images'])}"
image_export_map[str(img.id)] = img_export_id
export_data['images'].append({
'export_id': img_export_id,
'id': str(img.id),
'name': os.path.basename(getattr(img.image, 'name', 'image')),
'is_primary': getattr(img, 'is_primary', False),
})
loc_entry['images'].append(img_export_id)
for att in loc.attachments.all():
att_export_id = f"att_{len(export_data['attachments'])}"
export_data['attachments'].append({
'export_id': att_export_id,
'id': str(att.id),
'name': os.path.basename(getattr(att.file, 'name', 'attachment')),
})
loc_entry['attachments'].append(att_export_id)
export_data['locations'].append(loc_entry)
if collection.primary_image:
export_data['primary_image_ref'] = image_export_map.get(str(collection.primary_image.id))
# Related content (if models have FK to collection)
for t in Transportation.objects.filter(collection=collection):
export_data['transportation'].append({
'id': str(t.id),
'type': getattr(t, 'transportation_type', None),
'name': getattr(t, 'name', None),
# Omit date
'notes': getattr(t, 'notes', None),
})
for n in Note.objects.filter(collection=collection):
export_data['notes'].append({
'id': str(n.id),
'title': getattr(n, 'title', None),
'content': getattr(n, 'content', ''),
# Omit created_at
})
for c in Checklist.objects.filter(collection=collection):
items = []
if hasattr(c, 'items'):
items = [
{
'name': getattr(item, 'name', None),
'completed': getattr(item, 'completed', False),
} for item in c.items.all()
]
export_data['checklists'].append({
'id': str(c.id),
'name': getattr(c, 'name', None),
'items': items,
})
for l in Lodging.objects.filter(collection=collection):
export_data['lodging'].append({
'id': str(l.id),
'type': getattr(l, 'lodging_type', None),
'name': getattr(l, 'name', None),
# Omit start_date/end_date
'notes': getattr(l, 'notes', None),
})
# Intentionally omit itinerary_items from export
# Create ZIP in temp file
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
with zipfile.ZipFile(tmp_file, 'w', zipfile.ZIP_DEFLATED) as zipf:
zipf.writestr('metadata.json', json.dumps(export_data, indent=2))
# Write image files
for loc in collection.locations.all():
for img in loc.images.all():
export_id = image_export_map.get(str(img.id))
if not export_id:
continue
try:
file_name = os.path.basename(getattr(img.image, 'name', 'image'))
storage = getattr(img.image, 'storage', None)
if storage:
with storage.open(img.image.name, 'rb') as f:
zipf.writestr(f'images/{export_id}-{file_name}', f.read())
elif hasattr(img.image, 'path'):
with open(img.image.path, 'rb') as f:
zipf.writestr(f'images/{export_id}-{file_name}', f.read())
except Exception:
continue
# Write attachment files
for loc in collection.locations.all():
for att in loc.attachments.all():
try:
file_name = os.path.basename(getattr(att.file, 'name', 'attachment'))
storage = getattr(att.file, 'storage', None)
if storage:
with storage.open(att.file.name, 'rb') as f:
zipf.writestr(f'attachments/{file_name}', f.read())
elif hasattr(att.file, 'path'):
with open(att.file.path, 'rb') as f:
zipf.writestr(f'attachments/{file_name}', f.read())
except Exception:
continue
with open(tmp_file.name, 'rb') as fh:
data = fh.read()
os.unlink(tmp_file.name)
filename = f"collection-{collection.name.replace(' ', '_')}.zip"
response = HttpResponse(data, content_type='application/zip')
response['Content-Disposition'] = f'attachment; filename="{filename}"'
return response
@action(detail=False, methods=['post'], url_path='import', parser_classes=[MultiPartParser])
def import_collection(self, request):
"""Import a single collection from a ZIP file. Handles name conflicts by appending (n)."""
upload = request.FILES.get('file')
if not upload:
return Response({'detail': 'No file provided'}, status=status.HTTP_400_BAD_REQUEST)
# Read zip
file_bytes = upload.read()
with zipfile.ZipFile(io.BytesIO(file_bytes), 'r') as zipf:
try:
metadata = json.loads(zipf.read('metadata.json').decode('utf-8'))
except KeyError:
return Response({'detail': 'metadata.json missing'}, status=status.HTTP_400_BAD_REQUEST)
base_name = (metadata.get('collection') or {}).get('name') or 'Imported Collection'
# Ensure unique name per user
existing_names = set(request.user.collection_set.values_list('name', flat=True))
unique_name = base_name
if unique_name in existing_names:
i = 1
while True:
candidate = f"{base_name} ({i})"
if candidate not in existing_names:
unique_name = candidate
break
i += 1
new_collection = Collection.objects.create(
user=request.user,
name=unique_name,
description=(metadata.get('collection') or {}).get('description'),
is_public=(metadata.get('collection') or {}).get('is_public', False),
start_date=__import__('datetime').date.fromisoformat((metadata.get('collection') or {}).get('start_date')) if (metadata.get('collection') or {}).get('start_date') else None,
end_date=__import__('datetime').date.fromisoformat((metadata.get('collection') or {}).get('end_date')) if (metadata.get('collection') or {}).get('end_date') else None,
link=(metadata.get('collection') or {}).get('link'),
)
image_export_map = {img['export_id']: img for img in metadata.get('images', [])}
attachment_export_map = {att['export_id']: att for att in metadata.get('attachments', [])}
# Import locations
for loc_data in metadata.get('locations', []):
cat_obj = None
if loc_data.get('category'):
cat_obj, _ = Category.objects.get_or_create(user=request.user, name=loc_data['category'])
# Attempt to find a very similar existing location for this user
from difflib import SequenceMatcher
def _ratio(a, b):
a = (a or '').strip().lower()
b = (b or '').strip().lower()
if not a and not b:
return 1.0
return SequenceMatcher(None, a, b).ratio()
def _coords_close(lat1, lon1, lat2, lon2, threshold=0.02):
try:
if lat1 is None or lon1 is None or lat2 is None or lon2 is None:
return False
return abs(float(lat1) - float(lat2)) <= threshold and abs(float(lon1) - float(lon2)) <= threshold
except Exception:
return False
incoming_name = loc_data.get('name') or 'Untitled'
incoming_location_text = loc_data.get('location')
incoming_lat = loc_data.get('latitude')
incoming_lon = loc_data.get('longitude')
existing_loc = None
best_score = 0.0
for cand in Location.objects.filter(user=request.user):
name_score = _ratio(incoming_name, cand.name)
loc_text_score = _ratio(incoming_location_text, getattr(cand, 'location', None))
close_coords = _coords_close(incoming_lat, incoming_lon, cand.latitude, cand.longitude)
# Define "very similar": strong name match OR decent name with location/coords match
combined_score = max(name_score, (name_score + loc_text_score) / 2.0)
if close_coords:
combined_score = max(combined_score, name_score + 0.1) # small boost for coord proximity
if combined_score > best_score and (
name_score >= 0.92 or (name_score >= 0.85 and (loc_text_score >= 0.85 or close_coords))
):
best_score = combined_score
existing_loc = cand
if existing_loc:
# Link existing location to the new collection, skip creating a duplicate
loc = existing_loc
loc.collections.add(new_collection)
created_new_loc = False
else:
# Create a brand-new location
loc = Location.objects.create(
user=request.user,
name=incoming_name,
description=loc_data.get('description'),
location=incoming_location_text,
tags=loc_data.get('tags') or [],
rating=loc_data.get('rating'),
link=loc_data.get('link'),
is_public=bool(loc_data.get('is_public', False)),
longitude=incoming_lon,
latitude=incoming_lat,
category=cat_obj,
)
loc.collections.add(new_collection)
created_new_loc = True
# Images
# Only import images for newly created locations to avoid duplicating user content
if created_new_loc:
for export_id in loc_data.get('images', []):
img_meta = image_export_map.get(export_id)
if not img_meta:
continue
prefix = f"images/{export_id}-"
member = next((m for m in zipf.namelist() if m.startswith(prefix)), None)
if not member:
continue
file_bytes_img = zipf.read(member)
file_name_img = os.path.basename(member)
from django.core.files.base import ContentFile
image_obj = ContentImage(
user=request.user,
image=ContentFile(file_bytes_img, name=file_name_img),
)
# Assign to the generic relation for Location
image_obj.content_object = loc
image_obj.save()
if img_meta.get('is_primary'):
new_collection.primary_image = image_obj
new_collection.save(update_fields=['primary_image'])
# Attachments
if created_new_loc:
for export_id in loc_data.get('attachments', []):
att_meta = attachment_export_map.get(export_id)
if not att_meta:
continue
file_name_att = att_meta.get('name', '')
member = next((m for m in zipf.namelist() if m == f"attachments/{file_name_att}"), None)
if not member:
continue
file_bytes_att = zipf.read(member)
from django.core.files.base import ContentFile
attachment_obj = ContentAttachment(
user=request.user,
file=ContentFile(file_bytes_att, name=file_name_att),
)
# Assign to the generic relation for Location
attachment_obj.content_object = loc
attachment_obj.save()
serializer = self.get_serializer(new_collection)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def perform_create(self, serializer):
# This is ok because you cannot share a collection when creating it
serializer.save(user=self.request.user)
def _cleanup_out_of_range_itinerary_items(self, collection):
"""Delete itinerary items and day metadata outside the collection's date range."""
if not collection.start_date or not collection.end_date:
# If no date range is set, don't delete anything
return
# Delete itinerary items outside the date range
deleted_items = CollectionItineraryItem.objects.filter(
collection=collection
).exclude(
date__range=[collection.start_date, collection.end_date]
).delete()
# Delete day metadata outside the date range
deleted_days = CollectionItineraryDay.objects.filter(
collection=collection
).exclude(
date__range=[collection.start_date, collection.end_date]
).delete()
return deleted_items, deleted_days
@transaction.atomic
def update(self, request, *args, **kwargs):
"""Override update to handle is_public cascading and clean up out-of-range itinerary items when dates change."""
instance = self.get_object()
old_is_public = instance.is_public
old_start_date = instance.start_date
old_end_date = instance.end_date
# Perform the standard update
partial = kwargs.pop('partial', False)
serializer = self.get_serializer(instance, data=request.data, partial=partial)
serializer.is_valid(raise_exception=True)
self.perform_update(serializer)
# Check if is_public changed
new_is_public = serializer.instance.is_public
is_public_changed = old_is_public != new_is_public
# Handle is_public cascading
if is_public_changed:
if new_is_public:
# Collection is being made public, update all linked items to public
serializer.instance.locations.filter(is_public=False).update(is_public=True)
serializer.instance.transportation_set.filter(is_public=False).update(is_public=True)
serializer.instance.note_set.filter(is_public=False).update(is_public=True)
serializer.instance.checklist_set.filter(is_public=False).update(is_public=True)
serializer.instance.lodging_set.filter(is_public=False).update(is_public=True)
else:
# Collection is being made private, check each linked item
# Only set an item to private if it doesn't belong to any other public collection
# Handle locations (many-to-many relationship)
locations_in_collection = serializer.instance.locations.filter(is_public=True)
for location in locations_in_collection:
# Check if this location belongs to any other public collection
has_other_public_collection = location.collections.filter(
is_public=True
).exclude(id=serializer.instance.id).exists()
if not has_other_public_collection:
location.is_public = False
location.save(update_fields=['is_public'])
# Handle transportations, notes, checklists, lodging (foreign key relationships)
# Transportation
transportations_to_check = serializer.instance.transportation_set.filter(is_public=True)
for transportation in transportations_to_check:
transportation.is_public = False
transportation.save(update_fields=['is_public'])
# Notes
notes_to_check = serializer.instance.note_set.filter(is_public=True)
for note in notes_to_check:
note.is_public = False
note.save(update_fields=['is_public'])
# Checklists
checklists_to_check = serializer.instance.checklist_set.filter(is_public=True)
for checklist in checklists_to_check:
checklist.is_public = False
checklist.save(update_fields=['is_public'])
# Lodging
lodging_to_check = serializer.instance.lodging_set.filter(is_public=True)
for lodging in lodging_to_check:
lodging.is_public = False
lodging.save(update_fields=['is_public'])
# Check if dates changed
new_start_date = serializer.instance.start_date
new_end_date = serializer.instance.end_date
dates_changed = (old_start_date != new_start_date or old_end_date != new_end_date)
# Clean up out-of-range items if dates changed
if dates_changed:
self._cleanup_out_of_range_itinerary_items(serializer.instance)
if getattr(instance, '_prefetched_objects_cache', None):
# If 'prefetch_related' has been applied to a queryset, we need to
# forcibly invalidate the prefetch cache on the instance.
instance._prefetched_objects_cache = {}
return Response(serializer.data)
def paginate_and_respond(self, queryset, request):
paginator = self.pagination_class()
page = paginator.paginate_queryset(queryset, request)

View File

@@ -91,8 +91,13 @@ class GenerateDescription(viewsets.ViewSet):
try:
candidates = self.get_candidate_pages(name, lang)
found_images = []
for candidate in candidates:
# Stop after finding 5 valid images
if len(found_images) >= 8:
break
page_data = self.fetch_page(
lang=lang,
candidate=candidate,
@@ -113,13 +118,29 @@ class GenerateDescription(viewsets.ViewSet):
# Try original image first
original_image = page_data.get('original')
if original_image and self.is_valid_image(original_image.get('source')):
return Response(original_image)
found_images.append({
'source': original_image.get('source'),
'width': original_image.get('width'),
'height': original_image.get('height'),
'title': page_data.get('title'),
'type': 'original'
})
continue
# Fall back to thumbnail
thumbnail_image = page_data.get('thumbnail')
if thumbnail_image and self.is_valid_image(thumbnail_image.get('source')):
return Response(thumbnail_image)
found_images.append({
'source': thumbnail_image.get('source'),
'width': thumbnail_image.get('width'),
'height': thumbnail_image.get('height'),
'title': page_data.get('title'),
'type': 'thumbnail'
})
if found_images:
return Response({"images": found_images})
return Response({"error": "No image found"}, status=404)
except requests.exceptions.RequestException:

View File

@@ -4,64 +4,89 @@ from rest_framework.decorators import action
from rest_framework.permissions import IsAuthenticated
from icalendar import Calendar, Event, vText, vCalAddress
from datetime import datetime, timedelta
from adventures.models import Location
from adventures.serializers import LocationSerializer
from django.db.models import Prefetch
from adventures.models import Location, Visit
class IcsCalendarGeneratorViewSet(viewsets.ViewSet):
permission_classes = [IsAuthenticated]
@action(detail=False, methods=['get'])
def generate(self, request):
locations = Location.objects.filter(user=request.user)
context={'nested': True, 'allowed_nested_fields': ['visits']}
serializer = LocationSerializer(locations, many=True, context=context)
locations = (
Location.objects.filter(user=request.user)
.prefetch_related(
Prefetch(
'visits',
queryset=Visit.objects.only('id', 'start_date', 'end_date')
)
)
.only('id', 'name', 'description', 'location', 'link')
)
user = request.user
name = f"{user.first_name} {user.last_name}"
cal = Calendar()
cal.add('prodid', '-//My Adventure Calendar//example.com//')
cal.add('version', '2.0')
for location in serializer.data:
if location['visits']:
for visit in location['visits']:
# Skip if start_date is missing
if not visit.get('start_date'):
continue
for location in locations:
visits = list(location.visits.all())
if not visits:
continue
# Parse start_date and handle end_date
for visit in visits:
start_value = getattr(visit, 'start_date', None)
if not start_value:
continue
# Normalize start date
if isinstance(start_value, str):
try:
start_date = datetime.strptime(visit['start_date'], '%Y-%m-%d').date()
start_dt = datetime.fromisoformat(start_value.replace('Z', '+00:00'))
except ValueError:
continue # Skip if the start_date is invalid
continue
else:
start_dt = start_value
start_date = start_dt.date() if hasattr(start_dt, 'date') else start_dt
# Normalize end date (inclusive of final day)
end_value = getattr(visit, 'end_date', None) or start_dt
if isinstance(end_value, str):
try:
end_dt = datetime.fromisoformat(end_value.replace('Z', '+00:00'))
except ValueError:
end_dt = start_dt
else:
end_dt = end_value
end_date = end_dt.date() if hasattr(end_dt, 'date') else end_dt
end_date = end_date + timedelta(days=1)
event = Event()
event.add('summary', location.name)
event.add('dtstart', start_date)
event.add('dtend', end_date)
event.add('dtstamp', datetime.now())
event.add('transp', 'TRANSPARENT')
event.add('class', 'PUBLIC')
event.add('created', datetime.now())
event.add('last-modified', datetime.now())
event.add('description', getattr(location, 'description', '') or '')
if getattr(location, 'location', None):
event.add('location', location.location)
if getattr(location, 'link', None):
event.add('url', location.link)
organizer = vCalAddress(f'MAILTO:{user.email}')
organizer.params['cn'] = vText(name)
event.add('organizer', organizer)
cal.add_component(event)
end_date = (
datetime.strptime(visit['end_date'], '%Y-%m-%d').date() + timedelta(days=1)
if visit.get('end_date') else start_date + timedelta(days=1)
)
# Create event
event = Event()
event.add('summary', location['name'])
event.add('dtstart', start_date)
event.add('dtend', end_date)
event.add('dtstamp', datetime.now())
event.add('transp', 'TRANSPARENT')
event.add('class', 'PUBLIC')
event.add('created', datetime.now())
event.add('last-modified', datetime.now())
event.add('description', location['description'])
if location.get('location'):
event.add('location', location['location'])
if location.get('link'):
event.add('url', location['link'])
organizer = vCalAddress(f'MAILTO:{user.email}')
organizer.params['cn'] = vText(name)
event.add('organizer', organizer)
cal.add_component(event)
response = HttpResponse(cal.to_ical(), content_type='text/calendar')
response['Content-Disposition'] = 'attachment; filename=adventures.ics'
return response

View File

@@ -19,7 +19,8 @@ from django.contrib.contenttypes.models import ContentType
from adventures.models import (
Location, Collection, Transportation, Note, Checklist, ChecklistItem,
ContentImage, ContentAttachment, Category, Lodging, Visit, Trail, Activity
ContentImage, ContentAttachment, Category, Lodging, Visit, Trail, Activity,
CollectionItineraryItem
)
from worldtravel.models import VisitedCity, VisitedRegion, City, Region, Country
@@ -52,7 +53,8 @@ class BackupViewSet(viewsets.ViewSet):
'checklists': [],
'lodging': [],
'visited_cities': [],
'visited_regions': []
'visited_regions': [],
'itinerary_items': []
}
# Export Visited Cities
@@ -75,6 +77,9 @@ class BackupViewSet(viewsets.ViewSet):
'icon': category.icon,
})
# Track images so we can reference them for collection primary images
image_export_map = {}
# Export Collections
for idx, collection in enumerate(user.collection_set.all()):
export_data['collections'].append({
@@ -175,7 +180,7 @@ class BackupViewSet(viewsets.ViewSet):
location_data['trails'].append(trail_data)
# Add images
for image in location.images.all():
for image_index, image in enumerate(location.images.all()):
image_data = {
'immich_id': image.immich_id,
'is_primary': image.is_primary,
@@ -184,6 +189,13 @@ class BackupViewSet(viewsets.ViewSet):
if image.image:
image_data['filename'] = image.image.name.split('/')[-1]
location_data['images'].append(image_data)
image_export_map[image.id] = {
'location_export_id': idx,
'image_index': image_index,
'immich_id': image.immich_id,
'filename': image_data['filename'],
}
# Add attachments
for attachment in location.attachments.all():
@@ -196,14 +208,21 @@ class BackupViewSet(viewsets.ViewSet):
location_data['attachments'].append(attachment_data)
export_data['locations'].append(location_data)
# Attach collection primary image references (if any)
for idx, collection in enumerate(user.collection_set.all()):
primary = collection.primary_image
if primary and primary.id in image_export_map:
export_data['collections'][idx]['primary_image'] = image_export_map[primary.id]
# Export Transportation
for transport in user.transportation_set.all():
for idx, transport in enumerate(user.transportation_set.all()):
collection_export_id = None
if transport.collection:
collection_export_id = collection_name_to_id.get(transport.collection.name)
export_data['transportation'].append({
'export_id': idx,
'type': transport.type,
'name': transport.name,
'description': transport.description,
@@ -225,12 +244,13 @@ class BackupViewSet(viewsets.ViewSet):
})
# Export Notes
for note in user.note_set.all():
for idx, note in enumerate(user.note_set.all()):
collection_export_id = None
if note.collection:
collection_export_id = collection_name_to_id.get(note.collection.name)
export_data['notes'].append({
'export_id': idx,
'name': note.name,
'content': note.content,
'links': note.links,
@@ -240,12 +260,13 @@ class BackupViewSet(viewsets.ViewSet):
})
# Export Checklists
for checklist in user.checklist_set.all():
for idx, checklist in enumerate(user.checklist_set.all()):
collection_export_id = None
if checklist.collection:
collection_export_id = collection_name_to_id.get(checklist.collection.name)
checklist_data = {
'export_id': idx,
'name': checklist.name,
'date': checklist.date.isoformat() if checklist.date else None,
'is_public': checklist.is_public,
@@ -263,12 +284,13 @@ class BackupViewSet(viewsets.ViewSet):
export_data['checklists'].append(checklist_data)
# Export Lodging
for lodging in user.lodging_set.all():
for idx, lodging in enumerate(user.lodging_set.all()):
collection_export_id = None
if lodging.collection:
collection_export_id = collection_name_to_id.get(lodging.collection.name)
export_data['lodging'].append({
'export_id': idx,
'name': lodging.name,
'type': lodging.type,
'description': lodging.description,
@@ -286,6 +308,41 @@ class BackupViewSet(viewsets.ViewSet):
'collection_export_id': collection_export_id
})
# Export Itinerary Items
# Create export_id mappings for all content types
location_id_to_export_id = {loc.id: idx for idx, loc in enumerate(user.location_set.all())}
transportation_id_to_export_id = {t.id: idx for idx, t in enumerate(user.transportation_set.all())}
note_id_to_export_id = {n.id: idx for idx, n in enumerate(user.note_set.all())}
lodging_id_to_export_id = {l.id: idx for idx, l in enumerate(user.lodging_set.all())}
checklist_id_to_export_id = {c.id: idx for idx, c in enumerate(user.checklist_set.all())}
for collection_idx, collection in enumerate(user.collection_set.all()):
for itinerary_item in collection.itinerary_items.all():
content_type_str = itinerary_item.content_type.model
item_reference = None
# Determine how to reference the item based on content type using export_ids
if content_type_str == 'location':
item_reference = location_id_to_export_id.get(itinerary_item.object_id)
elif content_type_str == 'transportation':
item_reference = transportation_id_to_export_id.get(itinerary_item.object_id)
elif content_type_str == 'note':
item_reference = note_id_to_export_id.get(itinerary_item.object_id)
elif content_type_str == 'lodging':
item_reference = lodging_id_to_export_id.get(itinerary_item.object_id)
elif content_type_str == 'checklist':
item_reference = checklist_id_to_export_id.get(itinerary_item.object_id)
if item_reference is not None:
export_data['itinerary_items'].append({
'collection_export_id': collection_idx,
'content_type': content_type_str,
'item_reference': item_reference,
'date': itinerary_item.date.isoformat() if itinerary_item.date else None,
'is_global': itinerary_item.is_global,
'order': itinerary_item.order
})
# Create ZIP file
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
with zipfile.ZipFile(tmp_file.name, 'w', zipfile.ZIP_DEFLATED) as zip_file:
@@ -402,6 +459,9 @@ class BackupViewSet(viewsets.ViewSet):
def _clear_user_data(self, user):
"""Clear all existing user data before import"""
# Delete itinerary items first (they reference collections and content)
CollectionItineraryItem.objects.filter(collection__user=user).delete()
# Delete in reverse order of dependencies
user.activity_set.all().delete() # Delete activities first
user.trail_set.all().delete() # Delete trails
@@ -439,7 +499,7 @@ class BackupViewSet(viewsets.ViewSet):
'transportation': 0, 'notes': 0, 'checklists': 0,
'checklist_items': 0, 'lodging': 0, 'images': 0,
'attachments': 0, 'visited_cities': 0, 'visited_regions': 0,
'trails': 0, 'activities': 0, 'gpx_files': 0
'trails': 0, 'activities': 0, 'gpx_files': 0, 'itinerary_items': 0
}
# Import Visited Cities
@@ -475,6 +535,9 @@ class BackupViewSet(viewsets.ViewSet):
category_map[cat_data['name']] = category
summary['categories'] += 1
pending_primary_images = []
location_images_map = {}
# Import Collections
for col_data in backup_data.get('collections', []):
collection = Collection.objects.create(
@@ -498,6 +561,13 @@ class BackupViewSet(viewsets.ViewSet):
collection.shared_with.add(shared_user)
except User.DoesNotExist:
pass
# Defer primary image assignment until images are created
if col_data.get('primary_image'):
pending_primary_images.append({
'collection_export_id': col_data['export_id'],
'data': col_data['primary_image'],
})
# Import Locations
for adv_data in backup_data.get('locations', []):
@@ -541,6 +611,7 @@ class BackupViewSet(viewsets.ViewSet):
)
location.save(_skip_geocode=True) # Skip geocoding for now
location_map[adv_data['export_id']] = location
location_images_map.setdefault(adv_data['export_id'], [])
# Add to collections using export_ids - MUST be done after save()
for collection_export_id in adv_data.get('collection_export_ids', []):
@@ -638,13 +709,14 @@ class BackupViewSet(viewsets.ViewSet):
for img_data in adv_data.get('images', []):
immich_id = img_data.get('immich_id')
if immich_id:
ContentImage.objects.create(
new_img = ContentImage.objects.create(
user=user,
immich_id=immich_id,
is_primary=img_data.get('is_primary', False),
content_type=content_type,
object_id=location.id
)
location_images_map[adv_data['export_id']].append(new_img)
summary['images'] += 1
else:
filename = img_data.get('filename')
@@ -652,13 +724,14 @@ class BackupViewSet(viewsets.ViewSet):
try:
img_content = zip_file.read(f'images/{filename}')
img_file = ContentFile(img_content, name=filename)
ContentImage.objects.create(
new_img = ContentImage.objects.create(
user=user,
image=img_file,
is_primary=img_data.get('is_primary', False),
content_type=content_type,
object_id=location.id
)
location_images_map[adv_data['export_id']].append(new_img)
summary['images'] += 1
except KeyError:
pass
@@ -682,14 +755,32 @@ class BackupViewSet(viewsets.ViewSet):
pass
summary['locations'] += 1
# Apply primary image selections now that images exist
for entry in pending_primary_images:
collection = collection_map.get(entry['collection_export_id'])
data = entry.get('data', {}) or {}
if not collection:
continue
loc_export_id = data.get('location_export_id')
img_index = data.get('image_index')
if loc_export_id is None or img_index is None:
continue
images_for_location = location_images_map.get(loc_export_id, [])
if 0 <= img_index < len(images_for_location):
collection.primary_image = images_for_location[img_index]
collection.save(update_fields=['primary_image'])
# Import Transportation
transportation_map = {} # Map export_id to actual transportation object
for trans_data in backup_data.get('transportation', []):
collection = None
if trans_data.get('collection_export_id') is not None:
collection = collection_map.get(trans_data['collection_export_id'])
Transportation.objects.create(
transportation = Transportation.objects.create(
user=user,
type=trans_data['type'],
name=trans_data['name'],
@@ -710,15 +801,19 @@ class BackupViewSet(viewsets.ViewSet):
is_public=trans_data.get('is_public', False),
collection=collection
)
# Only add to map if export_id exists (for backward compatibility with old backups)
if 'export_id' in trans_data:
transportation_map[trans_data['export_id']] = transportation
summary['transportation'] += 1
# Import Notes
note_map = {} # Map export_id to actual note object
for note_data in backup_data.get('notes', []):
collection = None
if note_data.get('collection_export_id') is not None:
collection = collection_map.get(note_data['collection_export_id'])
Note.objects.create(
note = Note.objects.create(
user=user,
name=note_data['name'],
content=note_data.get('content'),
@@ -727,9 +822,13 @@ class BackupViewSet(viewsets.ViewSet):
is_public=note_data.get('is_public', False),
collection=collection
)
# Only add to map if export_id exists (for backward compatibility with old backups)
if 'export_id' in note_data:
note_map[note_data['export_id']] = note
summary['notes'] += 1
# Import Checklists
checklist_map = {} # Map export_id to actual checklist object
for check_data in backup_data.get('checklists', []):
collection = None
if check_data.get('collection_export_id') is not None:
@@ -753,15 +852,19 @@ class BackupViewSet(viewsets.ViewSet):
)
summary['checklist_items'] += 1
# Only add to map if export_id exists (for backward compatibility with old backups)
if 'export_id' in check_data:
checklist_map[check_data['export_id']] = checklist
summary['checklists'] += 1
# Import Lodging
lodging_map = {} # Map export_id to actual lodging object
for lodg_data in backup_data.get('lodging', []):
collection = None
if lodg_data.get('collection_export_id') is not None:
collection = collection_map.get(lodg_data['collection_export_id'])
Lodging.objects.create(
lodging = Lodging.objects.create(
user=user,
name=lodg_data['name'],
type=lodg_data.get('type', 'other'),
@@ -779,6 +882,51 @@ class BackupViewSet(viewsets.ViewSet):
is_public=lodg_data.get('is_public', False),
collection=collection
)
# Only add to map if export_id exists (for backward compatibility with old backups)
if 'export_id' in lodg_data:
lodging_map[lodg_data['export_id']] = lodging
summary['lodging'] += 1
# Import Itinerary Items
# Maps already created during import of each content type
for itinerary_data in backup_data.get('itinerary_items', []):
collection = collection_map.get(itinerary_data['collection_export_id'])
if not collection:
continue
content_type_str = itinerary_data['content_type']
item_reference = itinerary_data['item_reference']
# Get the actual object based on content type
content_object = None
content_type = None
if content_type_str == 'location':
content_object = location_map.get(item_reference) # item_reference is export_id
content_type = ContentType.objects.get(model='location')
elif content_type_str == 'transportation':
content_object = transportation_map.get(item_reference) # item_reference is export_id
content_type = ContentType.objects.get(model='transportation')
elif content_type_str == 'note':
content_object = note_map.get(item_reference) # item_reference is export_id
content_type = ContentType.objects.get(model='note')
elif content_type_str == 'lodging':
content_object = lodging_map.get(item_reference) # item_reference is export_id
content_type = ContentType.objects.get(model='lodging')
elif content_type_str == 'checklist':
content_object = checklist_map.get(item_reference) # item_reference is export_id
content_type = ContentType.objects.get(model='checklist')
if content_object and content_type:
CollectionItineraryItem.objects.create(
collection=collection,
content_type=content_type,
object_id=content_object.id,
date=itinerary_data.get('date') if not itinerary_data.get('is_global') else None,
is_global=bool(itinerary_data.get('is_global', False)),
order=itinerary_data['order']
)
summary['itinerary_items'] += 1
return summary

View File

@@ -0,0 +1,518 @@
from adventures.models import Location, Collection, CollectionItineraryItem, Transportation, Note, Lodging, Visit, Checklist, Note, CollectionItineraryDay
import datetime
from django.utils.dateparse import parse_date, parse_datetime
from django.contrib.contenttypes.models import ContentType
from django.db import models
from adventures.serializers import CollectionItineraryItemSerializer, CollectionItineraryDaySerializer
from adventures.utils.itinerary import reorder_itinerary_items
from adventures.utils.autogenerate_itinerary import auto_generate_itinerary
from rest_framework import viewsets, status
from rest_framework.decorators import action
from rest_framework.response import Response
from rest_framework.exceptions import ValidationError, PermissionDenied
from adventures.permissions import IsOwnerOrSharedWithFullAccess
from django.db.models import Q
from django.db import transaction
from django.utils import timezone
class ItineraryViewSet(viewsets.ModelViewSet):
serializer_class = CollectionItineraryItemSerializer
permission_classes = [IsOwnerOrSharedWithFullAccess]
def get_queryset(self):
user = self.request.user
if not user.is_authenticated:
return CollectionItineraryItem.objects.none()
# Return itinerary items from collections the user owns or is shared with
return CollectionItineraryItem.objects.filter(
Q(collection__user=user) | Q(collection__shared_with=user)
).distinct().select_related('collection', 'collection__user').order_by('date', 'order')
def create(self, request, *args, **kwargs):
"""
Accept 'content_type' as either a ContentType PK or a model name string
(e.g. 'location', 'lodging', 'transportation', 'note', 'visit'). If a
string is provided we resolve it to the appropriate ContentType PK and
validate the referenced object exists and the user has permission to
access it.
Optional parameter 'update_item_date': if True, update the actual item's
date field to match the itinerary date.
"""
if not request.user.is_authenticated:
return Response({"error": "User is not authenticated"}, status=status.HTTP_401_UNAUTHORIZED)
data = request.data.copy()
content_type_val = data.get('content_type')
object_id = data.get('object_id')
update_item_date = data.get('update_item_date', False)
target_date = data.get('date')
is_global = data.get('is_global', False)
# Normalize is_global to boolean
if isinstance(is_global, str):
is_global = is_global.lower() in ['1', 'true', 'yes']
data['is_global'] = is_global
# Support legacy field 'location' -> treat as content_type='location'
if not content_type_val and data.get('location'):
content_type_val = 'location'
object_id = object_id or data.get('location')
data['content_type'] = content_type_val
data['object_id'] = object_id
# If content_type is provided as a string model name, map to ContentType PK
if content_type_val and isinstance(content_type_val, str):
# If it's already numeric-like, leave it
if not content_type_val.isdigit():
content_map = {
'location': Location,
'transportation': Transportation,
'note': Note,
'lodging': Lodging,
'visit': Visit,
'checklist': Checklist,
'note': Note,
}
if content_type_val not in content_map:
return Response({
'error': f"Invalid content_type. Must be one of: {', '.join(content_map.keys())}"
}, status=status.HTTP_400_BAD_REQUEST)
model_class = content_map[content_type_val]
# Validate referenced object exists
try:
content_object = model_class.objects.get(id=object_id)
except (ValueError, model_class.DoesNotExist):
return Response({'error': f"{content_type_val} not found"}, status=status.HTTP_404_NOT_FOUND)
# Permission check
permission_checker = IsOwnerOrSharedWithFullAccess()
if not permission_checker.has_object_permission(request, self, content_object):
return Response({'error': 'User does not have permission to access this content'}, status=status.HTTP_403_FORBIDDEN)
ct = ContentType.objects.get_for_model(model_class)
data['content_type'] = ct.pk
# If update_item_date is True and target_date is provided, update the item's date
if update_item_date and target_date and content_object:
# Extract just the date part if target_date is datetime
clean_date = str(target_date).split('T')[0] if 'T' in str(target_date) else str(target_date)
# For locations, create an all-day visit instead of updating a date field
if content_type_val == 'location':
# Determine start/end bounds. Support single date or optional start_date/end_date in payload.
# Prefer explicit start_date/end_date if provided, otherwise use the single target date.
start_input = data.get('start_date') or clean_date
end_input = data.get('end_date') or clean_date
def parse_bounds(val):
if not val:
return None
s = str(val)
# If datetime string provided, parse directly
if 'T' in s:
dt = parse_datetime(s)
return dt
# Otherwise parse as date and convert to datetime at start/end of day
d = parse_date(s)
if d:
return d
return None
# Normalize to date or datetime values
parsed_start = parse_bounds(start_input)
parsed_end = parse_bounds(end_input)
# If both are plain dates, convert to datetimes spanning the day
if isinstance(parsed_start, datetime.date) and not isinstance(parsed_start, datetime.datetime):
new_start = datetime.datetime.combine(parsed_start, datetime.time.min)
elif isinstance(parsed_start, datetime.datetime):
new_start = parsed_start
else:
new_start = None
if isinstance(parsed_end, datetime.date) and not isinstance(parsed_end, datetime.datetime):
new_end = datetime.datetime.combine(parsed_end, datetime.time.max)
elif isinstance(parsed_end, datetime.datetime):
new_end = parsed_end
else:
new_end = None
# If we couldn't parse bounds, fallback to the all-day target date
if not new_start or not new_end:
try:
d = parse_date(clean_date)
new_start = datetime.datetime.combine(d, datetime.time.min)
new_end = datetime.datetime.combine(d, datetime.time.max)
except Exception:
new_start = None
new_end = None
# Update existing visit or create new one
# When moving between days, update the existing visit to preserve visit ID and data
if new_start and new_end:
source_visit_id = data.get('source_visit_id')
# If source visit provided, update it
if source_visit_id:
try:
source_visit = Visit.objects.get(id=source_visit_id, location=content_object)
source_visit.start_date = new_start
source_visit.end_date = new_end
source_visit.save(update_fields=['start_date', 'end_date'])
except Visit.DoesNotExist:
# Fall back to create logic below
pass
# If no source visit or update failed, check for overlapping visits
if not source_visit_id:
# Check for exact match to avoid duplicates
exact_match = Visit.objects.filter(
location=content_object,
start_date=new_start,
end_date=new_end
).exists()
if not exact_match:
# Check for any overlapping visits
overlap_q = Q(start_date__lte=new_end) & Q(end_date__gte=new_start)
existing = Visit.objects.filter(location=content_object).filter(overlap_q).first()
if existing:
# Update existing overlapping visit
existing.start_date = new_start
existing.end_date = new_end
existing.save(update_fields=['start_date', 'end_date'])
else:
# Create new visit
Visit.objects.create(
location=content_object,
start_date=new_start,
end_date=new_end,
notes="Created from itinerary planning"
)
else:
# For other item types, update their date field and preserve duration
if content_type_val == 'transportation':
# For transportation: update date and end_date, preserving duration and times
if hasattr(content_object, 'date') and hasattr(content_object, 'end_date'):
old_date = content_object.date
old_end_date = content_object.end_date
if old_date and old_end_date:
# Extract time from original start date
original_time = old_date.time()
# Create new_date with the new date but preserve the original time
new_date = datetime.datetime.combine(parse_date(clean_date), original_time)
# Duration = end_date - date
duration = old_end_date - old_date
# Apply same duration to new date
new_end_date = new_date + duration
else:
# No original end date, set to same as start date
new_date = datetime.datetime.combine(parse_date(clean_date), datetime.time.min)
new_end_date = new_date
content_object.date = new_date
content_object.end_date = new_end_date
content_object.save(update_fields=['date', 'end_date'])
elif content_type_val == 'lodging':
# For lodging: update check_in and check_out, preserving duration and times
if hasattr(content_object, 'check_in') and hasattr(content_object, 'check_out'):
old_check_in = content_object.check_in
old_check_out = content_object.check_out
if old_check_in and old_check_out:
# Extract time from original check_in
original_time = old_check_in.time()
# Create new_check_in with the new date but preserve the original time
new_check_in = datetime.datetime.combine(parse_date(clean_date), original_time)
# Duration = check_out - check_in
duration = old_check_out - old_check_in
# Apply same duration to new check_in
new_check_out = new_check_in + duration
else:
# No original dates: check_in at midnight on selected day, check_out at midnight next day
new_check_in = datetime.datetime.combine(parse_date(clean_date), datetime.time.min)
new_check_out = new_check_in + datetime.timedelta(days=1)
content_object.check_in = new_check_in
content_object.check_out = new_check_out
content_object.save(update_fields=['check_in', 'check_out'])
else:
# For note, checklist, etc. - just update the date field
date_field = None
if hasattr(content_object, 'date'):
date_field = 'date'
elif hasattr(content_object, 'start_date'):
date_field = 'start_date'
if date_field:
setattr(content_object, date_field, clean_date)
content_object.save(update_fields=[date_field])
# Ensure order is unique for this collection+group combination (day or global)
collection_id = data.get('collection')
item_date = data.get('date')
item_order = data.get('order', 0)
# Basic XOR validation between date and is_global
if is_global and item_date:
return Response({'error': 'Global itinerary items must not include a date.'}, status=status.HTTP_400_BAD_REQUEST)
if (not is_global) and not item_date:
return Response({'error': 'Dated itinerary items must include a date.'}, status=status.HTTP_400_BAD_REQUEST)
# Validate that the itinerary date (if provided) falls within the
# collection's start_date/end_date range (if those bounds are set).
if collection_id and item_date and not is_global:
# Try parse date or datetime-like values
parsed_date = None
try:
parsed_date = parse_date(str(item_date))
except Exception:
parsed_date = None
if parsed_date is None:
try:
dt = parse_datetime(str(item_date))
if dt:
parsed_date = dt.date()
except Exception:
parsed_date = None
if parsed_date is not None:
try:
collection_obj = Collection.objects.get(id=collection_id)
except Collection.DoesNotExist:
return Response({'error': 'Collection not found'}, status=status.HTTP_404_NOT_FOUND)
if collection_obj.start_date and parsed_date < collection_obj.start_date:
return Response({'error': 'Itinerary item date is before the collection start_date'}, status=status.HTTP_400_BAD_REQUEST)
if collection_obj.end_date and parsed_date > collection_obj.end_date:
return Response({'error': 'Itinerary item date is after the collection end_date'}, status=status.HTTP_400_BAD_REQUEST)
if collection_id:
if is_global:
# Max order within global group
existing_max = CollectionItineraryItem.objects.filter(
collection_id=collection_id,
is_global=True
).aggregate(max_order=models.Max('order'))['max_order']
if existing_max is None:
existing_max = -1
if item_order is None or item_order <= existing_max:
data['order'] = existing_max + 1
elif item_date:
# Find the maximum order for this collection+date
existing_max = CollectionItineraryItem.objects.filter(
collection_id=collection_id,
date=item_date,
is_global=False
).aggregate(max_order=models.Max('order'))['max_order']
# Check if the requested order conflicts with existing items
if existing_max is not None and item_order <= existing_max:
# Assign next available order
data['order'] = existing_max + 1
# Proceed with normal serializer flow using modified data
serializer = self.get_serializer(data=data)
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
# If we updated the item's date, include the updated object in response for frontend sync
response_data = serializer.data
if update_item_date and content_type_val and object_id:
if content_type_val == 'transportation':
try:
t = Transportation.objects.get(id=object_id)
from adventures.serializers import TransportationSerializer
response_data['updated_object'] = TransportationSerializer(t).data
except Transportation.DoesNotExist:
pass
elif content_type_val == 'lodging':
try:
l = Lodging.objects.get(id=object_id)
from adventures.serializers import LodgingSerializer
response_data['updated_object'] = LodgingSerializer(l).data
except Lodging.DoesNotExist:
pass
headers = self.get_success_headers(serializer.data)
return Response(response_data, status=status.HTTP_201_CREATED, headers=headers)
@transaction.atomic
def destroy(self, request, *args, **kwargs):
"""
Override destroy to remove associated visits when deleting a location itinerary item.
When removing a location from the itinerary, any PLANNED visits (future visits) at
that location on the same date as the itinerary item should also be removed.
If preserve_visits=true query parameter is provided, visits will NOT be deleted.
This is useful when moving items to global/trip context where we want to keep the visits.
"""
instance = self.get_object()
preserve_visits = request.query_params.get('preserve_visits', 'false').lower() == 'true'
# Check if this is a location type itinerary item
location_ct = ContentType.objects.get_for_model(Location)
if instance.content_type == location_ct and instance.object_id and not preserve_visits:
try:
location = Location.objects.get(id=instance.object_id)
itinerary_date = instance.date
if itinerary_date:
# Convert itinerary date to datetime for comparison
if isinstance(itinerary_date, str):
itinerary_date = parse_date(itinerary_date)
# Find and delete visits at this location on this date
# When removing from itinerary, we remove the associated visit
visits_to_delete = Visit.objects.filter(
location=location,
start_date__date=itinerary_date
)
deleted_count = visits_to_delete.count()
if deleted_count > 0:
visits_to_delete.delete()
except Location.DoesNotExist:
# Location doesn't exist, just proceed with deleting the itinerary item
pass
# Call parent destroy to delete the itinerary item
return super().destroy(request, *args, **kwargs)
@action(detail=False, methods=['post'], url_path='reorder')
@transaction.atomic
def reorder(self, request):
"""
Reorder itinerary items in bulk.
Expected payload:
{
"items": [
{"id": "uuid", "date": "2024-01-01", "order": 0},
{"id": "uuid", "date": "2024-01-01", "order": 1},
...
]
}
"""
items_data = request.data.get('items', [])
# Delegate to reusable helper which handles validation, permission checks
# and the two-phase update to avoid unique constraint races.
updated_items = reorder_itinerary_items(request.user, items_data)
serializer = self.get_serializer(updated_items, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@action(detail=False, methods=['post'], url_path='auto-generate')
@transaction.atomic
def auto_generate(self, request):
"""
Auto-generate itinerary items for a collection based on dated records.
Only works when:
- Collection has zero itinerary items
- Collection has dated records (visits, lodging, transportation, notes, checklists)
Expected payload:
{
"collection_id": "uuid"
}
Returns: List of created itinerary items
"""
collection_id = request.data.get('collection_id')
if not collection_id:
return Response(
{"error": "collection_id is required"},
status=status.HTTP_400_BAD_REQUEST
)
# Get collection and check permissions
try:
collection = Collection.objects.get(id=collection_id)
except Collection.DoesNotExist:
return Response(
{"error": "Collection not found"},
status=status.HTTP_404_NOT_FOUND
)
# Permission check: user must be collection owner or in shared_with
if not (collection.user == request.user or collection.shared_with.filter(id=request.user.id).exists()):
return Response(
{"error": "You do not have permission to modify this collection"},
status=status.HTTP_403_FORBIDDEN
)
try:
created_items = auto_generate_itinerary(collection)
serializer = self.get_serializer(created_items, many=True)
return Response({
"message": f"Successfully generated {len(created_items)} itinerary items",
"items": serializer.data
}, status=status.HTTP_201_CREATED)
except ValidationError as e:
return Response(e.detail, status=status.HTTP_400_BAD_REQUEST)
class ItineraryDayViewSet(viewsets.ModelViewSet):
"""ViewSet for managing itinerary day metadata (names and descriptions)"""
serializer_class = CollectionItineraryDaySerializer
permission_classes = [IsOwnerOrSharedWithFullAccess]
def get_queryset(self):
user = self.request.user
if not user.is_authenticated:
return CollectionItineraryDay.objects.none()
# Return day metadata from collections the user owns or is shared with
return CollectionItineraryDay.objects.filter(
Q(collection__user=user) | Q(collection__shared_with=user)
).distinct().select_related('collection', 'collection__user').order_by('date')
def perform_create(self, serializer):
"""Ensure the user has permission to modify the collection"""
collection = serializer.validated_data.get('collection')
if not collection:
raise ValidationError("Collection is required")
# Check if user has permission to modify this collection
if not (collection.user == self.request.user or
collection.shared_with.filter(id=self.request.user.id).exists()):
raise PermissionDenied("You do not have permission to modify this collection")
serializer.save()
def perform_update(self, serializer):
"""Ensure the user has permission to modify the collection"""
instance = self.get_object()
collection = instance.collection
# Check if user has permission to modify this collection
if not (collection.user == self.request.user or
collection.shared_with.filter(id=self.request.user.id).exists()):
raise PermissionDenied("You do not have permission to modify this collection")
serializer.save()
def perform_destroy(self, instance):
"""Ensure the user has permission to modify the collection"""
collection = instance.collection
# Check if user has permission to modify this collection
if not (collection.user == self.request.user or
collection.shared_with.filter(id=self.request.user.id).exists()):
raise PermissionDenied("You do not have permission to modify this collection")
instance.delete()

View File

@@ -1,15 +1,16 @@
from django.utils import timezone
from django.db import transaction
from django.core.exceptions import PermissionDenied
from django.db.models import Q, Max
from django.db.models import Q, Max, Prefetch
from django.db.models.functions import Lower
from rest_framework import viewsets, status
from rest_framework.decorators import action
from rest_framework.response import Response
import requests
from adventures.models import Location, Category
from adventures.models import Location, Category, CollectionItineraryItem, Visit
from django.contrib.contenttypes.models import ContentType
from adventures.permissions import IsOwnerOrSharedWithFullAccess
from adventures.serializers import LocationSerializer, MapPinSerializer
from adventures.serializers import LocationSerializer, MapPinSerializer, CalendarLocationSerializer
from adventures.utils import pagination
class LocationViewSet(viewsets.ModelViewSet):
@@ -208,6 +209,29 @@ class LocationViewSet(viewsets.ModelViewSet):
serializer = self.get_serializer(queryset, many=True, context={'nested': nested, 'allowed_nested_fields': allowedNestedFields})
return Response(serializer.data)
@action(detail=False, methods=['get'])
def calendar(self, request):
"""Return a lightweight payload for calendar rendering."""
if not request.user.is_authenticated:
return Response({"error": "User is not authenticated"}, status=400)
queryset = (
self.get_queryset()
.filter(visits__isnull=False)
.select_related('category')
.prefetch_related(
Prefetch(
'visits',
queryset=Visit.objects.only('id', 'start_date', 'end_date', 'timezone')
)
)
.only('id', 'name', 'location', 'category__name', 'category__icon')
.distinct()
)
serializer = CalendarLocationSerializer(queryset, many=True)
return Response(serializer.data)
@action(detail=True, methods=['get'], url_path='additional-info')
def additional_info(self, request, pk=None):
"""Get adventure with additional sunrise/sunset information."""
@@ -277,6 +301,25 @@ class LocationViewSet(viewsets.ModelViewSet):
raise PermissionDenied(
f"You don't have permission to remove this location from one of the collections it's linked to.'"
)
else:
# If the removal is permitted, also remove any itinerary items
# in this collection that reference this Location instance.
try:
ct = ContentType.objects.get_for_model(instance.__class__)
# Try deleting by native PK type first, then by string.
qs = CollectionItineraryItem.objects.filter(
collection=collection, content_type=ct, object_id=instance.pk
)
if qs.exists():
qs.delete()
else:
CollectionItineraryItem.objects.filter(
collection=collection, content_type=ct, object_id=str(instance.pk)
).delete()
except Exception:
# Don't raise on cleanup failures; deletion of itinerary items
# is best-effort and shouldn't block the update operation.
pass
def _validate_collection_permissions(self, collections):
"""Validate permissions for all collections (used in create)."""

View File

@@ -1,11 +1,13 @@
from rest_framework import viewsets, status
from rest_framework.response import Response
from django.db.models import Q
from adventures.models import Note
from adventures.models import Note, CollectionItineraryItem
from adventures.serializers import NoteSerializer
from rest_framework.exceptions import PermissionDenied
from adventures.permissions import IsOwnerOrSharedWithFullAccess
from rest_framework.decorators import action
from django.contrib.contenttypes.models import ContentType
from django.db import transaction
class NoteViewSet(viewsets.ModelViewSet):
queryset = Note.objects.all()
@@ -53,15 +55,18 @@ class NoteViewSet(viewsets.ModelViewSet):
# Retrieve the current object
instance = self.get_object()
# Store the old date before updating
old_date = instance.date
# Partially update the instance with the request data
serializer = self.get_serializer(instance, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
# Retrieve the collection from the validated data
new_collection = serializer.validated_data.get('collection')
new_date = serializer.validated_data.get('date', old_date)
user = request.user
print(new_collection)
if new_collection is not None and new_collection!=instance.collection:
# Check if the user is the owner of the new collection
@@ -72,34 +77,15 @@ class NoteViewSet(viewsets.ModelViewSet):
if instance.collection is not None and instance.collection.user != user:
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
# Perform the update
self.perform_update(serializer)
# Return the updated instance
return Response(serializer.data)
def partial_update(self, request, *args, **kwargs):
# Retrieve the current object
instance = self.get_object()
# Partially update the instance with the request data
serializer = self.get_serializer(instance, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
# Retrieve the collection from the validated data
new_collection = serializer.validated_data.get('collection')
user = request.user
print(new_collection)
if new_collection is not None and new_collection!=instance.collection:
# Check if the user is the owner of the new collection
if new_collection.user != user or instance.user != user:
raise PermissionDenied("You do not have permission to use this collection.")
elif new_collection is None:
# Handle the case where the user is trying to set the collection to None
if instance.collection is not None and instance.collection.user != user:
raise PermissionDenied("You cannot remove the collection as you are not the owner.")
# If the date changed, remove old itinerary items for this note on the old date
if old_date and new_date and old_date != new_date:
note_ct = ContentType.objects.get_for_model(Note)
old_itinerary_items = CollectionItineraryItem.objects.filter(
content_type=note_ct,
object_id=str(instance.id),
date=old_date
)
old_itinerary_items.delete()
# Perform the update
self.perform_update(serializer)

View File

@@ -5,197 +5,398 @@ from rest_framework.response import Response
from django.conf import settings
import requests
from geopy.distance import geodesic
import logging
from ..geocoding import search_google, search_osm
logger = logging.getLogger(__name__)
class RecommendationsViewSet(viewsets.ViewSet):
permission_classes = [IsAuthenticated]
BASE_URL = "https://overpass-api.de/api/interpreter"
OVERPASS_URL = "https://overpass-api.de/api/interpreter"
NOMINATIM_URL = "https://nominatim.openstreetmap.org/search"
HEADERS = {'User-Agent': 'AdventureLog Server'}
# Quality thresholds
MIN_GOOGLE_RATING = 3.0 # Minimum rating to include
MIN_GOOGLE_REVIEWS = 5 # Minimum number of reviews
MAX_RESULTS = 50 # Maximum results to return
def calculate_quality_score(self, place_data):
"""
Calculate a quality score based on multiple factors.
Higher score = better quality recommendation.
"""
import math
score = 0.0
# Rating contribution (0-50 points)
rating = place_data.get('rating')
if rating is not None and rating > 0:
score += (rating / 5.0) * 50
# Review count contribution (0-30 points, logarithmic scale)
review_count = place_data.get('review_count')
if review_count is not None and review_count > 0:
# Logarithmic scale: 10 reviews = ~10 pts, 100 = ~20 pts, 1000 = ~30 pts
score += min(30, math.log10(review_count) * 10)
# Distance penalty (0-20 points, closer is better)
distance_km = place_data.get('distance_km')
if distance_km is not None:
if distance_km < 1:
score += 20
elif distance_km < 5:
score += 15
elif distance_km < 10:
score += 10
elif distance_km < 20:
score += 5
# Verified/business status bonus (0-10 points)
if place_data.get('is_verified') or place_data.get('business_status') == 'OPERATIONAL':
score += 10
# Has photos bonus (0-5 points)
photos = place_data.get('photos')
if photos and len(photos) > 0:
score += 5
# Has opening hours bonus (0-5 points)
opening_hours = place_data.get('opening_hours')
if opening_hours and len(opening_hours) > 0:
score += 5
return round(score, 2)
def parse_google_places(self, places, origin):
"""
Parse Google Places API results into unified format.
Enhanced with quality filtering and comprehensive data extraction.
"""
locations = []
api_key = getattr(settings, 'GOOGLE_MAPS_API_KEY', None)
for place in places:
location = place.get('location', {})
types = place.get('types', [])
# Updated for new API response structure
formatted_address = place.get("formattedAddress") or place.get("shortFormattedAddress")
# Extract display name
display_name = place.get("displayName", {})
name = display_name.get("text") if isinstance(display_name, dict) else display_name
# Extract coordinates
lat = location.get('latitude')
lon = location.get('longitude')
if not name or not lat or not lon:
continue
# Extract rating information
rating = place.get('rating')
review_count = place.get('userRatingCount', 0)
# Quality filter: Skip low-rated or unreviewed places
if rating and rating < self.MIN_GOOGLE_RATING:
continue
if review_count < self.MIN_GOOGLE_REVIEWS:
continue
# Calculate distance
distance_km = geodesic(origin, (lat, lon)).km
adventure = {
"id": place.get('id'),
"type": 'place',
# Extract address information
formatted_address = place.get("formattedAddress") or place.get("shortFormattedAddress")
# Extract business status
business_status = place.get('businessStatus')
is_operational = business_status == 'OPERATIONAL'
# Extract opening hours
opening_hours = place.get('regularOpeningHours', {})
current_opening_hours = place.get('currentOpeningHours', {})
is_open_now = current_opening_hours.get('openNow')
# Extract photos and construct URLs
photos = place.get('photos', [])
photo_urls = []
if photos and api_key:
# Get first 5 photos and construct full URLs
for photo in photos[:5]:
photo_name = photo.get('name', '')
if photo_name:
# Construct Google Places Photo API URL
# Format: https://places.googleapis.com/v1/{name}/media?key={key}&maxHeightPx=800&maxWidthPx=800
photo_url = f"https://places.googleapis.com/v1/{photo_name}/media?key={api_key}&maxHeightPx=800&maxWidthPx=800"
photo_urls.append(photo_url)
# Extract contact information
phone_number = place.get('nationalPhoneNumber') or place.get('internationalPhoneNumber')
website = place.get('websiteUri')
google_maps_uri = place.get('googleMapsUri')
# Extract price level
price_level = place.get('priceLevel')
# Extract editorial summary/description
editorial_summary = place.get('editorialSummary', {})
description = editorial_summary.get('text') if isinstance(editorial_summary, dict) else None
# Filter out unwanted types (generic categories)
filtered_types = [t for t in types if t not in ['point_of_interest', 'establishment']]
# Build unified response
place_data = {
"id": f"google:{place.get('id')}",
"external_id": place.get('id'),
"source": "google",
"name": name,
"description": place.get('businessStatus', None),
"description": description,
"latitude": lat,
"longitude": lon,
"address": formatted_address,
"tag": types[0] if types else None,
"distance_km": round(distance_km, 2),
"rating": rating,
"review_count": review_count,
"price_level": price_level,
"types": filtered_types,
"primary_type": filtered_types[0] if filtered_types else None,
"business_status": business_status,
"is_open_now": is_open_now,
"opening_hours": opening_hours.get('weekdayDescriptions', []) if opening_hours else None,
"phone_number": phone_number,
"website": website,
"google_maps_url": google_maps_uri,
"photos": photo_urls,
"is_verified": is_operational,
}
locations.append(adventure)
# Sort by distance ascending
locations.sort(key=lambda x: x["distance_km"])
# Calculate quality score
place_data['quality_score'] = self.calculate_quality_score(place_data)
locations.append(place_data)
return locations
def parse_overpass_response(self, data, request):
def parse_overpass_response(self, data, request, origin):
"""
Parse Overpass API (OSM) results into unified format.
Enhanced with quality filtering and comprehensive data extraction.
"""
nodes = data.get('elements', [])
locations = []
all = request.query_params.get('all', False)
origin = None
try:
origin = (
float(request.query_params.get('lat')),
float(request.query_params.get('lon'))
)
except(ValueError, TypeError):
origin = None
for node in nodes:
if node.get('type') not in ['node', 'way', 'relation']:
continue
tags = node.get('tags', {})
lat = node.get('lat')
lon = node.get('lon')
name = tags.get('name', tags.get('official_name', ''))
# Get coordinates (for ways/relations, use center)
lat = node.get('lat') or node.get('center', {}).get('lat')
lon = node.get('lon') or node.get('center', {}).get('lon')
# Extract name (with fallbacks)
name = tags.get('name') or tags.get('official_name') or tags.get('alt_name')
if not name or lat is None or lon is None:
if not all:
continue
continue
# Flatten address
address_parts = [tags.get(f'addr:{k}') for k in ['housenumber', 'street', 'suburb', 'city', 'state', 'postcode', 'country']]
formatted_address = ", ".join(filter(None, address_parts)) or name
# Calculate distance
distance_km = round(geodesic(origin, (lat, lon)).km, 2) if origin else None
# Calculate distance if possible
distance_km = None
if origin:
distance_km = round(geodesic(origin, (lat, lon)).km, 2)
# Extract address information
address_parts = [
tags.get('addr:housenumber'),
tags.get('addr:street'),
tags.get('addr:suburb') or tags.get('addr:neighbourhood'),
tags.get('addr:city'),
tags.get('addr:state'),
tags.get('addr:postcode'),
tags.get('addr:country')
]
formatted_address = ", ".join(filter(None, address_parts)) or None
# Unified format
adventure = {
"id": f"osm:{node.get('id')}",
"type": "place",
# Extract contact information
phone = tags.get('phone') or tags.get('contact:phone')
website = tags.get('website') or tags.get('contact:website') or tags.get('url')
# Extract opening hours
opening_hours = tags.get('opening_hours')
# Extract rating/stars (if available)
stars = tags.get('stars')
# Determine category/type hierarchy
category_keys = ['tourism', 'leisure', 'amenity', 'natural', 'historic', 'attraction', 'shop', 'sport']
types = [tags.get(key) for key in category_keys if key in tags]
primary_type = types[0] if types else None
# Extract description and additional info
description = tags.get('description') or tags.get('note')
wikipedia = tags.get('wikipedia') or tags.get('wikidata')
# Extract image if available
image = tags.get('image') or tags.get('wikimedia_commons')
# Quality filters for OSM data
# Skip if it's just a generic POI without specific category
if not primary_type:
continue
# Skip construction or disused places
if tags.get('disused') or tags.get('construction'):
continue
# Build unified response
place_data = {
"id": f"osm:{node.get('type')}:{node.get('id')}",
"external_id": str(node.get('id')),
"source": "osm",
"name": name,
"description": tags.get('description'),
"description": description,
"latitude": lat,
"longitude": lon,
"address": formatted_address,
"tag": next((tags.get(key) for key in ['leisure', 'tourism', 'natural', 'historic', 'amenity'] if key in tags), None),
"distance_km": distance_km,
"powered_by": "osm"
"rating": None, # OSM doesn't have ratings
"review_count": None,
"price_level": None,
"types": types,
"primary_type": primary_type,
"business_status": None,
"is_open_now": None,
"opening_hours": [opening_hours] if opening_hours else None,
"phone_number": phone,
"website": website,
"google_maps_url": None,
"photos": [image] if image else [],
"is_verified": bool(wikipedia), # Has Wikipedia = more verified
"osm_type": node.get('type'),
"wikipedia": wikipedia,
"stars": stars,
}
locations.append(adventure)
# Sort by distance if available
if origin:
locations.sort(key=lambda x: x.get("distance_km") or float("inf"))
# Calculate quality score (will be lower without ratings)
place_data['quality_score'] = self.calculate_quality_score(place_data)
locations.append(place_data)
return locations
def query_overpass(self, lat, lon, radius, category, request):
"""
Query Overpass API (OpenStreetMap) for nearby places.
Enhanced with better queries and error handling.
"""
# Limit radius for OSM to prevent timeouts (max 5km for OSM due to server limits)
osm_radius = min(radius, 5000)
# Build optimized query - use simpler queries and limit results
# Reduced timeout and simplified queries to prevent 504 errors
if category == 'tourism':
query = f"""
[out:json];
[out:json][timeout:25];
(
node(around:{radius},{lat},{lon})["tourism"];
node(around:{radius},{lat},{lon})["leisure"];
node(around:{radius},{lat},{lon})["historic"];
node(around:{radius},{lat},{lon})["sport"];
node(around:{radius},{lat},{lon})["natural"];
node(around:{radius},{lat},{lon})["attraction"];
node(around:{radius},{lat},{lon})["museum"];
node(around:{radius},{lat},{lon})["zoo"];
node(around:{radius},{lat},{lon})["aquarium"];
nwr["tourism"~"attraction|viewpoint|museum|gallery|zoo|aquarium"](around:{osm_radius},{lat},{lon});
nwr["historic"~"monument|castle|memorial"](around:{osm_radius},{lat},{lon});
nwr["leisure"~"park|garden|nature_reserve"](around:{osm_radius},{lat},{lon});
);
out;
out center tags 50;
"""
elif category == 'lodging':
query = f"""
[out:json];
(
node(around:{radius},{lat},{lon})["tourism"="hotel"];
node(around:{radius},{lat},{lon})["tourism"="motel"];
node(around:{radius},{lat},{lon})["tourism"="guest_house"];
node(around:{radius},{lat},{lon})["tourism"="hostel"];
node(around:{radius},{lat},{lon})["tourism"="camp_site"];
node(around:{radius},{lat},{lon})["tourism"="caravan_site"];
node(around:{radius},{lat},{lon})["tourism"="chalet"];
node(around:{radius},{lat},{lon})["tourism"="alpine_hut"];
node(around:{radius},{lat},{lon})["tourism"="apartment"];
);
out;
[out:json][timeout:25];
nwr["tourism"~"hotel|motel|guest_house|hostel"](around:{osm_radius},{lat},{lon});
out center tags 50;
"""
elif category == 'food':
query = f"""
[out:json];
(
node(around:{radius},{lat},{lon})["amenity"="restaurant"];
node(around:{radius},{lat},{lon})["amenity"="cafe"];
node(around:{radius},{lat},{lon})["amenity"="fast_food"];
node(around:{radius},{lat},{lon})["amenity"="pub"];
node(around:{radius},{lat},{lon})["amenity"="bar"];
node(around:{radius},{lat},{lon})["amenity"="food_court"];
node(around:{radius},{lat},{lon})["amenity"="ice_cream"];
node(around:{radius},{lat},{lon})["amenity"="bakery"];
node(around:{radius},{lat},{lon})["amenity"="confectionery"];
);
out;
[out:json][timeout:25];
nwr["amenity"~"restaurant|cafe|bar|pub"](around:{osm_radius},{lat},{lon});
out center tags 50;
"""
else:
return Response({"error": "Invalid category."}, status=400)
logger.error(f"Invalid category requested: {category}")
return {"error": "Invalid category.", "results": []}
overpass_url = f"{self.BASE_URL}?data={query}"
try:
response = requests.get(overpass_url, headers=self.HEADERS)
response = requests.post(
self.OVERPASS_URL,
data=query,
headers=self.HEADERS,
timeout=30
)
response.raise_for_status()
data = response.json()
except Exception as e:
print("Overpass API error:", e)
return Response({"error": "Failed to retrieve data from Overpass API."}, status=500)
except requests.exceptions.Timeout:
logger.warning(f"Overpass API timeout for {category} at ({lat}, {lon}) with radius {osm_radius}m")
return {"error": f"OpenStreetMap query timed out. The service is overloaded. Radius limited to {int(osm_radius)}m.", "results": []}
except requests.exceptions.HTTPError as e:
if e.response.status_code == 504:
logger.warning(f"Overpass API 504 Gateway Timeout for {category}")
return {"error": "OpenStreetMap server is overloaded. Try again later or use Google source.", "results": []}
logger.warning(f"Overpass API HTTP error: {e}")
return {"error": f"OpenStreetMap error: please try again later.", "results": []}
except requests.exceptions.RequestException as e:
logger.warning(f"Overpass API error: {e}")
return {"error": f"OpenStreetMap temporarily unavailable: please try again later.", "results": []}
except ValueError as e:
logger.error(f"Invalid JSON response from Overpass: {e}")
return {"error": "Invalid response from OpenStreetMap.", "results": []}
locations = self.parse_overpass_response(data, request)
return Response(locations)
origin = (float(lat), float(lon))
locations = self.parse_overpass_response(data, request, origin)
logger.info(f"Overpass returned {len(locations)} results")
return {"error": None, "results": locations}
def query_google_nearby(self, lat, lon, radius, category, request):
"""Query Google Places API (New) for nearby places"""
"""
Query Google Places API (New) for nearby places.
Enhanced with comprehensive field masks and better error handling.
"""
api_key = settings.GOOGLE_MAPS_API_KEY
# Updated to use new Places API endpoint
url = "https://places.googleapis.com/v1/places:searchNearby"
# Comprehensive field mask to get all useful information
headers = {
'Content-Type': 'application/json',
'X-Goog-Api-Key': api_key,
'X-Goog-FieldMask': 'places.displayName.text,places.formattedAddress,places.location,places.types,places.rating,places.userRatingCount,places.businessStatus,places.id'
'X-Goog-FieldMask': (
'places.id,'
'places.displayName,'
'places.formattedAddress,'
'places.shortFormattedAddress,'
'places.location,'
'places.types,'
'places.rating,'
'places.userRatingCount,'
'places.businessStatus,'
'places.priceLevel,'
'places.websiteUri,'
'places.googleMapsUri,'
'places.nationalPhoneNumber,'
'places.internationalPhoneNumber,'
'places.editorialSummary,'
'places.photos,'
'places.currentOpeningHours,'
'places.regularOpeningHours'
)
}
# Map categories to place types for the new API
# Map categories to place types - use multiple types for better coverage
type_mapping = {
'lodging': 'lodging',
'food': 'restaurant',
'tourism': 'tourist_attraction',
'lodging': ['lodging', 'hotel', 'hostel', 'resort_hotel', 'extended_stay_hotel'],
'food': ['restaurant', 'cafe', 'bar', 'bakery', 'meal_takeaway', 'meal_delivery'],
'tourism': ['tourist_attraction', 'museum', 'art_gallery', 'aquarium', 'zoo', 'amusement_park', 'park', 'natural_feature'],
}
payload = {
"includedTypes": [type_mapping[category]],
"includedTypes": type_mapping.get(category, ['tourist_attraction']),
"maxResultCount": 20,
"rankPreference": "DISTANCE", # Sort by distance first
"locationRestriction": {
"circle": {
"center": {
@@ -208,7 +409,7 @@ class RecommendationsViewSet(viewsets.ViewSet):
}
try:
response = requests.post(url, json=payload, headers=headers, timeout=10)
response = requests.post(url, json=payload, headers=headers, timeout=15)
response.raise_for_status()
data = response.json()
@@ -216,41 +417,265 @@ class RecommendationsViewSet(viewsets.ViewSet):
origin = (float(lat), float(lon))
locations = self.parse_google_places(places, origin)
return Response(locations)
logger.info(f"Google Places returned {len(locations)} quality results for category '{category}'")
return Response(self._prepare_final_results(locations))
except requests.exceptions.Timeout:
logger.warning("Google Places API timeout, falling back to OSM")
return self.query_overpass(lat, lon, radius, category, request)
except requests.exceptions.RequestException as e:
print(f"Google Places API error: {e}")
# Fallback to Overpass API
logger.warning(f"Google Places API error: {e}, falling back to OSM")
return self.query_overpass(lat, lon, radius, category, request)
except Exception as e:
print(f"Unexpected error with Google Places API: {e}")
# Fallback to Overpass API
logger.error(f"Unexpected error with Google Places API: {e}")
return self.query_overpass(lat, lon, radius, category, request)
def _prepare_final_results(self, locations):
"""
Prepare final results: sort by quality score and limit results.
"""
# Sort by quality score (highest first)
locations.sort(key=lambda x: x.get('quality_score', 0), reverse=True)
# Limit to MAX_RESULTS
locations = locations[:self.MAX_RESULTS]
return locations
def _deduplicate_results(self, google_results, osm_results):
"""
Deduplicate results from both sources based on name and proximity.
Prioritize Google results when duplicates are found.
"""
from difflib import SequenceMatcher
def is_similar(name1, name2, threshold=0.85):
"""Check if two names are similar using fuzzy matching."""
return SequenceMatcher(None, name1.lower(), name2.lower()).ratio() > threshold
def is_nearby(loc1, loc2, max_distance_m=50):
"""Check if two locations are within max_distance_m meters."""
dist = geodesic(
(loc1['latitude'], loc1['longitude']),
(loc2['latitude'], loc2['longitude'])
).meters
return dist < max_distance_m
# Start with all Google results (higher quality)
deduplicated = list(google_results)
# Add OSM results that don't match Google results
for osm_loc in osm_results:
is_duplicate = False
for google_loc in google_results:
if (is_similar(osm_loc['name'], google_loc['name']) and
is_nearby(osm_loc, google_loc)):
is_duplicate = True
break
if not is_duplicate:
deduplicated.append(osm_loc)
return deduplicated
@action(detail=False, methods=['get'])
def query(self, request):
"""
Query both Google Places and OSM for recommendations.
Returns unified, high-quality results sorted by quality score.
Query Parameters:
- lat (required): Latitude
- lon (required): Longitude
- radius (optional): Search radius in meters (default: 5000, max: 50000)
- category (required): Category - 'tourism', 'food', or 'lodging'
- sources (optional): Comma-separated sources - 'google', 'osm', or 'both' (default: 'both')
"""
lat = request.query_params.get('lat')
lon = request.query_params.get('lon')
radius = request.query_params.get('radius', '1000')
category = request.query_params.get('category', 'all')
# Allow a free-text `location` parameter which will be geocoded
location_param = request.query_params.get('location')
radius = request.query_params.get('radius', '5000')
category = request.query_params.get('category')
sources = request.query_params.get('sources', 'both').lower()
# If lat/lon not supplied, try geocoding the free-text location param
if (not lat or not lon) and location_param:
geocode_results = None
# Try Google first if API key configured
if getattr(settings, 'GOOGLE_MAPS_API_KEY', None):
try:
geocode_results = search_google(location_param)
except Exception:
logger.warning("Google geocoding failed; falling back to OSM")
geocode_results = None
# Fallback to OSM Nominatim
if not geocode_results:
try:
geocode_results = search_osm(location_param)
except Exception:
logger.warning("OSM geocoding failed")
geocode_results = None
# Validate geocode results
if isinstance(geocode_results, dict) and geocode_results.get('error'):
# Log internal geocoding error but avoid exposing sensitive details
logger.warning("Geocoding helper returned an internal error")
return Response({"error": "Geocoding failed. Please try a different location or contact support."}, status=400)
if not geocode_results:
return Response({"error": "Could not geocode provided location."}, status=400)
# geocode_results expected to be a list of results; pick the best (first)
best = None
if isinstance(geocode_results, list) and len(geocode_results) > 0:
best = geocode_results[0]
elif isinstance(geocode_results, dict):
# Some helpers might return a dict when only one result found
best = geocode_results
if not best:
return Response({"error": "No geocoding results found."}, status=400)
try:
lat = float(best.get('lat') or best.get('latitude'))
lon = float(best.get('lon') or best.get('longitude'))
except Exception:
return Response({"error": "Geocoding result missing coordinates."}, status=400)
# Replace location_param with display name when available for logging/debug
location_param = best.get('display_name') or best.get('name') or location_param
# Validation: require lat and lon at this point
if not lat or not lon:
return Response({"error": "Latitude and longitude parameters are required."}, status=400)
valid_categories = {
'lodging': 'lodging',
'food': 'restaurant',
'tourism': 'tourist_attraction',
}
return Response({
"error": "Latitude and longitude parameters are required (or provide a 'location' parameter to geocode)."
}, status=400)
try:
lat = float(lat)
lon = float(lon)
radius = min(float(radius), 50000) # Max 50km radius
except ValueError:
return Response({
"error": "Invalid latitude, longitude, or radius value."
}, status=400)
valid_categories = ['lodging', 'food', 'tourism']
if category not in valid_categories:
return Response({"error": f"Invalid category. Valid categories: {', '.join(valid_categories)}"}, status=400)
return Response({
"error": f"Invalid category. Valid categories: {', '.join(valid_categories)}"
}, status=400)
valid_sources = ['google', 'osm', 'both']
if sources not in valid_sources:
return Response({
"error": f"Invalid sources. Valid options: {', '.join(valid_sources)}"
}, status=400)
api_key = getattr(settings, 'GOOGLE_MAPS_API_KEY', None)
# Fallback to Overpass if no API key configured
if not api_key:
return self.query_overpass(lat, lon, radius, category, request)
# Use the new Google Places API
return self.query_google_nearby(lat, lon, radius, category, request)
google_results = []
osm_results = []
# Query Google Places if available and requested
if api_key and sources in ['google', 'both']:
try:
url = "https://places.googleapis.com/v1/places:searchNearby"
headers = {
'Content-Type': 'application/json',
'X-Goog-Api-Key': api_key,
'X-Goog-FieldMask': (
'places.id,places.displayName,places.formattedAddress,'
'places.shortFormattedAddress,places.location,places.types,'
'places.rating,places.userRatingCount,places.businessStatus,'
'places.priceLevel,places.websiteUri,places.googleMapsUri,'
'places.nationalPhoneNumber,places.internationalPhoneNumber,'
'places.editorialSummary,places.photos,'
'places.currentOpeningHours,places.regularOpeningHours'
)
}
type_mapping = {
'lodging': ['lodging', 'hotel', 'hostel', 'resort_hotel'],
'food': ['restaurant', 'cafe', 'bar', 'bakery'],
'tourism': ['tourist_attraction', 'museum', 'art_gallery', 'aquarium', 'zoo', 'park'],
}
payload = {
"includedTypes": type_mapping.get(category, ['tourist_attraction']),
"maxResultCount": 20,
"rankPreference": "DISTANCE",
"locationRestriction": {
"circle": {
"center": {"latitude": lat, "longitude": lon},
"radius": radius
}
}
}
response = requests.post(url, json=payload, headers=headers, timeout=15)
response.raise_for_status()
data = response.json()
places = data.get('places', [])
origin = (lat, lon)
google_results = self.parse_google_places(places, origin)
logger.info(f"Google Places: {len(google_results)} quality results")
except Exception as e:
logger.warning(f"Google Places failed: {e}")
# Query OSM if requested or as fallback
osm_error = None
if sources in ['osm', 'both'] or (sources == 'google' and not google_results):
osm_response = self.query_overpass(lat, lon, radius, category, request)
osm_results = osm_response.get('results', [])
osm_error = osm_response.get('error')
if osm_error:
logger.warning(f"OSM query had issues: {osm_error}")
# Combine and deduplicate if using both sources
if sources == 'both' and google_results and osm_results:
all_results = self._deduplicate_results(google_results, osm_results)
else:
all_results = google_results + osm_results
# Prepare final results
final_results = self._prepare_final_results(all_results)
logger.info(f"Returning {len(final_results)} total recommendations")
# Build response with metadata
response_data = {
"count": len(final_results),
"results": final_results,
"sources_used": {
"google": len(google_results),
"osm": len(osm_results),
"total_before_dedup": len(google_results) + len(osm_results)
}
}
# Add warnings if there were errors but we still have some results
warnings = []
if osm_error and len(osm_results) == 0:
warnings.append(osm_error)
if warnings:
response_data["warnings"] = warnings
# If no results at all and user requested only OSM, return error status
if len(final_results) == 0 and sources == 'osm' and osm_error:
# Log internal error notice for investigation but do not expose details to clients
logger.debug("OSM query error (internal)")
return Response({
"error": "OpenStreetMap service temporarily unavailable. Please try again later.",
"count": 0,
"results": [],
"sources_used": response_data["sources_used"]
}, status=503)
return Response(response_data)

View File

@@ -45,41 +45,90 @@ class ReverseGeocodeViewSet(viewsets.ViewSet):
@action(detail=False, methods=['post'])
def mark_visited_region(self, request):
# searches through all of the users locations, if the serialized data is_visited, is true, runs reverse geocode on the locations and if a region is found, marks it as visited. Use the extractIsoCode function to get the region
"""
Marks regions and cities as visited based on user's visited locations.
Uses the pre-stored region/city data on locations to avoid expensive reverse geocoding.
"""
new_region_count = 0
new_regions = {}
new_city_count = 0
new_cities = {}
locations = Location.objects.filter(user=self.request.user)
serializer = LocationSerializer(locations, many=True)
for adventure, serialized_adventure in zip(locations, serializer.data):
if serialized_adventure['is_visited'] == True:
lat = adventure.latitude
lon = adventure.longitude
if not lat or not lon:
continue
# Use the existing reverse_geocode function which handles both Google and OSM
data = reverse_geocode(lat, lon, self.request.user)
if 'error' in data:
continue
# data already contains region_id and city_id
if 'region_id' in data and data['region_id'] is not None:
region = Region.objects.filter(id=data['region_id']).first()
visited_region = VisitedRegion.objects.filter(region=region, user=self.request.user).first()
if not visited_region:
visited_region = VisitedRegion(region=region, user=self.request.user)
visited_region.save()
new_region_count += 1
new_regions[region.id] = region.name
if 'city_id' in data and data['city_id'] is not None:
city = City.objects.filter(id=data['city_id']).first()
visited_city = VisitedCity.objects.filter(city=city, user=self.request.user).first()
if not visited_city:
visited_city = VisitedCity(city=city, user=self.request.user)
visited_city.save()
new_city_count += 1
new_cities[city.id] = city.name
return Response({"new_regions": new_region_count, "regions": new_regions, "new_cities": new_city_count, "cities": new_cities})
# Get all visited locations with their region and city data
visited_locations = Location.objects.filter(
user=self.request.user
).select_related('region', 'city')
# Track unique regions and cities to create VisitedRegion/VisitedCity entries
regions_to_mark = set()
cities_to_mark = set()
for location in visited_locations:
# Only process locations that are marked as visited
if not location.is_visited_status():
continue
# Collect regions
if location.region:
regions_to_mark.add(location.region.id)
# Collect cities
if location.city:
cities_to_mark.add(location.city.id)
# Get existing visited regions for this user
existing_visited_regions = set(
VisitedRegion.objects.filter(
user=self.request.user,
region_id__in=regions_to_mark
).values_list('region_id', flat=True)
)
# Create new VisitedRegion entries
new_visited_regions = []
for region_id in regions_to_mark:
if region_id not in existing_visited_regions:
new_visited_regions.append(
VisitedRegion(region_id=region_id, user=self.request.user)
)
if new_visited_regions:
VisitedRegion.objects.bulk_create(new_visited_regions)
new_region_count = len(new_visited_regions)
# Get region names for response
regions = Region.objects.filter(
id__in=[vr.region_id for vr in new_visited_regions]
)
new_regions = {r.id: r.name for r in regions}
# Get existing visited cities for this user
existing_visited_cities = set(
VisitedCity.objects.filter(
user=self.request.user,
city_id__in=cities_to_mark
).values_list('city_id', flat=True)
)
# Create new VisitedCity entries
new_visited_cities = []
for city_id in cities_to_mark:
if city_id not in existing_visited_cities:
new_visited_cities.append(
VisitedCity(city_id=city_id, user=self.request.user)
)
if new_visited_cities:
VisitedCity.objects.bulk_create(new_visited_cities)
new_city_count = len(new_visited_cities)
# Get city names for response
cities = City.objects.filter(
id__in=[vc.city_id for vc in new_visited_cities]
)
new_cities = {c.id: c.name for c in cities}
return Response({
"new_regions": new_region_count,
"regions": new_regions,
"new_cities": new_city_count,
"cities": new_cities
})

View File

@@ -1,11 +1,8 @@
"""
Django settings for demo project.
AdventureLog Server settings
For more information on this file, see
https://docs.djangoproject.com/en/1.7/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.7/ref/settings/
Reference:
- Django settings: https://docs.djangoproject.com/en/stable/ref/settings/
"""
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
@@ -15,14 +12,21 @@ from os import getenv
from pathlib import Path
from urllib.parse import urlparse
from publicsuffix2 import get_sld
# Load environment variables from .env file
# ---------------------------------------------------------------------------
# Environment & Paths
# ---------------------------------------------------------------------------
# Load environment variables from .env file early so getenv works everywhere.
load_dotenv()
BASE_DIR = os.path.dirname(os.path.dirname(__file__))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.7/howto/deployment/checklist/
# See Django deployment checklist for production hardening.
# ---------------------------------------------------------------------------
# Core Security & Debug
# ---------------------------------------------------------------------------
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = getenv('SECRET_KEY')
@@ -34,9 +38,13 @@ DEBUG = getenv('DEBUG', 'true').lower() == 'true'
# '127.0.0.1',
# 'server'
# ]
ALLOWED_HOSTS = ['*']
ALLOWED_HOSTS = ['*'] # In production, restrict to known hosts.
# ---------------------------------------------------------------------------
# Installed Apps
# ---------------------------------------------------------------------------
INSTALLED_APPS = (
"allauth_ui",
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
@@ -44,7 +52,6 @@ INSTALLED_APPS = (
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.sites',
# "allauth_ui",
'rest_framework',
'rest_framework.authtoken',
'allauth',
@@ -54,7 +61,9 @@ INSTALLED_APPS = (
'allauth.socialaccount',
'allauth.socialaccount.providers.github',
'allauth.socialaccount.providers.openid_connect',
'invitations',
'drf_yasg',
'djmoney',
'corsheaders',
'adventures',
'worldtravel',
@@ -62,11 +71,14 @@ INSTALLED_APPS = (
'integrations',
'django.contrib.gis',
# 'achievements', # Not done yet, will be added later in a future update
# 'widget_tweaks',
# 'slippers',
'widget_tweaks',
'slippers',
)
# ---------------------------------------------------------------------------
# Middleware
# ---------------------------------------------------------------------------
MIDDLEWARE = (
'whitenoise.middleware.WhiteNoiseMiddleware',
'adventures.middleware.XSessionTokenMiddleware',
@@ -83,9 +95,9 @@ MIDDLEWARE = (
'allauth.account.middleware.AccountMiddleware',
)
# disable verifications for new users
ACCOUNT_EMAIL_VERIFICATION = 'none'
# ---------------------------------------------------------------------------
# Caching
# ---------------------------------------------------------------------------
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.PyMemcacheCache',
@@ -101,9 +113,9 @@ ROOT_URLCONF = 'main.urls'
# WSGI_APPLICATION = 'demo.wsgi.application'
# ---------------------------------------------------------------------------
# Database
# https://docs.djangoproject.com/en/1.7/ref/settings/#databases
# ---------------------------------------------------------------------------
# Using legacy PG environment variables for compatibility with existing setups
def env(*keys, default=None):
@@ -131,58 +143,52 @@ DATABASES = {
# Internationalization
# https://docs.djangoproject.com/en/1.7/topics/i18n/
# ---------------------------------------------------------------------------
# Internationalization
# ---------------------------------------------------------------------------
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# ---------------------------------------------------------------------------
# Frontend URL & Cookies
# ---------------------------------------------------------------------------
# Derive frontend URL from environment and configure cookie behavior.
unParsedFrontenedUrl = getenv('FRONTEND_URL', 'http://localhost:3000')
FRONTEND_URL = unParsedFrontenedUrl.translate(str.maketrans('', '', '\'"'))
SESSION_COOKIE_SAMESITE = 'Lax'
SESSION_COOKIE_NAME = 'sessionid'
# Secure cookies if frontend is served over HTTPS
SESSION_COOKIE_SECURE = FRONTEND_URL.startswith('https')
CSRF_COOKIE_SECURE = FRONTEND_URL.startswith('https')
# Dynamically determine cookie domain to support subdomains while avoiding IPs
hostname = urlparse(FRONTEND_URL).hostname
is_ip_address = hostname.replace('.', '').isdigit()
# Check if the hostname is single-label (no dots)
is_single_label = '.' not in hostname
is_single_label = '.' not in hostname # single-label hostnames (e.g., "localhost")
if is_ip_address or is_single_label:
# Do not set a domain for IP addresses or single-label hostnames
SESSION_COOKIE_DOMAIN = None
else:
# Use publicsuffix2 to calculate the correct cookie domain
cookie_domain = get_sld(hostname)
if cookie_domain:
SESSION_COOKIE_DOMAIN = f".{cookie_domain}"
else:
# Fallback to the hostname if parsing fails
SESSION_COOKIE_DOMAIN = hostname
SESSION_COOKIE_DOMAIN = f".{cookie_domain}" if cookie_domain else hostname
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.7/howto/static-files/
# ---------------------------------------------------------------------------
# Static & Media Files
# ---------------------------------------------------------------------------
SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
BASE_DIR = Path(__file__).resolve().parent.parent
STATIC_ROOT = BASE_DIR / "staticfiles"
STATIC_URL = '/static/'
MEDIA_URL = '/media/'
MEDIA_ROOT = BASE_DIR / 'media' # This path must match the NGINX root
MEDIA_ROOT = BASE_DIR / 'media' # Must match NGINX root for media serving
STATICFILES_DIRS = [BASE_DIR / 'static']
STORAGES = {
@@ -196,6 +202,9 @@ STORAGES = {
SILENCED_SYSTEM_CHECKS = ["slippers.E001"]
# ---------------------------------------------------------------------------
# Templates
# ---------------------------------------------------------------------------
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
@@ -212,31 +221,37 @@ TEMPLATES = [
},
]
# Authentication settings
ALLAUTH_UI_THEME = "dim"
# ---------------------------------------------------------------------------
# Authentication & Accounts
# ---------------------------------------------------------------------------
DISABLE_REGISTRATION = getenv('DISABLE_REGISTRATION', 'false').lower() == 'true'
DISABLE_REGISTRATION_MESSAGE = getenv('DISABLE_REGISTRATION_MESSAGE', 'Registration is disabled. Please contact the administrator if you need an account.')
SOCIALACCOUNT_ALLOW_SIGNUP = getenv('SOCIALACCOUNT_ALLOW_SIGNUP', 'false').lower() == 'true'
AUTH_USER_MODEL = 'users.CustomUser'
ACCOUNT_ADAPTER = 'users.adapters.NoNewUsersAccountAdapter'
ACCOUNT_ADAPTER = 'users.adapters.CustomAccountAdapter'
INVITATIONS_ADAPTER = ACCOUNT_ADAPTER
INVITATIONS_ACCEPT_INVITE_AFTER_SIGNUP = True
INVITATIONS_EMAIL_SUBJECT_PREFIX = 'AdventureLog: '
SOCIALACCOUNT_ADAPTER = 'users.adapters.CustomSocialAccountAdapter'
ACCOUNT_SIGNUP_FORM_CLASS = 'users.form_overrides.CustomSignupForm'
SESSION_SAVE_EVERY_REQUEST = True
# Set login redirect URL to the frontend
LOGIN_REDIRECT_URL = FRONTEND_URL
LOGIN_REDIRECT_URL = FRONTEND_URL # Redirect to frontend after login
SOCIALACCOUNT_LOGIN_ON_GET = True
INVITATIONS_INVITE_FORM = 'users.form_overrides.UseAdminInviteForm'
INVITATIONS_SIGNUP_REDIRECT_URL = f"{FRONTEND_URL}/signup"
HEADLESS_FRONTEND_URLS = {
"account_confirm_email": f"{FRONTEND_URL}/user/verify-email/{{key}}",
"account_reset_password": f"{FRONTEND_URL}/user/reset-password",
"account_reset_password_from_key": f"{FRONTEND_URL}/user/reset-password/{{key}}",
"account_signup": f"{FRONTEND_URL}/signup",
# Fallback in case the state containing the `next` URL is lost and the handshake
# with the third-party provider fails.
# Fallback if handshake with provider fails and `next` URL is lost.
"socialaccount_login_error": f"{FRONTEND_URL}/account/provider/callback",
}
@@ -249,8 +264,14 @@ AUTHENTICATION_BACKENDS = [
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
SITE_ID = 1
ACCOUNT_EMAIL_REQUIRED = True
ACCOUNT_AUTHENTICATION_METHOD = 'username'
ACCOUNT_EMAIL_VERIFICATION = 'optional'
ACCOUNT_UNIQUE_EMAIL = True
ACCOUNT_EMAIL_VERIFICATION = getenv('ACCOUNT_EMAIL_VERIFICATION', 'none') # 'none', 'optional', 'mandatory'
SOCIALACCOUNT_EMAIL_AUTHENTICATION = True
SOCIALACCOUNT_EMAIL_AUTHENTICATION_AUTO_CONNECT = True # Auto-link by email
SOCIALACCOUNT_AUTO_SIGNUP = True # Allow auto-signup post adapter checks
FORCE_SOCIALACCOUNT_LOGIN = getenv('FORCE_SOCIALACCOUNT_LOGIN', 'false').lower() == 'true' # When true, only social login is allowed (no password login) and the login page will show only social providers or redirect directly to the first provider if only one is configured.
if getenv('EMAIL_BACKEND', 'console') == 'console':
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
@@ -274,6 +295,9 @@ else:
# DEFAULT_FROM_EMAIL = 'mail@mail.user.com'
# ---------------------------------------------------------------------------
# Django REST Framework
# ---------------------------------------------------------------------------
REST_FRAMEWORK = {
'DEFAULT_AUTHENTICATION_CLASSES': (
'rest_framework.authentication.SessionAuthentication',
@@ -292,15 +316,18 @@ else:
)
# ---------------------------------------------------------------------------
# CORS & CSRF
# ---------------------------------------------------------------------------
CORS_ALLOWED_ORIGINS = [origin.strip() for origin in getenv('CSRF_TRUSTED_ORIGINS', 'http://localhost').split(',') if origin.strip()]
CSRF_TRUSTED_ORIGINS = [origin.strip() for origin in getenv('CSRF_TRUSTED_ORIGINS', 'http://localhost').split(',') if origin.strip()]
CORS_ALLOW_CREDENTIALS = True
DEFAULT_AUTO_FIELD = 'django.db.models.AutoField'
# ---------------------------------------------------------------------------
# Logging
# ---------------------------------------------------------------------------
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
@@ -326,6 +353,9 @@ LOGGING = {
},
}
# ---------------------------------------------------------------------------
# Public URLs & Third-Party Integrations
# ---------------------------------------------------------------------------
PUBLIC_URL = getenv('PUBLIC_URL', 'http://localhost:8000')
# ADVENTURELOG_CDN_URL = getenv('ADVENTURELOG_CDN_URL', 'https://cdn.adventurelog.app')
@@ -334,9 +364,9 @@ PUBLIC_URL = getenv('PUBLIC_URL', 'http://localhost:8000')
ADVENTURELOG_RELEASE_VERSION = 'v0.11.0'
# https://github.com/dr5hn/countries-states-cities-database/tags
COUNTRY_REGION_JSON_VERSION = 'v2.6'
COUNTRY_REGION_JSON_VERSION = 'v3.0'
# External service keys (do not hardcode secrets)
GOOGLE_MAPS_API_KEY = getenv('GOOGLE_MAPS_API_KEY', '')
STRAVA_CLIENT_ID = getenv('STRAVA_CLIENT_ID', '')
STRAVA_CLIENT_SECRET = getenv('STRAVA_CLIENT_SECRET', '')

View File

@@ -33,6 +33,8 @@ urlpatterns = [
path('csrf/', get_csrf_token, name='get_csrf_token'),
path('public-url/', get_public_url, name='get_public_url'),
path("invitations/", include('invitations.urls', namespace='invitations')),
path('', TemplateView.as_view(template_name='home.html')),

View File

@@ -6,5 +6,6 @@ def get_user_uuid(user):
class CustomModelSerializer(serializers.ModelSerializer):
def to_representation(self, instance):
representation = super().to_representation(instance)
representation['user'] = get_user_uuid(instance.user)
if hasattr(instance, 'user') and instance.user:
representation['user'] = get_user_uuid(instance.user)
return representation

View File

@@ -1,6 +1,8 @@
Django==5.2.8
djangorestframework>=3.15.2
django-allauth==0.63.3
django-money==3.5.4
django-invitations==2.1.0
drf-yasg==1.21.4
django-cors-headers==4.4.0
coreapi==2.3.3
@@ -14,7 +16,7 @@ setuptools==79.0.1
gunicorn==23.0.0
qrcode==8.0
slippers==0.6.2
django-allauth-ui==1.5.1
django-allauth-ui==1.7.0
django-widget-tweaks==1.5.0
django-ical==1.9.2
icalendar==6.1.0

View File

@@ -0,0 +1,105 @@
#!/usr/bin/env python3
"""
Periodic sync runner for AdventureLog.
Runs sync_visited_regions management command every 60 seconds.
Managed by supervisord to ensure it inherits container environment variables.
"""
import os
import sys
import time
import logging
import signal
import threading
from datetime import datetime, timedelta
from pathlib import Path
# Setup Django
sys.path.insert(0, str(Path(__file__).parent))
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'main.settings')
import django
django.setup()
from django.core.management import call_command
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s [%(levelname)s] %(message)s',
handlers=[logging.StreamHandler(sys.stdout)]
)
logger = logging.getLogger(__name__)
INTERVAL_SECONDS = 60
# Event used to signal shutdown from signal handlers
_stop_event = threading.Event()
def _seconds_until_next_midnight() -> float:
"""Return number of seconds until the next local midnight."""
now = datetime.now()
next_midnight = (now + timedelta(days=1)).replace(
hour=0, minute=0, second=0, microsecond=0
)
return (next_midnight - now).total_seconds()
def _handle_termination(signum, frame):
"""Signal handler for SIGTERM and SIGINT: request graceful shutdown."""
logger.info(f"Received signal {signum}; shutting down gracefully...")
_stop_event.set()
def run_sync():
"""Run the sync_visited_regions command."""
try:
logger.info("Running sync_visited_regions...")
call_command('sync_visited_regions')
logger.info("Sync completed successfully")
except Exception as e:
logger.error(f"Sync failed: {e}", exc_info=True)
def main():
"""Main loop - run sync every INTERVAL_SECONDS."""
logger.info(f"Starting periodic sync worker for midnight background jobs...")
# Install signal handlers so supervisord (or other process managers)
# can request a clean shutdown using SIGTERM/SIGINT.
signal.signal(signal.SIGTERM, _handle_termination)
signal.signal(signal.SIGINT, _handle_termination)
try:
while not _stop_event.is_set():
# Wait until the next local midnight (or until shutdown)
wait_seconds = _seconds_until_next_midnight()
hours = wait_seconds / 3600.0
logger.info(
f"Next sync scheduled in {wait_seconds:.0f}s (~{hours:.2f}h) at UTC midnight"
)
# Sleep until midnight or until stop event is set
if _stop_event.wait(wait_seconds):
break
# It's midnight (or we woke up), run the sync once
run_sync()
# After running at midnight, loop continues to compute next midnight
except Exception:
logger.exception("Unexpected error in periodic sync loop")
finally:
logger.info("Periodic sync worker exiting")
if __name__ == '__main__':
try:
main()
except KeyboardInterrupt:
# Fallback in case the signal is delivered as KeyboardInterrupt
logger.info("KeyboardInterrupt received — exiting")
_stop_event.set()
except SystemExit:
logger.info("SystemExit received — exiting")
finally:
logger.info("run_periodic_sync terminated")

View File

@@ -0,0 +1,2 @@
@import url('https://cdn.tailwindcss.com');
@plugin "daisyui";

Binary file not shown.

After

Width:  |  Height:  |  Size: 87 KiB

View File

@@ -0,0 +1,3 @@
{% extends "admin/base.html" %} {% load static %} {% block extrahead %}
<link rel="icon" href="{% static 'favicon.png' %}" />
{% endblock %}

View File

@@ -4,6 +4,7 @@
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="description" content="AdventureLog API Server" />
<link rel="icon" href="/static/favicon.png" />
<meta name="author" content="Sean Morley" />
<title>AdventureLog API Server</title>

View File

@@ -0,0 +1,96 @@
{% load i18n %} {% autoescape off %} {% blocktrans %}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<title>Youre Invited to AdventureLog</title>
<style>
body {
font-family: "Helvetica Neue", Helvetica, Arial, sans-serif;
background-color: #f0f4f8;
margin: 0;
padding: 0;
color: #1f2937;
}
.container {
max-width: 600px;
margin: 50px auto;
background: #ffffff;
border-radius: 16px;
box-shadow: 0 8px 30px rgba(0, 0, 0, 0.08);
padding: 40px 30px;
text-align: center;
}
.logo {
width: 80px;
height: 80px;
margin-bottom: 24px;
}
h1 {
font-size: 28px;
color: #111827;
margin-bottom: 16px;
}
p {
font-size: 16px;
line-height: 1.65;
margin-bottom: 24px;
color: #1f2937;
}
.highlight {
font-weight: 600;
color: #4f46e5;
}
.button {
display: inline-block;
padding: 16px 32px;
background: linear-gradient(90deg, #4f46e5, #6366f1);
color: #ffffff !important;
text-decoration: none;
font-weight: 600;
font-size: 16px;
border-radius: 12px;
}
.footer {
margin-top: 40px;
font-size: 12px;
line-height: 1.6;
color: #9ca3af;
text-align: center;
}
</style>
</head>
<body>
<div class="container">
<img
src="https://adventurelog.app/adventurelog.png"
alt="AdventureLog"
class="logo"
/>
<h1>Youre Invited to AdventureLog</h1>
<p>Hello <strong>{{ email }}</strong>,</p>
<p>
Adventure awaits! You've been invited to join
<span class="highlight">AdventureLog</span>, the ultimate travel
companion to track, plan, and collaborate on your journeys.
</p>
<p>Hit the button below to accept your invitation and start exploring!</p>
<p><a href="{{ invite_url }}" class="button">Join AdventureLog</a></p>
<div class="footer">
Youre receiving this email because someone invited you to join
AdventureLog.<br />
If you werent expecting this invitation, you can safely ignore this
email.<br /><br />
© 20232025 AdventureLog
</div>
</div>
</body>
</html>
{% endblocktrans %} {% endautoescape %}

View File

@@ -0,0 +1,23 @@
{# templates/invitations/invite.html #} {% load i18n %}
<h4 class="title">{% trans "Send Invitation" %}</h4>
<div class="alert alert-info">
{% blocktrans %}To send an invitation, please go to the Django admin and
create a new Invitation object.{% endblocktrans %}
</div>
<p>
<a
href="{% url 'admin:invitations_invitation_add' %}"
class="btn btn-primary"
>
{% trans "Go to Django Admin" %}
</a>
</p>
{% if request.user.is_staff %}
<p class="text-muted small">
{% blocktrans %}Only staff users can send invitations.{% endblocktrans %}
</p>
{% endif %}

View File

@@ -0,0 +1,39 @@
{% extends "mfa/authenticate.html" %}
{% load allauth %}
{% load allauth_ui %}
{% load i18n %}
{% block content %}
{% trans "Two-Factor Authentication" as heading %}
{% blocktranslate asvar subheading %}Your account is protected by two-factor authentication. Please enter an authenticator code:{% endblocktranslate %}
{% url 'mfa_authenticate' as action_url %}
{% #container heading=heading subheading=subheading %}
{% translate "Activate" as button_text %}
{% trans "Sign In" as button_text %}
{% #form form=form url=action_url button_text=button_text %}
{% csrf_token %}
{% /form %}
{% if "webauthn" in MFA_SUPPORTED_TYPES %}
<div class="divider"></div>
<h2 class="my-3 text-lg">{% translate "Alternative options" %}</h2>
{% #form form=webauthn_form url=action_url use_default_button="false" %}
<button type="submit" class="btn btn-neutral">
{% trans "Use a security key" %}
</button>
<a href="{% url "account_login" %}" class="btn btn-accent">{% trans "Cancel" %}</a>
{% csrf_token %}
{% /form %}
{% endif %}
{% /container %}
{{ js_data|json_script:"js_data" }}
{# djlint:off #}
<script type="text/javascript">
allauth.webauthn.forms.authenticateForm({
ids: {
authenticate: "mfa_webauthn_authenticate",
credential: "{{ webauthn_form.credential.auto_id }}"
},
data: JSON.parse(document.getElementById('js_data').textContent)
})
</script>
{# djlint:on #}
{% endblock content %}

View File

@@ -1,10 +1,65 @@
from allauth.account.adapter import DefaultAccountAdapter
from django.conf import settings
# your_app/adapters.py
class NoNewUsersAccountAdapter(DefaultAccountAdapter):
"""
Disable new user registration.
"""
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
from allauth.account.adapter import DefaultAccountAdapter
from allauth.account.signals import user_signed_up
from django.conf import settings
from django.urls import resolve, Resolver404
from invitations.models import Invitation
class CustomAccountAdapter(DefaultAccountAdapter):
"""Control regular signup based on DISABLE_REGISTRATION, but allow invites."""
def is_open_for_signup(self, request):
is_disabled = getattr(settings, 'DISABLE_REGISTRATION', False)
return not is_disabled
"""
Allow signup only if:
- DISABLE_REGISTRATION is False, OR
- the request is for the invitation acceptance URL, OR
- there's a valid invitation key in the request parameters.
"""
# If registration is globally open, allow as usual
if settings.DISABLE_REGISTRATION is False:
return True
# If an invitation-verified email is stashed in the session, allow signup
if hasattr(request, "session") and request.session.get("account_verified_email"):
return True
# When disabled, allow signups via invitation accept URL
try:
match = resolve(request.path_info)
print("Resolved view name:", match.view_name)
if match.view_name == "invitations:accept-invite":
return True
except Resolver404:
pass
# Block any other signup
return False
def get_user_signed_up_signal(self):
"""Return the allauth `user_signed_up` signal for compatibility with
django-invitations which expects this method on the adapter.
"""
return user_signed_up
class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
"""Control social signup based on SOCIALACCOUNT_ALLOW_SIGNUP setting"""
def is_open_for_signup(self, request, sociallogin):
"""
Determines if social signup is allowed.
Check SOCIALACCOUNT_ALLOW_SIGNUP env variable.
Returning False shows the same 'signup_closed.html' template
as regular signup, but only blocks NEW social signups.
Existing users can still log in.
"""
# If social signup is disabled, only allow existing users
if not settings.SOCIALACCOUNT_ALLOW_SIGNUP:
return sociallogin.is_existing
return True

View File

@@ -4,3 +4,8 @@ from django.apps import AppConfig
class UsersConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'users'
def ready(self):
# Ensure signal handlers are registered
import users.signals # noqa: F401

View File

@@ -1,3 +1,4 @@
from django.conf import settings
from django.contrib.auth.backends import ModelBackend
from allauth.socialaccount.models import SocialAccount
from allauth.account.auth_backends import AuthenticationBackend as AllauthBackend
@@ -7,6 +8,10 @@ User = get_user_model()
class NoPasswordAuthBackend(ModelBackend):
def authenticate(self, request, username=None, password=None, **kwargs):
# Block all password-based logins when social-only mode is enforced
if getattr(settings, "FORCE_SOCIALACCOUNT_LOGIN", False) and password:
return None
# Handle allauth-specific authentication (like email login)
allauth_backend = AllauthBackend()
allauth_user = allauth_backend.authenticate(request, username=username, password=password, **kwargs)

View File

@@ -14,4 +14,17 @@ class CustomSignupForm(forms.Form):
# Save the user instance
user.save()
return user
return user
class UseAdminInviteForm(forms.Form):
"""
Dummy form that just tells admins to use the Django admin to send invites.
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# Remove any fields; we only want to show a message
self.fields.clear()
def as_widget(self):
# This is not needed; well just use a template
pass

View File

@@ -0,0 +1,18 @@
# Generated by Django 5.2.8 on 2026-01-03 20:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('users', '0005_customuser_measurement_system'),
]
operations = [
migrations.AddField(
model_name='customuser',
name='default_currency',
field=models.CharField(choices=[('USD', 'US Dollar'), ('EUR', 'Euro'), ('GBP', 'British Pound'), ('JPY', 'Japanese Yen'), ('AUD', 'Australian Dollar'), ('CAD', 'Canadian Dollar'), ('CHF', 'Swiss Franc'), ('CNY', 'Chinese Yuan'), ('HKD', 'Hong Kong Dollar'), ('SGD', 'Singapore Dollar'), ('SEK', 'Swedish Krona'), ('NOK', 'Norwegian Krone'), ('DKK', 'Danish Krone'), ('NZD', 'New Zealand Dollar'), ('INR', 'Indian Rupee'), ('MXN', 'Mexican Peso'), ('BRL', 'Brazilian Real'), ('ZAR', 'South African Rand'), ('AED', 'UAE Dirham'), ('TRY', 'Turkish Lira')], default='USD', max_length=5),
),
]

View File

@@ -3,6 +3,30 @@ from django.contrib.auth.models import AbstractUser
from django.db import models
from django_resized import ResizedImageField
CURRENCY_CHOICES = (
('USD', 'US Dollar'),
('EUR', 'Euro'),
('GBP', 'British Pound'),
('JPY', 'Japanese Yen'),
('AUD', 'Australian Dollar'),
('CAD', 'Canadian Dollar'),
('CHF', 'Swiss Franc'),
('CNY', 'Chinese Yuan'),
('HKD', 'Hong Kong Dollar'),
('SGD', 'Singapore Dollar'),
('SEK', 'Swedish Krona'),
('NOK', 'Norwegian Krone'),
('DKK', 'Danish Krone'),
('NZD', 'New Zealand Dollar'),
('INR', 'Indian Rupee'),
('MXN', 'Mexican Peso'),
('BRL', 'Brazilian Real'),
('ZAR', 'South African Rand'),
('AED', 'UAE Dirham'),
('TRY', 'Turkish Lira'),
)
class CustomUser(AbstractUser):
email = models.EmailField(unique=True) # Override the email field with unique constraint
profile_pic = ResizedImageField(force_format="WEBP", quality=75, null=True, blank=True, upload_to='profile-pics/')
@@ -10,6 +34,8 @@ class CustomUser(AbstractUser):
public_profile = models.BooleanField(default=False)
disable_password = models.BooleanField(default=False)
measurement_system = models.CharField(max_length=10, choices=[('metric', 'Metric'), ('imperial', 'Imperial')], default='metric')
default_currency = models.CharField(max_length=5, choices=CURRENCY_CHOICES, default='USD')
def __str__(self):
return self.username

View File

@@ -50,7 +50,7 @@ class UserDetailsSerializer(serializers.ModelSerializer):
class Meta:
model = CustomUser
extra_fields = ['profile_pic', 'uuid', 'public_profile', 'measurement_system']
extra_fields = ['profile_pic', 'uuid', 'public_profile']
if hasattr(UserModel, 'USERNAME_FIELD'):
extra_fields.append(UserModel.USERNAME_FIELD)
@@ -68,6 +68,8 @@ class UserDetailsSerializer(serializers.ModelSerializer):
extra_fields.append('disable_password')
if hasattr(UserModel, 'measurement_system'):
extra_fields.append('measurement_system')
if hasattr(UserModel, 'default_currency'):
extra_fields.append('default_currency')
fields = ['pk', *extra_fields]
read_only_fields = ('email', 'date_joined', 'is_staff', 'is_superuser', 'is_active', 'pk', 'disable_password')
@@ -98,7 +100,7 @@ class CustomUserDetailsSerializer(UserDetailsSerializer):
class Meta(UserDetailsSerializer.Meta):
model = CustomUser
fields = UserDetailsSerializer.Meta.fields + ['profile_pic', 'uuid', 'public_profile', 'has_password', 'disable_password', 'measurement_system']
fields = UserDetailsSerializer.Meta.fields + ['has_password', 'disable_password']
read_only_fields = UserDetailsSerializer.Meta.read_only_fields + ('uuid', 'has_password', 'disable_password')
@staticmethod

View File

@@ -0,0 +1,53 @@
from django.db.models.signals import post_save, post_delete
from django.dispatch import receiver
from django.contrib.auth import get_user_model
from django.contrib.auth.models import AbstractUser
from allauth.account.models import EmailAddress
from django.db.models.signals import pre_delete
User = get_user_model()
def _sync_user_email(user: AbstractUser):
from allauth.account.models import EmailAddress # local import to avoid early import issues
# Prefer the primary email if it exists
primary = EmailAddress.objects.filter(user=user, primary=True).first()
if primary:
if user.email != primary.email:
user.email = primary.email
user.save(update_fields=['email'])
return
# Fallback: if any email exists, use the first; if none, leave user.email unchanged
any_email = EmailAddress.objects.filter(user=user).order_by('id').first()
if any_email and user.email != any_email.email:
user.email = any_email.email
user.save(update_fields=['email'])
@receiver(post_save)
def emailaddress_post_save(sender, instance, **kwargs):
# Only react to allauth EmailAddress saves
try:
from allauth.account.models import EmailAddress
except Exception:
return
if sender is EmailAddress:
_sync_user_email(instance.user)
@receiver(post_delete)
def emailaddress_post_delete(sender, instance, **kwargs):
# Only react to allauth EmailAddress deletes
try:
from allauth.account.models import EmailAddress
except Exception:
return
if sender is EmailAddress:
_sync_user_email(instance.user)
# Prevent deleting the last email address for a user
@receiver(pre_delete, sender=EmailAddress)
def prevent_deleting_last_email(sender, instance, using, **kwargs):
user = instance.user
email_count = EmailAddress.objects.filter(user=user).count()
if email_count <= 1:
raise ValueError("Cannot delete the last email address of a user.")

View File

@@ -171,7 +171,8 @@ class EnabledSocialProvidersView(APIView):
providers.append({
'provider': provider.provider,
'url': f"{getenv('PUBLIC_URL')}/accounts/{new_provider}/login/",
'name': provider.name
'name': provider.name,
'usage_required': settings.FORCE_SOCIALACCOUNT_LOGIN
})
return Response(providers, status=status.HTTP_200_OK)

View File

@@ -167,7 +167,7 @@ class Command(BaseCommand):
# Process regions/states
if country['states']:
for state in country['states']:
state_id = f"{country_code}-{state['state_code']}"
state_id = f"{country_code}-{state['iso2']}"
state_name = state['name']
state_lat = round(float(state['latitude']), 6) if state['latitude'] else None
state_lng = round(float(state['longitude']), 6) if state['longitude'] else None

View File

@@ -1,4 +1,4 @@
from django.shortcuts import render, get_object_or_404
from django.shortcuts import get_object_or_404
from .models import Country, Region, VisitedRegion, City, VisitedCity
from .serializers import CitySerializer, CountrySerializer, RegionSerializer, VisitedRegionSerializer, VisitedCitySerializer
from rest_framework import viewsets, status
@@ -6,35 +6,8 @@ from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
from rest_framework.decorators import api_view, permission_classes, action
from django.contrib.gis.geos import Point
from django.core.cache import cache
from django.views.decorators.cache import cache_page
from django.utils.decorators import method_decorator
from adventures.models import Location
# Cache TTL
CACHE_TTL = 60 * 60 * 24 # 1 day
def invalidate_visit_caches_for_region_and_user(region, user):
"""Invalidate cached visit lists for a given region and user.
Removes both the per-region and per-country per-user cache keys so
UI calls will refetch updated visited lists.
"""
try:
if region is None or user is None:
return
# per-region cache
cache.delete(f"visits_by_region_{region.id}_{user.id}")
# per-country cache (region -> country -> country_code)
country_code = getattr(region.country, 'country_code', None)
if country_code:
cache.delete(f"visits_by_country_{country_code}_{user.id}")
except Exception:
# Avoid raising cache-related exceptions; best-effort invalidation
pass
@cache_page(CACHE_TTL)
@api_view(['GET'])
@permission_classes([IsAuthenticated])
def regions_by_country(request, country_code):
@@ -46,17 +19,11 @@ def regions_by_country(request, country_code):
@api_view(['GET'])
@permission_classes([IsAuthenticated])
def visits_by_country(request, country_code):
cache_key = f"visits_by_country_{country_code}_{request.user.id}"
data = cache.get(cache_key)
if data is not None:
return Response(data)
country = get_object_or_404(Country, country_code=country_code)
visits = VisitedRegion.objects.filter(region__country=country, user=request.user.id)
serializer = VisitedRegionSerializer(visits, many=True)
cache.set(cache_key, serializer.data, CACHE_TTL)
return Response(serializer.data)
@cache_page(CACHE_TTL)
@api_view(['GET'])
@permission_classes([IsAuthenticated])
def cities_by_region(request, region_id):
@@ -68,14 +35,9 @@ def cities_by_region(request, region_id):
@api_view(['GET'])
@permission_classes([IsAuthenticated])
def visits_by_region(request, region_id):
cache_key = f"visits_by_region_{region_id}_{request.user.id}"
data = cache.get(cache_key)
if data is not None:
return Response(data)
region = get_object_or_404(Region, id=region_id)
visits = VisitedCity.objects.filter(city__region=region, user=request.user.id)
serializer = VisitedCitySerializer(visits, many=True)
cache.set(cache_key, serializer.data, CACHE_TTL)
return Response(serializer.data)
# view called spin the globe that return a random country, a random region in that country and a random city in that region
@@ -99,7 +61,6 @@ def globespin(request):
return Response(data)
@method_decorator(cache_page(CACHE_TTL), name='list')
class CountryViewSet(viewsets.ReadOnlyModelViewSet):
queryset = Country.objects.all().order_by('name')
serializer_class = CountrySerializer
@@ -134,7 +95,6 @@ class CountryViewSet(viewsets.ReadOnlyModelViewSet):
continue
return Response({'regions_visited': count})
@method_decorator(cache_page(CACHE_TTL), name='list')
class RegionViewSet(viewsets.ReadOnlyModelViewSet):
queryset = Region.objects.all()
serializer_class = RegionSerializer
@@ -158,22 +118,13 @@ class VisitedRegionViewSet(viewsets.ModelViewSet):
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
headers = self.get_success_headers(serializer.data)
# Invalidate caches for this region and its country for the user
try:
region = serializer.validated_data.get('region')
invalidate_visit_caches_for_region_and_user(region, request.user)
except Exception:
pass
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
def destroy(self, request, **kwargs):
region = get_object_or_404(Region, id=kwargs['pk'])
visited_region = VisitedRegion.objects.filter(user=request.user.id, region=region)
if visited_region.exists():
# capture region before deleting so we can invalidate caches
affected_region = visited_region.first().region
visited_region.delete()
invalidate_visit_caches_for_region_and_user(affected_region, request.user)
return Response(status=status.HTTP_204_NO_CONTENT)
else:
return Response({"error": "Visited region not found."}, status=status.HTTP_404_NOT_FOUND)
@@ -193,14 +144,10 @@ class VisitedCityViewSet(viewsets.ModelViewSet):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
self.perform_create(serializer)
# Ensure a VisitedRegion exists for the city and invalidate caches
# Ensure a VisitedRegion exists for the city
region = serializer.validated_data['city'].region
if not VisitedRegion.objects.filter(user=request.user.id, region=region).exists():
VisitedRegion.objects.create(user=request.user, region=region)
try:
invalidate_visit_caches_for_region_and_user(region, request.user)
except Exception:
pass
headers = self.get_success_headers(serializer.data)
return Response(serializer.data, status=status.HTTP_201_CREATED, headers=headers)
@@ -208,9 +155,7 @@ class VisitedCityViewSet(viewsets.ModelViewSet):
city = get_object_or_404(City, id=kwargs['pk'])
visited_city = VisitedCity.objects.filter(user=request.user.id, city=city)
if visited_city.exists():
region = city.region
visited_city.delete()
invalidate_visit_caches_for_region_and_user(region, request.user)
return Response(status=status.HTTP_204_NO_CONTENT)
else:
return Response({"error": "Visited city not found."}, status=status.HTTP_404_NOT_FOUND)

View File

@@ -8,7 +8,8 @@ stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr
[program:gunicorn]
command=/code/entrypoint.sh
command=/usr/local/bin/gunicorn main.wsgi:application --bind [::]:8000 --workers 2 --timeout 120
directory=/code
autorestart=true
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr
@@ -22,3 +23,12 @@ stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr
stdout_logfile_maxbytes=0
stderr_logfile_maxbytes=0
[program:sync_visited_regions]
command=/usr/local/bin/python3 /code/run_periodic_sync.py
directory=/code
autorestart=true
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr
stdout_logfile_maxbytes=0
stderr_logfile_maxbytes=0

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.3 MiB

After

Width:  |  Height:  |  Size: 4.4 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 387 KiB

After

Width:  |  Height:  |  Size: 1002 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 882 KiB

After

Width:  |  Height:  |  Size: 4.2 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 431 KiB

After

Width:  |  Height:  |  Size: 671 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 267 KiB

After

Width:  |  Height:  |  Size: 1.1 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 225 KiB

After

Width:  |  Height:  |  Size: 737 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 677 KiB

59
docker-compose.dev.yml Normal file
View File

@@ -0,0 +1,59 @@
services:
web:
build: ./frontend/
image: ghcr.io/seanmorley15/adventurelog-frontend:latest
container_name: adventurelog-frontend
restart: unless-stopped
user: root
env_file: .env
environment:
- CI=true
- NODE_OPTIONS=--max-old-space-size=4096
ports:
- "${FRONTEND_PORT:-8015}:3000"
depends_on:
- server
volumes:
- ./frontend:/app
- pnpm_store:/pnpm-store
command: sh -c "mkdir -p /pnpm-store && chown -R node:node /pnpm-store && su node -c 'pnpm config set store-dir /pnpm-store && pnpm install --frozen-lockfile && pnpm exec vite dev --host 0.0.0.0 --port 3000 --strictPort'"
db:
image: postgis/postgis:16-3.5
container_name: adventurelog-db
restart: unless-stopped
env_file: .env
volumes:
- postgres_data:/var/lib/postgresql/data/
server:
build: ./backend/
image: ghcr.io/seanmorley15/adventurelog-backend:latest
container_name: adventurelog-backend
restart: unless-stopped
env_file: .env
environment:
- DJANGO_SUPERUSER_USERNAME=${DJANGO_ADMIN_USERNAME}
- DJANGO_SUPERUSER_PASSWORD=${DJANGO_ADMIN_PASSWORD}
- DJANGO_SUPERUSER_EMAIL=${DJANGO_ADMIN_EMAIL}
ports:
- "${BACKEND_PORT:-8016}:8000"
depends_on:
- db
volumes:
- ./backend/server:/code
- adventurelog_media:/code/media/
command: >
sh -c "memcached -u nobody -m 64 -p 11211 -d;
until pg_isready -h db -p 5432 >/dev/null 2>&1; do sleep 1; done;
python manage.py migrate --noinput;
python manage.py shell -c \"from worldtravel.models import Country; import sys; sys.exit(0 if Country.objects.exists() else 1)\" || python manage.py download-countries;
if [ -n \"$$DJANGO_SUPERUSER_USERNAME\" ] && [ -n \"$$DJANGO_SUPERUSER_PASSWORD\" ] && [ -n \"$$DJANGO_SUPERUSER_EMAIL\" ]; then
python manage.py createsuperuser --noinput --username \"$$DJANGO_SUPERUSER_USERNAME\" --email \"$$DJANGO_SUPERUSER_EMAIL\" || true;
fi;
python manage.py runserver 0.0.0.0:8000"
volumes:
postgres_data:
adventurelog_media:
pnpm_store:

View File

@@ -108,7 +108,7 @@ export default defineConfig({
footer: {
message: "AdventureLog",
copyright: "Copyright © 2023-2025 Sean Morley",
copyright: "Copyright © 2023-2026 Sean Morley",
},
logo: "/adventurelog.png",
@@ -138,6 +138,10 @@ export default defineConfig({
link: "/docs/install/kustomize",
},
{ text: "Unraid 🧡", link: "/docs/install/unraid" },
{
text: "Dev Container + WSL 🧰",
link: "/docs/install/dev_container_wsl",
},
{
text: "With A Reverse Proxy",
@@ -203,6 +207,10 @@ export default defineConfig({
text: "Authelia",
link: "https://www.authelia.com/integration/openid-connect/adventure-log/",
},
{
text: "Pocket ID",
link: "/docs/configuration/social_auth/pocket_id",
},
{
text: "Open ID Connect",
link: "/docs/configuration/social_auth/oidc",
@@ -219,6 +227,10 @@ export default defineConfig({
},
{ text: "SMTP Email", link: "/docs/configuration/email" },
{ text: "Umami Analytics", link: "/docs/configuration/analytics" },
{
text: "Advanced Configuration",
link: "/docs/configuration/advanced_configuration",
},
],
},
{
@@ -247,6 +259,10 @@ export default defineConfig({
text: "Admin Panel",
link: "/docs/guides/admin_panel",
},
{
text: "Invite a User",
link: "/docs/guides/invite_user",
},
{
text: "v0.7.1 Migration Guide",
link: "/docs/guides/v0-7-1_migration",
@@ -257,6 +273,10 @@ export default defineConfig({
text: "Changelogs",
collapsed: false,
items: [
{
text: "Development Timeline",
link: "/docs/changelogs/development_timeline",
},
{
text: "v0.11.0",
link: "/docs/changelogs/v0-11-0",

View File

@@ -0,0 +1,204 @@
# AdventureLog: Development Timeline & Origin Story
By: Sean Morley, Founder & Lead Developer
This is the timeline of **how AdventureLog came to be, how it kept surviving my terrible early design choices, and how it slowly learned to be useful**. I wrote this as a detailed, phase-by-phase story so contributors, users, and future-me can see what decisions were made, why, and what problems we hit (and fixed) along the way.
> TL;DR: started as _NextVenture_, learned web dev the hard way, resurrected as _AdventureLog_, switched stacks twice, survived an chaotic Reddit launch, grew through community requests, and today the project is very much alive.
## Quick roadmap
- **Phase 0 — Ideation & Prototyping:** March 2023 → July 2023
The seed. Lots of learning, lots of scrapped prototypes.
- **Phase 1 — AdventureLog, SvelteKit roots:** March 2024 → July 2024
Frontend-first, local-storage MVP, severe Docker struggles, file storage chaos, MinIO pain.
- **Phase 2 — Django saves the day:** July 2024 → August 2024
Backend matured quickly; REST API, Django admin, file storage sanity.
- **Phase 3 — Definition, Community Growth, and Integrations:** Sept 2024 → June 2025
Feature solidification, sharing, world data, Immich integration, big UX decisions.
- **Phase 4 — Solidification & Expansion:** June 2025 → Present
UI rebuild, rename of core concepts, activities/trails, heavy QoL and performance work.
## Phase 0 — Initial Ideation and Prototyping
**Dates:** March 2023 — July 2023
This phase was basically me being excited, naive, and wildly optimistic.
### What I planned
- Start as **NextVenture**: a curated list of national parks, cities, landmarks — places people check off. Simple premise, obvious joy.
- A focus on letting users mark where they've been, build a list of places to visit, and keep a little travel log.
### What actually happened
- I was learning the **React / Node / Express** (MERN) stack on the fly. Every problem felt like a mountain and every mountain required rewriting whole parts of the codebase once I learned better practices (it was not that enjoyable to be honest).
- I produced multiple small prototypes, each progressively less terrible than the last. Progress! But also lots of “why did I do that” moments.
- Burnout + humility set in: I needed a break to learn fundamentals rather than pile band-aids on a shaky codebase (yeah I didn't really learn a lot during that break either, but at least I wasn't actively writing bad code).
### Small but important pivots
- While on break I renamed the project to **AdventureLog** — it felt better, cleaner, and more fitting than the working title.
- I played with tiny experiments, tested UI ideas, and tried different stacks mentally so the next attempt wouldnt be purely guesswork. I was very intrested in the project just lacking the right technical foundation.
### Takeaway
Phase 0 was less about shipping and more about surviving the learning curve. The projects DNA (places, visits, memories) was clear; I just needed the right tools and patience to implement it.
## Phase 1 — Initial Development of AdventureLog (SvelteKit era)
**Dates:** March 2024 — July 2024
**Versions:** v0.1.0-alpha → v0.3.1
This was the “frontend-first, learn-by-doing” era. SvelteKit won me because its delightful to write and let me prototype fast. I still use SvelteKit for the frontend today and love it.
### Core progress
- Built a single-page app MVP where adventures were stored in **localStorage**. Simple, demoable, and enough to prove the concept.
- Learned SvelteKit app structure, routing, and how to think in reactive UI.
### Auth and backend beginnings
- Implemented authentication with **Lucia** so users could create accounts and persist data beyond local storage. That transition felt like leveling up.
- Switched from local-only to a backend API using **SvelteKits API routes** to centralize storage and multi-device access.
### Deployment & DevOps pain
- Began containerizing everything: Dockerfiles (frontend & backend), `docker-compose`, and env variables. Took days of hair-pulling but I got a reliably deployable container. Victory was greatly needed at this point.
- File uploads became a major sticking point: SvelteKit had no baked-in file handling story. I experimented with a self-hosted S3-compatible solution — **MinIO**. It worked, but felt hacky: extra moving parts, weird integration edges, and a general “this isnt elegant” feeling. I pretty much knew at this point I was walking down a dead-end path...
### Major decision to pivot
- The MinIO + SvelteKit upload situation (and the need for a more robust API/admin story) made me decide to **rewrite the backend in Django**. I started the backend from scratch with a fresh project layout and a clearer architecture plan. This felt like ripping off a bandage: painful but necessary.
### Lessons learned
- Rapid frontend iteration is fantastic for shaping UX, but for persistent data and file handling, I needed a backend that provided batteries-included features (auth, file storage, admin) — enter Django.
## Phase 2 — Django Backend & Early Stability
**Dates:** July 2024 — August 2024
**Versions:** v0.4.0 → v0.6.0
After the SvelteKit experiment I rewired the backend into Django + Django REST Framework. This phase is where the project matured technically in a big way.
### Why Django?
- **Djangos admin**, built-in auth, and mature file handling made life dramatically easier. I could iterate on the API fast and manage the DB through a UI when debugging or testing. Django REST Framework allowed a clean separation between API and frontend.
### What changed (notably)
- Reused frontend SvelteKit components where possible, but the API endpoints were completely reworked to talk to Django.
- Switched file uploads from MinIO to Djangos file storage on the server filesystem — simpler and, honestly, a relief.
- Introduced **collections**, **lodgings**, **notes**, **checklists** — broadening the scope beyond “just places” into trip planning and trip context. (Restaurants were later pruned and replaced with transportation models for better clarity.)
### Stability and schema work
- One big database change (v0.5.1): I switched primary keys to **UUIDs** instead of auto-incrementing integers. That was scary but intentional: UUIDs make merging and scaling safer later on. Happily, it was done early — before many users existed — which avoided painful migrations later.
### Community & launch
- I drafted a release post for r/selfhosted and decided to ship _before_ college started. On **August 15, 2024** I posted it, and it blew up more than I dared hope: **~400 upvotes, 180+ comments**, and a surge of installs and conversations. The repo got a large influx of attention and traffic, the kind of validation that keeps a project alive through times of doubt.
### Immediate aftermath
- I spent the next week triaging issues, helping users deploy, and shipping fixes. It was a stressful but extremely educational crunch while simultaneously moving to college. That crunch was intense, but it was also the moment I learned how real user feedback shapes a projects priorities.
### Takeaway
Switching to Django was the right move, it reduced friction, sped up backend feature development, and made the application more maintainable.
## Phase 3 — Defining AdventureLog & Community-Guided Growth
**Dates:** September 2024 — June 2025
**Versions:** v0.7.0 → v0.10.0
This phase is about defining the product: what is AdventureLog, what is it not, and how do we make it useful for other people?
### Core feature evolution
- **Multiple visits per location:** Users wanted to track repeat trips to the same place. Adding visit history became central to the data model.
- **Collection sharing & permissions:** Collections could be shared with other accounts for collaborative trip planning, implementing the permission logic here was fiddly and involved a lot of debugging. But once it worked, collaboration felt genuinely useful.
- **World travel data:** Initially we were manually entering countries and regions. A generous contributor pointed to a JSON dataset with countries/regions/cities - integrating that made world travel features robust and maintainable. (City support came later.)
- **Categories & tags:** After debating categories vs tags, we leaned into categories as the primary organizational mechanism (with tags available as flexible metadata). Later, custom categories were added so users could create their own classification schemes.
### UX polish & identity
- Logo: I swapped out the placeholder Windows map emoji for a proper logo designed by a friend (thanks, Jacob!). It made the app look more “real” and brandable.
- Localization: Frontend got translations to make AdventureLog accessible to more users.
- Calendar view: added a calendar to visualize trips over time, another highly requested feature.
### Integrations & polish
- **Immich integration** (huge win): Sync photos from Immich to AdventureLog adventures. This solved the “where do my travel photos live?” problem for many self-hosters and reduced friction for users who already had an Immich instance.
- **Backend optimizations:** performance tweaks, PWA support, OIDC support for enterprise-friendly auth, and other server configurability options.
### Community milestones
- Docker image downloads crossed **100K** — a concrete, surreal milestone. GitHub stars crossed **1K** shortly after. These numbers matter because they mean people are using and relying on AdventureLog.
- Collections received “smart recommendations” — algorithmic suggestions for new locations to add to a collection based on existing entries. This added a bit of discovery and delight.
### Ops & deployment improvements
- Simplified deployment by removing an extra Nginx container. Instead the backend serves media via an internal Nginx proxy. Fewer containers made deployment easier for hobbyist hosts.
### Takeaway
Phase 3 is where AdventureLog stopped being “my little project” and started becoming a community-shaped tool. The roadmap was heavily guided by user requests, and that made the app both more useful and more fun to build.
## Phase 4 — Solidification & Expansion of the Core Platform
**Dates:** June 2025 — Present
**Versions:** v0.11.0 → Present
Now the project focuses on _polish, robustness, and expanding the core platform_ rather than constantly changing directions.
### Primary themes
- **Solidifying core UX**: a major UI rebuild to improve accessibility, usability, and cohesion. The goal was not only to look nicer but to make features easier to discover and use.
- **Expanding travel tracking & trip planning**: deeper integrations, better activity support, and more ways to view and interact with your travel history.
### Notable changes & features
- **Rename: “adventures” → “locations”**: This semantic pivot helped clarify the data model. A _location_ is a place that can have multiple _visits_; collections are groups of locations for trip planning. The rename reduced user confusion and aligned the product to real-world mental models.
- **Activities & Trails**:
- Activities: connect visits to activity providers (e.g., Strava imports) so users can show what they did at a location — not just that they were there.
- Trails: link trail data either via a URL or by integrating with self-hosted platforms (like Wanderer). This enriches the outdoor-adventure use case.
- **File attachments & broader media options:** allow PDFs and other travel documents to be attached to locations/visits.
- **Server configurability & geocoding:** more options for self-hosted operators, plus an optional Google Maps integration for geocoding.
### Ongoing priorities
- Performance tuning and bug fixes continue to be the top priority — the fewer regressions, the more people trust the app.
- Accessibility improvements, better testing, and expanding integrations in a way that doesnt bloat the core experience.
### Takeaway
This phase is about turning AdventureLog from “a promising tool” into “a dependable tool.” Its less about big rewrites and more about incremental, meaningful improvements.
## Lessons, patterns, and a few thoughts
1. **Pick the right tool for the job**
- The SvelteKit prototype phase taught me how fast UI iteration can progress. The Django rewrite taught me you cant ignore backend primitives (auth, file handling, admin) if you want to ship a stable self-hosted app. Each stack had strengths, use them where they matter.
2. **Community feedback is gold**
- The Reddit launch pushed the project into real usage. Responding to issues and user requests shaped core features more than any design doc ever could.
3. **Keep breaking changes reasonable**
- UUIDs as primary keys were scary, but doing it early saved headaches. Plan big breaking changes early; avoid them once people rely on your software.
4. **Simplicity wins in deployment**
- Removing extra containers and simplifying deployment options made AdventureLog more approachable for hobbyist hosts — which is the core audience.
5. **Iterate visibly**
- Small, visible wins (better login flow, calendar, Immich sync) build momentum and community trust.
## Current state & whats next
AdventureLog is alive, maintained, and focused on being the best self-hosted travel app it can be: accessible, performant, and useful for trip planning and personal travel history.
Writing this made me realize how much of AdventureLogs identity came from mistakes, feedback, and stubbornness. Its the result of learning, throwing away things that didnt work, embracing tools that did, and listening to people who actually used it. Im proud of how its evolved and excited for the next phase.
If you made it this far: thanks. If you want to help — issues, PRs, ideas, or design feedback are always welcome. The project is alive because of an amazing community of users and contributors!
— Sean

View File

@@ -0,0 +1,9 @@
# Advanced Configuration
In addition to the primary configuration variables listed above, there are several optional environment variables that can be set to further customize your AdventureLog instance. These variables are not required for a basic setup but can enhance functionality and security.
| Name | Required | Description | Default Value | Variable Location |
| ---------------------------- | -------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ------------- | ----------------- |
| `ACCOUNT_EMAIL_VERIFICATION` | No | Enable email verification for new accounts. Options are `none`, `optional`, or `mandatory` | `none` | Backend |
| `FORCE_SOCIALACCOUNT_LOGIN` | No | When set to `True`, only social login is allowed (no password login). The login page will show only social providers or redirect directly to the first provider if only one is configured. | `False` | Backend |
| `SOCIALACCOUNT_ALLOW_SIGNUP` | No | When set to `True`, signup will be allowed via social providers even if registration is disabled. | `False` | Backend |

View File

@@ -0,0 +1,143 @@
# Pocket ID OIDC Authentication
<img src="https://pocket-id.org/logo.png" alt="Pocket ID Logo" width="400" />
Pocket ID is a lightweight, self-hosted OpenID Connect (OIDC) identity provider. AdventureLog can be configured to use Pocket ID for social authentication using its built-in OpenID Connect support.
Once Pocket ID is configured by an administrator, users can sign in to AdventureLog using their Pocket ID account and optionally link it to an existing AdventureLog account.
---
# Configuration
To enable Pocket ID as an identity provider, both Pocket ID and AdventureLog must be configured correctly. The most important (and least obvious) part of this setup is the **callback URL**, which must match AdventureLogs internal OIDC routing.
---
## Pocket ID Configuration
1. Log in to your Pocket ID admin interface.
2. Navigate to **Clients** and create a new client.
3. Name the client something like `AdventureLog`.
4. Set the **Redirect / Callback URL** to:
```
https://<adventurelog-backend.example.com>/accounts/oidc/<CLIENT_ID>/login/callback/
```
- Replace `<adventurelog-backend.example.com>` with the **backend** URL of your AdventureLog instance.
- Replace `<CLIENT_ID>` with the **Pocket ID client ID** exactly as generated.
- This path is required and currently not auto-documented by Pocket ID or AdventureLog.
5. Ensure the client type is **Confidential**.
6. Copy the generated **Client ID** and **Client Secret** — you will need both for AdventureLog.
---
## AdventureLog Configuration
This configuration is done in the [Admin Panel](../../guides/admin_panel.md). You can launch it from the `Settings` page or navigate directly to `/admin` on your AdventureLog server.
1. Log in to AdventureLog as an administrator.
2. Navigate to **Settings** → **Administration Settings** and launch the admin panel.
3. Go to **Social Accounts**.
4. Under **Social applications**, click **Add**.
5. Fill in the fields as follows:
### Social Application Settings
- **Provider**: `OpenID Connect`
- **Provider ID**: Pocket ID Client ID
- **Name**: `Pocket ID`
- **Client ID**: Pocket ID Client ID
- **Secret Key**: Pocket ID Client Secret
- **Key**: _(leave blank)_
- **Settings**:
```json
{
"server_url": "https://<pocketid-url>/.well-known/openid-configuration"
}
```
- Replace `<pocketid-url>` with the base URL of your Pocket ID instance.
::: warning
Do **not** use `localhost` unless Pocket ID is running on the same machine and is resolvable from inside the AdventureLog container or service. Use a domain name or LAN IP instead.
:::
- **Sites**: Move the sites you want Pocket ID enabled on (usually `example.com` and `www.example.com`).
6. Save the configuration.
Ensure Pocket ID is running and reachable by AdventureLog.
---
## What It Should Look Like
Once configured correctly:
- Pocket ID appears as a login option on the AdventureLog login screen.
- Logging in redirects to Pocket ID, then back to AdventureLog without errors.
---
## Linking to an Existing Account
If a user already has an AdventureLog account:
1. Log in to AdventureLog normally.
2. Go to **Settings**.
3. Click **Launch Account Connections**.
4. Choose **Pocket ID** to link the identity to the existing account.
This allows future logins using Pocket ID without creating a duplicate account.
---
## Troubleshooting
### 404 Error After Login
Ensure that:
- `/accounts` routes are handled by the **backend**, not the frontend.
- Your reverse proxy (Nginx, Traefik, Caddy, etc.) forwards `/accounts/*` correctly.
---
### Invalid Redirect URI
- Double-check that the callback URL in Pocket ID exactly matches:
```
/accounts/oidc/<CLIENT_ID>/login/callback/
```
- The `<CLIENT_ID>` must match the value used in the AdventureLog social application.
---
### Cannot Reach Pocket ID
- Verify that the `.well-known/openid-configuration` endpoint is accessible from the AdventureLog server.
- Test by opening:
```
https://<pocketid-url>/.well-known/openid-configuration
```
in a browser.
---
## Notes
- Pocket ID configuration is very similar to Authentik.
- The main difference is the **explicit callback URL requirement** and the use of the `.well-known/openid-configuration` endpoint as the `server_url`.
- This setup works with Docker, Docker Compose, and bare-metal deployments as long as networking i

View File

@@ -0,0 +1,6 @@
# Invite a User
AdventureLog allows for admin users to invite new users to create an account via an email invitation.
Invites are sent from the AdventureLog [Admin Panel](admin_panel.md) or via a custom invite form. The invite section of the admin panel can be found at `/admin/invitations/invitation/add/`.
Tip: make sure you have a working email setup for AdventureLog to send emails. See the [Email Configuration Guide](../configuration/email.md) for more information.

View File

@@ -0,0 +1,180 @@
# Dev Container + WSL 🧰
Running AdventureLog in a **Dev Container** allows you to contribute to the project or work on features locally in a fully reproducible development environment with hot reloading, debugging, and tooling isolated inside Docker.
This guide focuses on **Windows using WSL 2**, but the workflow is similar on other platforms.
## Prerequisites
Before starting, ensure you have the following installed:
* **Docker Desktop**
Download from: [https://www.docker.com/products/docker-desktop/](https://www.docker.com/products/docker-desktop/)
> Docker Desktop must be configured to use **WSL 2**
> Make sure Docker Desktop is running before you start the steps below.
* **WSL 2 with a Linux distribution installed**
Ubuntu is recommended.
```bash
wsl --install -d Ubuntu
```
Run this in **Windows PowerShell** (or **Windows Terminal**).
* **Visual Studio Code**
[https://code.visualstudio.com/](https://code.visualstudio.com/)
* **VS Code Extensions**
* Dev Containers
* WSL
## ⚠️ Important Notes (Read First)
> **TIP**
> Do not use the `docker-desktop` WSL distribution for development.
> Always use a real Linux distro such as **Ubuntu**.
> **TIP**
> Avoid working in `/mnt/c/...`.
> Clone and work inside your Linux home directory (`/home/<user>`), otherwise file watching and container mounts may behave incorrectly.
> **TIP**
> Docker must be available *inside* WSL. Make sure WSL integration is enabled in Docker Desktop:
>
> **Docker Desktop → Settings → Resources → WSL Integration → Enable Ubuntu**
## Getting Started
### 1. Clone the Repository (inside WSL)
Open a WSL terminal (search for "WSL" in the Windows Start menu and open the WSL terminal), then run:
```bash
cd ~
git clone https://github.com/seanmorley15/AdventureLog.git
cd AdventureLog
```
> **TIP**
> If you plan to contribute changes, fork the repository on GitHub and clone your fork instead:
>
> ```bash
> git clone https://github.com/<your-username>/AdventureLog.git
> ```
### 2. Create the Development `.env` File (via WSL)
```bash
cp .env.example .env && sed -i 's/^DEBUG=.*/DEBUG=True/' .env
```
This creates the `.env` file required for the containers to start and enables DEBUG for local development.
> **NOTE**
> The rest of the defaults in `.env.example` are sufficient for running the project.
#### Environment Variables
The Dev Container setup uses the same `.env` configuration as the standard Docker installation.
For a full list of available environment variables and optional configuration options, see the
[**Docker 🐋 installation guide**](docker.md#configuration).
### 3. Open the Project in VS Code (via WSL)
From the project directory:
```bash
code .
```
VS Code should indicate that the folder is opened **in WSL**.
### 4. Reopen the Project in a Dev Container
In VS Code:
1. Press **Ctrl + Shift + P**
2. Select **Dev Containers: Reopen in Container**
VS Code will:
* Build the development containers
* Install dependencies
* Attach the editor to the running container
The first build usually takes around 30 seconds.
## Running the Application
Once the Dev Container is running, the services are started using Docker Compose.
Use the VS Code terminal (inside the Dev Container) for the commands below.
To start the app, enter the following command:
```bash
docker compose -f docker-compose.dev.yml up --build
```
Bringing the app up usually takes around 1-2 minutes.
To fully reset the database and media volumes, run:
```bash
docker compose -f docker-compose.dev.yml down -v
```
## Accessing the App
* **Frontend (Web UI)**
[http://localhost:8015](http://localhost:8015)
* **Backend (API)**
[http://localhost:8016](http://localhost:8016)
Admin credentials are taken from your `.env` file. The `docker-compose.dev.yml` setup auto-creates a superuser on startup using those values so you can log in right away.
It also checks whether the countries/flags data already exists before re-importing it, so the first build can take longer and subsequent `down`/`up` runs are faster.
This dev setup can feel a bit slower because hot reload, dependency installs, and initial database bootstrapping all happen inside containers.
## Common Issues
### Docker Not Found Inside WSL
If you see:
```
The command 'docker' could not be found in this WSL 2 distro
```
Ensure:
* Docker Desktop is running
* WSL integration is enabled for **Ubuntu**
* Docker Desktop has been restarted after enabling integration
### Accidentally Using `/mnt/c`
If the project lives under `/mnt/c/...`, move it to:
```bash
/home/<user>/AdventureLog
```
This avoids performance issues and file watcher bugs.
## Dev vs Production
| Feature | Docker Install | Dev Container |
| ------------ | --------------- | ------------------ |
| Intended use | Running the app | Developing the app |
| Hot reload | ❌ | ✅ |
| Debugging | ❌ | ✅ |
| Code editing | ❌ | ✅ |
For production or personal hosting, follow the standard
[**Docker 🐋 installation guide**](docker.md).
Enjoy contributing to AdventureLog! 🎉
If you run into issues not covered here, please open a discussion or issue so the docs can be improved.

View File

@@ -22,6 +22,7 @@ Perfect for Docker beginners.
## ⚙️ Advanced & Alternative Setups
- [Nginx Proxy Manager](nginx_proxy_manager.md) Easy reverse proxy config
- [Nginx Proxy Manager](nginx_proxy_manager.md) - Easy reverse proxy config
- [Traefik](traefik.md) — Dynamic reverse proxy with automation
- [Caddy](caddy.md) — Automatic HTTPS with a clean config
- [Dev Container + WSL](dev_container_wsl.md) - Windows dev environment with WSL 2 + Dev Containers

View File

@@ -7,7 +7,7 @@ Community Applications Page for PostGIS: [PostGIS on CA Store](https://unraid.ne
## Installation Configuration
- **Note:** It is recommended to install the applications in the order of these instructions, as failing to do so could cause issues.\
- **Note:** It is recommended to install the applications in the order of these instructions, as failing to do so could cause issues.
- Container names can be set to whatever you desire.
- Also ensure they are all on the same custom network so they can communicate with one another. You can create one by running the following command in your command line, with `example` being set to your desired name. This network will then show up for selection when making the apps/containers.

View File

@@ -31,3 +31,5 @@ AdventureLog is open-source software, licensed under the GPL-3.0 license. This m
Hi, I'm [Sean Morley](https://seanmorley.com), the creator of AdventureLog. I'm an Electrical Engineering student at the University of Connecticut, and I'm passionate about open-source software and building modern tools that help people solve real-world problems. I created AdventureLog to solve a problem: the lack of a modern, open-source, user-friendly travel companion. Many existing travel apps are either too complex, too expensive, or too closed-off to be useful for the average traveler. AdventureLog aims to be the opposite: simple, beautiful, and open to everyone.
I hope you enjoy using AdventureLog as much as I enjoy creating it! If you have any questions, feedback, or suggestions, feel free to reach out to me via the email address listed on my website. I'm always happy to hear from users and help in any way I can. Thank you for using AdventureLog, and happy travels! 🌍
If you want to check out the origins of AdventureLog and the timeline, process, and story of its development, check out the [Development Timeline](/docs/changelogs/development_timeline.md).

View File

@@ -1,6 +1,6 @@
{
"devDependencies": {
"vitepress": "^1.6.3"
"vitepress": "^1.6.4"
},
"scripts": {
"docs:dev": "vitepress dev",
@@ -8,7 +8,7 @@
"docs:preview": "vitepress preview"
},
"dependencies": {
"prettier": "^3.3.3",
"vue": "^3.5.13"
"prettier": "^3.7.4",
"vue": "^3.5.26"
}
}

View File

@@ -9,15 +9,15 @@ importers:
.:
dependencies:
prettier:
specifier: ^3.3.3
version: 3.3.3
specifier: ^3.7.4
version: 3.7.4
vue:
specifier: ^3.5.13
version: 3.5.13
specifier: ^3.5.26
version: 3.5.26
devDependencies:
vitepress:
specifier: ^1.6.3
version: 1.6.3(@algolia/client-search@5.15.0)(postcss@8.4.49)(search-insights@2.17.3)
specifier: ^1.6.4
version: 1.6.4(@algolia/client-search@5.15.0)(postcss@8.5.6)(search-insights@2.17.3)
packages:
@@ -93,21 +93,21 @@ packages:
resolution: {integrity: sha512-b1jTpbFf9LnQHEJP5ddDJKE2sAlhYd7EVSOWgzo/27n/SfCoHfqD0VWntnWYD83PnOKvfe8auZ2+xCb0TXotrQ==}
engines: {node: '>= 14.0.0'}
'@babel/helper-string-parser@7.25.9':
resolution: {integrity: sha512-4A/SCr/2KLd5jrtOMFzaKjVtAei3+2r/NChoBNoZ3EyP/+GlhoaEGoWOZUmFmoITP7zOJyHIMm+DYRd8o3PvHA==}
'@babel/helper-string-parser@7.27.1':
resolution: {integrity: sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==}
engines: {node: '>=6.9.0'}
'@babel/helper-validator-identifier@7.25.9':
resolution: {integrity: sha512-Ed61U6XJc3CVRfkERJWDz4dJwKe7iLmmJsbOGu9wSloNSFttHV0I8g6UAgb7qnK5ly5bGLPd4oXZlxCdANBOWQ==}
'@babel/helper-validator-identifier@7.28.5':
resolution: {integrity: sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==}
engines: {node: '>=6.9.0'}
'@babel/parser@7.26.2':
resolution: {integrity: sha512-DWMCZH9WA4Maitz2q21SRKHo9QXZxkDsbNZoVD62gusNtNBBqDg9i7uOhASfTfIGNzW+O+r7+jAlM8dwphcJKQ==}
'@babel/parser@7.28.5':
resolution: {integrity: sha512-KKBU1VGYR7ORr3At5HAtUQ+TV3SzRCXmA/8OdDZiLDBIZxVyzXuztPjfLd3BV1PRAQGCMWWSHYhL0F8d5uHBDQ==}
engines: {node: '>=6.0.0'}
hasBin: true
'@babel/types@7.26.0':
resolution: {integrity: sha512-Z/yiTPj+lDVnF7lWeKCIJzaIkI0vYO87dMpZ4bg4TDrFe4XXLFWL1TbXU27gBP3QccxV9mZICCrnjnYlJjXHOA==}
'@babel/types@7.28.5':
resolution: {integrity: sha512-qQ5m48eI/MFLQ5PxQj4PFaprjyCTLI37ElWMmNs0K8Lk3dVeOdNpB3ks8jc7yM5CDmVC73eMVk/trk3fgmrUpA==}
engines: {node: '>=6.9.0'}
'@docsearch/css@3.8.2':
@@ -277,8 +277,8 @@ packages:
'@iconify/types@2.0.0':
resolution: {integrity: sha512-+wluvCrRhXrhyOmRDJ3q8mux9JkKy5SJ/v8ol2tu4FVjyYvtEzkc/3pK15ET6RKg4b4w4BmTk1+gsCUhf21Ykg==}
'@jridgewell/sourcemap-codec@1.5.0':
resolution: {integrity: sha512-gv3ZRaISU3fjPAgNsriBRqGWQL6quFx04YMPW/zD8XMLsU32mhCCbfbO6KZFLjvYpCZ8zyDEgqsgf+PwPaM7GQ==}
'@jridgewell/sourcemap-codec@1.5.5':
resolution: {integrity: sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==}
'@rollup/rollup-android-arm-eabi@4.27.4':
resolution: {integrity: sha512-2Y3JT6f5MrQkICUyRVCw4oa0sutfAsgaSsb0Lmmy1Wi2y7X5vT9Euqw4gOsCyy0YfKURBg35nhUKZS4mDcfULw==}
@@ -428,17 +428,17 @@ packages:
vite: ^5.0.0 || ^6.0.0
vue: ^3.2.25
'@vue/compiler-core@3.5.13':
resolution: {integrity: sha512-oOdAkwqUfW1WqpwSYJce06wvt6HljgY3fGeM9NcVA1HaYOij3mZG9Rkysn0OHuyUAGMbEbARIpsG+LPVlBJ5/Q==}
'@vue/compiler-core@3.5.26':
resolution: {integrity: sha512-vXyI5GMfuoBCnv5ucIT7jhHKl55Y477yxP6fc4eUswjP8FG3FFVFd41eNDArR+Uk3QKn2Z85NavjaxLxOC19/w==}
'@vue/compiler-dom@3.5.13':
resolution: {integrity: sha512-ZOJ46sMOKUjO3e94wPdCzQ6P1Lx/vhp2RSvfaab88Ajexs0AHeV0uasYhi99WPaogmBlRHNRuly8xV75cNTMDA==}
'@vue/compiler-dom@3.5.26':
resolution: {integrity: sha512-y1Tcd3eXs834QjswshSilCBnKGeQjQXB6PqFn/1nxcQw4pmG42G8lwz+FZPAZAby6gZeHSt/8LMPfZ4Rb+Bd/A==}
'@vue/compiler-sfc@3.5.13':
resolution: {integrity: sha512-6VdaljMpD82w6c2749Zhf5T9u5uLBWKnVue6XWxprDobftnletJ8+oel7sexFfM3qIxNmVE7LSFGTpv6obNyaQ==}
'@vue/compiler-sfc@3.5.26':
resolution: {integrity: sha512-egp69qDTSEZcf4bGOSsprUr4xI73wfrY5oRs6GSgXFTiHrWj4Y3X5Ydtip9QMqiCMCPVwLglB9GBxXtTadJ3mA==}
'@vue/compiler-ssr@3.5.13':
resolution: {integrity: sha512-wMH6vrYHxQl/IybKJagqbquvxpWCuVYpoUJfCqFZwa/JY1GdATAQ+TgVtgrwwMZ0D07QhA99rs/EAAWfvG6KpA==}
'@vue/compiler-ssr@3.5.26':
resolution: {integrity: sha512-lZT9/Y0nSIRUPVvapFJEVDbEXruZh2IYHMk2zTtEgJSlP5gVOqeWXH54xDKAaFS4rTnDeDBQUYDtxKyoW9FwDw==}
'@vue/devtools-api@7.7.6':
resolution: {integrity: sha512-b2Xx0KvXZObePpXPYHvBRRJLDQn5nhKjXh7vUhMEtWxz1AYNFOVIsh5+HLP8xDGL7sy+Q7hXeUxPHB/KgbtsPw==}
@@ -449,23 +449,26 @@ packages:
'@vue/devtools-shared@7.7.6':
resolution: {integrity: sha512-yFEgJZ/WblEsojQQceuyK6FzpFDx4kqrz2ohInxNj5/DnhoX023upTv4OD6lNPLAA5LLkbwPVb10o/7b+Y4FVA==}
'@vue/reactivity@3.5.13':
resolution: {integrity: sha512-NaCwtw8o48B9I6L1zl2p41OHo/2Z4wqYGGIK1Khu5T7yxrn+ATOixn/Udn2m+6kZKB/J7cuT9DbWWhRxqixACg==}
'@vue/reactivity@3.5.26':
resolution: {integrity: sha512-9EnYB1/DIiUYYnzlnUBgwU32NNvLp/nhxLXeWRhHUEeWNTn1ECxX8aGO7RTXeX6PPcxe3LLuNBFoJbV4QZ+CFQ==}
'@vue/runtime-core@3.5.13':
resolution: {integrity: sha512-Fj4YRQ3Az0WTZw1sFe+QDb0aXCerigEpw418pw1HBUKFtnQHWzwojaukAs2X/c9DQz4MQ4bsXTGlcpGxU/RCIw==}
'@vue/runtime-core@3.5.26':
resolution: {integrity: sha512-xJWM9KH1kd201w5DvMDOwDHYhrdPTrAatn56oB/LRG4plEQeZRQLw0Bpwih9KYoqmzaxF0OKSn6swzYi84e1/Q==}
'@vue/runtime-dom@3.5.13':
resolution: {integrity: sha512-dLaj94s93NYLqjLiyFzVs9X6dWhTdAlEAciC3Moq7gzAc13VJUdCnjjRurNM6uTLFATRHexHCTu/Xp3eW6yoog==}
'@vue/runtime-dom@3.5.26':
resolution: {integrity: sha512-XLLd/+4sPC2ZkN/6+V4O4gjJu6kSDbHAChvsyWgm1oGbdSO3efvGYnm25yCjtFm/K7rrSDvSfPDgN1pHgS4VNQ==}
'@vue/server-renderer@3.5.13':
resolution: {integrity: sha512-wAi4IRJV/2SAW3htkTlB+dHeRmpTiVIK1OGLWV1yeStVSebSQQOwGwIq0D3ZIoBj2C2qpgz5+vX9iEBkTdk5YA==}
'@vue/server-renderer@3.5.26':
resolution: {integrity: sha512-TYKLXmrwWKSodyVuO1WAubucd+1XlLg4set0YoV+Hu8Lo79mp/YMwWV5mC5FgtsDxX3qo1ONrxFaTP1OQgy1uA==}
peerDependencies:
vue: 3.5.13
vue: 3.5.26
'@vue/shared@3.5.13':
resolution: {integrity: sha512-/hnE/qP5ZoGpol0a5mDi45bOd7t3tjYJBjsgCsivow7D48cJeV5l05RD82lPqi7gRiphZM37rnhW1l6ZoCNNnQ==}
'@vue/shared@3.5.26':
resolution: {integrity: sha512-7Z6/y3uFI5PRoKeorTOSXKcDj0MSasfNNltcslbFrPpcw6aXRUALq4IfJlaTRspiWIUOEZbrpM+iQGmCOiWe4A==}
'@vueuse/core@12.8.2':
resolution: {integrity: sha512-HbvCmZdzAu3VGi/pWYm5Ut+Kd9mn1ZHnn4L5G8kOQTPs/IwIAmJoBrmYk2ckLArgMXZj0AW3n5CAejLUO+PhdQ==}
@@ -539,8 +542,8 @@ packages:
resolution: {integrity: sha512-yCEafptTtb4bk7GLEQoM8KVJpxAfdBJYaXyzQEgQQQgYrZiDp8SJmGKlYza6CYjEDNstAdNdKA3UuoULlEbS6w==}
engines: {node: '>=12.13'}
csstype@3.1.3:
resolution: {integrity: sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==}
csstype@3.2.3:
resolution: {integrity: sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==}
dequal@2.0.3:
resolution: {integrity: sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA==}
@@ -552,8 +555,8 @@ packages:
emoji-regex-xs@1.0.0:
resolution: {integrity: sha512-LRlerrMYoIDrT6jgpeZ2YYl/L8EulRTt5hQcYjy5AInh7HWXKimpqx68aknBFpGL2+/IcogTcaydJEgaTmOpDg==}
entities@4.5.0:
resolution: {integrity: sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==}
entities@7.0.0:
resolution: {integrity: sha512-FDWG5cmEYf2Z00IkYRhbFrwIwvdFKH07uV8dvNy0omp/Qb1xcyCWp2UDtcwJF4QZZvk0sLudP6/hAu42TaqVhQ==}
engines: {node: '>=0.12'}
esbuild@0.21.5:
@@ -588,8 +591,8 @@ packages:
resolution: {integrity: sha512-ZhMwEosbFJkA0YhFnNDgTM4ZxDRsS6HqTo7qsZM08fehyRYIYa0yHu5R6mgo1n/8MgaPBXiPimPD77baVFYg+A==}
engines: {node: '>=12.13'}
magic-string@0.30.13:
resolution: {integrity: sha512-8rYBO+MsWkgjDSOvLomYnzhdwEG51olQ4zL5KXnNJWV5MNmrb4rTZdrtkhxjnD/QyZUqR/Z/XDsUs/4ej2nx0g==}
magic-string@0.30.21:
resolution: {integrity: sha512-vd2F4YUyEXKGcLHoq+TEyCjxueSeHnFxyyjNp80yg0XV4vUhnDer/lvvlqM/arB5bXQN5K2/3oinyCRyx8T2CQ==}
mark.js@8.11.1:
resolution: {integrity: sha512-1I+1qpDt4idfgLQG+BNWmrqku+7/2bi5nLf4YwF8y8zXvmfiTBY3PV3ZibfrjBueCByROpuBjLLFCajqkgYoLQ==}
@@ -618,6 +621,11 @@ packages:
mitt@3.0.1:
resolution: {integrity: sha512-vKivATfr97l2/QBCYAkXYDbrIWPM2IIKEl7YPhjCvKlG3kE2gm+uBo6nEXK3M5/Ffh/FLpKExzOQ3JJoJGFKBw==}
nanoid@3.3.11:
resolution: {integrity: sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==}
engines: {node: ^10 || ^12 || ^13.7 || ^14 || >=15.0.1}
hasBin: true
nanoid@3.3.8:
resolution: {integrity: sha512-WNLf5Sd8oZxOm+TzppcYk8gVOgP+l58xNy58D0nbUnOxOWRWvlcCV4kUF7ltmI6PsrLl/BgKEyS4mqsGChFN0w==}
engines: {node: ^10 || ^12 || ^13.7 || ^14 || >=15.0.1}
@@ -636,11 +644,15 @@ packages:
resolution: {integrity: sha512-OCVPnIObs4N29kxTjzLfUryOkvZEq+pf8jTF0lg8E7uETuWHA+v7j3c/xJmiqpX450191LlmZfUKkXxkTry7nA==}
engines: {node: ^10 || ^12 || >=14}
postcss@8.5.6:
resolution: {integrity: sha512-3Ybi1tAuwAP9s0r1UQ2J4n5Y0G05bJkpUIO0/bI9MhwmD70S5aTWbXGBwxHrelT+XM1k6dM0pk+SwNkpTRN7Pg==}
engines: {node: ^10 || ^12 || >=14}
preact@10.25.0:
resolution: {integrity: sha512-6bYnzlLxXV3OSpUxLdaxBmE7PMOu0aR3pG6lryK/0jmvcDFPlcXGQAt5DpK3RITWiDrfYZRI0druyaK/S9kYLg==}
prettier@3.3.3:
resolution: {integrity: sha512-i2tDNA0O5IrMO757lfrdQZCc2jPNDVntV0m/+4whiDfWaTKfMNgR7Qz0NAeGz/nRqF4m5/6CLzbP4/liHt12Ew==}
prettier@3.7.4:
resolution: {integrity: sha512-v6UNi1+3hSlVvv8fSaoUbggEM5VErKmmpGA7Pl3HF8V6uKY7rvClBOJlH6yNwQtfTueNkGVpOv/mtWL9L4bgRA==}
engines: {node: '>=14'}
hasBin: true
@@ -746,8 +758,8 @@ packages:
terser:
optional: true
vitepress@1.6.3:
resolution: {integrity: sha512-fCkfdOk8yRZT8GD9BFqusW3+GggWYZ/rYncOfmgcDtP3ualNHCAg+Robxp2/6xfH1WwPHtGpPwv7mbA3qomtBw==}
vitepress@1.6.4:
resolution: {integrity: sha512-+2ym1/+0VVrbhNyRoFFesVvBvHAVMZMK0rw60E3X/5349M1GuVdKeazuksqopEdvkKwKGs21Q729jX81/bkBJg==}
hasBin: true
peerDependencies:
markdown-it-mathjax3: ^4
@@ -758,8 +770,8 @@ packages:
postcss:
optional: true
vue@3.5.13:
resolution: {integrity: sha512-wmeiSMxkZCSc+PM2w2VRsOYAZC8GdipNFRTsLSfodVqI9mbejKeXEGr8SckuLnrQPGe3oJN5c3K0vpoU9q/wCQ==}
vue@3.5.26:
resolution: {integrity: sha512-SJ/NTccVyAoNUJmkM9KUqPcYlY+u8OVL1X5EW9RIs3ch5H2uERxyyIUI4MRxVCSOiEcupX9xNGde1tL9ZKpimA==}
peerDependencies:
typescript: '*'
peerDependenciesMeta:
@@ -876,18 +888,18 @@ snapshots:
dependencies:
'@algolia/client-common': 5.15.0
'@babel/helper-string-parser@7.25.9': {}
'@babel/helper-string-parser@7.27.1': {}
'@babel/helper-validator-identifier@7.25.9': {}
'@babel/helper-validator-identifier@7.28.5': {}
'@babel/parser@7.26.2':
'@babel/parser@7.28.5':
dependencies:
'@babel/types': 7.26.0
'@babel/types': 7.28.5
'@babel/types@7.26.0':
'@babel/types@7.28.5':
dependencies:
'@babel/helper-string-parser': 7.25.9
'@babel/helper-validator-identifier': 7.25.9
'@babel/helper-string-parser': 7.27.1
'@babel/helper-validator-identifier': 7.28.5
'@docsearch/css@3.8.2': {}
@@ -988,7 +1000,7 @@ snapshots:
'@iconify/types@2.0.0': {}
'@jridgewell/sourcemap-codec@1.5.0': {}
'@jridgewell/sourcemap-codec@1.5.5': {}
'@rollup/rollup-android-arm-eabi@4.27.4':
optional: true
@@ -1109,40 +1121,40 @@ snapshots:
'@ungap/structured-clone@1.2.0': {}
'@vitejs/plugin-vue@5.2.4(vite@5.4.19)(vue@3.5.13)':
'@vitejs/plugin-vue@5.2.4(vite@5.4.19)(vue@3.5.26)':
dependencies:
vite: 5.4.19
vue: 3.5.13
vue: 3.5.26
'@vue/compiler-core@3.5.13':
'@vue/compiler-core@3.5.26':
dependencies:
'@babel/parser': 7.26.2
'@vue/shared': 3.5.13
entities: 4.5.0
'@babel/parser': 7.28.5
'@vue/shared': 3.5.26
entities: 7.0.0
estree-walker: 2.0.2
source-map-js: 1.2.1
'@vue/compiler-dom@3.5.13':
'@vue/compiler-dom@3.5.26':
dependencies:
'@vue/compiler-core': 3.5.13
'@vue/shared': 3.5.13
'@vue/compiler-core': 3.5.26
'@vue/shared': 3.5.26
'@vue/compiler-sfc@3.5.13':
'@vue/compiler-sfc@3.5.26':
dependencies:
'@babel/parser': 7.26.2
'@vue/compiler-core': 3.5.13
'@vue/compiler-dom': 3.5.13
'@vue/compiler-ssr': 3.5.13
'@vue/shared': 3.5.13
'@babel/parser': 7.28.5
'@vue/compiler-core': 3.5.26
'@vue/compiler-dom': 3.5.26
'@vue/compiler-ssr': 3.5.26
'@vue/shared': 3.5.26
estree-walker: 2.0.2
magic-string: 0.30.13
postcss: 8.4.49
magic-string: 0.30.21
postcss: 8.5.6
source-map-js: 1.2.1
'@vue/compiler-ssr@3.5.13':
'@vue/compiler-ssr@3.5.26':
dependencies:
'@vue/compiler-dom': 3.5.13
'@vue/shared': 3.5.13
'@vue/compiler-dom': 3.5.26
'@vue/shared': 3.5.26
'@vue/devtools-api@7.7.6':
dependencies:
@@ -1162,36 +1174,38 @@ snapshots:
dependencies:
rfdc: 1.4.1
'@vue/reactivity@3.5.13':
'@vue/reactivity@3.5.26':
dependencies:
'@vue/shared': 3.5.13
'@vue/shared': 3.5.26
'@vue/runtime-core@3.5.13':
'@vue/runtime-core@3.5.26':
dependencies:
'@vue/reactivity': 3.5.13
'@vue/shared': 3.5.13
'@vue/reactivity': 3.5.26
'@vue/shared': 3.5.26
'@vue/runtime-dom@3.5.13':
'@vue/runtime-dom@3.5.26':
dependencies:
'@vue/reactivity': 3.5.13
'@vue/runtime-core': 3.5.13
'@vue/shared': 3.5.13
csstype: 3.1.3
'@vue/reactivity': 3.5.26
'@vue/runtime-core': 3.5.26
'@vue/shared': 3.5.26
csstype: 3.2.3
'@vue/server-renderer@3.5.13(vue@3.5.13)':
'@vue/server-renderer@3.5.26(vue@3.5.26)':
dependencies:
'@vue/compiler-ssr': 3.5.13
'@vue/shared': 3.5.13
vue: 3.5.13
'@vue/compiler-ssr': 3.5.26
'@vue/shared': 3.5.26
vue: 3.5.26
'@vue/shared@3.5.13': {}
'@vue/shared@3.5.26': {}
'@vueuse/core@12.8.2':
dependencies:
'@types/web-bluetooth': 0.0.21
'@vueuse/metadata': 12.8.2
'@vueuse/shared': 12.8.2
vue: 3.5.13
vue: 3.5.26
transitivePeerDependencies:
- typescript
@@ -1199,7 +1213,7 @@ snapshots:
dependencies:
'@vueuse/core': 12.8.2
'@vueuse/shared': 12.8.2
vue: 3.5.13
vue: 3.5.26
optionalDependencies:
focus-trap: 7.6.5
transitivePeerDependencies:
@@ -1209,7 +1223,7 @@ snapshots:
'@vueuse/shared@12.8.2':
dependencies:
vue: 3.5.13
vue: 3.5.26
transitivePeerDependencies:
- typescript
@@ -1243,7 +1257,7 @@ snapshots:
dependencies:
is-what: 4.1.16
csstype@3.1.3: {}
csstype@3.2.3: {}
dequal@2.0.3: {}
@@ -1253,7 +1267,7 @@ snapshots:
emoji-regex-xs@1.0.0: {}
entities@4.5.0: {}
entities@7.0.0: {}
esbuild@0.21.5:
optionalDependencies:
@@ -1314,9 +1328,9 @@ snapshots:
is-what@4.1.16: {}
magic-string@0.30.13:
magic-string@0.30.21:
dependencies:
'@jridgewell/sourcemap-codec': 1.5.0
'@jridgewell/sourcemap-codec': 1.5.5
mark.js@8.11.1: {}
@@ -1353,6 +1367,8 @@ snapshots:
mitt@3.0.1: {}
nanoid@3.3.11: {}
nanoid@3.3.8: {}
oniguruma-to-es@3.1.1:
@@ -1371,9 +1387,15 @@ snapshots:
picocolors: 1.1.1
source-map-js: 1.2.1
postcss@8.5.6:
dependencies:
nanoid: 3.3.11
picocolors: 1.1.1
source-map-js: 1.2.1
preact@10.25.0: {}
prettier@3.3.3: {}
prettier@3.7.4: {}
property-information@7.1.0: {}
@@ -1486,7 +1508,7 @@ snapshots:
optionalDependencies:
fsevents: 2.3.3
vitepress@1.6.3(@algolia/client-search@5.15.0)(postcss@8.4.49)(search-insights@2.17.3):
vitepress@1.6.4(@algolia/client-search@5.15.0)(postcss@8.5.6)(search-insights@2.17.3):
dependencies:
'@docsearch/css': 3.8.2
'@docsearch/js': 3.8.2(@algolia/client-search@5.15.0)(search-insights@2.17.3)
@@ -1495,7 +1517,7 @@ snapshots:
'@shikijs/transformers': 2.5.0
'@shikijs/types': 2.5.0
'@types/markdown-it': 14.1.2
'@vitejs/plugin-vue': 5.2.4(vite@5.4.19)(vue@3.5.13)
'@vitejs/plugin-vue': 5.2.4(vite@5.4.19)(vue@3.5.26)
'@vue/devtools-api': 7.7.6
'@vue/shared': 3.5.13
'@vueuse/core': 12.8.2
@@ -1505,9 +1527,9 @@ snapshots:
minisearch: 7.1.1
shiki: 2.5.0
vite: 5.4.19
vue: 3.5.13
vue: 3.5.26
optionalDependencies:
postcss: 8.4.49
postcss: 8.5.6
transitivePeerDependencies:
- '@algolia/client-search'
- '@types/node'
@@ -1535,12 +1557,12 @@ snapshots:
- typescript
- universal-cookie
vue@3.5.13:
vue@3.5.26:
dependencies:
'@vue/compiler-dom': 3.5.13
'@vue/compiler-sfc': 3.5.13
'@vue/runtime-dom': 3.5.13
'@vue/server-renderer': 3.5.13(vue@3.5.13)
'@vue/shared': 3.5.13
'@vue/compiler-dom': 3.5.26
'@vue/compiler-sfc': 3.5.26
'@vue/runtime-dom': 3.5.26
'@vue/server-renderer': 3.5.26(vue@3.5.26)
'@vue/shared': 3.5.26
zwitch@2.0.4: {}

File diff suppressed because it is too large Load Diff

View File

@@ -25,6 +25,7 @@
"@types/node": "^22.15.2",
"@types/qrcode": "^1.5.5",
"autoprefixer": "^10.4.21",
"baseline-browser-mapping": "^2.9.9",
"daisyui": "^4.12.24",
"postcss": "^8.5.3",
"prettier": "^3.5.3",
@@ -47,6 +48,7 @@
"marked": "^15.0.11",
"psl": "^1.15.0",
"qrcode": "^1.5.4",
"svelte-dnd-action": "^0.9.68",
"svelte-i18n": "^4.0.1",
"svelte-maplibre": "^0.9.14"
},

1410
frontend/pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -18,6 +18,7 @@ declare global {
has_password: boolean;
disable_password: boolean;
measurement_system: 'metric' | 'imperial';
default_currency: string;
} | null;
locale: string;
}

View File

@@ -8,8 +8,7 @@
<!-- iOS / Safari PWA support -->
<meta name="apple-mobile-web-app-capable" content="yes" />
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent" />
<meta name="mobile-web-app-capable" content="yes" />
<!-- Apple touch icons (place files in frontend/static/) -->
<link rel="apple-touch-icon" sizes="180x180" href="%sveltekit.assets%/apple-touch-icon.png" />
<link

View File

@@ -1,415 +0,0 @@
<script lang="ts">
import type { Checklist, Lodging, Note, Transportation } from '$lib/types';
import { deserialize } from '$app/forms';
import { createEventDispatcher } from 'svelte';
const dispatch = createEventDispatcher();
import { t } from 'svelte-i18n';
import { addToast } from '$lib/toasts';
export let object: Lodging | Transportation;
export let objectType: 'lodging' | 'transportation' | 'note' | 'checklist';
export let isAttachmentsUploading: boolean = false;
let attachmentInput: HTMLInputElement;
let attachmentFiles: File[] = [];
let editingAttachment: { id: string; name: string } | null = null;
function handleAttachmentChange(event: Event) {
const target = event.target as HTMLInputElement;
if (target?.files) {
attachmentFiles = Array.from(target.files);
console.log('Attachments selected:', attachmentFiles.length);
if (object.id) {
// If object exists, upload immediately
uploadAttachments();
}
}
}
// Watch for external trigger to upload attachments
$: {
if (isAttachmentsUploading && attachmentFiles.length > 0 && object.id) {
// Immediately clear the trigger to prevent infinite loop
const filesToUpload = [...attachmentFiles];
attachmentFiles = []; // Clear immediately
if (attachmentInput) {
attachmentInput.value = '';
}
uploadAttachmentsFromList(filesToUpload);
}
}
async function uploadAttachments() {
if (attachmentFiles.length === 0) {
isAttachmentsUploading = false;
return;
}
const filesToUpload = [...attachmentFiles];
// Clear immediately to prevent re-triggering
attachmentFiles = [];
if (attachmentInput) {
attachmentInput.value = '';
}
await uploadAttachmentsFromList(filesToUpload);
}
async function uploadAttachmentsFromList(files: File[]) {
if (files.length === 0) {
isAttachmentsUploading = false;
return;
}
try {
// Upload all attachments concurrently
const uploadPromises = files.map((file) => uploadAttachment(file));
await Promise.all(uploadPromises);
} catch (error) {
} finally {
isAttachmentsUploading = false;
}
}
async function uploadAttachment(file: File): Promise<void> {
let formData = new FormData();
formData.append('file', file);
formData.append('object_id', object.id);
formData.append('content_type', objectType);
let res = await fetch(`/locations?/attachment`, {
method: 'POST',
body: formData
});
if (res.ok) {
let newData = deserialize(await res.text()) as {
data: {
id: string;
file: string;
name: string;
extension: string;
size: number;
};
};
let newAttachment = {
id: newData.data.id,
file: newData.data.file,
name: newData.data.name,
extension: newData.data.extension,
size: newData.data.size,
user: '',
geojson: null
};
object.attachments = [...(object.attachments || []), newAttachment];
} else {
throw new Error(`Failed to upload ${file.name}`);
}
}
async function removeAttachment(id: string) {
let res = await fetch(`/api/attachments/${id}/`, {
method: 'DELETE'
});
if (res.status === 204) {
object.attachments = object.attachments.filter(
(attachment: { id: string }) => attachment.id !== id
);
addToast('success', $t('adventures.attachment_removed_success'));
} else {
addToast('error', $t('adventures.attachment_removed_error'));
console.error('Error removing attachment:', await res.text());
}
}
async function updateAttachmentName(attachmentId: string, newName: string) {
let res = await fetch(`/api/attachments/${attachmentId}/`, {
method: 'PATCH',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({ name: newName })
});
if (res.ok) {
object.attachments = object.attachments.map((attachment) => {
if (attachment.id === attachmentId) {
return { ...attachment, name: newName };
}
return attachment;
});
editingAttachment = null;
} else {
}
}
function startEditingName(attachment: { id: string; name: string }) {
editingAttachment = { id: attachment.id, name: attachment.name };
}
function cancelEditingName() {
editingAttachment = null;
}
function handleNameKeydown(event: KeyboardEvent, attachmentId: string) {
if (event.key === 'Enter') {
updateAttachmentName(attachmentId, editingAttachment?.name || '');
} else if (event.key === 'Escape') {
cancelEditingName();
}
}
function getFileIcon(filename: string): string {
const extension = filename.toLowerCase().split('.').pop() || '';
switch (extension) {
case 'pdf':
return 'M19 3H5c-1.1 0-2 .9-2 2v14c0 1.1.9 2 2 2h14c1.1 0 2-.9 2-2V5c0-1.1-.9-2-2-2zM9.5 11.5c0 .83-.67 1.5-1.5 1.5s-1.5-.67-1.5-1.5.67-1.5 1.5-1.5 1.5.67 1.5 1.5zM17 17H7l3-3.99 2 2.67L16 12l1 5z';
case 'doc':
case 'docx':
return 'M14,2H6A2,2 0 0,0 4,4V20A2,2 0 0,0 6,22H18A2,2 0 0,0 20,20V8L14,2M18,20H6V4H13V9H18V20Z';
case 'xls':
case 'xlsx':
return 'M14,2H6A2,2 0 0,0 4,4V20A2,2 0 0,0 6,22H18A2,2 0 0,0 20,20V8L14,2M18,20H6V4H13V9H18V20M8.93,12.22H10.66L12.03,14.71L13.4,12.22H15.13L13.15,15.31L15.13,18.4H13.4L12.03,15.91L10.66,18.4H8.93L10.91,15.31L8.93,12.22Z';
case 'txt':
return 'M14,2H6A2,2 0 0,0 4,4V20A2,2 0 0,0 6,22H18A2,2 0 0,0 20,20V8L14,2M18,20H6V4H13V9H18V20Z';
case 'jpg':
case 'jpeg':
case 'png':
case 'gif':
return 'M19,3H5C3.89,3 3,3.89 3,5V19A2,2 0 0,0 5,21H19A2,2 0 0,0 21,19V5C21,3.89 20.1,3 19,3M19,19H5V5H19V19M13.96,12.29L11.21,15.83L9.25,13.47L6.5,17H17.5L13.96,12.29Z';
default:
return 'M14,2H6A2,2 0 0,0 4,4V20A2,2 0 0,0 6,22H18A2,2 0 0,0 20,20V8L14,2M18,20H6V4H13V9H18V20Z';
}
}
function formatFileSize(bytes: number): string {
if (bytes === 0) return '0 Bytes';
const k = 1024;
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
}
// Export function to check if attachments are ready to upload
export function hasAttachmentsToUpload(): boolean {
return attachmentFiles.length > 0;
}
</script>
<div
class="collapse collapse-plus bg-base-200/50 border border-base-300/50 mb-6 rounded-2xl overflow-hidden"
>
<input type="checkbox" />
<div class="collapse-title text-xl font-semibold bg-gradient-to-r from-primary/10 to-primary/5">
<div class="flex items-center gap-3">
<div class="p-2 bg-primary/10 rounded-lg">
<svg class="w-5 h-5 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M15.172 7l-6.586 6.586a2 2 0 102.828 2.828l6.414-6.586a4 4 0 00-5.656-5.656l-6.415 6.585a6 6 0 108.486 8.486L20.5 13"
/>
</svg>
</div>
{$t('adventures.attachments')}
{#if isAttachmentsUploading}
<span class="loading loading-spinner loading-sm text-primary"></span>
{/if}
</div>
</div>
<div class="collapse-content bg-base-100/50 pt-4 p-6">
<div class="form-control">
<label class="label" for="attachment">
<span class="label-text font-medium">{$t('adventures.upload_attachment')}</span>
</label>
<input
type="file"
id="attachment"
name="attachment"
multiple
bind:this={attachmentInput}
on:change={handleAttachmentChange}
class="file-input file-input-bordered file-input-primary w-full bg-base-100/80 focus:bg-base-100"
disabled={isAttachmentsUploading}
/>
</div>
{#if attachmentFiles.length > 0 && !object.id}
<div class="mt-4">
<h4 class="font-semibold text-base-content mb-2">
{$t('adventures.selected_attachments')} ({attachmentFiles.length})
</h4>
<div class="alert alert-info">
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
class="stroke-current shrink-0 w-6 h-6"
><path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
></path></svg
>
<span>{$t('adventures.attachments_upload_info')} {objectType}</span>
</div>
<div class="space-y-2 mt-3">
{#each attachmentFiles as file}
<div class="flex items-center gap-3 p-3 bg-base-200/60 rounded-lg">
<div class="p-2 bg-secondary/20 rounded-lg">
<svg class="w-4 h-4 text-secondary" fill="currentColor" viewBox="0 0 24 24">
<path d={getFileIcon(file.name)} />
</svg>
</div>
<div class="flex-1 min-w-0">
<p class="text-sm font-medium text-base-content truncate">
{file.name}
</p>
<p class="text-xs text-base-content/60">
{formatFileSize(file.size)}
</p>
</div>
</div>
{/each}
</div>
</div>
{/if}
{#if object.id}
<div class="divider my-6"></div>
<!-- Current Attachments -->
<div class="space-y-4">
<h4 class="font-semibold text-lg">{$t('adventures.my_attachments')}</h4>
{#if object.attachments && object.attachments.length > 0}
<div class="space-y-3">
{#each object.attachments as attachment}
<div
class="group relative flex items-center gap-4 p-4 bg-base-200/60 hover:bg-base-200 rounded-xl border border-base-300/50 transition-all duration-200 hover:shadow-sm"
>
<div class="p-3 bg-secondary/20 rounded-lg">
<svg class="w-6 h-6 text-secondary" fill="currentColor" viewBox="0 0 24 24">
<path d={getFileIcon(attachment.name || attachment.file)} />
</svg>
</div>
<div class="flex-1 min-w-0">
{#if editingAttachment?.id === attachment.id}
<div class="flex items-center gap-2">
<!-- svelte-ignore a11y-autofocus -->
<input
type="text"
bind:value={editingAttachment.name}
on:keydown={(e) => handleNameKeydown(e, attachment.id)}
class="input input-sm input-bordered flex-1 bg-base-100"
placeholder="Enter attachment name"
autofocus
/>
<button
type="button"
class="btn btn-success btn-sm"
on:click={() =>
updateAttachmentName(attachment.id, editingAttachment?.name || '')}
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M5 13l4 4L19 7"
/>
</svg>
</button>
<button
type="button"
class="btn btn-ghost btn-sm"
on:click={cancelEditingName}
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M6 18L18 6M6 6l12 12"
/>
</svg>
</button>
</div>
{:else}
<div class="flex items-center gap-2">
<h5 class="text-sm font-semibold text-base-content truncate flex-1">
{attachment.name || attachment.file.split('/').pop() || 'Untitled'}
</h5>
<button
type="button"
class="btn btn-ghost btn-xs opacity-0 group-hover:opacity-100 transition-opacity"
on:click={() => startEditingName(attachment)}
title="Edit name"
>
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M15.232 5.232l3.536 3.536m-2.036-5.036a2.5 2.5 0 113.536 3.536L6.5 21.036H3v-3.572L16.732 3.732z"
/>
</svg>
</button>
</div>
{/if}
</div>
<div class="flex items-center gap-2">
<a
href={attachment.file}
target="_blank"
rel="noopener noreferrer"
class="btn btn-ghost btn-sm opacity-60 group-hover:opacity-100 transition-opacity"
title="Download"
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M12 10v6m0 0l-3-3m3 3l3-3m2 8H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z"
/>
</svg>
</a>
<button
type="button"
class="btn btn-error btn-sm opacity-0 group-hover:opacity-100 transition-opacity"
on:click={() => removeAttachment(attachment.id)}
title="Remove"
>
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"
/>
</svg>
</button>
</div>
</div>
{/each}
</div>
{:else}
<div class="text-center py-8">
<div class="text-base-content/60 text-lg mb-2">
{$t('adventures.no_attachments')}
</div>
<p class="text-sm text-base-content/40">{$t('adventures.no_attachments_desc')}</p>
</div>
{/if}
</div>
{/if}
</div>
</div>

View File

@@ -0,0 +1,360 @@
<script lang="ts">
import type { Attachment, User } from '$lib/types';
import { createEventDispatcher } from 'svelte';
import { t } from 'svelte-i18n';
import { deserialize } from '$app/forms';
// Icons
import TrashIcon from '~icons/mdi/delete';
import EditIcon from '~icons/mdi/pencil';
import FileIcon from '~icons/mdi/file-document';
import AttachmentIcon from '~icons/mdi/attachment';
import LaunchIcon from '~icons/mdi/open-in-new';
import CheckIcon from '~icons/mdi/check';
import CloseIcon from '~icons/mdi/close';
import LightbubOnIcon from '~icons/mdi/lightbulb-on';
import { addToast } from '$lib/toasts';
import StravaGpxList from './transportation/StravaGpxList.svelte';
// Props
export let attachments: Attachment[] = [];
export let itemId: string = '';
export let contentType: 'location' | 'lodging' | 'transportation' | '' = 'location';
export let user: User | null = null;
export let start_date: string | null = null;
export let end_date: string | null = null;
// Component state
let attachmentFileInput: HTMLInputElement;
let attachmentError: string = '';
let isAttachmentLoading: boolean = false;
// Attachment state
let selectedFile: File | null = null;
let attachmentName: string = '';
let attachmentToEdit: Attachment | null = null;
let editingAttachmentName: string = '';
// Allowed file types for attachments
const allowedFileTypes = [
'.gpx',
'.kml',
'.kmz',
'.pdf',
'.doc',
'.docx',
'.txt',
'.md',
'.json',
'.xml',
'.csv',
'.xlsx'
];
const dispatch = createEventDispatcher();
// Helper functions
function updateAttachmentsList(newAttachment: Attachment) {
attachments = [...attachments, newAttachment];
dispatch('attachmentsUpdated', attachments);
}
// Attachment event handlers
function handleAttachmentFileChange(event: Event) {
const files = (event.target as HTMLInputElement).files;
if (files && files.length > 0) {
selectedFile = files[0];
// Auto-fill attachment name if empty
if (!attachmentName.trim()) {
attachmentName = selectedFile.name.split('.')[0];
}
} else {
selectedFile = null;
}
attachmentError = '';
}
async function uploadAttachment() {
if (!selectedFile) {
attachmentError = $t('adventures.no_file_selected');
return;
}
if (!attachmentName.trim()) {
attachmentError = $t('adventures.attachment_name_required');
return;
}
isAttachmentLoading = true;
attachmentError = '';
const formData = new FormData();
formData.append('file', selectedFile);
formData.append('name', attachmentName.trim());
formData.append('object_id', itemId);
formData.append('content_type', contentType);
try {
const res = await fetch('/locations?/attachment', {
method: 'POST',
body: formData
});
if (res.ok) {
const newData = deserialize(await res.text()) as { data: Attachment };
updateAttachmentsList(newData.data);
addToast('success', $t('adventures.attachment_upload_success'));
// Reset form
attachmentName = '';
selectedFile = null;
if (attachmentFileInput) {
attachmentFileInput.value = '';
}
} else {
throw new Error('Upload failed');
}
} catch (error) {
console.error('Attachment upload error:', error);
attachmentError = $t('adventures.attachment_upload_error');
addToast('error', $t('adventures.attachment_upload_error'));
} finally {
isAttachmentLoading = false;
}
}
function startEditingAttachment(attachment: Attachment) {
attachmentToEdit = attachment;
editingAttachmentName = attachment.name;
}
function cancelEditingAttachment() {
attachmentToEdit = null;
editingAttachmentName = '';
}
async function saveAttachmentEdit() {
if (!attachmentToEdit || !editingAttachmentName.trim()) return;
try {
const res = await fetch(`/api/attachments/${attachmentToEdit.id}`, {
method: 'PATCH',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
name: editingAttachmentName.trim()
})
});
if (res.ok) {
attachments = attachments.map((att) =>
att.id === attachmentToEdit!.id ? { ...att, name: editingAttachmentName.trim() } : att
);
dispatch('attachmentsUpdated', attachments);
addToast('success', $t('adventures.attachment_updated'));
cancelEditingAttachment();
} else {
throw new Error('Failed to update attachment');
}
} catch (error) {
console.error('Error updating attachment:', error);
addToast('error', $t('adventures.attachment_update_error'));
}
}
async function removeAttachment(attachmentId: string) {
try {
const res = await fetch(`/api/attachments/${attachmentId}`, {
method: 'DELETE'
});
if (res.status === 204) {
attachments = attachments.filter((attachment) => attachment.id !== attachmentId);
dispatch('attachmentsUpdated', attachments);
addToast('success', $t('adventures.attachment_removed'));
} else {
throw new Error('Failed to remove attachment');
}
} catch (error) {
console.error('Error removing attachment:', error);
addToast('error', $t('adventures.attachment_remove_error'));
}
}
</script>
<div class="card bg-base-100 border border-base-300 shadow-lg">
<div class="card-body p-6">
<div class="flex items-center gap-3 mb-6">
<div class="p-2 bg-secondary/10 rounded-lg">
<AttachmentIcon class="w-5 h-5 text-secondary" />
</div>
<h2 class="text-xl font-bold">{$t('adventures.attachment_management')}</h2>
</div>
<!-- transportation GPX tip box -->
{#if contentType === 'transportation'}
<div class="alert alert-neutral mb-6">
<div class="flex-1">
<div class="flex items-center gap-2">
<LightbubOnIcon class="w-5 h-5 " />
<p class="text-sm">
{$t('adventures.transportation_gpx_tip')}
</p>
</div>
</div>
</div>
{/if}
<!-- Upload Options -->
<div class="grid gap-4 mb-6">
<!-- File Upload -->
<div class="bg-base-50 p-4 rounded-lg border border-base-200">
<h4 class="font-medium mb-3 text-base-content/80">
{$t('adventures.upload_attachment')}
</h4>
<div class="grid gap-3 md:grid-cols-3">
<input
type="file"
bind:this={attachmentFileInput}
class="file-input file-input-bordered col-span-2 md:col-span-1"
accept={allowedFileTypes.join(',')}
disabled={isAttachmentLoading}
on:change={handleAttachmentFileChange}
/>
<input
type="text"
bind:value={attachmentName}
class="input input-bordered"
placeholder={$t('adventures.attachment_name')}
disabled={isAttachmentLoading}
/>
<button
class="btn btn-secondary btn-sm md:btn-md"
class:loading={isAttachmentLoading}
disabled={isAttachmentLoading || !selectedFile || !attachmentName.trim()}
on:click={uploadAttachment}
>
{$t('adventures.upload')}
</button>
</div>
{#if contentType === 'transportation'}
<StravaGpxList {start_date} {end_date} {user} />
{/if}
{#if attachmentError}
<div class="alert alert-error mt-2 py-2">
<span class="text-sm">{attachmentError}</span>
</div>
{/if}
</div>
</div>
<!-- Attachment Gallery -->
{#if attachments.length > 0}
<div class="divider">{$t('adventures.current_attachments')}</div>
<div class="grid gap-4 sm:grid-cols-2 lg:grid-cols-3">
{#each attachments as attachment (attachment.id)}
<div class="relative group">
{#if attachmentToEdit?.id === attachment.id}
<!-- Edit Mode -->
<div class="bg-warning/10 p-4 rounded-lg border border-warning/30">
<div class="flex items-center gap-2 mb-3">
<EditIcon class="w-4 h-4 text-warning" />
<span class="text-sm font-medium text-warning">Editing</span>
</div>
<input
type="text"
bind:value={editingAttachmentName}
class="input input-bordered input-sm w-full mb-3"
placeholder="Attachment name"
/>
<div class="flex gap-2">
<button class="btn btn-success btn-xs flex-1" on:click={saveAttachmentEdit}>
<CheckIcon class="w-3 h-3" />
Save
</button>
<button class="btn btn-ghost btn-xs flex-1" on:click={cancelEditingAttachment}>
<CloseIcon class="w-3 h-3" />
{$t('adventures.cancel')}
</button>
</div>
</div>
{:else}
<!-- Normal Display -->
<div
class="bg-base-50 p-4 rounded-lg border border-base-200 hover:border-base-300 transition-colors"
>
<div class="flex items-center gap-3 mb-2">
<div class="p-2 bg-secondary/10 rounded flex items-center justify-center">
<FileIcon class="w-4 h-4 text-secondary" />
</div>
<div class="flex-1 min-w-0">
{#if attachment.file}
<a
href={attachment.file}
target="_blank"
rel="noopener noreferrer"
class="font-medium truncate hover:underline underline-offset-2"
aria-label="Open attachment in new tab">{attachment.name}</a
>
{:else}
<div class="font-medium truncate">{attachment.name}</div>
{/if}
<div class="text-xs text-base-content/60">
{attachment.extension.toUpperCase()}
</div>
</div>
</div>
<!-- Attachment Controls -->
<div class="flex gap-2 mt-3 justify-end">
<!-- Open in new tab button -->
{#if attachment.file}
<a
href={attachment.file}
target="_blank"
rel="noopener noreferrer"
class="btn btn-xs btn-square neutral-200 hover:bg-neutral-300 text-base-content tooltip tooltip-top"
data-tip="Open in new tab"
aria-label="Open attachment in new tab"
>
<LaunchIcon class="w-4 h-4" />
</a>
{/if}
<button
type="button"
class="btn btn-warning btn-xs btn-square tooltip tooltip-top"
data-tip="Edit Name"
on:click={() => startEditingAttachment(attachment)}
>
<EditIcon class="w-4 h-4" />
</button>
<button
type="button"
class="btn btn-error btn-xs btn-square tooltip tooltip-top"
data-tip="Remove Attachment"
on:click={() => removeAttachment(attachment.id)}
>
<TrashIcon class="w-4 h-4" />
</button>
</div>
</div>
{/if}
</div>
{/each}
</div>
{:else}
<div class="bg-base-200/50 rounded-lg p-8 text-center">
<div class="text-base-content/60 mb-2">
{$t('adventures.no_attachments_uploaded_yet')}
</div>
<div class="text-sm text-base-content/40">
{$t('adventures.upload_first_attachment')}
</div>
</div>
{/if}
</div>
</div>

View File

@@ -1,11 +1,12 @@
<script lang="ts">
import { onMount } from 'svelte';
import type { Category } from '$lib/types';
import { onMount, tick } from 'svelte';
import { t } from 'svelte-i18n';
import type { Category } from '$lib/types';
export let selected_category: Category | null = null;
export let searchTerm: string = '';
let new_category: Category = {
export let searchTerm = '';
const emptyCategory: Category = {
name: '',
display_name: '',
icon: '',
@@ -14,72 +15,123 @@
num_locations: 0
};
$: {
console.log('Selected category changed:', selected_category);
let newCategory: Category = { ...emptyCategory };
let categories: Category[] = [];
let isOpen = false;
let isEmojiPickerVisible = false;
let dropdownRef: HTMLDivElement;
let mobileSearchInputRef: HTMLInputElement;
let desktopSearchInputRef: HTMLInputElement;
$: sortedCategories = [...categories].sort(
(a, b) => (b.num_locations || 0) - (a.num_locations || 0)
);
$: filteredCategories = sortedCategories.filter((category) => {
if (!searchTerm) return true;
return category.display_name.toLowerCase().includes(searchTerm.toLowerCase());
});
function closeDropdown() {
isOpen = false;
isEmojiPickerVisible = false;
}
let categories: Category[] = [];
async function openDropdown() {
isOpen = true;
await tick();
(mobileSearchInputRef ?? desktopSearchInputRef)?.focus();
}
let isOpen: boolean = false;
let isEmojiPickerVisible: boolean = false;
function toggleDropdown() {
isOpen ? closeDropdown() : openDropdown();
}
function toggleEmojiPicker() {
isEmojiPickerVisible = !isEmojiPickerVisible;
}
function toggleDropdown() {
isOpen = !isOpen;
}
function selectCategory(category: Category) {
console.log('category', category);
selected_category = category;
isOpen = false;
closeDropdown();
}
function custom_category() {
new_category.name = new_category.display_name.toLowerCase().replace(/ /g, '_');
if (!new_category.icon) {
new_category.icon = '🌎'; // Default icon if none selected
}
selectCategory(new_category);
function createCustomCategory() {
const displayName = newCategory.display_name.trim();
if (!displayName) return;
const generatedId =
newCategory.id ||
(typeof crypto !== 'undefined' && 'randomUUID' in crypto
? crypto.randomUUID()
: `custom-${Date.now()}`);
const category: Category = {
...newCategory,
id: generatedId,
name: displayName.toLowerCase().replace(/\s+/g, '_'),
icon: newCategory.icon || '🌎'
};
categories = [category, ...categories];
selectCategory(category);
newCategory = { ...emptyCategory };
}
function handleEmojiSelect(event: CustomEvent) {
new_category.icon = event.detail.unicode;
newCategory.icon = event.detail.unicode;
}
// Close dropdown when clicking outside
let dropdownRef: HTMLDivElement;
onMount(() => {
const loadData = async () => {
await import('emoji-picker-element');
let res = await fetch('/api/categories');
categories = await res.json();
categories = categories.sort((a, b) => (b.num_locations || 0) - (a.num_locations || 0));
try {
await import('emoji-picker-element');
} catch (error) {
console.error('Emoji picker failed to load', error);
}
try {
const res = await fetch('/api/categories');
const data = await res.json();
categories = Array.isArray(data) ? data : [];
} catch (error) {
console.error('Unable to load categories', error);
}
};
loadData();
const handleClickOutside = (event: MouseEvent) => {
if (dropdownRef && !dropdownRef.contains(event.target as Node)) {
isOpen = false;
const handleOutside = (event: Event) => {
if (!dropdownRef) return;
const target = event.target as Node | null;
if (target && !dropdownRef.contains(target)) {
closeDropdown();
}
};
document.addEventListener('click', handleClickOutside);
const handleKeyDown = (event: KeyboardEvent) => {
if (event.key === 'Escape') {
closeDropdown();
}
};
const outsideEvents: Array<keyof DocumentEventMap> = ['pointerdown', 'mousedown', 'touchstart'];
outsideEvents.forEach((eventName) => document.addEventListener(eventName, handleOutside));
document.addEventListener('keydown', handleKeyDown);
return () => {
document.removeEventListener('click', handleClickOutside);
outsideEvents.forEach((eventName) => document.removeEventListener(eventName, handleOutside));
document.removeEventListener('keydown', handleKeyDown);
};
});
</script>
<div class="dropdown w-full" bind:this={dropdownRef}>
<!-- Main dropdown trigger -->
<div
tabindex="0"
role="button"
<div class="dropdown w-full" class:dropdown-open={isOpen} bind:this={dropdownRef}>
<button
type="button"
class="btn btn-outline w-full justify-between sm:h-auto h-12"
aria-haspopup="listbox"
aria-expanded={isOpen}
on:click={toggleDropdown}
>
<span class="flex items-center gap-2">
@@ -91,29 +143,39 @@
{/if}
</span>
<svg
class="w-4 h-4 transition-transform duration-200 {isOpen ? 'rotate-180' : ''}"
class={`w-4 h-4 transition-transform duration-200 ${isOpen ? 'rotate-180' : ''}`}
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 9l-7 7-7-7" />
</svg>
</div>
</button>
{#if isOpen}
<!-- Mobile Modal Overlay (only on small screens) -->
<div class="fixed inset-0 bg-black/50 z-40 sm:hidden" on:click={() => (isOpen = false)}></div>
<button
type="button"
class="fixed inset-0 bg-black/50 z-40 sm:hidden focus:outline-none"
aria-label={$t('adventures.back')}
on:click={closeDropdown}
on:keydown={(event) => event.key === 'Enter' && closeDropdown()}
></button>
<!-- Mobile Bottom Sheet -->
<div
class="fixed bottom-0 left-0 right-0 z-50 bg-base-100 rounded-t-2xl shadow-2xl border-t border-base-300 max-h-[90vh] flex flex-col sm:hidden"
>
<!-- Mobile Header -->
<div class="flex-shrink-0 bg-base-100 border-b border-base-300 p-4">
<div class="flex items-center justify-between">
<h2 class="text-lg font-semibold">{$t('categories.select_category')}</h2>
<button class="btn btn-ghost btn-sm btn-circle" on:click={() => (isOpen = false)}>
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<button type="button" class="btn btn-ghost btn-sm btn-circle" on:click={closeDropdown}>
<svg
class="w-5 h-5"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
@@ -126,10 +188,15 @@
</div>
<div class="flex-1 overflow-y-auto min-h-0">
<!-- Mobile Category Creator Section -->
<div class="p-4 border-b border-base-300">
<h3 class="font-semibold text-sm text-base-content/80 mb-3 flex items-center gap-2">
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<div class="p-4 border-b border-base-300 space-y-4">
<div class="flex items-center gap-2 text-sm font-semibold text-base-content/80">
<svg
class="w-4 h-4"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
@@ -138,41 +205,45 @@
/>
</svg>
{$t('categories.add_new_category')}
</h3>
</div>
<div class="space-y-3">
<div class="space-y-2">
<input
type="text"
placeholder={$t('categories.category_name')}
class="input input-bordered w-full h-12 text-base"
bind:value={newCategory.display_name}
/>
<div class="join w-full">
<input
type="text"
placeholder={$t('categories.category_name')}
class="input input-bordered w-full h-12 text-base"
bind:value={new_category.display_name}
placeholder={$t('categories.icon')}
class="input input-bordered join-item flex-1 h-12 text-base"
bind:value={newCategory.icon}
/>
<div class="join w-full">
<input
type="text"
placeholder={$t('categories.icon')}
class="input input-bordered join-item flex-1 h-12 text-base"
bind:value={new_category.icon}
/>
<button
on:click={toggleEmojiPicker}
type="button"
class="btn join-item h-12 w-12 text-lg"
class:btn-active={isEmojiPickerVisible}
>
😊
</button>
</div>
<button
type="button"
class="btn join-item h-12 w-12 text-lg"
on:click={toggleEmojiPicker}
class:btn-active={isEmojiPickerVisible}
>
😊
</button>
</div>
<button
on:click={custom_category}
type="button"
class="btn btn-primary h-12 w-full"
disabled={!new_category.display_name.trim()}
on:click={createCustomCategory}
disabled={!newCategory.display_name.trim()}
>
<svg class="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<svg
class="w-4 h-4 mr-1"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
@@ -191,29 +262,38 @@
</div>
</div>
<!-- Mobile Categories List -->
<div class="p-4">
<h3 class="font-semibold text-sm text-base-content/80 mb-3">
<div class="p-4 space-y-4">
<div class="flex items-center gap-2 text-sm font-semibold text-base-content/80">
<svg
class="w-4 h-4"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"
/>
</svg>
{$t('categories.select_category')}
</h3>
</div>
{#if categories.length > 0}
<div class="form-control mb-4">
<div class="form-control">
<input
type="text"
placeholder={$t('navbar.search')}
class="input input-bordered w-full h-12 text-base"
bind:value={searchTerm}
bind:this={mobileSearchInputRef}
/>
</div>
<div class="space-y-2">
{#each categories
.slice()
.sort((a, b) => (b.num_locations || 0) - (a.num_locations || 0))
.filter((category) => !searchTerm || category.display_name
.toLowerCase()
.includes(searchTerm.toLowerCase())) as category}
{#each filteredCategories as category}
<button
type="button"
class="w-full text-left p-4 rounded-lg border border-base-300 hover:border-primary hover:bg-primary/5 transition-colors"
@@ -236,22 +316,43 @@
</button>
{/each}
</div>
{:else}
<div class="text-center py-8 text-base-content/60">
<svg
class="w-12 h-12 mx-auto mb-2 opacity-50"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.99 1.99 0 013 12V7a4 4 0 014-4z"
/>
</svg>
<p class="text-sm">{$t('categories.no_categories_yet')}</p>
</div>
{/if}
</div>
</div>
<!-- Bottom safe area -->
<div class="flex-shrink-0 h-4"></div>
<div class="flex-shrink-0 h-4"></div>
</div>
</div>
<!-- Desktop Dropdown -->
<div
class="dropdown-content z-[1] w-full mt-1 bg-base-300 rounded-box shadow-xl border border-base-300 max-h-96 overflow-y-auto hidden sm:block"
class="dropdown-content z-[1] w-full mt-1 bg-base-100 rounded-box shadow-xl border border-base-300 max-h-[28rem] overflow-y-auto hidden sm:block"
>
<!-- Desktop Category Creator Section -->
<div class="p-4 border-b border-base-300">
<h3 class="font-semibold text-sm text-base-content/80 mb-3 flex items-center gap-2">
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<div class="p-4 border-b border-base-300 space-y-3">
<div class="flex items-center gap-2 text-sm font-semibold text-base-content/80">
<svg
class="w-4 h-4"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
@@ -260,69 +361,74 @@
/>
</svg>
{$t('categories.add_new_category')}
</h3>
</div>
<div class="space-y-3">
<div class="grid grid-cols-1 md:grid-cols-2 gap-2">
<div class="form-control">
<input
type="text"
placeholder={$t('categories.category_name')}
class="input input-bordered input-sm w-full"
bind:value={new_category.display_name}
/>
</div>
<div class="form-control">
<div class="input-group">
<input
type="text"
placeholder={$t('categories.icon')}
class="input input-bordered input-sm flex-1"
bind:value={new_category.icon}
/>
<button
on:click={toggleEmojiPicker}
type="button"
class="btn btn-square btn-sm btn-secondary"
class:btn-active={isEmojiPickerVisible}
>
😊
</button>
</div>
</div>
</div>
<div class="flex justify-end">
<div class="grid grid-cols-1 md:grid-cols-2 gap-2">
<input
type="text"
placeholder={$t('categories.category_name')}
class="input input-bordered input-sm w-full"
bind:value={newCategory.display_name}
/>
<div class="input-group">
<input
type="text"
placeholder={$t('categories.icon')}
class="input input-bordered input-sm flex-1"
bind:value={newCategory.icon}
/>
<button
on:click={custom_category}
type="button"
class="btn btn-primary btn-sm"
disabled={!new_category.display_name.trim()}
class="btn btn-square btn-sm btn-secondary"
on:click={toggleEmojiPicker}
class:btn-active={isEmojiPickerVisible}
>
<svg class="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M12 6v6m0 0v6m0-6h6m-6 0H6"
/>
</svg>
{$t('adventures.add')}
😊
</button>
</div>
{#if isEmojiPickerVisible}
<div class="p-3 rounded-lg border border-base-300">
<emoji-picker on:emoji-click={handleEmojiSelect}></emoji-picker>
</div>
{/if}
</div>
<div class="flex justify-end">
<button
type="button"
class="btn btn-primary btn-sm"
on:click={createCustomCategory}
disabled={!newCategory.display_name.trim()}
>
<svg
class="w-4 h-4 mr-1"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M12 6v6m0 0v6m0-6h6m-6 0H6"
/>
</svg>
{$t('adventures.add')}
</button>
</div>
{#if isEmojiPickerVisible}
<div class="p-3 rounded-lg border border-base-300">
<emoji-picker on:emoji-click={handleEmojiSelect}></emoji-picker>
</div>
{/if}
</div>
<!-- Desktop Categories List Section -->
<div class="p-4">
<h3 class="font-semibold text-sm text-base-content/80 mb-3 flex items-center gap-2">
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<div class="p-4 space-y-3">
<div class="flex items-center gap-2 text-sm font-semibold text-base-content/80">
<svg
class="w-4 h-4"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
@@ -331,27 +437,22 @@
/>
</svg>
{$t('categories.select_category')}
</h3>
</div>
{#if categories.length > 0}
<div class="form-control mb-3">
<input
type="text"
placeholder={$t('navbar.search')}
class="input input-bordered input-sm w-full"
bind:value={searchTerm}
/>
</div>
<input
type="text"
placeholder={$t('navbar.search')}
class="input input-bordered input-sm w-full"
bind:value={searchTerm}
bind:this={desktopSearchInputRef}
/>
<div
class="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-2 max-h-60 overflow-y-auto"
role="listbox"
>
{#each categories
.slice()
.sort((a, b) => (b.num_locations || 0) - (a.num_locations || 0))
.filter((category) => !searchTerm || category.display_name
.toLowerCase()
.includes(searchTerm.toLowerCase())) as category}
{#each filteredCategories as category (category.id)}
<button
type="button"
class="btn btn-ghost btn-sm justify-start h-auto py-2 px-3"
@@ -374,15 +475,14 @@
{/each}
</div>
{#if categories.filter((category) => !searchTerm || category.display_name
.toLowerCase()
.includes(searchTerm.toLowerCase())).length === 0}
{#if filteredCategories.length === 0}
<div class="text-center py-8 text-base-content/60">
<svg
class="w-12 h-12 mx-auto mb-2 opacity-50"
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"
@@ -401,6 +501,7 @@
fill="none"
stroke="currentColor"
viewBox="0 0 24 24"
aria-hidden="true"
>
<path
stroke-linecap="round"

View File

@@ -1,104 +0,0 @@
<script lang="ts">
import { addToast } from '$lib/toasts';
import type { Checklist, Collection, User } from '$lib/types';
import { createEventDispatcher } from 'svelte';
const dispatch = createEventDispatcher();
import { t } from 'svelte-i18n';
import Launch from '~icons/mdi/launch';
import TrashCan from '~icons/mdi/trash-can';
import Calendar from '~icons/mdi/calendar';
import DeleteWarning from './DeleteWarning.svelte';
import { isEntityOutsideCollectionDateRange } from '$lib/dateUtils';
export let checklist: Checklist;
export let user: User | null = null;
export let collection: Collection;
let isWarningModalOpen: boolean = false;
let outsideCollectionRange: boolean = false;
$: {
outsideCollectionRange = isEntityOutsideCollectionDateRange(checklist, collection);
}
function editChecklist() {
dispatch('edit', checklist);
}
async function deleteChecklist() {
const res = await fetch(`/api/checklists/${checklist.id}`, {
method: 'DELETE'
});
if (res.ok) {
addToast('success', $t('checklist.checklist_deleted'));
isWarningModalOpen = false;
dispatch('delete', checklist.id);
} else {
addToast($t('checklist.checklist_delete_error'), 'error');
}
}
</script>
{#if isWarningModalOpen}
<DeleteWarning
title={$t('adventures.delete_checklist')}
button_text="Delete"
description={$t('adventures.checklist_delete_confirm')}
is_warning={false}
on:close={() => (isWarningModalOpen = false)}
on:confirm={deleteChecklist}
/>
{/if}
<div
class="card w-full max-w-md bg-base-300 text-base-content shadow-2xl hover:shadow-3xl transition-all duration-300 border border-base-300 hover:border-primary/20 group"
>
<div class="card-body p-6 space-y-4">
<!-- Header -->
<div class="flex flex-col sm:flex-row sm:items-center sm:justify-between gap-2">
<h2 class="text-xl font-bold break-words">{checklist.name}</h2>
<div class="flex flex-wrap gap-2">
<div class="badge badge-primary">{$t('adventures.checklist')}</div>
{#if outsideCollectionRange}
<div class="badge badge-error">{$t('adventures.out_of_range')}</div>
{/if}
</div>
</div>
<!-- Checklist Stats -->
{#if checklist.items.length > 0}
<p class="text-sm">
{checklist.items.length}
{checklist.items.length > 1 ? $t('checklist.items') : $t('checklist.item')}
</p>
{/if}
<!-- Date -->
{#if checklist.date && checklist.date !== ''}
<div class="inline-flex items-center gap-2 text-sm">
<Calendar class="w-5 h-5 text-primary" />
<p>{new Date(checklist.date).toLocaleDateString(undefined, { timeZone: 'UTC' })}</p>
</div>
{/if}
<!-- Actions -->
<div class="pt-4 border-t border-base-300 flex justify-end gap-2">
<button class="btn btn-neutral btn-sm flex items-center gap-1" on:click={editChecklist}>
<Launch class="w-5 h-5" />
{$t('notes.open')}
</button>
{#if checklist.user == user?.uuid || (collection && user && collection.shared_with?.includes(user.uuid))}
<button
id="delete_adventure"
data-umami-event="Delete Checklist"
class="btn btn-secondary btn-sm flex items-center gap-1"
on:click={() => (isWarningModalOpen = true)}
>
<TrashCan class="w-5 h-5" />
{$t('adventures.delete')}
</button>
{/if}
</div>
</div>
</div>

View File

@@ -11,6 +11,7 @@
export let checklist: Checklist | null = null;
export let collection: Collection;
export let user: User | null = null;
export let initialVisitDate: string | null = null;
let items: ChecklistItem[] = [];
@@ -58,14 +59,29 @@
warning = '';
}
const getSeedDate = (): string | null => {
if (checklist?.date) return checklist.date;
if (initialVisitDate) return initialVisitDate;
return null;
};
let newChecklist = {
name: checklist?.name || '',
date: checklist?.date || undefined || null,
date: getSeedDate() || undefined || null,
items: checklist?.items || [],
collection: collection.id,
is_public: collection.is_public
};
const hasVisitDateSuggestion = !!initialVisitDate && !checklist?.date;
function useVisitDate() {
if (isReadOnly) return;
if (initialVisitDate) {
newChecklist = { ...newChecklist, date: initialVisitDate };
}
}
onMount(() => {
modal = document.getElementById('my_modal_1') as HTMLDialogElement;
if (modal) {
@@ -249,6 +265,17 @@
<label class="label" for="date">
<span class="label-text font-medium">{$t('adventures.date')}</span>
</label>
{#if !isReadOnly && hasVisitDateSuggestion}
<div
class="flex flex-wrap items-center gap-2 mb-2 text-xs text-base-content/70"
>
<span class="badge badge-primary badge-soft">Itinerary day</span>
<span>Prefilled to match your selected day.</span>
<button type="button" class="btn btn-ghost btn-xs" on:click={useVisitDate}>
Reapply date
</button>
</div>
{/if}
{#if collection && collection.start_date && collection.end_date && !isReadOnly}
<div class="flex items-center gap-2 mb-2">
<input

View File

@@ -1,8 +1,9 @@
<script lang="ts">
import { createEventDispatcher } from 'svelte';
import { CircleLayer, GeoJSON, MapLibre, MarkerLayer, SymbolLayer } from 'svelte-maplibre';
import { createEventDispatcher, onMount } from 'svelte';
import { CircleLayer, GeoJSON, MapLibre, MarkerLayer } from 'svelte-maplibre';
import type { ClusterOptions, LayerClickInfo } from 'svelte-maplibre';
import { getBasemapUrl } from '$lib';
import { resolveThemeColor, withAlpha } from '$lib/utils/resolveThemeColor';
type PointGeometry = {
type: 'Point';
@@ -41,6 +42,22 @@
export let mapClass = '';
export let zoom = 2;
export let standardControls = true;
export let fitToBounds: boolean = true;
export let fitPadding: number = 40;
export let fitMaxZoom: number = 8;
// Optional level context (e.g. 'country' | 'region' | 'city'). When provided,
// `fitMaxZooms` can supply level-specific maximum zoom values used when
// fitting bounds. This lets callers choose different fit zooms for country,
// region, and city views.
export let fitLevel: string = '';
export let fitMaxZooms: Record<string, number> = { country: 4, region: 7, city: 12 };
// Effective fit max zoom (prefers level-specific value if available)
let effectiveFitMaxZoom: number = fitMaxZoom;
$: effectiveFitMaxZoom =
fitLevel && fitMaxZooms && fitMaxZooms[fitLevel] !== undefined
? fitMaxZooms[fitLevel]
: fitMaxZoom;
export let getMarkerProps: (feature: unknown) => MarkerProps = (feature) =>
feature && typeof feature === 'object' && feature !== null && 'properties' in (feature as any)
@@ -58,19 +75,92 @@
export let markerLabel: (props: MarkerProps) => string = markerTitle;
export let clusterCirclePaint = {
const DEFAULT_CLUSTER_CIRCLE_PAINT: Record<string, any> = {
'circle-color': ['step', ['get', 'point_count'], '#60a5fa', 20, '#facc15', 60, '#f472b6'],
'circle-radius': ['step', ['get', 'point_count'], 24, 20, 34, 60, 46],
'circle-opacity': 0.85
};
export let clusterSymbolLayout = {
export let clusterCirclePaint: Record<string, any> = DEFAULT_CLUSTER_CIRCLE_PAINT;
const DEFAULT_CLUSTER_SYMBOL_LAYOUT: Record<string, any> = {
'text-field': '{point_count_abbreviated}',
'text-font': ['Open Sans Bold', 'Arial Unicode MS Bold'],
// Use a font stack that works across more basemap styles.
// Many raster-only styles rely on an external `glyphs` endpoint and won't have Open Sans.
'text-font': ['Noto Sans Regular', 'Arial Unicode MS Regular'],
'text-size': 12
};
export let clusterSymbolPaint = { 'text-color': '#1f2937' };
export let clusterSymbolLayout: Record<string, any> = DEFAULT_CLUSTER_SYMBOL_LAYOUT;
const DEFAULT_CLUSTER_SYMBOL_PAINT: Record<string, any> = { 'text-color': '#1f2937' };
export let clusterSymbolPaint: Record<string, any> = DEFAULT_CLUSTER_SYMBOL_PAINT;
onMount(() => {
// Only apply theme-based defaults when the consumer hasn't overridden them.
const shouldThemeCircle = clusterCirclePaint === DEFAULT_CLUSTER_CIRCLE_PAINT;
const shouldThemeLayout = clusterSymbolLayout === DEFAULT_CLUSTER_SYMBOL_LAYOUT;
const shouldThemeSymbol = clusterSymbolPaint === DEFAULT_CLUSTER_SYMBOL_PAINT;
if (!shouldThemeCircle && !shouldThemeLayout && !shouldThemeSymbol) return;
const baseContent = resolveThemeColor('--color-base-content', '#111827');
// Softer/pastel-ish cluster palette using daisyUI semantic tokens.
const info = resolveThemeColor('--color-info', '#38bdf8');
const warning = resolveThemeColor('--color-warning', '#f59e0b');
const error = resolveThemeColor('--color-error', '#f87171');
const infoContent = resolveThemeColor('--color-info-content', '#082f49');
const warningContent = resolveThemeColor('--color-warning-content', '#111827');
const errorContent = resolveThemeColor('--color-error-content', '#450a0a');
if (shouldThemeCircle) {
clusterCirclePaint = {
// Use daisyUI semantic colors so clusters pop against any basemap.
'circle-color': [
'step',
['get', 'point_count'],
withAlpha(info, 0.7),
25,
withAlpha(warning, 0.7),
80,
withAlpha(error, 0.65)
],
'circle-radius': ['step', ['get', 'point_count'], 22, 20, 32, 60, 44],
'circle-opacity': 1,
'circle-stroke-color': withAlpha(baseContent, 0.25),
'circle-stroke-width': 2,
// Keep clusters crisp; blur can look fuzzy on some displays.
'circle-blur': 0
};
}
if (shouldThemeLayout) {
clusterSymbolLayout = {
...clusterSymbolLayout,
'text-size': 13
};
}
if (shouldThemeSymbol) {
clusterSymbolPaint = {
// Keep numbers highly readable: use each fill's matching *-content color.
'text-color': [
'step',
['get', 'point_count'],
infoContent,
25,
warningContent,
80,
errorContent
],
// Tiny crisp halo just to help glyph edges.
'text-halo-color': withAlpha(baseContent, 0.12),
'text-halo-width': 0.75,
'text-halo-blur': 0
};
}
});
const dispatch = createEventDispatcher<{
markerSelect: { feature: unknown; markerProps: MarkerProps; countryCode?: string };
@@ -80,6 +170,75 @@
let resolvedClusterCirclePaint: Record<string, unknown> = clusterCirclePaint;
$: resolvedClusterCirclePaint = clusterCirclePaint as Record<string, unknown>;
// Map instance (bound from MapLibre) and bounding state
let map: any = undefined;
let _lastBoundsKey: string | null = null;
// When `geoJson` changes, compute bounding box and fit map to bounds (only when changed)
$: if (
map &&
fitToBounds &&
geoJson &&
Array.isArray(geoJson.features) &&
geoJson.features.length > 0
) {
let minLon = 180;
let minLat = 90;
let maxLon = -180;
let maxLat = -90;
for (const f of geoJson.features) {
const coords = (f && f.geometry && f.geometry.coordinates) || null;
if (!coords || !Array.isArray(coords) || coords.length < 2) continue;
const lon = Number(coords[0]);
const lat = Number(coords[1]);
if (!Number.isFinite(lon) || !Number.isFinite(lat)) continue;
minLon = Math.min(minLon, lon);
minLat = Math.min(minLat, lat);
maxLon = Math.max(maxLon, lon);
maxLat = Math.max(maxLat, lat);
}
if (minLon <= maxLon && minLat <= maxLat) {
const boundsKey = `${minLon},${minLat},${maxLon},${maxLat}`;
if (boundsKey !== _lastBoundsKey) {
_lastBoundsKey = boundsKey;
// If bounds represent effectively a single point, use easeTo with a sensible zoom
const lonDelta = Math.abs(maxLon - minLon);
const latDelta = Math.abs(maxLat - minLat);
const isSinglePoint = lonDelta < 1e-6 && latDelta < 1e-6;
try {
if (isSinglePoint) {
const center = [(minLon + maxLon) / 2, (minLat + maxLat) / 2];
map.easeTo({ center, zoom: Math.max(zoom, effectiveFitMaxZoom), duration: 1000 });
} else {
const bounds: [[number, number], [number, number]] = [
[minLon, minLat],
[maxLon, maxLat]
];
// Use fitBounds to contain all points with padding and a max zoom
if (typeof map.fitBounds === 'function') {
map.fitBounds(bounds, {
padding: fitPadding,
maxZoom: effectiveFitMaxZoom,
duration: 1000
});
} else {
// Fallback: center and set zoom if fitBounds not available
const center = [(minLon + maxLon) / 2, (minLat + maxLat) / 2];
map.easeTo({ center, duration: 1000 });
}
}
} catch (err) {
// If something fails (map not ready), ignore — it will re-run when map is available.
console.error('ClusterMap: fitBounds failed', err);
}
}
}
}
function handleClusterClick(event: CustomEvent<LayerClickInfo>) {
const { clusterId, features, map, source } = event.detail;
if (!clusterId || !features?.length) {
@@ -131,7 +290,7 @@
}
</script>
<MapLibre style={mapStyle} class={mapClass} {standardControls} {zoom}>
<MapLibre bind:map style={mapStyle} class={mapClass} {standardControls} {zoom}>
<GeoJSON id={sourceId} data={geoJson} cluster={clusterOptions} generateId>
<CircleLayer
id={`${sourceId}-clusters`}
@@ -140,12 +299,19 @@
paint={resolvedClusterCirclePaint}
on:click={handleClusterClick}
/>
<SymbolLayer
id={`${sourceId}-cluster-count`}
applyToClusters
layout={clusterSymbolLayout}
paint={clusterSymbolPaint}
/>
<!-- Render cluster counts as HTML so they don't depend on map glyph/font availability -->
<MarkerLayer applyToClusters let:feature={clusterFeature}>
{@const clusterProps = getMarkerProps(clusterFeature)}
{@const abbreviated = clusterProps && clusterProps['point_count_abbreviated']}
{@const count = abbreviated ?? (clusterProps && clusterProps['point_count'])}
{#if typeof count !== 'undefined' && count !== null}
<div
class="pointer-events-none select-none font-sans text-xs font-bold text-base-content drop-shadow-sm"
>
{count}
</div>
{/if}
</MarkerLayer>
<MarkerLayer applyToClusters={false} on:click={handleMarkerClick} let:feature={featureData}>
{@const markerProps = getMarkerProps(featureData)}
<slot name="marker" {featureData} {markerProps}>

View File

@@ -1,551 +0,0 @@
<script lang="ts">
import type {
Location,
Transportation,
Lodging,
Note,
Checklist,
User,
Collection
} from '$lib/types';
import { createEventDispatcher } from 'svelte';
import { t } from 'svelte-i18n';
const dispatch = createEventDispatcher();
// Icons
import Adventures from '~icons/mdi/map-marker-path';
import TransportationIcon from '~icons/mdi/car';
import Hotel from '~icons/mdi/hotel';
import NoteIcon from '~icons/mdi/note-text';
import ChecklistIcon from '~icons/mdi/check-box-outline';
import Search from '~icons/mdi/magnify';
import Clear from '~icons/mdi/close';
import Filter from '~icons/mdi/filter-variant';
// Component imports
import LocationCard from './LocationCard.svelte';
import TransportationCard from './TransportationCard.svelte';
import LodgingCard from './LodgingCard.svelte';
import NoteCard from './NoteCard.svelte';
import ChecklistCard from './ChecklistCard.svelte';
// Props
export let adventures: Location[] = [];
export let transportations: Transportation[] = [];
export let lodging: Lodging[] = [];
export let notes: Note[] = [];
export let checklists: Checklist[] = [];
export let user: User | null;
export let collection: Collection;
// State
let searchQuery: string = '';
let filterOption: string = 'all';
let sortOption: string = 'name_asc';
// Filtered arrays
let filteredAdventures: Location[] = [];
let filteredTransportations: Transportation[] = [];
let filteredLodging: Lodging[] = [];
let filteredNotes: Note[] = [];
let filteredChecklists: Checklist[] = [];
// Helper function to sort items
function sortItems(items: any[], sortOption: string) {
const sorted = [...items];
switch (sortOption) {
case 'name_asc':
return sorted.sort((a, b) =>
(a.name || a.title || '').localeCompare(b.name || b.title || '')
);
case 'name_desc':
return sorted.sort((a, b) =>
(b.name || b.title || '').localeCompare(a.name || a.title || '')
);
case 'date_newest':
return sorted.sort(
(a, b) => new Date(b.created_at || 0).getTime() - new Date(a.created_at || 0).getTime()
);
case 'date_oldest':
return sorted.sort(
(a, b) => new Date(a.created_at || 0).getTime() - new Date(b.created_at || 0).getTime()
);
case 'visited_first':
return sorted.sort((a, b) => {
const aVisited = a.visits && a.visits.length > 0;
const bVisited = b.visits && b.visits.length > 0;
if (aVisited && !bVisited) return -1;
if (!aVisited && bVisited) return 1;
return 0;
});
case 'unvisited_first':
return sorted.sort((a, b) => {
const aVisited = a.visits && a.visits.length > 0;
const bVisited = b.visits && b.visits.length > 0;
if (!aVisited && bVisited) return -1;
if (aVisited && !bVisited) return 1;
return 0;
});
default:
return sorted;
}
}
// Clear all filters function
function clearAllFilters() {
searchQuery = '';
filterOption = 'all';
sortOption = 'name_asc';
}
// Reactive statements for filtering and sorting
$: {
// Filter adventures
let filtered = adventures;
if (searchQuery !== '') {
filtered = filtered.filter((adventure) => {
const nameMatch =
adventure.name?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
const locationMatch =
adventure.location?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
const descriptionMatch =
adventure.description?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
return nameMatch || locationMatch || descriptionMatch;
});
}
filteredAdventures = sortItems(filtered, sortOption);
}
$: {
// Filter transportations
let filtered = transportations;
if (searchQuery !== '') {
filtered = filtered.filter((transport) => {
const nameMatch =
transport.name?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
const fromMatch =
transport.from_location?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
const toMatch =
transport.to_location?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
return nameMatch || fromMatch || toMatch;
});
}
filteredTransportations = sortItems(filtered, sortOption);
}
$: {
// Filter lodging
let filtered = lodging;
if (searchQuery !== '') {
filtered = filtered.filter((hotel) => {
const nameMatch = hotel.name?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
const locationMatch =
hotel.location?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
return nameMatch || locationMatch;
});
}
filteredLodging = sortItems(filtered, sortOption);
}
$: {
// Filter notes
let filtered = notes;
if (searchQuery !== '') {
filtered = filtered.filter((note) => {
const titleMatch = note.name?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
const contentMatch =
note.content?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
return titleMatch || contentMatch;
});
}
filteredNotes = sortItems(filtered, sortOption);
}
$: {
// Filter checklists
let filtered = checklists;
if (searchQuery !== '') {
filtered = filtered.filter((checklist) => {
const titleMatch =
checklist.name?.toLowerCase().includes(searchQuery.toLowerCase()) || false;
return titleMatch;
});
}
filteredChecklists = sortItems(filtered, sortOption);
}
// Calculate total items
$: totalItems =
filteredAdventures.length +
filteredTransportations.length +
filteredLodging.length +
filteredNotes.length +
filteredChecklists.length;
// Event handlers
function handleEditAdventure(event: { detail: any }) {
dispatch('editAdventure', event.detail);
}
function handleDeleteAdventure(event: { detail: any }) {
dispatch('deleteAdventure', event.detail);
}
function handleEditTransportation(event: { detail: any }) {
dispatch('editTransportation', event.detail);
}
function handleDeleteTransportation(event: { detail: any }) {
dispatch('deleteTransportation', event.detail);
}
function handleEditLodging(event: { detail: any }) {
dispatch('editLodging', event.detail);
}
function handleDeleteLodging(event: { detail: any }) {
dispatch('deleteLodging', event.detail);
}
function handleEditNote(event: { detail: any }) {
dispatch('editNote', event.detail);
}
function handleDeleteNote(event: { detail: any }) {
dispatch('deleteNote', event.detail);
}
function handleEditChecklist(event: { detail: any }) {
dispatch('editChecklist', event.detail);
}
function handleDeleteChecklist(event: { detail: any }) {
dispatch('deleteChecklist', event.detail);
}
</script>
<!-- Search and Filter Controls -->
<div
class="bg-base-100/90 backdrop-blur-lg border border-base-300/50 rounded-2xl p-6 mx-4 mb-6 shadow-lg mt-4"
>
<!-- Header with Stats -->
<div class="flex items-center justify-between mb-4">
<div class="flex items-center gap-3">
<div class="p-2 bg-primary/10 rounded-xl">
<Adventures class="w-6 h-6 text-primary" />
</div>
<div>
<h2 class="text-xl font-bold text-primary">
{$t('adventures.collection_contents')}
</h2>
<p class="text-sm text-base-content/60">
{totalItems}
{$t('worldtravel.total_items')}
</p>
</div>
</div>
<!-- Quick Stats -->
<div class="hidden md:flex items-center gap-2">
<div class="stats stats-horizontal bg-base-200/50 border border-base-300/50">
<div class="stat py-2 px-3">
<div class="stat-title text-xs">{$t('locations.locations')}</div>
<div class="stat-value text-sm text-info">{adventures.length}</div>
</div>
<div class="stat py-2 px-3">
<div class="stat-title text-xs">{$t('adventures.transportations')}</div>
<div class="stat-value text-sm text-warning">{transportations.length}</div>
</div>
<div class="stat py-2 px-3">
<div class="stat-title text-xs">{$t('adventures.lodging')}</div>
<div class="stat-value text-sm text-success">{lodging.length}</div>
</div>
</div>
</div>
</div>
<!-- Search Bar -->
<div class="flex flex-col lg:flex-row items-stretch lg:items-center gap-4 mb-4">
<div class="relative flex-1 max-w-md">
<Search class="absolute left-3 top-1/2 -translate-y-1/2 w-4 h-4 text-base-content/40" />
<input
type="text"
placeholder="{$t('navbar.search')} {$t('adventures.name_location')}..."
class="input input-bordered w-full pl-10 pr-10 bg-base-100/80"
bind:value={searchQuery}
/>
{#if searchQuery.length > 0}
<button
class="absolute right-3 top-1/2 -translate-y-1/2 text-base-content/40 hover:text-base-content"
on:click={() => (searchQuery = '')}
>
<Clear class="w-4 h-4" />
</button>
{/if}
</div>
{#if searchQuery || filterOption !== 'all' || sortOption !== 'name_asc'}
<button class="btn btn-ghost btn-sm gap-1" on:click={clearAllFilters}>
<Clear class="w-3 h-3" />
{$t('worldtravel.clear_all')}
</button>
{/if}
</div>
<!-- Sort Labels (Mobile Friendly) -->
<div class="flex flex-wrap gap-2 mb-4">
<div class="badge badge-outline gap-1">
<Filter class="w-3 h-3" />
{$t('adventures.sort')}:
</div>
<div class="flex flex-wrap gap-1">
<button
class="badge {sortOption === 'name_asc'
? 'badge-primary'
: 'badge-ghost'} cursor-pointer hover:badge-primary"
on:click={() => (sortOption = 'name_asc')}
>
A-Z
</button>
<button
class="badge {sortOption === 'name_desc'
? 'badge-primary'
: 'badge-ghost'} cursor-pointer hover:badge-primary"
on:click={() => (sortOption = 'name_desc')}
>
Z-A
</button>
<button
class="badge {sortOption === 'date_newest'
? 'badge-primary'
: 'badge-ghost'} cursor-pointer hover:badge-primary"
on:click={() => (sortOption = 'date_newest')}
>
{$t('worldtravel.newest_first')}
</button>
<button
class="badge {sortOption === 'date_oldest'
? 'badge-primary'
: 'badge-ghost'} cursor-pointer hover:badge-primary"
on:click={() => (sortOption = 'date_oldest')}
>
{$t('worldtravel.oldest_first')}
</button>
<button
class="badge {sortOption === 'visited_first'
? 'badge-primary'
: 'badge-ghost'} cursor-pointer hover:badge-primary"
on:click={() => (sortOption = 'visited_first')}
>
{$t('worldtravel.visited_first')}
</button>
<button
class="badge {sortOption === 'unvisited_first'
? 'badge-primary'
: 'badge-ghost'} cursor-pointer hover:badge-primary"
on:click={() => (sortOption = 'unvisited_first')}
>
{$t('worldtravel.unvisited_first')}
</button>
</div>
</div>
<!-- Filter Tabs -->
<div class="flex flex-col sm:flex-row sm:items-center gap-2">
<span class="text-sm font-medium text-base-content/60">
{$t('adventures.show')}:
</span>
<!-- Scrollable container on mobile -->
<div class="w-full overflow-x-auto">
<div class="tabs tabs-boxed bg-base-200 flex-nowrap flex sm:flex-wrap w-max sm:w-auto">
<button
class="tab tab-sm gap-2 {filterOption === 'all' ? 'tab-active' : ''} whitespace-nowrap"
on:click={() => (filterOption = 'all')}
>
<Adventures class="w-3 h-3" />
{$t('adventures.all')}
</button>
<button
class="tab tab-sm gap-2 {filterOption === 'adventures'
? 'tab-active'
: ''} whitespace-nowrap"
on:click={() => (filterOption = 'adventures')}
>
<Adventures class="w-3 h-3" />
{$t('locations.locations')}
</button>
<button
class="tab tab-sm gap-2 {filterOption === 'transportation'
? 'tab-active'
: ''} whitespace-nowrap"
on:click={() => (filterOption = 'transportation')}
>
<TransportationIcon class="w-3 h-3" />
{$t('adventures.transportations')}
</button>
<button
class="tab tab-sm gap-2 {filterOption === 'lodging'
? 'tab-active'
: ''} whitespace-nowrap"
on:click={() => (filterOption = 'lodging')}
>
<Hotel class="w-3 h-3" />
{$t('adventures.lodging')}
</button>
<button
class="tab tab-sm gap-2 {filterOption === 'notes' ? 'tab-active' : ''} whitespace-nowrap"
on:click={() => (filterOption = 'notes')}
>
<NoteIcon class="w-3 h-3" />
{$t('adventures.notes')}
</button>
<button
class="tab tab-sm gap-2 {filterOption === 'checklists'
? 'tab-active'
: ''} whitespace-nowrap"
on:click={() => (filterOption = 'checklists')}
>
<ChecklistIcon class="w-3 h-3" />
{$t('adventures.checklists')}
</button>
</div>
</div>
</div>
</div>
<!-- Adventures Section -->
{#if (filterOption === 'all' || filterOption === 'adventures') && filteredAdventures.length > 0}
<div class="mb-8">
<div class="flex items-center justify-between mx-4 mb-4">
<h1 class="text-3xl font-bold text-primary">
{$t('adventures.linked_locations')}
</h1>
<div class="badge badge-primary badge-lg">{filteredAdventures.length}</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4 mx-4">
{#each filteredAdventures as adventure}
<LocationCard
{user}
on:edit={handleEditAdventure}
on:delete={handleDeleteAdventure}
{adventure}
{collection}
/>
{/each}
</div>
</div>
{/if}
<!-- Transportation Section -->
{#if (filterOption === 'all' || filterOption === 'transportation') && filteredTransportations.length > 0}
<div class="mb-8">
<div class="flex items-center justify-between mx-4 mb-4">
<h1 class="text-3xl font-bold bg-clip-text text-primary">
{$t('adventures.transportations')}
</h1>
<div class="badge badge-warning badge-lg">{filteredTransportations.length}</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4 mx-4">
{#each filteredTransportations as transportation}
<TransportationCard
{transportation}
{user}
on:delete={handleDeleteTransportation}
on:edit={handleEditTransportation}
{collection}
/>
{/each}
</div>
</div>
{/if}
<!-- Lodging Section -->
{#if (filterOption === 'all' || filterOption === 'lodging') && filteredLodging.length > 0}
<div class="mb-8">
<div class="flex items-center justify-between mx-4 mb-4">
<h1 class="text-3xl font-bold bg-clip-text text-primary">
{$t('adventures.lodging')}
</h1>
<div class="badge badge-success badge-lg">{filteredLodging.length}</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4 mx-4">
{#each filteredLodging as hotel}
<LodgingCard
lodging={hotel}
{user}
on:delete={handleDeleteLodging}
on:edit={handleEditLodging}
{collection}
/>
{/each}
</div>
</div>
{/if}
<!-- Notes Section -->
{#if (filterOption === 'all' || filterOption === 'notes') && filteredNotes.length > 0}
<div class="mb-8">
<div class="flex items-center justify-between mx-4 mb-4">
<h1 class="text-3xl font-bold bg-clip-text text-primary">
{$t('adventures.notes')}
</h1>
<div class="badge badge-info badge-lg">{filteredNotes.length}</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4 mx-4">
{#each filteredNotes as note}
<NoteCard
{note}
{user}
on:edit={handleEditNote}
on:delete={handleDeleteNote}
{collection}
/>
{/each}
</div>
</div>
{/if}
<!-- Checklists Section -->
{#if (filterOption === 'all' || filterOption === 'checklists') && filteredChecklists.length > 0}
<div class="mb-8">
<div class="flex items-center justify-between mx-4 mb-4">
<h1 class="text-3xl font-bold bg-clip-text text-primary">
{$t('adventures.checklists')}
</h1>
<div class="badge badge-secondary badge-lg">{filteredChecklists.length}</div>
</div>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4 mx-4">
{#each filteredChecklists as checklist}
<ChecklistCard
{checklist}
{user}
on:delete={handleDeleteChecklist}
on:edit={handleEditChecklist}
{collection}
/>
{/each}
</div>
</div>
{/if}
<!-- Empty State -->
{#if totalItems === 0}
<div class="hero min-h-96">
<div class="hero-content text-center">
<div class="max-w-md">
<Clear class="w-16 h-16 text-base-content/30 mb-4" />
<h1 class="text-3xl font-bold text-base-content/70">
{$t('immich.no_items_found')}
</h1>
</div>
</div>
</div>
{/if}

View File

@@ -3,7 +3,7 @@
import { createEventDispatcher } from 'svelte';
const dispatch = createEventDispatcher();
import { onMount } from 'svelte';
import CollectionCard from './CollectionCard.svelte';
import CollectionCard from './cards/CollectionCard.svelte';
let modal: HTMLDialogElement;
import { t } from 'svelte-i18n';

View File

@@ -1,13 +1,22 @@
<script lang="ts">
import { createEventDispatcher } from 'svelte';
import type { Collection, Transportation } from '$lib/types';
const dispatch = createEventDispatcher();
import { onMount } from 'svelte';
import { addToast } from '$lib/toasts';
let modal: HTMLDialogElement;
import { t } from 'svelte-i18n';
import MarkdownEditor from './MarkdownEditor.svelte';
import { addToast } from '$lib/toasts';
import type { Collection, ContentImage, SlimCollection } from '$lib/types';
// Icons
import CollectionIcon from '~icons/mdi/folder-multiple';
import InfoIcon from '~icons/mdi/information';
import CalendarIcon from '~icons/mdi/calendar';
import LinkIcon from '~icons/mdi/link';
import SaveIcon from '~icons/mdi/content-save';
import CloseIcon from '~icons/mdi/close';
import ImageIcon from '~icons/mdi/image-multiple';
const dispatch = createEventDispatcher();
let modal: HTMLDialogElement;
export let collectionToEdit: Collection | null = null;
@@ -19,18 +28,99 @@
end_date: collectionToEdit?.end_date || null,
user: collectionToEdit?.user || '',
is_public: collectionToEdit?.is_public || false,
adventures: collectionToEdit?.adventures || [],
locations: collectionToEdit?.locations || [],
link: collectionToEdit?.link || '',
shared_with: undefined
shared_with: collectionToEdit?.shared_with || [],
itinerary: collectionToEdit?.itinerary || [],
status: collectionToEdit?.status || 'folder',
days_until_start: collectionToEdit?.days_until_start ?? null,
primary_image: collectionToEdit?.primary_image ?? null,
primary_image_id: collectionToEdit?.primary_image_id ?? null,
itinerary_days: []
};
console.log(collection);
let availableImages: ContentImage[] = [];
let coverImageId: string | null = collection.primary_image?.id || null;
function setImagesFromCollection(col: Collection) {
const seen = new Map<string, ContentImage>();
(col.locations || []).forEach((loc) => {
(loc.images || []).forEach((img) => {
if (!seen.has(img.id)) {
seen.set(img.id, img);
}
});
});
const deduped = Array.from(seen.values());
deduped.sort((a, b) => {
if (coverImageId && a.id === coverImageId) return -1;
if (coverImageId && b.id === coverImageId) return 1;
if (a.is_primary && !b.is_primary) return -1;
if (!a.is_primary && b.is_primary) return 1;
return a.id.localeCompare(b.id);
});
availableImages = deduped;
}
function selectCover(imageId: string | null) {
coverImageId = imageId;
collection.primary_image_id = imageId;
setImagesFromCollection(collection);
}
function toSlimCollection(col: Collection): SlimCollection {
return {
id: col.id,
user: col.user,
name: col.name,
description: col.description,
is_public: col.is_public,
start_date: col.start_date,
end_date: col.end_date,
is_archived: col.is_archived ?? false,
link: col.link ?? null,
created_at: col.created_at ?? '',
updated_at: col.updated_at ?? '',
location_images: (col.locations || []).flatMap((loc) => loc.images || []),
location_count: (col.locations || []).length,
shared_with: col.shared_with || [],
status: col.status ?? 'folder',
days_until_start: col.days_until_start ?? null,
primary_image: col.primary_image ?? null
};
}
async function loadCollectionDetails() {
if (!collectionToEdit?.id) {
setImagesFromCollection(collection);
return;
}
try {
const res = await fetch(`/api/collections/${collectionToEdit.id}?nested=true`);
if (res.ok) {
const data = (await res.json()) as Collection;
collection = { ...collection, ...data };
coverImageId = data.primary_image?.id ?? coverImageId;
collection.primary_image_id = coverImageId;
setImagesFromCollection(collection);
return;
}
} catch (err) {
console.error(err);
}
setImagesFromCollection(collection);
}
onMount(async () => {
modal = document.getElementById('my_modal_1') as HTMLDialogElement;
if (modal) {
modal.showModal();
}
await loadCollectionDetails();
});
function close() {
@@ -45,7 +135,6 @@
async function handleSubmit(event: Event) {
event.preventDefault();
console.log(collection);
if (collection.start_date && !collection.end_date) {
collection.end_date = collection.start_date;
@@ -69,19 +158,32 @@
collection.end_date = null;
}
const payload = {
name: collection.name,
description: collection.description,
start_date: collection.start_date,
end_date: collection.end_date,
is_public: collection.is_public,
link: collection.link,
primary_image_id: coverImageId
};
if (collection.id === '') {
let res = await fetch('/api/collections', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(collection)
body: JSON.stringify(payload)
});
let data = await res.json();
if (data.id) {
collection = data as Collection;
coverImageId = collection.primary_image?.id ?? null;
collection.primary_image_id = coverImageId;
setImagesFromCollection(collection);
addToast('success', $t('collection.collection_created'));
dispatch('save', collection);
dispatch('save', toSlimCollection(collection));
} else {
console.error(data);
addToast('error', $t('collection.error_creating_collection'));
@@ -92,13 +194,16 @@
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(collection)
body: JSON.stringify(payload)
});
let data = await res.json();
if (data.id) {
collection = data as Collection;
coverImageId = collection.primary_image?.id ?? null;
collection.primary_image_id = coverImageId;
setImagesFromCollection(collection);
addToast('success', $t('collection.collection_edit_success'));
dispatch('save', collection);
dispatch('save', toSlimCollection(collection));
} else {
addToast('error', $t('collection.error_editing_collection'));
}
@@ -106,141 +211,328 @@
}
</script>
<dialog id="my_modal_1" class="modal">
<dialog id="my_modal_1" class="modal backdrop-blur-sm">
<!-- svelte-ignore a11y-no-noninteractive-tabindex -->
<!-- svelte-ignore a11y-no-noninteractive-element-interactions -->
<div class="modal-box w-11/12 max-w-3xl" role="dialog" on:keydown={handleKeydown} tabindex="0">
<h3 class="font-bold text-2xl">
{collectionToEdit ? $t('adventures.edit_collection') : $t('collection.new_collection')}
</h3>
<div class="modal-action items-center">
<form method="post" style="width: 100%;" on:submit={handleSubmit}>
<!-- Basic Information Section -->
<div class="collapse collapse-plus bg-base-200 mb-4">
<input type="checkbox" checked />
<div class="collapse-title text-xl font-medium">
{$t('adventures.basic_information')}
<div
class="modal-box w-11/12 max-w-6xl bg-gradient-to-br from-base-100 via-base-100 to-base-200 border border-base-300 shadow-2xl max-h-[85vh] flex flex-col"
role="dialog"
on:keydown={handleKeydown}
tabindex="0"
>
<!-- Header Section -->
<div
class="top-0 z-10 bg-base-100/90 backdrop-blur-lg border-b border-base-300 -mx-6 -mt-6 px-6 py-4 mb-6"
>
<div class="flex items-center justify-between">
<div class="flex items-center gap-3">
<div class="p-2 bg-primary/10 rounded-xl">
<CollectionIcon class="w-8 h-8 text-primary" />
</div>
<div class="collapse-content">
<!-- Name -->
<div>
<label for="name">
{$t('adventures.name')}<span class="text-red-500">*</span>
</label>
<input
type="text"
id="name"
name="name"
bind:value={collection.name}
class="input input-bordered w-full"
required
/>
<div>
<h1 class="text-3xl font-bold text-primary bg-clip-text">
{collectionToEdit
? $t('adventures.edit_collection')
: $t('collection.new_collection')}
</h1>
<p class="text-sm text-base-content/60">
{collectionToEdit
? $t('collection.update_collection_details')
: $t('collection.create_new_collection')}
</p>
</div>
</div>
<!-- Close Button -->
<button class="btn btn-ghost btn-square" on:click={close}>
<CloseIcon class="w-5 h-5" />
</button>
</div>
</div>
<!-- Main Content -->
<div class="p-6 overflow-auto max-h-[70vh]">
<form method="post" on:submit={handleSubmit} class="space-y-6">
<!-- Basic Information Section -->
<div class="card bg-base-100 border border-base-300 shadow-lg">
<div class="card-body p-6">
<div class="flex items-center gap-3 mb-6">
<div class="p-2 bg-primary/10 rounded-lg">
<InfoIcon class="w-5 h-5 text-primary" />
</div>
<h2 class="text-xl font-bold">{$t('adventures.basic_information')}</h2>
</div>
<!-- Description -->
<div>
<label for="description">{$t('adventures.description')}</label><br />
<MarkdownEditor bind:text={collection.description} editor_height={'h-32'} />
</div>
<!-- Start Date -->
<div>
<label for="start_date">{$t('adventures.start_date')}</label>
<input
type="date"
id="start_date"
name="start_date"
bind:value={collection.start_date}
class="input input-bordered w-full"
/>
</div>
<!-- End Date -->
<div>
<label for="end_date">{$t('adventures.end_date')}</label>
<input
type="date"
id="end_date"
name="end_date"
bind:value={collection.end_date}
class="input input-bordered w-full"
/>
</div>
<!-- Public -->
<div>
<label class="label cursor-pointer flex items-start space-x-2">
<span class="label-text">{$t('collection.public_collection')}</span>
<input
type="checkbox"
class="toggle toggle-primary"
id="is_public"
name="is_public"
bind:checked={collection.is_public}
/>
</label>
</div>
<!-- Link -->
<div>
<label for="link">{$t('adventures.link')}</label>
<input
type="text"
id="link"
name="link"
bind:value={collection.link}
class="input input-bordered w-full"
/>
<div class="grid grid-cols-1 lg:grid-cols-2 gap-6">
<!-- Left Column -->
<div class="space-y-4">
<!-- Name Field -->
<div class="form-control">
<label class="label" for="name">
<span class="label-text font-medium"
>{$t('adventures.name')}<span class="text-error ml-1">*</span></span
>
</label>
<input
type="text"
id="name"
name="name"
bind:value={collection.name}
class="input input-bordered w-full"
placeholder={$t('collection.enter_collection_name')}
required
/>
</div>
<!-- Description Field -->
<div class="form-control">
<label class="label" for="description">
<span class="label-text font-medium">{$t('adventures.description')}</span>
</label>
<MarkdownEditor bind:text={collection.description} editor_height={'h-32'} />
</div>
<!-- Link Field -->
<div class="form-control">
<label class="label" for="link">
<span class="label-text font-medium flex items-center gap-2">
<LinkIcon class="w-4 h-4" />
{$t('adventures.link')}
</span>
</label>
<input
type="text"
id="link"
name="link"
bind:value={collection.link}
class="input input-bordered w-full"
placeholder="https://example.com"
/>
</div>
</div>
<!-- Right Column -->
<div class="space-y-4">
<!-- Start Date -->
<div class="form-control">
<label class="label" for="start_date">
<span class="label-text font-medium flex items-center gap-2">
<CalendarIcon class="w-4 h-4" />
{$t('adventures.start_date')}
</span>
</label>
<input
type="date"
id="start_date"
name="start_date"
bind:value={collection.start_date}
class="input input-bordered w-full"
/>
</div>
<!-- End Date -->
<div class="form-control">
<label class="label" for="end_date">
<span class="label-text font-medium flex items-center gap-2">
<CalendarIcon class="w-4 h-4" />
{$t('adventures.end_date')}
</span>
</label>
<input
type="date"
id="end_date"
name="end_date"
bind:value={collection.end_date}
class="input input-bordered w-full"
/>
</div>
<!-- Public Toggle -->
<div class="form-control">
<label class="label cursor-pointer justify-start gap-3">
<input
type="checkbox"
class="toggle toggle-primary"
id="is_public"
name="is_public"
bind:checked={collection.is_public}
/>
<span class="label-text font-medium">{$t('collection.public_collection')}</span>
</label>
<div class="pl-12">
<span class="text-sm text-base-content/60"
>{$t('collection.public_collection_description')}</span
>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Date Warning Alert -->
{#if !collection.start_date && !collection.end_date}
<div class="mt-4">
<div role="alert" class="alert alert-neutral">
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
class="h-6 w-6 shrink-0 stroke-current"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
></path>
</svg>
<span>{$t('adventures.collection_no_start_end_date')}</span>
<div role="alert" class="alert alert-info shadow-lg">
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
class="h-6 w-6 shrink-0 stroke-current"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
></path>
</svg>
<span>{$t('adventures.collection_no_start_end_date')}</span>
</div>
{:else if collection.id && collectionToEdit?.start_date && collectionToEdit?.end_date && (collection.start_date !== collectionToEdit.start_date || collection.end_date !== collectionToEdit.end_date)}
<div role="alert" class="alert alert-warning shadow-lg">
<svg
xmlns="http://www.w3.org/2000/svg"
class="stroke-current shrink-0 h-6 w-6"
fill="none"
viewBox="0 0 24 24"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z"
/>
</svg>
<div>
<div class="font-bold">{$t('collection.changing_date_title')}</div>
<div class="text-sm">
{$t('collection.changing_date_warning')}
</div>
</div>
</div>
{/if}
<div class="mt-4">
<button type="submit" class="btn btn-primary">
{$t('notes.save')}
</button>
<button type="button" class="btn" on:click={close}>
<!-- Cover Image Selection -->
{#if collection.id}
<div class="card bg-base-100 border border-base-300 shadow-lg">
<div class="card-body p-6 space-y-4">
<div class="flex items-center gap-3">
<div class="p-2 bg-primary/10 rounded-lg">
<ImageIcon class="w-5 h-5 text-primary" />
</div>
<div>
<h3 class="text-lg font-semibold">
{$t('collection.cover_image') ?? 'Cover image'}
</h3>
<p class="text-sm text-base-content/60">
{$t('collection.cover_image_hint') ??
'Choose a cover from images in this collection.'}
</p>
</div>
</div>
{#if availableImages.length === 0}
<div class="alert alert-info shadow-sm">
<span>
{$t('collection.no_images_available') ??
'No images available from linked locations yet.'}
</span>
</div>
{:else}
<div class="grid sm:grid-cols-2 md:grid-cols-3 lg:grid-cols-4 gap-3">
{#each availableImages as image (image.id)}
<button
type="button"
class="relative group rounded-xl overflow-hidden border border-base-300 bg-base-200/30 hover:border-primary transition shadow-sm {coverImageId ===
image.id
? 'ring-2 ring-primary ring-offset-2 ring-offset-base-100'
: ''}"
on:click={() => selectCover(image.id)}
aria-pressed={coverImageId === image.id}
>
<img
src={image.image}
alt="Cover candidate"
class="w-full h-32 object-cover"
/>
<div
class="absolute inset-0 bg-gradient-to-t from-base-300/60 to-transparent opacity-0 group-hover:opacity-100 transition"
/>
{#if coverImageId === image.id}
<div class="absolute top-2 left-2 badge badge-primary gap-2 shadow">
{$t('collection.cover') ?? 'Cover'}
</div>
{:else if image.is_primary}
<div class="absolute top-2 left-2 badge badge-ghost shadow">
{$t('collection.location_primary') ?? 'Location cover'}
</div>
{/if}
<div
class="absolute bottom-2 right-2 btn btn-xs btn-ghost bg-base-100/90 shadow"
>
{coverImageId === image.id
? ($t('collection.cover') ?? 'Cover')
: ($t('collection.set_cover') ?? 'Set cover')}
</div>
</button>
{/each}
</div>
<div class="flex justify-end">
<button
type="button"
class="btn btn-ghost btn-sm"
on:click={() => selectCover(null)}
>
<CloseIcon class="w-4 h-4" />
<span>{$t('collection.clear_cover') ?? 'Clear cover'}</span>
</button>
</div>
{/if}
</div>
</div>
{/if}
<!-- Share Link Section (only if public and has ID) -->
{#if collection.is_public && collection.id}
<div class="card bg-base-100 border border-base-300 shadow-lg">
<div class="card-body p-6">
<h3 class="font-semibold text-lg mb-3">{$t('adventures.share_collection')}</h3>
<div class="flex items-center gap-3">
<input
type="text"
value="{window.location.origin}/collections/{collection.id}"
readonly
class="input input-bordered flex-1 font-mono text-sm"
/>
<button
type="button"
on:click={() => {
navigator.clipboard.writeText(
`${window.location.origin}/collections/${collection.id}`
);
addToast('success', $t('adventures.link_copied'));
}}
class="btn btn-primary gap-2"
>
<LinkIcon class="w-4 h-4" />
{$t('adventures.copy_link')}
</button>
</div>
</div>
</div>
{/if}
<!-- Action Buttons -->
<div class="flex gap-3 justify-end pt-4">
<button type="button" class="btn btn-neutral gap-2" on:click={close}>
<CloseIcon class="w-5 h-5" />
{$t('about.close')}
</button>
<button type="submit" class="btn btn-primary gap-2">
<SaveIcon class="w-5 h-5" />
{$t('notes.save')}
</button>
</div>
{#if collection.is_public && collection.id}
<div class="bg-neutral p-4 mt-2 rounded-md shadow-sm text-neutral-content">
<p class=" font-semibold">{$t('adventures.share_collection')}</p>
<div class="flex items-center justify-between">
<p class="text-card-foreground font-mono">
{window.location.origin}/collections/{collection.id}
</p>
<button
type="button"
on:click={() => {
navigator.clipboard.writeText(
`${window.location.origin}/collections/${collection.id}`
);
}}
class="inline-flex items-center justify-center whitespace-nowrap rounded-md text-sm font-medium ring-offset-background transition-colors focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:pointer-events-none disabled:opacity-50 h-10 px-4 py-2"
>
{$t('adventures.copy_link')}
</button>
</div>
</div>
{/if}
</form>
</div>
</div>

View File

@@ -0,0 +1,800 @@
<script lang="ts">
import type { Collection, User, ContentImage } from '$lib/types';
import { onMount } from 'svelte';
import { t } from 'svelte-i18n';
import { DefaultMarker, MapLibre, Popup } from 'svelte-maplibre';
import { getBasemapUrl } from '$lib';
import MagnifyIcon from '~icons/mdi/magnify';
import MapMarker from '~icons/mdi/map-marker';
import Star from '~icons/mdi/star';
import StarHalfFull from '~icons/mdi/star-half-full';
import StarOutline from '~icons/mdi/star-outline';
import AccountMultiple from '~icons/mdi/account-multiple';
import Phone from '~icons/mdi/phone';
import Web from '~icons/mdi/web';
import OpenInNew from '~icons/mdi/open-in-new';
import ClockOutline from '~icons/mdi/clock-outline';
import CurrencyUsd from '~icons/mdi/currency-usd';
import TuneVariant from '~icons/mdi/tune-variant';
import CloseCircle from '~icons/mdi/close-circle';
import Compass from '~icons/mdi/compass';
import ImageDisplayModal from '$lib/components/ImageDisplayModal.svelte';
import LocationModal from '$lib/components/locations/LocationModal.svelte';
import LodgingModal from '$lib/components/lodging/LodgingModal.svelte';
import { createEventDispatcher } from 'svelte';
import type { Location, Lodging } from '$lib/types';
export let collection: Collection;
export let user: User | null;
// Whether the current user can modify this collection (owner or shared user)
type RecommendationResult = {
name: string;
latitude: number;
longitude: number;
distance_km: number;
source: 'google' | 'osm';
type: string;
tags?: Record<string, string>;
rating?: number;
review_count?: number;
address?: string;
business_status?: string;
opening_hours?: string[];
is_open_now?: boolean;
photos?: string[];
phone_number?: string;
website?: string;
google_maps_uri?: string;
price_level?: string;
description?: string;
quality_score?: number;
};
let searchQuery = '';
let selectedCategory: 'tourism' | 'lodging' | 'food' = 'tourism';
let radiusValue = 5000; // Default 5km
let loading = false;
let results: RecommendationResult[] = [];
let error: string | null = null;
let selectedLocationId: string | null = null;
let showFilters = false;
let mapCenter: { lng: number; lat: number } = { lng: 0, lat: 0 };
let mapZoom = 12;
// Filters
let minRating = 0;
let minReviews = 0;
let showOpenOnly = false;
// Photo modal
let photoModalOpen = false;
let selectedPhotos: ContentImage[] = [];
let selectedPhotoIndex = 0;
let selectedPlaceName = '';
let selectedPlaceAddress = '';
const dispatch = createEventDispatcher();
// Modals for creating autofilled items
let showLocationModal = false;
let showLodgingModal = false;
let modalLocationToEdit: Location | null = null;
let modalLodgingToEdit: Lodging | null = null;
function mapPhotosToContentImages(photos: string[]): ContentImage[] {
return photos.map((url, i) => ({
id: `rec-${i}-${Date.now()}`,
image: url,
is_primary: i === 0,
immich_id: null
}));
}
function openCreateLocationFromResult(result: RecommendationResult) {
modalLocationToEdit = {
id: '',
name: result.name || '',
location: result.address || result.description || '',
tags: [],
description: result.description || null,
rating: result.rating ?? NaN,
price: null,
price_currency: null,
link: result.website || null,
images: mapPhotosToContentImages(result.photos || []),
visits: [],
collections: [collection.id],
latitude: result.latitude ?? null,
longitude: result.longitude ?? null,
is_public: false,
user: user ?? null,
category: null,
attachments: [],
trails: []
} as Location;
showLocationModal = true;
}
function openCreateLodgingFromResult(result: RecommendationResult) {
modalLodgingToEdit = {
id: '',
user: user ? user.uuid : '',
name: result.name || '',
type: '',
description: result.description || null,
rating: result.rating ?? null,
link: result.website || null,
check_in: null,
check_out: null,
timezone: null,
reservation_number: null,
price: null,
price_currency: null,
latitude: result.latitude ?? null,
longitude: result.longitude ?? null,
location: result.address || result.description || null,
is_public: false,
collection: collection.id,
created_at: '',
updated_at: '',
images: mapPhotosToContentImages(result.photos || []),
attachments: []
} as Lodging;
showLodgingModal = true;
}
function handleLocationCreate(e: CustomEvent) {
const created: Location = e.detail;
showLocationModal = false;
modalLocationToEdit = null;
collection.locations = [...collection.locations, created];
}
function handleLodgingCreate(e: CustomEvent) {
const created: Lodging = e.detail;
showLodgingModal = false;
modalLodgingToEdit = null;
collection.lodging = [...(collection.lodging ?? []), created];
}
function closeLocationModal() {
showLocationModal = false;
modalLocationToEdit = null;
}
function closeLodgingModal() {
showLodgingModal = false;
modalLodgingToEdit = null;
}
$: isMetric = user?.measurement_system === 'metric';
$: radiusDisplay = isMetric
? `${(radiusValue / 1000).toFixed(1)} km`
: `${(radiusValue / 1609.34).toFixed(1)} mi`;
$: radiusOptions = isMetric
? [
{ value: 1000, label: '1 km' },
{ value: 2000, label: '2 km' },
{ value: 5000, label: '5 km' },
{ value: 10000, label: '10 km' },
{ value: 20000, label: '20 km' },
{ value: 50000, label: '50 km' }
]
: [
{ value: 1609, label: '1 mi' },
{ value: 3219, label: '2 mi' },
{ value: 8047, label: '5 mi' },
{ value: 16093, label: '10 mi' },
{ value: 32187, label: '20 mi' },
{ value: 80467, label: '50 mi' }
];
// Get locations with coordinates for dropdown
$: locationsWithCoords = collection.locations.filter((l) => l.latitude && l.longitude);
// Set default selected location and map center
onMount(() => {
if (locationsWithCoords.length > 0) {
selectedLocationId = locationsWithCoords[0].id;
mapCenter = {
lng: locationsWithCoords[0].longitude!,
lat: locationsWithCoords[0].latitude!
};
}
});
// Update map center when selected location changes
$: if (selectedLocationId) {
const location = locationsWithCoords.find((l) => l.id === selectedLocationId);
if (location && location.latitude && location.longitude) {
mapCenter = { lng: location.longitude, lat: location.latitude };
}
}
// Filter results
$: filteredResults = results.filter((r) => {
if (minRating > 0 && (r.rating === undefined || r.rating < minRating)) return false;
if (minReviews > 0 && (r.review_count === undefined || r.review_count < minReviews))
return false;
if (showOpenOnly && !r.is_open_now) return false;
return true;
});
async function searchRecommendations() {
if (!searchQuery.trim() && !selectedLocationId) {
error = 'Please select a location or enter a search query';
return;
}
loading = true;
error = null;
results = [];
try {
const params = new URLSearchParams();
if (selectedLocationId) {
const location = locationsWithCoords.find((l) => l.id === selectedLocationId);
if (location && location.latitude && location.longitude) {
params.append('lat', location.latitude.toString());
params.append('lon', location.longitude.toString());
}
} else if (searchQuery.trim()) {
params.append('location', searchQuery);
}
params.append('radius', radiusValue.toString());
params.append('category', selectedCategory);
const response = await fetch(`/api/recommendations/query?${params.toString()}`);
if (!response.ok) {
const errorData = await response.json();
throw new Error(errorData.error || 'Failed to fetch recommendations');
}
const data = await response.json();
if (data.error) {
throw new Error(data.error);
}
results = data.results || [];
// Update map if we have results
if (results.length > 0) {
// Calculate bounds for all results
const lats = results.map((r) => r.latitude);
const lngs = results.map((r) => r.longitude);
const avgLat = lats.reduce((a, b) => a + b, 0) / lats.length;
const avgLng = lngs.reduce((a, b) => a + b, 0) / lngs.length;
mapCenter = { lng: avgLng, lat: avgLat };
}
} catch (err) {
error = err instanceof Error ? err.message : 'An error occurred';
console.error('Error fetching recommendations:', err);
} finally {
loading = false;
}
}
function openPhotoModal(
photos: string[],
placeName: string,
placeAddress: string = '',
startIndex: number = 0
) {
// Convert photo URLs to ContentImage format
selectedPhotos = photos.map((url, index) => ({
id: `photo-${index}`,
image: url,
is_primary: index === 0,
immich_id: null
}));
selectedPlaceName = placeName;
selectedPlaceAddress = placeAddress;
selectedPhotoIndex = startIndex;
photoModalOpen = true;
}
function closePhotoModal() {
photoModalOpen = false;
selectedPhotos = [];
selectedPhotoIndex = 0;
selectedPlaceName = '';
selectedPlaceAddress = '';
}
function renderStars(rating: number | undefined) {
if (!rating) return [];
const stars = [];
const fullStars = Math.floor(rating);
const hasHalfStar = rating % 1 >= 0.5;
for (let i = 0; i < 5; i++) {
if (i < fullStars) {
stars.push({ type: 'full', key: i });
} else if (i === fullStars && hasHalfStar) {
stars.push({ type: 'half', key: i });
} else {
stars.push({ type: 'empty', key: i });
}
}
return stars;
}
function getPriceLevelDisplay(priceLevel: string | undefined) {
if (!priceLevel) return '';
const levels: Record<string, string> = {
FREE: 'Free',
INEXPENSIVE: '$',
MODERATE: '$$',
EXPENSIVE: '$$$',
VERY_EXPENSIVE: '$$$$'
};
return levels[priceLevel] || '';
}
function formatDistance(km: number) {
if (isMetric) {
return km < 1 ? `${Math.round(km * 1000)} m` : `${km.toFixed(1)} km`;
} else {
const miles = km / 1.60934;
const feet = miles * 5280;
return miles < 0.1 ? `${Math.round(feet)} ft` : `${miles.toFixed(1)} mi`;
}
}
</script>
<!-- Photo Modal -->
{#if photoModalOpen}
<ImageDisplayModal
images={selectedPhotos}
initialIndex={selectedPhotoIndex}
name={selectedPlaceName}
location={selectedPlaceAddress}
on:close={closePhotoModal}
/>
{/if}
{#if showLocationModal}
<LocationModal
{user}
{collection}
locationToEdit={modalLocationToEdit}
on:create={handleLocationCreate}
on:save={handleLocationCreate}
on:close={closeLocationModal}
/>
{/if}
{#if showLodgingModal}
<LodgingModal
{user}
{collection}
lodgingToEdit={modalLodgingToEdit}
on:create={handleLodgingCreate}
on:close={closeLodgingModal}
on:save={handleLodgingCreate}
/>
{/if}
<div class="space-y-6">
<!-- Search & Filter Card -->
<div class="card bg-base-200 shadow-xl">
<div class="card-body">
<h2 class="card-title text-2xl mb-4">
<Compass class="w-8 h-8" />
{$t('recomendations.discover_places')}
</h2>
<!-- Search Options -->
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
<!-- Location Selector -->
{#if locationsWithCoords.length > 0}
<div class="form-control">
<label class="label">
<span class="label-text font-semibold"
>{$t('recomendations.search_around_location')}</span
>
</label>
<select class="select select-bordered w-full" bind:value={selectedLocationId}>
<option value={null}>{$t('recomendations.use_search_instead')}...</option>
{#each locationsWithCoords as location}
<option value={location.id}>{location.name}</option>
{/each}
</select>
</div>
{/if}
<!-- Search Input -->
<div class="form-control">
<label class="label">
<span class="label-text font-semibold">{$t('recomendations.search_by_address')}</span>
</label>
<input
type="text"
placeholder={$t('adventures.search_placeholder')}
class="input input-bordered w-full"
bind:value={searchQuery}
disabled={selectedLocationId !== null}
on:keydown={(e) => e.key === 'Enter' && searchRecommendations()}
/>
</div>
<!-- Category Selector -->
<div class="form-control">
<label class="label">
<span class="label-text font-semibold">{$t('adventures.category')}</span>
</label>
<select class="select select-bordered w-full" bind:value={selectedCategory}>
<option value="tourism">🏛️ {$t('recomendations.tourism')}</option>
<option value="lodging">🏨 {$t('recomendations.lodging')}</option>
<option value="food">🍴 {$t('recomendations.food')}</option>
</select>
</div>
<!-- Radius Selector -->
<div class="form-control">
<label class="label">
<span class="label-text font-semibold"
>{$t('recomendations.search_radius_label')} {radiusDisplay}</span
>
</label>
<select class="select select-bordered w-full" bind:value={radiusValue}>
{#each radiusOptions as option}
<option value={option.value}>{option.label}</option>
{/each}
</select>
</div>
</div>
<!-- Filters Toggle -->
<div class="flex gap-2 mt-4">
<button class="btn btn-primary flex-1" on:click={searchRecommendations} disabled={loading}>
{#if loading}
<span class="loading loading-spinner loading-sm"></span>
{$t('recomendations.searching')}
{:else}
<MagnifyIcon class="w-5 h-5" />
{$t('navbar.search')}
{/if}
</button>
<button class="btn btn-ghost" on:click={() => (showFilters = !showFilters)}>
<TuneVariant class="w-5 h-5" />
{$t('adventures.filter')}
</button>
</div>
<!-- Advanced Filters -->
{#if showFilters}
<div class="divider">{$t('adventures.filter')}</div>
<div class="grid grid-cols-1 md:grid-cols-3 gap-4">
<div class="form-control">
<label class="label">
<span class="label-text">{$t('recomendations.minimum_rating')}</span>
</label>
<select class="select select-bordered select-sm" bind:value={minRating}>
<option value={0}>{$t('recomendations.any')}</option>
<option value={3}>3+ ⭐</option>
<option value={3.5}>3.5+ ⭐</option>
<option value={4}>4+ ⭐</option>
<option value={4.5}>4.5+ ⭐</option>
</select>
</div>
<div class="form-control">
<!-- svelte-ignore a11y-label-has-associated-control -->
<label class="label">
<span class="label-text">{$t('recomendations.minimum_reviews')}</span>
</label>
<select class="select select-bordered select-sm" bind:value={minReviews}>
<option value={0}>{$t('recomendations.any')}</option>
<option value={10}>10+</option>
<option value={50}>50+</option>
<option value={100}>100+</option>
<option value={500}>500+</option>
</select>
</div>
<div class="form-control">
<label class="label cursor-pointer">
<span class="label-text">{$t('recomendations.open_now_only')}</span>
<input type="checkbox" class="toggle toggle-primary" bind:checked={showOpenOnly} />
</label>
</div>
</div>
{/if}
<!-- Error Message -->
{#if error}
<div class="alert alert-error mt-4">
<CloseCircle class="w-6 h-6" />
<span>{error}</span>
</div>
{/if}
</div>
</div>
<!-- Results -->
{#if loading}
<div class="flex justify-center py-12">
<span class="loading loading-spinner loading-lg text-primary"></span>
</div>
{:else if filteredResults.length > 0}
<!-- Results Stats -->
<div class="stats shadow w-full">
<div class="stat">
<div class="stat-title">{$t('recomendations.total_results')}</div>
<div class="stat-value text-primary">{filteredResults.length}</div>
</div>
<div class="stat">
<div class="stat-title">{$t('recomendations.average_rating')}</div>
<div class="stat-value text-secondary">
{(
filteredResults.filter((r) => r.rating).reduce((sum, r) => sum + (r.rating || 0), 0) /
filteredResults.filter((r) => r.rating).length
).toFixed(1)}
</div>
</div>
<div class="stat">
<div class="stat-title">{$t('recomendations.search_radius_label')}</div>
<div class="stat-value text-accent">{radiusDisplay}</div>
</div>
</div>
<!-- Map View -->
<div class="card bg-base-200 shadow-xl">
<div class="card-body">
<h3 class="card-title text-xl mb-4">📍 {$t('recomendations.map_view')}</h3>
<div class="rounded-lg overflow-hidden shadow-lg">
<MapLibre
style={getBasemapUrl()}
class="w-full h-[500px]"
standardControls
center={mapCenter}
zoom={mapZoom}
>
<!-- Collection Locations -->
{#each collection.locations as location}
{#if location.latitude && location.longitude}
<DefaultMarker lngLat={{ lng: location.longitude, lat: location.latitude }}>
<Popup openOn="click" offset={[0, -10]}>
<div class="p-2">
<a
href={`/adventures/${location.id}`}
class="text-lg font-bold text-black hover:underline mb-1 block"
>
{location.name}
</a>
<p class="text-xs text-black opacity-70">
{$t('recomendations.your_location')}
</p>
</div>
</Popup>
</DefaultMarker>
{/if}
{/each}
<!-- Recommendation Results -->
{#each filteredResults as result}
<DefaultMarker lngLat={{ lng: result.longitude, lat: result.latitude }}>
<Popup openOn="click" offset={[0, -10]}>
<div class="p-3 max-w-xs">
<h4 class="text-base font-bold text-black mb-2">{result.name}</h4>
{#if result.rating}
<div class="flex items-center gap-2 mb-2">
<div class="flex text-yellow-500">
{#each renderStars(result.rating) as star}
{#if star.type === 'full'}
<Star class="w-4 h-4" />
{:else if star.type === 'half'}
<StarHalfFull class="w-4 h-4" />
{:else}
<StarOutline class="w-4 h-4" />
{/if}
{/each}
</div>
<span class="text-sm text-black">{result.rating.toFixed(1)}</span>
{#if result.review_count}
<span class="text-xs text-black opacity-70">
({result.review_count})
</span>
{/if}
</div>
{/if}
{#if result.address}
<p class="text-xs text-black opacity-70 mb-2">📍 {result.address}</p>
{/if}
<p class="text-xs text-black font-semibold">
🚶 {formatDistance(result.distance_km)}
{$t('recomendations.away')}
</p>
</div>
</Popup>
</DefaultMarker>
{/each}
</MapLibre>
</div>
</div>
</div>
<!-- Results Grid -->
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4">
{#each filteredResults as result}
<div class="card bg-base-100 shadow-xl hover:shadow-2xl transition-shadow">
<!-- Photo Carousel -->
{#if result.photos && result.photos.length > 0}
<figure class="relative h-48 cursor-pointer">
<button
class="w-full h-full"
on:click={() =>
openPhotoModal(result.photos || [], result.name, result.address || '')}
>
<img src={result.photos[0]} alt={result.name} class="w-full h-full object-cover" />
</button>
{#if result.photos.length > 1}
<div
class="badge badge-neutral badge-sm absolute bottom-2 right-2 bg-black/70 text-white border-none"
>
📷 {result.photos.length}
</div>
{/if}
</figure>
{:else}
<div
class="bg-gradient-to-br from-primary/20 to-secondary/20 h-48 flex items-center justify-center"
>
<MapMarker class="w-16 h-16 opacity-30" />
</div>
{/if}
<div class="card-body p-4">
<!-- Title & Type -->
<h3 class="card-title text-lg">
{result.name}
{#if result.is_open_now}
<span class="badge badge-success badge-sm">{$t('recomendations.open')}</span>
{/if}
</h3>
<!-- Rating -->
{#if result.rating}
<div class="flex items-center gap-2 mb-2">
<div class="flex text-yellow-500">
{#each renderStars(result.rating) as star}
{#if star.type === 'full'}
<Star class="w-4 h-4" />
{:else if star.type === 'half'}
<StarHalfFull class="w-4 h-4" />
{:else}
<StarOutline class="w-4 h-4" />
{/if}
{/each}
</div>
<span class="text-sm font-semibold">{result.rating.toFixed(1)}</span>
{#if result.review_count}
<span class="text-xs opacity-70">
<AccountMultiple class="w-3 h-3 inline" />
{result.review_count}
</span>
{/if}
{#if result.quality_score}
<div class="badge badge-primary badge-sm ml-auto">
Score: {result.quality_score}
</div>
{/if}
</div>
{/if}
<!-- Address -->
{#if result.address}
<p class="text-sm opacity-70 line-clamp-2">
<MapMarker class="w-4 h-4 inline" />
{result.address}
</p>
{/if}
<!-- Distance & Price -->
<div class="flex gap-2 flex-wrap mt-2">
<div class="badge badge-outline badge-sm">
🚶 {formatDistance(result.distance_km)}
</div>
{#if result.price_level}
<div class="badge badge-outline badge-sm">
<CurrencyUsd class="w-3 h-3" />
{getPriceLevelDisplay(result.price_level)}
</div>
{/if}
<div class="badge badge-ghost badge-sm">
{result.source === 'google' ? '🔍 Google' : '🗺️ OSM'}
</div>
</div>
<!-- Description -->
{#if result.description}
<p class="text-sm mt-2 line-clamp-2 opacity-80">
{result.description}
</p>
{/if}
<!-- Opening Hours -->
{#if result.opening_hours && result.opening_hours.length > 0}
<div class="collapse collapse-arrow bg-base-200 mt-2">
<input type="checkbox" />
<div class="collapse-title text-sm font-medium">
<ClockOutline class="w-4 h-4 inline" />
{$t('recomendations.hours')}
</div>
<div class="collapse-content text-xs">
{#each result.opening_hours as hours}
<p>{hours}</p>
{/each}
</div>
</div>
{/if}
<!-- Action Buttons -->
<div class="card-actions justify-end mt-4">
{#if result.phone_number}
<a href={`tel:${result.phone_number}`} class="btn btn-sm btn-neutral-100">
<Phone class="w-4 h-4" />
</a>
{/if}
{#if result.website}
<a
href={result.website}
target="_blank"
rel="noopener noreferrer"
class="btn btn-sm btn-neutral-100"
>
<Web class="w-4 h-4" />
</a>
{/if}
{#if result.google_maps_uri}
<a
href={result.google_maps_uri}
target="_blank"
rel="noopener noreferrer"
class="btn btn-sm btn-primary"
>
View on Maps
<OpenInNew class="w-4 h-4" />
</a>
{/if}
<!-- Create from recommendation -->
<button
class="btn btn-sm btn-outline"
on:click={() => openCreateLocationFromResult(result)}
>
{$t('recomendations.add_location')}
</button>
<button
class="btn btn-sm btn-ghost"
on:click={() => openCreateLodgingFromResult(result)}
>
{$t('recomendations.add_lodging')}
</button>
</div>
</div>
</div>
{/each}
</div>
{:else if !loading && results.length === 0 && !error}
<div class="card bg-base-200 shadow-xl">
<div class="card-body text-center py-12">
<MagnifyIcon class="w-24 h-24 mx-auto opacity-30 mb-4" />
<h3 class="text-2xl font-bold mb-2">{$t('recomendations.no_results_yet')}</h3>
<p class="opacity-70">{$t('recomendations.select_location_or_query')}</p>
</div>
</div>
{/if}
</div>

View File

@@ -1,506 +0,0 @@
<script lang="ts">
import type { Collection } from '$lib/types';
import TimezoneSelector from './TimezoneSelector.svelte';
import { t } from 'svelte-i18n';
export let collection: Collection | null = null;
import { updateLocalDate, updateUTCDate, validateDateRange, formatUTCDate } from '$lib/dateUtils';
import { onMount } from 'svelte';
import { isAllDay } from '$lib';
export let type: 'adventure' | 'transportation' | 'lodging' = 'adventure';
// Initialize with browser's timezone
export let selectedStartTimezone: string = Intl.DateTimeFormat().resolvedOptions().timeZone;
export let selectedEndTimezone: string = Intl.DateTimeFormat().resolvedOptions().timeZone;
let allDay: boolean = false;
// Store the UTC dates as source of truth
export let utcStartDate: string | null = null;
export let utcEndDate: string | null = null;
export let note: string | null = null;
type Visit = {
id: string;
start_date: string;
end_date: string;
notes: string;
timezone: string | null;
};
type TransportationVisit = {
id: string;
start_date: string;
end_date: string;
notes: string;
start_timezone: string;
end_timezone: string;
};
export let visits: (Visit | TransportationVisit)[] | null = null;
// Local display values
let localStartDate: string = '';
let localEndDate: string = '';
let fullStartDate: string = '';
let fullEndDate: string = '';
let constrainDates: boolean = false;
let isEditing = false; // Disable reactivity when editing
onMount(async () => {
// Auto-detect all-day for transportation and lodging types
if ((type === 'transportation' || type === 'lodging') && utcStartDate) {
allDay = isAllDay(utcStartDate);
}
// Initialize UTC dates
localStartDate = updateLocalDate({
utcDate: utcStartDate,
timezone: selectedStartTimezone
}).localDate;
localEndDate = updateLocalDate({
utcDate: utcEndDate,
timezone: type === 'transportation' ? selectedEndTimezone : selectedStartTimezone
}).localDate;
if (!selectedStartTimezone) {
selectedStartTimezone = Intl.DateTimeFormat().resolvedOptions().timeZone;
}
if (!selectedEndTimezone) {
selectedEndTimezone = Intl.DateTimeFormat().resolvedOptions().timeZone;
}
});
// Set the full date range for constraining purposes
$: if (collection && collection.start_date && collection.end_date) {
fullStartDate = `${collection.start_date}T00:00`;
fullEndDate = `${collection.end_date}T23:59`;
}
function formatDateInTimezone(utcDate: string, timezone: string): string {
try {
return new Intl.DateTimeFormat(undefined, {
timeZone: timezone,
year: 'numeric',
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit',
hour12: true
}).format(new Date(utcDate));
} catch {
return new Date(utcDate).toLocaleString(); // fallback
}
}
// Get constraint dates in the right format based on allDay setting
$: constraintStartDate = allDay
? fullStartDate && fullStartDate.includes('T')
? fullStartDate.split('T')[0]
: ''
: fullStartDate || '';
$: constraintEndDate = allDay
? fullEndDate && fullEndDate.includes('T')
? fullEndDate.split('T')[0]
: ''
: fullEndDate || '';
// Update local display dates whenever timezone or UTC dates change
$: if (!isEditing) {
if (allDay) {
localStartDate = utcStartDate?.substring(0, 10) ?? '';
localEndDate = utcEndDate?.substring(0, 10) ?? '';
} else {
const start = updateLocalDate({
utcDate: utcStartDate,
timezone: selectedStartTimezone
}).localDate;
const end = updateLocalDate({
utcDate: utcEndDate,
timezone: type === 'transportation' ? selectedEndTimezone : selectedStartTimezone
}).localDate;
localStartDate = start;
localEndDate = end;
}
}
// Update UTC dates when local dates change
function handleLocalDateChange() {
utcStartDate = updateUTCDate({
localDate: localStartDate,
timezone: selectedStartTimezone,
allDay
}).utcDate;
utcEndDate = updateUTCDate({
localDate: localEndDate,
timezone: type === 'transportation' ? selectedEndTimezone : selectedStartTimezone,
allDay
}).utcDate;
}
function createVisitObject(): Visit | TransportationVisit {
// Generate a unique ID using built-in methods
const uniqueId = Date.now().toString(36) + Math.random().toString(36).substring(2);
if (type === 'transportation') {
const transportVisit: TransportationVisit = {
id: uniqueId,
start_date: utcStartDate ?? '',
end_date: utcEndDate ?? utcStartDate ?? '',
notes: note ?? '',
start_timezone: selectedStartTimezone,
end_timezone: selectedEndTimezone
};
return transportVisit;
} else {
const regularVisit: Visit = {
id: uniqueId,
start_date: utcStartDate ?? '',
end_date: utcEndDate ?? utcStartDate ?? '',
notes: note ?? '',
timezone: selectedStartTimezone
};
return regularVisit;
}
}
</script>
<div
class="collapse collapse-plus bg-base-200/50 border border-base-300/50 mb-6 rounded-2xl overflow-hidden"
>
<input type="checkbox" />
<div class="collapse-title text-xl font-semibold bg-gradient-to-r from-primary/10 to-primary/5">
<div class="flex items-center gap-3">
<div class="p-2 bg-primary/10 rounded-lg">
<svg class="w-5 h-5 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M8 7V3m8 4V3m-9 8h10M5 21h14a2 2 0 002-2V7a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z"
/>
</svg>
</div>
{$t('adventures.date_information')}
</div>
</div>
<div class="collapse-content bg-base-100/50 p-6">
<!-- Settings -->
<div class="card bg-base-100 border border-base-300/50 mb-6">
<div class="card-body p-4">
<h3 class="text-lg font-bold mb-4">Settings</h3>
<div class="space-y-3">
{#if type === 'transportation'}
<div class="grid grid-cols-2 gap-4">
<div>
<label class="label-text text-sm font-medium">Departure Timezone</label>
<div class="mt-1">
<TimezoneSelector bind:selectedTimezone={selectedStartTimezone} />
</div>
</div>
<div>
<label class="label-text text-sm font-medium">Arrival Timezone</label>
<div class="mt-1">
<TimezoneSelector bind:selectedTimezone={selectedEndTimezone} />
</div>
</div>
</div>
{:else}
<div>
<label class="label-text text-sm font-medium">Timezone</label>
<div class="mt-1">
<TimezoneSelector bind:selectedTimezone={selectedStartTimezone} />
</div>
</div>
{/if}
<div class="flex items-center justify-between">
<label class="label-text text-sm font-medium">All Day</label>
<input
type="checkbox"
class="toggle toggle-primary"
bind:checked={allDay}
on:change={() => {
if (allDay) {
localStartDate = localStartDate ? localStartDate.split('T')[0] : '';
localEndDate = localEndDate ? localEndDate.split('T')[0] : '';
} else {
localStartDate = localStartDate + 'T00:00';
localEndDate = localEndDate + 'T23:59';
}
utcStartDate = updateUTCDate({
localDate: localStartDate,
timezone: selectedStartTimezone,
allDay
}).utcDate;
utcEndDate = updateUTCDate({
localDate: localEndDate,
timezone: type === 'transportation' ? selectedEndTimezone : selectedStartTimezone,
allDay
}).utcDate;
localStartDate = updateLocalDate({
utcDate: utcStartDate,
timezone: selectedStartTimezone
}).localDate;
localEndDate = updateLocalDate({
utcDate: utcEndDate,
timezone: type === 'transportation' ? selectedEndTimezone : selectedStartTimezone
}).localDate;
}}
/>
</div>
{#if collection?.start_date && collection?.end_date}
<div class="flex items-center justify-between">
<label class="label-text text-sm font-medium">Constrain to Collection Dates</label>
<input
type="checkbox"
class="toggle toggle-primary"
on:change={() => (constrainDates = !constrainDates)}
/>
</div>
{/if}
</div>
</div>
</div>
<!-- Date Selection -->
<div class="card bg-base-100 border border-base-300/50 mb-6">
<div class="card-body p-4">
<h3 class="text-lg font-bold mb-4">Date Selection</h3>
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
<div>
<label class="label-text text-sm font-medium">
{type === 'transportation'
? 'Departure Date'
: type === 'lodging'
? 'Check In'
: 'Start Date'}
</label>
{#if allDay}
<input
type="date"
class="input input-bordered w-full mt-1"
bind:value={localStartDate}
on:change={handleLocalDateChange}
min={constrainDates ? constraintStartDate : ''}
max={constrainDates ? constraintEndDate : ''}
/>
{:else}
<input
type="datetime-local"
class="input input-bordered w-full mt-1"
bind:value={localStartDate}
on:change={handleLocalDateChange}
min={constrainDates ? constraintStartDate : ''}
max={constrainDates ? constraintEndDate : ''}
/>
{/if}
</div>
{#if localStartDate}
<div>
<label class="label-text text-sm font-medium">
{type === 'transportation'
? 'Arrival Date'
: type === 'lodging'
? 'Check Out'
: 'End Date'}
</label>
{#if allDay}
<input
type="date"
class="input input-bordered w-full mt-1"
bind:value={localEndDate}
on:change={handleLocalDateChange}
min={constrainDates ? localStartDate : ''}
max={constrainDates ? constraintEndDate : ''}
/>
{:else}
<input
type="datetime-local"
class="input input-bordered w-full mt-1"
bind:value={localEndDate}
on:change={handleLocalDateChange}
min={constrainDates ? localStartDate : ''}
max={constrainDates ? constraintEndDate : ''}
/>
{/if}
</div>
{/if}
</div>
{#if type === 'adventure'}
<div class="mt-4">
<label class="label-text text-sm font-medium">Notes</label>
<textarea
class="textarea textarea-bordered w-full mt-1"
rows="3"
placeholder="Add notes..."
bind:value={note}
></textarea>
</div>
<div class="flex justify-end mt-4">
<button
class="btn btn-primary btn-sm"
type="button"
on:click={() => {
const newVisit = createVisitObject();
if (visits) {
visits = [...visits, newVisit];
} else {
visits = [newVisit];
}
note = '';
localStartDate = '';
localEndDate = '';
utcStartDate = null;
utcEndDate = null;
}}
>
Add Visit
</button>
</div>
{/if}
</div>
</div>
<!-- Validation -->
{#if !validateDateRange(utcStartDate ?? '', utcEndDate ?? '').valid}
<div class="alert alert-error mb-6">
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M12 8v4m0 4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
/>
</svg>
<span>Invalid date range</span>
</div>
{/if}
<!-- Visits List -->
{#if type === 'adventure'}
<div class="card bg-base-100 border border-base-300/50">
<div class="card-body p-4">
<h3 class="text-lg font-bold mb-4">Visits</h3>
{#if visits && visits.length === 0}
<div class="text-center py-8 text-base-content/60">
<p class="text-sm">No visits added yet</p>
</div>
{/if}
{#if visits && visits.length > 0}
<div class="space-y-3">
{#each visits as visit}
<div class="p-3 bg-base-200/50 rounded-lg border border-base-300/30">
<div class="flex items-start justify-between">
<div class="flex-1">
<div class="text-sm font-medium mb-1">
{#if isAllDay(visit.start_date)}
<span class="badge badge-outline badge-sm mr-2">All Day</span>
{visit.start_date && typeof visit.start_date === 'string'
? visit.start_date.split('T')[0]
: ''}
{visit.end_date && typeof visit.end_date === 'string'
? visit.end_date.split('T')[0]
: ''}
{:else if 'start_timezone' in visit}
{formatDateInTimezone(visit.start_date, visit.start_timezone)}
{formatDateInTimezone(visit.end_date, visit.end_timezone)}
{:else if visit.timezone}
{formatDateInTimezone(visit.start_date, visit.timezone)}
{formatDateInTimezone(visit.end_date, visit.timezone)}
{:else}
{new Date(visit.start_date).toLocaleString()}
{new Date(visit.end_date).toLocaleString()}
{/if}
</div>
{#if visit.notes}
<p class="text-xs text-base-content/70 mt-1">"{visit.notes}"</p>
{/if}
</div>
<div class="flex gap-2">
<button
class="btn btn-primary btn-xs"
type="button"
on:click={() => {
isEditing = true;
const isAllDayEvent = isAllDay(visit.start_date);
allDay = isAllDayEvent;
if ('start_timezone' in visit) {
selectedStartTimezone = visit.start_timezone;
selectedEndTimezone = visit.end_timezone;
} else if (visit.timezone) {
selectedStartTimezone = visit.timezone;
}
if (isAllDayEvent) {
localStartDate = visit.start_date.split('T')[0];
localEndDate = visit.end_date.split('T')[0];
} else {
localStartDate = updateLocalDate({
utcDate: visit.start_date,
timezone: selectedStartTimezone
}).localDate;
localEndDate = updateLocalDate({
utcDate: visit.end_date,
timezone:
'end_timezone' in visit ? visit.end_timezone : selectedStartTimezone
}).localDate;
}
if (visits) {
visits = visits.filter((v) => v.id !== visit.id);
}
note = visit.notes;
constrainDates = true;
utcStartDate = visit.start_date;
utcEndDate = visit.end_date;
setTimeout(() => {
isEditing = false;
}, 0);
}}
>
Edit
</button>
<button
class="btn btn-error btn-xs"
type="button"
on:click={() => {
if (visits) {
visits = visits.filter((v) => v.id !== visit.id);
}
}}
>
Remove
</button>
</div>
</div>
</div>
{/each}
</div>
{/if}
</div>
</div>
{/if}
</div>
</div>

View File

@@ -1,284 +0,0 @@
<script lang="ts">
import type { Checklist, Lodging, Note, Transportation } from '$lib/types';
import { deserialize } from '$app/forms';
import { t } from 'svelte-i18n';
import { addToast } from '$lib/toasts';
export let object: Lodging | Transportation;
export let objectType: 'lodging' | 'transportation' | 'note' | 'checklist';
export let isImagesUploading: boolean = false;
let imageInput: HTMLInputElement;
let imageFiles: File[] = [];
function handleImageChange(event: Event) {
const target = event.target as HTMLInputElement;
if (target?.files) {
imageFiles = Array.from(target.files);
console.log('Images selected:', imageFiles.length);
if (object.id) {
// If object exists, upload immediately
uploadImages();
}
}
}
// Watch for external trigger to upload images
$: {
if (isImagesUploading && imageFiles.length > 0 && object.id) {
// Immediately clear the trigger to prevent infinite loop
const filesToUpload = [...imageFiles];
imageFiles = []; // Clear immediately
if (imageInput) {
imageInput.value = '';
}
uploadImagesFromList(filesToUpload);
}
}
async function uploadImages() {
if (imageFiles.length === 0) {
isImagesUploading = false;
return;
}
const filesToUpload = [...imageFiles];
// Clear immediately to prevent re-triggering
imageFiles = [];
if (imageInput) {
imageInput.value = '';
}
await uploadImagesFromList(filesToUpload);
}
async function uploadImagesFromList(files: File[]) {
if (files.length === 0) {
isImagesUploading = false;
return;
}
console.log('Starting image upload for', files.length, 'files');
try {
// Upload all images concurrently
const uploadPromises = files.map((file) => uploadImage(file));
await Promise.all(uploadPromises);
} catch (error) {
console.error('Error uploading images:', error);
addToast('error', $t('adventures.image_upload_error'));
} finally {
isImagesUploading = false;
}
}
async function uploadImage(file: File): Promise<void> {
let formData = new FormData();
formData.append('image', file);
formData.append('object_id', object.id);
formData.append('content_type', objectType);
let res = await fetch(`/locations?/image`, {
method: 'POST',
body: formData
});
if (res.ok) {
let newData = deserialize(await res.text()) as { data: { id: string; image: string } };
let newImage = {
id: newData.data.id,
image: newData.data.image,
is_primary: false,
immich_id: null
};
object.images = [...(object.images || []), newImage];
} else {
throw new Error(`Failed to upload ${file.name}`);
}
}
async function removeImage(id: string) {
let res = await fetch(`/api/images/${id}/image_delete`, {
method: 'POST'
});
if (res.status === 204) {
object.images = object.images.filter((image: { id: string }) => image.id !== id);
addToast('success', $t('adventures.image_removed_success'));
} else {
addToast('error', $t('adventures.image_removed_error'));
}
}
async function makePrimaryImage(image_id: string) {
let res = await fetch(`/api/images/${image_id}/toggle_primary`, {
method: 'POST'
});
if (res.ok) {
object.images = object.images.map((image) => {
if (image.id === image_id) {
return { ...image, is_primary: true };
} else {
return { ...image, is_primary: false };
}
});
} else {
console.error('Error in makePrimaryImage:', res);
}
}
// Export function to check if images are ready to upload
export function hasImagesToUpload(): boolean {
return imageFiles.length > 0;
}
</script>
<div
class="collapse collapse-plus bg-base-200/50 border border-base-300/50 mb-6 rounded-2xl overflow-hidden"
>
<input type="checkbox" />
<div class="collapse-title text-xl font-semibold bg-gradient-to-r from-primary/10 to-primary/5">
<div class="flex items-center gap-3">
<div class="p-2 bg-primary/10 rounded-lg">
<svg class="w-5 h-5 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M4 16l4.586-4.586a2 2 0 012.828 0L16 16m-2-2l1.586-1.586a2 2 0 012.828 0L20 14m-6-6h.01M6 20h12a2 2 0 002-2V6a2 2 0 00-2-2H6a2 2 0 00-2 2v12a2 2 0 002 2z"
/>
</svg>
</div>
{$t('adventures.images')}
{#if isImagesUploading}
<span class="loading loading-spinner loading-sm text-primary"></span>
{/if}
</div>
</div>
<div class="collapse-content bg-base-100/50 pt-4 p-6">
<div class="form-control">
<label class="label" for="image">
<span class="label-text font-medium">{$t('adventures.upload_image')}</span>
</label>
<input
type="file"
id="image"
name="image"
accept="image/*"
multiple
bind:this={imageInput}
on:change={handleImageChange}
class="file-input file-input-bordered file-input-primary w-full bg-base-100/80 focus:bg-base-100"
disabled={isImagesUploading}
/>
</div>
{#if imageFiles.length > 0 && !object.id}
<div class="mt-4">
<h4 class="font-semibold text-base-content mb-2">
{$t('adventures.selected_images')} ({imageFiles.length})
</h4>
<div class="alert alert-info">
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
class="stroke-current shrink-0 w-6 h-6"
><path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
></path></svg
>
<span>{$t('adventures.image_upload_info')} {objectType}</span>
</div>
<ul class="list-disc pl-5 space-y-1 mt-2">
{#each imageFiles as file}
<li>{file.name} ({Math.round(file.size / 1024)} KB)</li>
{/each}
</ul>
</div>
{/if}
{#if object.id}
<div class="divider my-6"></div>
<!-- Current Images -->
<div class="space-y-4">
<h4 class="font-semibold text-lg">{$t('adventures.my_images')}</h4>
{#if object.images && object.images.length > 0}
<div class="grid gap-4 sm:grid-cols-2 md:grid-cols-3 lg:grid-cols-4">
{#each object.images as image}
<div class="relative group">
<div class="aspect-square overflow-hidden rounded-lg bg-base-300">
<img
src={image.image}
alt={image.id}
class="w-full h-full object-cover transition-transform group-hover:scale-105"
/>
</div>
<!-- Image Controls -->
<div
class="absolute inset-0 bg-black/50 opacity-0 group-hover:opacity-100 transition-opacity rounded-lg flex items-center justify-center gap-2"
>
{#if !image.is_primary}
<button
type="button"
class="btn btn-success btn-sm"
on:click={() => makePrimaryImage(image.id)}
title="Make Primary"
>
<svg class="h-4 w-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M11.049 2.927c.3-.921 1.603-.921 1.902 0l1.519 4.674a1 1 0 00.95.69h4.915c.969 0 1.371 1.24.588 1.81l-3.976 2.888a1 1 0 00-.363 1.118l1.518 4.674c.3.922-.755 1.688-1.538 1.118l-3.976-2.888a1 1 0 00-1.176 0l-3.976 2.888c-.783.57-1.838-.197-1.538-1.118l1.518-4.674a1 1 0 00-.363-1.118l-3.976-2.888c-.784-.57-.38-1.81.588-1.81h4.914a1 1 0 00.951-.69l1.519-4.674z"
></path>
</svg>
</button>
{/if}
<button
type="button"
class="btn btn-error btn-sm"
on:click={() => removeImage(image.id)}
title="Remove"
>
</button>
</div>
<!-- Primary Badge -->
{#if image.is_primary}
<div
class="absolute top-2 left-2 bg-warning text-warning-content rounded-full p-1"
>
<svg class="h-4 w-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M5 3l14 9-14 9V3z"
></path>
</svg>
</div>
{/if}
</div>
{/each}
</div>
{:else}
<div class="text-center py-8">
<div class="text-base-content/60 text-lg mb-2">
{$t('adventures.no_images')}
</div>
<p class="text-sm text-base-content/40">{$t('adventures.no_images_desc')}</p>
</div>
{/if}
</div>
{/if}
</div>
</div>

View File

@@ -0,0 +1,545 @@
<script lang="ts">
import type { ContentImage } from '$lib/types';
import { createEventDispatcher, onMount } from 'svelte';
import { t } from 'svelte-i18n';
import { deserialize } from '$app/forms';
// Icons
import Star from '~icons/mdi/star';
import Crown from '~icons/mdi/crown';
import TrashIcon from '~icons/mdi/delete';
import CheckIcon from '~icons/mdi/check';
import CloseIcon from '~icons/mdi/close';
import ImageIcon from '~icons/mdi/image';
import { addToast } from '$lib/toasts';
import ImmichSelect from './ImmichSelect.svelte';
// Props
export let images: ContentImage[] = [];
export let objectId: string = '';
export let contentType: string = 'location'; // 'location', 'adventure', 'collection', etc.
export let defaultSearchTerm: string = '';
export let immichIntegration: boolean = false;
export let copyImmichLocally: boolean = false;
// Component state
let fileInput: HTMLInputElement;
let url: string = '';
let imageSearch: string = defaultSearchTerm;
let imageError: string = '';
let wikiImageError: string = '';
let isLoading: boolean = false;
// Wikipedia image selection
let wikiImageResults: Array<{
source: string;
width: number;
height: number;
title: string;
type: string;
}> = [];
const dispatch = createEventDispatcher<{
imagesUpdated: ContentImage[];
}>();
// Helper functions
function createImageFromData(data: {
id: string;
image: string;
immich_id?: string | null;
}): ContentImage {
return {
id: data.id,
image: data.image,
is_primary: false,
immich_id: data.immich_id || null
};
}
function updateImagesList(newImage: ContentImage) {
images = [...images, newImage];
dispatch('imagesUpdated', images);
}
// API calls
async function uploadImageToServer(file: File) {
const formData = new FormData();
formData.append('image', file);
formData.append('object_id', objectId);
formData.append('content_type', contentType);
try {
const res = await fetch(`/locations?/image`, {
method: 'POST',
credentials: 'same-origin',
body: formData
});
if (res.ok) {
const newData = deserialize(await res.text()) as { data: { id: string; image: string } };
return createImageFromData(newData.data);
} else {
throw new Error('Upload failed');
}
} catch (error) {
console.error('Upload error:', error);
return null;
}
}
// Import temporary recommendation images (id starting with 'rec-') once objectId is available
export let importInProgress: boolean = false;
async function importPrefilledImagesIfNeeded() {
if (importInProgress) return;
if (!objectId || !images || images.length === 0) return;
const prefilled = images.filter((img) => img.id && img.id.startsWith('rec-'));
if (prefilled.length === 0) return;
importInProgress = true;
for (const img of prefilled) {
try {
const res = await fetch(img.image);
if (!res.ok) throw new Error('Failed to fetch image');
const blob = await res.blob();
const file = new File([blob], 'image.jpg', { type: blob.type || 'image/jpeg' });
const newImage = await uploadImageToServer(file);
if (newImage) {
images = images.map((i) => (i.id === img.id ? newImage : i));
dispatch('imagesUpdated', images);
addToast('success', $t('adventures.image_upload_success'));
} else {
throw new Error('Upload failed');
}
} catch (err) {
console.error('Error importing prefilled image:', err);
addToast('error', $t('adventures.image_upload_error'));
}
}
importInProgress = false;
}
onMount(() => {
importPrefilledImagesIfNeeded();
});
// React to objectId becoming available later
$: if (objectId) {
importPrefilledImagesIfNeeded();
}
async function fetchImageFromUrl(imageUrl: string): Promise<Blob | null> {
try {
const res = await fetch(imageUrl);
if (!res.ok) throw new Error('Failed to fetch image');
return await res.blob();
} catch (error) {
console.error('Fetch error:', error);
return null;
}
}
// Image event handlers
async function handleMultipleFiles(event: Event) {
const files = (event.target as HTMLInputElement).files;
if (!files) return;
isLoading = true;
imageError = '';
try {
for (const file of files) {
const newImage = await uploadImageToServer(file);
if (newImage) {
updateImagesList(newImage);
}
}
addToast('success', $t('adventures.image_upload_success'));
} catch (error) {
addToast('error', $t('adventures.image_upload_error'));
imageError = $t('adventures.image_upload_error');
} finally {
isLoading = false;
if (fileInput) fileInput.value = '';
}
}
async function handleUrlUpload() {
if (!url.trim()) return;
isLoading = true;
imageError = '';
try {
const blob = await fetchImageFromUrl(url);
if (!blob) {
imageError = $t('adventures.no_image_url');
return;
}
const file = new File([blob], 'image.jpg', { type: 'image/jpeg' });
const newImage = await uploadImageToServer(file);
if (newImage) {
updateImagesList(newImage);
addToast('success', $t('adventures.image_upload_success'));
url = '';
} else {
throw new Error('Upload failed');
}
} catch (error) {
imageError = $t('adventures.image_fetch_failed');
addToast('error', $t('adventures.image_upload_error'));
} finally {
isLoading = false;
}
}
async function handleWikiImageSearch() {
if (!imageSearch.trim()) return;
isLoading = true;
wikiImageError = '';
try {
const res = await fetch(`/api/generate/img/?name=${encodeURIComponent(imageSearch)}`);
const data = await res.json();
if (!res.ok || !data.images || data.images.length === 0) {
wikiImageError = $t('adventures.image_fetch_failed');
return;
}
// Store results to display inline (deduplicated by source)
{
const seen = new Set();
wikiImageResults = (data.images || []).filter((img: { source: unknown }) => {
if (!img || !img.source) return false;
if (seen.has(img.source)) return false;
seen.add(img.source);
return true;
});
}
} catch (error) {
wikiImageError = $t('adventures.wiki_image_error');
addToast('error', $t('adventures.image_upload_error'));
} finally {
isLoading = false;
}
}
async function selectWikiImage(imageUrl: string) {
isLoading = true;
try {
const blob = await fetchImageFromUrl(imageUrl);
if (!blob) {
wikiImageError = $t('adventures.image_fetch_failed');
isLoading = false;
return;
}
const file = new File([blob], `${imageSearch}.jpg`, { type: 'image/jpeg' });
const newImage = await uploadImageToServer(file);
if (newImage) {
updateImagesList(newImage);
addToast('success', $t('adventures.image_upload_success'));
// Keep results open to allow adding multiple images
} else {
throw new Error('Upload failed');
}
} catch (error) {
wikiImageError = $t('adventures.wiki_image_error');
addToast('error', $t('adventures.image_upload_error'));
} finally {
isLoading = false;
}
}
async function makePrimaryImage(imageId: string) {
try {
const res = await fetch(`/api/images/${imageId}/toggle_primary`, {
method: 'POST'
});
if (res.ok) {
images = images.map((image) => ({
...image,
is_primary: image.id === imageId
}));
dispatch('imagesUpdated', images);
addToast('success', 'Primary image updated');
} else {
throw new Error('Failed to update primary image');
}
} catch (error) {
console.error('Error in makePrimaryImage:', error);
addToast('error', 'Failed to update primary image');
}
}
async function removeImage(imageId: string) {
try {
const res = await fetch(`/api/images/${imageId}/image_delete`, {
method: 'POST'
});
if (res.status === 204) {
images = images.filter((image) => image.id !== imageId);
dispatch('imagesUpdated', images);
addToast('success', 'Image removed');
} else {
throw new Error('Failed to remove image');
}
} catch (error) {
console.error('Error removing image:', error);
addToast('error', 'Failed to remove image');
}
}
function handleImmichImageSaved(event: CustomEvent) {
const newImage = createImageFromData(event.detail);
updateImagesList(newImage);
addToast('success', $t('adventures.image_upload_success'));
}
// Watch for defaultSearchTerm changes
$: if (defaultSearchTerm && !imageSearch) {
imageSearch = defaultSearchTerm;
}
</script>
<div class="card bg-base-100 border border-base-300 shadow-lg">
<div class="card-body p-6">
<div class="flex items-center gap-3 mb-6">
<div class="p-2 bg-primary/10 rounded-lg">
<ImageIcon class="w-5 h-5 text-primary" />
</div>
<h2 class="text-xl font-bold">{$t('adventures.image_management')}</h2>
</div>
<!-- Upload Options Grid -->
<div class="grid gap-4 lg:grid-cols-2 mb-6">
<!-- File Upload -->
<div class="bg-base-50 p-4 rounded-lg border border-base-200">
<h4 class="font-medium mb-3 text-base-content/80">
{$t('adventures.upload_from_device')}
</h4>
<input
type="file"
bind:this={fileInput}
class="file-input file-input-bordered w-full"
accept="image/*"
multiple
disabled={isLoading}
on:change={handleMultipleFiles}
/>
</div>
<!-- URL Upload -->
<div class="bg-base-50 p-4 rounded-lg border border-base-200">
<h4 class="font-medium mb-3 text-base-content/80">
{$t('adventures.upload_from_url')}
</h4>
<div class="flex gap-2">
<input
type="url"
bind:value={url}
class="input input-bordered flex-1"
placeholder="https://example.com/image.jpg"
disabled={isLoading}
/>
<button
class="btn btn-primary btn-sm"
class:loading={isLoading}
disabled={isLoading || !url.trim()}
on:click={handleUrlUpload}
>
{$t('adventures.fetch_image')}
</button>
</div>
{#if imageError}
<div class="alert alert-error mt-2 py-2">
<span class="text-sm">{imageError}</span>
</div>
{/if}
</div>
<!-- Wikipedia Search -->
<div class="bg-base-50 p-4 rounded-lg border border-base-200">
<h4 class="font-medium mb-3 text-base-content/80">
{$t('adventures.wikipedia')}
</h4>
<div class="flex gap-2">
<input
type="text"
bind:value={imageSearch}
class="input input-bordered flex-1"
placeholder="Search Wikipedia for images"
disabled={isLoading}
/>
<button
class="btn btn-primary btn-sm"
class:loading={isLoading}
disabled={isLoading || !imageSearch.trim()}
on:click={handleWikiImageSearch}
>
{$t('navbar.search')}
</button>
</div>
{#if wikiImageError}
<div class="alert alert-error mt-2 py-2">
<span class="text-sm">{wikiImageError}</span>
</div>
{/if}
<!-- Wikipedia Image Results (Inside Box) -->
{#if wikiImageResults.length > 0}
<div class="mt-4">
<div class="flex items-center justify-between mb-3">
<span class="text-sm text-base-content/70">
{$t('adventures.wiki_results_found', {
values: { count: wikiImageResults.length, query: imageSearch }
})}
</span>
<button
class="btn btn-ghost btn-xs"
on:click={() => {
wikiImageResults = [];
imageSearch = defaultSearchTerm;
}}
>
<CloseIcon class="h-4 w-4" />
</button>
</div>
<div class="grid grid-cols-2 sm:grid-cols-3 gap-2 max-h-96 overflow-y-auto">
{#each wikiImageResults as result, i (result.source + '-' + i)}
<button
type="button"
class="card bg-base-100 border border-base-300 hover:border-primary hover:shadow-lg transition-all duration-200 cursor-pointer group"
on:click={() => selectWikiImage(result.source)}
disabled={isLoading}
>
<figure class="aspect-square bg-base-200 overflow-hidden">
<img
src={result.source}
alt={result.title}
class="w-full h-full object-cover transition-transform group-hover:scale-105"
loading="lazy"
/>
</figure>
<div class="card-body p-2">
<h4 class="text-xs font-medium line-clamp-1 text-left" title={result.title}>
{result.title}
</h4>
<div
class="text-xs text-base-content/60 flex items-center justify-between gap-1"
>
<span class="truncate">{result.width} × {result.height}</span>
</div>
</div>
<div
class="absolute inset-0 bg-primary/10 opacity-0 group-hover:opacity-100 transition-opacity duration-200 flex items-center justify-center rounded-2xl"
>
<div class="btn btn-primary btn-sm gap-2">
<CheckIcon class="h-4 w-4" />
{$t('adventures.select')}
</div>
</div>
</button>
{/each}
</div>
</div>
{/if}
</div>
<!-- Immich Integration -->
{#if immichIntegration}
<div class="bg-base-50 p-4 rounded-lg border border-base-200">
<h4 class="font-medium mb-3 text-base-content/80">
{$t('immich.immich')}
</h4>
<ImmichSelect
{objectId}
{contentType}
{copyImmichLocally}
on:fetchImage={(e) => {
url = e.detail;
handleUrlUpload();
}}
on:remoteImmichSaved={handleImmichImageSaved}
/>
</div>
{/if}
</div>
<!-- Image Gallery -->
{#if images.length > 0}
<div class="divider">Current Images</div>
<div class="grid gap-4 sm:grid-cols-2 md:grid-cols-3 lg:grid-cols-4">
{#each images as image, i (image.id ?? image.image ?? `img-${i}`)}
<div class="relative group">
<div
class="aspect-square overflow-hidden rounded-lg bg-base-200 border border-base-300"
>
<img
src={image.image}
alt="Uploaded content"
class="w-full h-full object-cover transition-transform group-hover:scale-105"
loading="lazy"
/>
</div>
<!-- Image Controls Overlay -->
<div
class="absolute inset-0 bg-black/60 opacity-0 group-hover:opacity-100 transition-all duration-200 rounded-lg flex items-center justify-center gap-2"
>
{#if !image.is_primary}
<button
type="button"
class="btn btn-success btn-sm tooltip tooltip-top"
data-tip="Make Primary"
on:click={() => image.id && makePrimaryImage(image.id)}
disabled={!image.id}
>
<Star class="h-4 w-4" />
</button>
{/if}
<button
type="button"
class="btn btn-error btn-sm tooltip tooltip-top"
data-tip="Remove Image"
on:click={() => image.id && removeImage(image.id)}
disabled={!image.id}
>
<TrashIcon class="h-4 w-4" />
</button>
</div>
<!-- Primary Badge -->
{#if image.is_primary}
<div
class="absolute top-2 left-2 bg-warning text-warning-content rounded-full p-1 shadow-lg"
>
<Crown class="h-4 w-4" />
</div>
{/if}
</div>
{/each}
</div>
{:else}
<div class="bg-base-200/50 rounded-lg p-8 text-center">
<div class="text-base-content/60 mb-2">{$t('adventures.no_images_uploaded_yet')}</div>
<div class="text-sm text-base-content/40">
{$t('adventures.upload_first_image')}
</div>
</div>
{/if}
</div>
</div>

Some files were not shown because too many files have changed in this diff Show More