* Fixes [REQUEST] Email-based auto-linking for OIDC Fixes #921 * Add ClusterMap integration for regions and cities with fit-to-bounds functionality * Update COUNTRY_REGION_JSON_VERSION to v3.0 and modify state ID generation to use ISO2 code * fix: handle email verification required case during signup Updated the signup action to return a specific message when the backend responds with a 401 status, indicating that the signup succeeded but email verification is required. This allows the frontend to display the appropriate message using an i18n key. * feat: add Advanced Configuration documentation with optional environment variables * Fixes #511 * fix: update appVersion to v0.11.0-main-121425 and enhance socialProviders handling in settings page * feat: implement social signup controls and update documentation for new environment variables * fix: update LocationCard props and enhance restore data functionality - Changed the user prop to null in LocationCard component on the dashboard page. - Added isRestoring state to manage loading state during data restoration in settings. - Updated the restore button to show a loading spinner when a restore operation is in progress. * fix: update appVersion to v0.12.0-pre-dev-121625 * feat: implement itinerary planning feature with CollectionItineraryPlanner component and related updates * feat: add overnight lodging indicator and functionality to CollectionItineraryPlanner * feat: add compact display option to LocationCard and enhance lodging filtering in CollectionItineraryPlanner * feat(itinerary): add itinerary management features and link modal - Introduced ItineraryViewSet for managing itinerary items with create and reorder functionalities. - Added itinerary linking capabilities in CollectionModal and CollectionItineraryPlanner components. - Implemented new ItineraryLinkModal for linking existing items to specific dates. - Enhanced the frontend with new modals for creating locations, lodging, transportation, notes, and checklists. - Updated the backend to handle itinerary item creation and reordering with appropriate permissions. - Improved data handling for unscheduled items and their association with the itinerary. - Added new dependencies to the frontend for enhanced functionality. * feat(itinerary): implement auto-generate functionality for itinerary items based on dated records * feat(collection): enhance collection sharing logic and improve data handling on invite acceptance * fix: update appVersion to correct pre-dev version * feat(wikipedia): implement image selection from Wikipedia with enhanced results display * Refactor code structure for improved readability and maintainability * feat: add CollectionRecommendationView component for displaying location recommendations - Implemented CollectionRecommendationView.svelte to handle location recommendations based on user input and selected categories. - Added Recommendation and RecommendationResponse types to types.ts for better type safety and structure. - Updated collections/[id]/+page.svelte to include a new view for recommendations, allowing users to switch between different views seamlessly. * fix: update appVersion and improve button accessibility in collection views * feat: add canModify prop to collection components for user permission handling * feat: add itinerary removal functionality to various cards and update UI components - Implemented `removeFromItinerary` function in `LodgingCard`, `NoteCard`, and `TransportationCard` to allow users to remove items from their itinerary. - Replaced the trash icon with a calendar remove icon in `LocationCard`, `LodgingCard`, `NoteCard`, and `TransportationCard` for better visual representation. - Updated the dropdown menus in `LodgingCard`, `NoteCard`, and `TransportationCard` to include the new remove from itinerary option. - Enhanced `CollectionItineraryPlanner` to pass itinerary items to the respective cards. - Removed `PointSelectionModal.svelte` as it is no longer needed. - Refactored `LocationMedia.svelte` to integrate `ImageManagement` component and clean up unused code related to image handling. * feat: enhance itinerary management with deduplication and initial visit date handling * feat: add FullMap component for enhanced map functionality with clustering support - Introduced FullMap.svelte to handle map rendering, clustering, and marker management. - Updated map page to utilize FullMap component, replacing direct MapLibre usage. - Implemented clustering options and marker properties handling in FullMap. - Added utility functions for resolving theme colors and managing marker states. - Enhanced user experience with hover popups and improved loading states for location details. - Updated app version to v0.12.0-pre-dev-122225. * feat: enhance map interaction for touch devices with custom popup handling * feat: add progress tracker for folder views to display visited and planned locations * feat: add map center and zoom state management with URL synchronization * feat: add status and days until start fields to collections with filtering options * Component folder structure changes * feat: add LodgingMedia and LodgingModal components for managing lodging details and media attachments feat: implement LocationSearchMap component for interactive location searching and mapping functionality * fix: update contentType in ImageManagement component to 'lodging' for correct media handling * feat: enhance lodging management with date validation and update messages * feat: implement lodging detail page with server-side loading and image modal functionality - Added a new server-side load function to fetch lodging details by ID. - Created a new Svelte component for the lodging detail page, including image carousel and map integration. - Implemented a modal for displaying images with navigation. - Enhanced URL handling in the locations page to only read parameters. * feat: add Transportation modal component and related routes - Implemented TransportationModal component for creating and editing transportation entries. - Added server-side loading for transportation details in the new route [id]/+page.server.ts. - Created a new Svelte page for displaying transportation details with image and attachment handling. - Integrated modal for editing transportation in the transportation details page. - Updated lodging routes to include a modal for editing lodging entries. - Removed unused delete action from lodging server-side logic. * feat: add start_code and end_code fields to Transportation model and update related components * feat: implement date validation for itinerary items and add day picker modal for scheduling * Reorder town and county checks in geocoding.py Fix detection if only town exists for a location but county is no city name * Use address keys only if city is found * Make sure reverse geocoding uses correct key for cities (#938) * Reorder town and county checks in geocoding.py Fix detection if only town exists for a location but county is no city name * Use address keys only if city is found * Refactor code structure for improved readability and maintainability * Enhance collection management with modal updates and item handling * feat: integrate CollectionMap component in collections page and update map titles in lodging and transportation pages - Replaced inline map implementation with CollectionMap component in collections/[id]/+page.svelte for better modularity. - Updated the map title in lodging/[id]/+page.svelte to reflect lodging context. - Updated the map title in transportations/[id]/+page.svelte to reflect transportation context. - Added functionality to collect and render GeoJSON data from transportation attachments in transportations/[id]/+page.svelte. * chore: update copyright year to 2026 in various files * feat: enhance backup export functionality with itinerary items and export IDs * fix: improve dropdown close behavior by handling multiple event types * fix: remove unnecessary cache decorator from globespin function * feat: add initial visit date support in ChecklistModal and NoteModal, with UI suggestions for prefilled dates * feat: add details view for checklist and note cards with edit functionality * feat: add travel duration and GPX distance calculation to Transportation model and UI * feat: add primary image support to Collection model, serializers, and UI components * Refactor calendar components and enhance event detail handling - Replaced direct calendar implementation with a reusable CalendarComponent in the calendar route. - Introduced EventDetailsModal for displaying event details, improving modularity and readability. - Added functionality to fetch event details asynchronously when an event is clicked. - Implemented ICS calendar download functionality with loading state management. - Enhanced collections page to support calendar view, integrating event handling and timezone management. - Improved lodging and transportation pages to display local time for stays and trips, including timezone badges. - Cleaned up unused code and comments for better maintainability. * feat: enhance hero image handling in collection view by prioritizing primary image * chore: update .env.example to include account email verification configuration * feat: enhance LodgingCard and TransportationCard components with expandable details and improved layout * feat: add price and currency fields to locations, lodging, and transportation components - Introduced price and price_currency fields in LocationModal, LodgingDetails, LodgingModal, TransportationDetails, and TransportationModal components. - Implemented MoneyInput and CurrencyDropdown components for handling monetary values and currency selection. - Updated data structures and types to accommodate new price and currency fields across various models. - Enhanced cost summary calculations in collections and routes to display total costs by currency. - Added user preference for default currency in settings, affecting new item forms. - Updated UI to display price information in relevant components, ensuring consistent formatting and user experience. * feat: add Development Timeline link to overview and create timeline documentation * feat: enhance map functionality with search and zoom features - Updated availableViews in collection page to include map view based on lodging and transportation locations. - Added search functionality to the map page, allowing users to filter pins by name and category. - Implemented auto-zoom feature to adjust the map view based on filtered search results. - Introduced a search bar with a clear button for better user experience. * feat: enhance ISO code extraction and region matching logic in extractIsoCode function * feat: enhance extractIsoCode function with normalization for locality matching * feat: update extractIsoCode function to include additional ISO3166 levels for improved region matching * feat: enhance extractIsoCode function to handle cases without city information and update CollectionMap to bind user data * feat: add cron job for syncing visited regions and cities, enhance Docker and supervisord configurations * feat: add CollectionItineraryDay model and related functionality for itinerary day metadata management * feat: implement cleanup of out-of-range itinerary items and notify users of potential impacts on itinerary when dates change * Refactor collection page for improved localization and code clarity - Removed unused imports and consolidated cost category labels to be reactive. - Updated cost summary function to accept localized labels. - Enhanced localization for various UI elements, including buttons, headings, and statistics. - Improved user feedback messages for better clarity and consistency. - Ensured all relevant text is translatable using the i18n library. * feat: add collaborator serialization and display in collections - Implemented `_build_profile_pic_url` and `_serialize_collaborator` functions for user profile picture URLs and serialization. - Updated `CollectionSerializer` and `UltraSlimCollectionSerializer` to include collaborators in the serialized output. - Enhanced `CollectionViewSet` to prefetch shared_with users for optimized queries. - Modified frontend components to display collaborators in collection details, including profile pictures and initials. - Added new localization strings for collaborators. - Refactored map and location components to improve usability and functionality. - Updated app version to reflect new changes. * feat: add dynamic lodging icons based on type in CollectionMap component * feat: add CollectionStats component for detailed trip statistics - Implemented CollectionStats.svelte to display various statistics related to the collection, including distances, activities, and locations visited. - Enhanced CollectionMap.svelte to filter activities based on date range using new getActivityDate function. - Updated LocationSearchMap.svelte to handle airport mode for start and end locations. - Modified types.ts to include is_global property in CollectionItineraryItem for trip-wide items. - Updated +page.svelte to integrate the new stats view and manage view state accordingly. * feat: enhance itinerary management by removing old items on date change for notes and checklists; normalize date handling in CollectionMap * feat: add functionality to change day and move items to trip-wide itinerary - Implemented changeDay function in ChecklistCard, LocationCard, LodgingCard, NoteCard, and TransportationCard components to allow users to change the scheduled day of items. - Added a button to move items to the global (trip-wide) itinerary in the aforementioned components, with appropriate dispatch events. - Enhanced CollectionItineraryPlanner to handle moving items to the global itinerary and added UI elements for unscheduled items. - Updated ItineraryDayPickModal to support the deletion of source visits when moving locations. - Added new translations for "Change Day" and "Move Trip Wide" in the English locale. * fix: specify full path for python3 in cron job and add shell and path variables * fix: update appVersion to v0.12.0-pre-dev-010726 * feat: enhance CollectionItineraryPlanner and CollectionStats with dynamic links and transport type normalization * Add Dev Container + WSL install docs and link in install guide (#944) (#951) * feat: enhance internationalization support in CollectionMap and CollectionStats components - Added translation support for various labels and messages in CollectionMap.svelte and CollectionStats.svelte using svelte-i18n. - Updated English and Chinese locale files to include new translation keys for improved user experience. - Simplified the rendering of recommendation views in the collections page. * Refactor itinerary management and UI components - Updated ItineraryViewSet to handle visit updates and creations more efficiently, preserving visit IDs when moving between days. - Enhanced ChecklistCard, LodgingCard, TransportationCard, and NoteCard to include a new "Change Day" option in the actions menu. - Improved user experience in CollectionItineraryPlanner by tracking specific itinerary items being moved and ensuring only the relevant entries are deleted. - Added new location sharing options in LodgingCard and TransportationCard for Apple Maps, Google Maps, and OpenStreetMap. - Updated translations in en.json for consistency and clarity. - Minor UI adjustments for better accessibility and usability across various components. * feat: implement action menus and close event handling in card components * feat: refactor Dockerfile and supervisord configuration to remove cron and add periodic sync script * feat: enhance LocationSearchMap and TransportationDetails components with initialization handling and airport mode logic * feat: add airport and location search mode labels to localization file * feat: enhance periodic sync logging and improve airport mode handling in LocationSearchMap * feat: enhance unscheduled items display with improved card interactions and accessibility * Add dev compose for hot reload and update WSL dev container docs (#958) * feat: enhance localization for itinerary linking and transportation components * Localization: update localization files with new keys and values * fix: improve error messages for Overpass API responses * chore: update dependencies in frontend package.json and pnpm-lock.yaml - Updated @sveltejs/adapter-node from ^5.2.12 to ^5.4.0 - Updated @sveltejs/adapter-vercel from ^5.7.0 to ^6.3.0 - Updated tailwindcss from ^3.4.17 to ^3.4.19 - Updated typescript from ^5.8.3 to ^5.9.3 - Updated vite from ^5.4.19 to ^5.4.21 * chore: update dependencies in pnpm-lock.yaml to latest versions * Refactor code structure for improved readability and maintainability * Refactor code structure for improved readability and maintainability * fix: update package dependencies to resolve compatibility issues * Add "worldtravel" translations to multiple locale files - Added "worldtravel" key with translations for Spanish, French, Hungarian, Italian, Japanese, Korean, Dutch, Norwegian, Polish, Brazilian Portuguese, Russian, Slovak, Swedish, Turkish, Ukrainian, and Chinese. - Updated the navigation section in each locale file to include the new "worldtravel" entry. * Add new screenshots and update email verification message in locale file * feat: Implement data restoration functionality with file import - Added a new action `restoreData` in `+page.server.ts` to handle file uploads for restoring collections. - Enhanced the UI in `+page.svelte` to include an import button and a modal for import progress. - Integrated file input handling to trigger form submission upon file selection. - Removed unused GSAP animations from the login, profile, and signup pages for cleaner code. * feat: Add modals for creating locations and lodging from recommendations, enhance image import functionality * fix: Adjust styles to prevent horizontal scroll and enhance floating action button visibility * feat: Enhance error handling and messaging for Google Maps and OpenStreetMap geocoding functions * fix: Enhance error messaging for Google Maps access forbidden response * feat: Add User-Agent header to Google Maps API requests and refine error messaging for access forbidden response * fix: Update User-Agent header in Google Maps API requests for improved compatibility * fix: Disable proxy settings in Google Maps API request to prevent connection issues * fix: Update Trivy security scan configuration and add .trivyignore for known false positives * fix: Refactor update method to handle is_public cascading for related items * feat: Integrate django-invitations for user invitation management and update settings * feat: Add Tailwind CSS and DaisyUI plugin for styling * feat: Add Tailwind CSS and DaisyUI plugin for styling * feat: Add "Invite a User" guide and update navigation links * docs: Update "Invite a User" guide to include email configuration tip * feat: Update email invitation template for improved styling and clarity * fix: Remove trailing backslash from installation note in Unraid documentation * feat: Add export/import messages and user email verification prompts in multiple languages * Squashed commit of the following: commit a993a15b93ebb7521ae2e5cc31596b98b29fcd6c Author: Alex <div@alexe.at> Date: Mon Jan 12 20:44:47 2026 +0100 Translated using Weblate (German) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/ commit fdc455d9424fbb0f6b72179d9eb1340411700773 Author: Ettore Atalan <atalanttore@googlemail.com> Date: Sat Jan 10 23:24:23 2026 +0100 Translated using Weblate (German) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/ commit 5942129c55e89dd999a13d4df9c40e6e3189355c Author: Orhun <orhunavcu@gmail.com> Date: Sun Jan 11 13:05:31 2026 +0100 Translated using Weblate (Turkish) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/tr/ commit 8712e43d8ba4a7e7fe163fb454d6577187f9a375 Author: Henrique Fonseca Veloso <henriquefv@tutamail.com> Date: Fri Jan 9 22:53:11 2026 +0100 Translated using Weblate (Portuguese (Brazil)) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/pt_BR/ commit 18ee56653470413afe8d71ecd2b5028f6e4cf118 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:57 2026 +0100 Translated using Weblate (Dutch) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/nl/ commit 57783c544e583c035c8b57b5c10ca320f25f399e Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:14 2026 +0100 Translated using Weblate (Arabic) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ar/ commit fb09edfd85bc85234b1c1ba7dd499f2915093fff Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:26 2026 +0100 Translated using Weblate (Spanish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/es/ commit 554a207d8e454a1f7ae826e2a40d389b94be5512 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:21 2026 +0100 Translated using Weblate (German) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/de/ commit b70b9db27fb8607beefeb288185601c8f5eae28d Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:02 2026 +0100 Translated using Weblate (Norwegian Bokmål) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/nb_NO/ commit 3b467caa9007c553e4ae7de97f53b6e462161ea3 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:07 2026 +0100 Translated using Weblate (Polish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/pl/ commit 30fbbfba3572c8f78ec7c7e1a231e363aca1ef10 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:17 2026 +0100 Translated using Weblate (Russian) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ru/ commit 8cecb492cfcac0a1f93ee8919f7b41d978d331ee Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:42 2026 +0100 Translated using Weblate (Italian) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/it/ commit f0d3d41029c89bfa83d5891ee7af70241f27b7be Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:38 2026 +0100 Translated using Weblate (Hungarian) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/hu/ commit 102e0f1912d010d38755a1713abb2a7f7564aafb Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:21 2026 +0100 Translated using Weblate (Slovak) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/sk/ commit 428b8f18cf6195a96b55109e0221413d82415a2f Author: Максим Горпиніч <gorpinicmaksim0@gmail.com> Date: Sat Jan 10 08:55:28 2026 +0100 Translated using Weblate (Ukrainian) Currently translated at 100.0% (1048 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/uk/ commit 1a71aaf279ecab26c0c1fede05025732e6dcfa5e Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:27 2026 +0100 Translated using Weblate (Swedish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/sv/ commit 36ec3701f3a1a904e7c42ac4ffbe6a050dc6d1ed Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:53:43 2026 +0100 Translated using Weblate (Chinese (Simplified Han script)) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/zh_Hans/ commit 65d8b74b340c877cad2028b7142c783a1b568d49 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:48 2026 +0100 Translated using Weblate (Japanese) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ja/ commit 4d11d1d31022583657e93aee70301a8ffcde1340 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:52 2026 +0100 Translated using Weblate (Korean) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/ko/ commit bd1135bcb965ad73cf493771b15081cc97cf513a Author: Orhun <orhunavcu@gmail.com> Date: Fri Jan 9 22:53:33 2026 +0100 Translated using Weblate (Turkish) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/tr/ commit 2c3d814119f4cf2dabd20933699f5b991f20f3e6 Author: Anonymous <noreply@weblate.org> Date: Fri Jan 9 22:52:32 2026 +0100 Translated using Weblate (French) Currently translated at 99.9% (1047 of 1048 strings) Translation: AdventureLog/Web App Translate-URL: https://hosted.weblate.org/projects/adventurelog/web-app/fr/ * Refactor code structure and remove redundant code blocks for improved readability and maintainability * fix: Correct appVersion to match the latest pre-release version * fix: Add missing vulnerability reference for jaraco.context in .trivyignore --------- Co-authored-by: Lars Lehmann <33843261+larsl-net@users.noreply.github.com> Co-authored-by: Lars Lehmann <lars@lmail.eu> Co-authored-by: Nick Petrushin <n.a.petrushin@gmail.com>
932 lines
44 KiB
Python
932 lines
44 KiB
Python
# views.py
|
|
import json
|
|
import zipfile
|
|
import tempfile
|
|
import os
|
|
from datetime import datetime
|
|
from django.http import HttpResponse
|
|
from django.core.files.storage import default_storage
|
|
from django.core.files.base import ContentFile
|
|
from django.db import transaction
|
|
from django.contrib.auth import get_user_model
|
|
from rest_framework import viewsets, status
|
|
from rest_framework.decorators import action
|
|
from rest_framework.response import Response
|
|
from rest_framework.parsers import MultiPartParser
|
|
from rest_framework.permissions import IsAuthenticated
|
|
from django.conf import settings
|
|
from django.contrib.contenttypes.models import ContentType
|
|
|
|
from adventures.models import (
|
|
Location, Collection, Transportation, Note, Checklist, ChecklistItem,
|
|
ContentImage, ContentAttachment, Category, Lodging, Visit, Trail, Activity,
|
|
CollectionItineraryItem
|
|
)
|
|
from worldtravel.models import VisitedCity, VisitedRegion, City, Region, Country
|
|
|
|
User = get_user_model()
|
|
|
|
class BackupViewSet(viewsets.ViewSet):
|
|
permission_classes = [IsAuthenticated]
|
|
"""
|
|
Simple ViewSet for handling backup and import operations
|
|
"""
|
|
|
|
@action(detail=False, methods=['get'])
|
|
def export(self, request):
|
|
"""
|
|
Export all user data as a ZIP file containing JSON data and files
|
|
"""
|
|
user = request.user
|
|
|
|
# Build export data structure
|
|
export_data = {
|
|
'version': settings.ADVENTURELOG_RELEASE_VERSION,
|
|
'export_date': datetime.now().isoformat(),
|
|
'user_email': user.email,
|
|
'user_username': user.username,
|
|
'categories': [],
|
|
'collections': [],
|
|
'locations': [],
|
|
'transportation': [],
|
|
'notes': [],
|
|
'checklists': [],
|
|
'lodging': [],
|
|
'visited_cities': [],
|
|
'visited_regions': [],
|
|
'itinerary_items': []
|
|
}
|
|
|
|
# Export Visited Cities
|
|
for visited_city in user.visitedcity_set.all():
|
|
export_data['visited_cities'].append({
|
|
'city': visited_city.city.id,
|
|
})
|
|
|
|
# Export Visited Regions
|
|
for visited_region in user.visitedregion_set.all():
|
|
export_data['visited_regions'].append({
|
|
'region': visited_region.region.id,
|
|
})
|
|
|
|
# Export Categories
|
|
for category in user.category_set.all():
|
|
export_data['categories'].append({
|
|
'name': category.name,
|
|
'display_name': category.display_name,
|
|
'icon': category.icon,
|
|
})
|
|
|
|
# Track images so we can reference them for collection primary images
|
|
image_export_map = {}
|
|
|
|
# Export Collections
|
|
for idx, collection in enumerate(user.collection_set.all()):
|
|
export_data['collections'].append({
|
|
'export_id': idx, # Add unique identifier for this export
|
|
'name': collection.name,
|
|
'description': collection.description,
|
|
'is_public': collection.is_public,
|
|
'start_date': collection.start_date.isoformat() if collection.start_date else None,
|
|
'end_date': collection.end_date.isoformat() if collection.end_date else None,
|
|
'is_archived': collection.is_archived,
|
|
'link': collection.link,
|
|
'shared_with_user_ids': [str(uuid) for uuid in collection.shared_with.values_list('uuid', flat=True)]
|
|
})
|
|
|
|
# Create collection name to export_id mapping
|
|
collection_name_to_id = {col.name: idx for idx, col in enumerate(user.collection_set.all())}
|
|
|
|
# Export locations with related data
|
|
for idx, location in enumerate(user.location_set.all()):
|
|
location_data = {
|
|
'export_id': idx, # Add unique identifier for this export
|
|
'name': location.name,
|
|
'location': location.location,
|
|
'tags': location.tags,
|
|
'description': location.description,
|
|
'rating': location.rating,
|
|
'link': location.link,
|
|
'is_public': location.is_public,
|
|
'longitude': str(location.longitude) if location.longitude else None,
|
|
'latitude': str(location.latitude) if location.latitude else None,
|
|
'city': location.city_id,
|
|
'region': location.region_id,
|
|
'country': location.country_id,
|
|
'category_name': location.category.name if location.category else None,
|
|
'collection_export_ids': [collection_name_to_id[col_name] for col_name in location.collections.values_list('name', flat=True) if col_name in collection_name_to_id],
|
|
'visits': [],
|
|
'trails': [],
|
|
'images': [],
|
|
'attachments': []
|
|
}
|
|
|
|
# Add visits
|
|
for visit_idx, visit in enumerate(location.visits.all()):
|
|
visit_data = {
|
|
'export_id': visit_idx, # Add unique identifier for this visit
|
|
'start_date': visit.start_date.isoformat() if visit.start_date else None,
|
|
'end_date': visit.end_date.isoformat() if visit.end_date else None,
|
|
'timezone': visit.timezone,
|
|
'notes': visit.notes,
|
|
'activities': []
|
|
}
|
|
|
|
# Add activities for this visit
|
|
for activity in visit.activities.all():
|
|
activity_data = {
|
|
'name': activity.name,
|
|
'sport_type': activity.sport_type,
|
|
'distance': float(activity.distance) if activity.distance else None,
|
|
'moving_time': activity.moving_time.total_seconds() if activity.moving_time else None,
|
|
'elapsed_time': activity.elapsed_time.total_seconds() if activity.elapsed_time else None,
|
|
'rest_time': activity.rest_time.total_seconds() if activity.rest_time else None,
|
|
'elevation_gain': float(activity.elevation_gain) if activity.elevation_gain else None,
|
|
'elevation_loss': float(activity.elevation_loss) if activity.elevation_loss else None,
|
|
'elev_high': float(activity.elev_high) if activity.elev_high else None,
|
|
'elev_low': float(activity.elev_low) if activity.elev_low else None,
|
|
'start_date': activity.start_date.isoformat() if activity.start_date else None,
|
|
'start_date_local': activity.start_date_local.isoformat() if activity.start_date_local else None,
|
|
'timezone': activity.timezone,
|
|
'average_speed': float(activity.average_speed) if activity.average_speed else None,
|
|
'max_speed': float(activity.max_speed) if activity.max_speed else None,
|
|
'average_cadence': float(activity.average_cadence) if activity.average_cadence else None,
|
|
'calories': float(activity.calories) if activity.calories else None,
|
|
'start_lat': float(activity.start_lat) if activity.start_lat else None,
|
|
'start_lng': float(activity.start_lng) if activity.start_lng else None,
|
|
'end_lat': float(activity.end_lat) if activity.end_lat else None,
|
|
'end_lng': float(activity.end_lng) if activity.end_lng else None,
|
|
'external_service_id': activity.external_service_id,
|
|
'trail_name': activity.trail.name if activity.trail else None, # Link by trail name
|
|
'gpx_filename': None
|
|
}
|
|
|
|
# Handle GPX file
|
|
if activity.gpx_file:
|
|
activity_data['gpx_filename'] = activity.gpx_file.name.split('/')[-1]
|
|
|
|
visit_data['activities'].append(activity_data)
|
|
|
|
location_data['visits'].append(visit_data)
|
|
|
|
# Add trails for this location
|
|
for trail in location.trails.all():
|
|
trail_data = {
|
|
'name': trail.name,
|
|
'link': trail.link,
|
|
'wanderer_id': trail.wanderer_id,
|
|
'created_at': trail.created_at.isoformat() if trail.created_at else None
|
|
}
|
|
location_data['trails'].append(trail_data)
|
|
|
|
# Add images
|
|
for image_index, image in enumerate(location.images.all()):
|
|
image_data = {
|
|
'immich_id': image.immich_id,
|
|
'is_primary': image.is_primary,
|
|
'filename': None,
|
|
}
|
|
if image.image:
|
|
image_data['filename'] = image.image.name.split('/')[-1]
|
|
location_data['images'].append(image_data)
|
|
|
|
image_export_map[image.id] = {
|
|
'location_export_id': idx,
|
|
'image_index': image_index,
|
|
'immich_id': image.immich_id,
|
|
'filename': image_data['filename'],
|
|
}
|
|
|
|
# Add attachments
|
|
for attachment in location.attachments.all():
|
|
attachment_data = {
|
|
'name': attachment.name,
|
|
'filename': None
|
|
}
|
|
if attachment.file:
|
|
attachment_data['filename'] = attachment.file.name.split('/')[-1]
|
|
location_data['attachments'].append(attachment_data)
|
|
|
|
export_data['locations'].append(location_data)
|
|
|
|
# Attach collection primary image references (if any)
|
|
for idx, collection in enumerate(user.collection_set.all()):
|
|
primary = collection.primary_image
|
|
if primary and primary.id in image_export_map:
|
|
export_data['collections'][idx]['primary_image'] = image_export_map[primary.id]
|
|
|
|
# Export Transportation
|
|
for idx, transport in enumerate(user.transportation_set.all()):
|
|
collection_export_id = None
|
|
if transport.collection:
|
|
collection_export_id = collection_name_to_id.get(transport.collection.name)
|
|
|
|
export_data['transportation'].append({
|
|
'export_id': idx,
|
|
'type': transport.type,
|
|
'name': transport.name,
|
|
'description': transport.description,
|
|
'rating': transport.rating,
|
|
'link': transport.link,
|
|
'date': transport.date.isoformat() if transport.date else None,
|
|
'end_date': transport.end_date.isoformat() if transport.end_date else None,
|
|
'start_timezone': transport.start_timezone,
|
|
'end_timezone': transport.end_timezone,
|
|
'flight_number': transport.flight_number,
|
|
'from_location': transport.from_location,
|
|
'origin_latitude': str(transport.origin_latitude) if transport.origin_latitude else None,
|
|
'origin_longitude': str(transport.origin_longitude) if transport.origin_longitude else None,
|
|
'destination_latitude': str(transport.destination_latitude) if transport.destination_latitude else None,
|
|
'destination_longitude': str(transport.destination_longitude) if transport.destination_longitude else None,
|
|
'to_location': transport.to_location,
|
|
'is_public': transport.is_public,
|
|
'collection_export_id': collection_export_id
|
|
})
|
|
|
|
# Export Notes
|
|
for idx, note in enumerate(user.note_set.all()):
|
|
collection_export_id = None
|
|
if note.collection:
|
|
collection_export_id = collection_name_to_id.get(note.collection.name)
|
|
|
|
export_data['notes'].append({
|
|
'export_id': idx,
|
|
'name': note.name,
|
|
'content': note.content,
|
|
'links': note.links,
|
|
'date': note.date.isoformat() if note.date else None,
|
|
'is_public': note.is_public,
|
|
'collection_export_id': collection_export_id
|
|
})
|
|
|
|
# Export Checklists
|
|
for idx, checklist in enumerate(user.checklist_set.all()):
|
|
collection_export_id = None
|
|
if checklist.collection:
|
|
collection_export_id = collection_name_to_id.get(checklist.collection.name)
|
|
|
|
checklist_data = {
|
|
'export_id': idx,
|
|
'name': checklist.name,
|
|
'date': checklist.date.isoformat() if checklist.date else None,
|
|
'is_public': checklist.is_public,
|
|
'collection_export_id': collection_export_id,
|
|
'items': []
|
|
}
|
|
|
|
# Add checklist items
|
|
for item in checklist.checklistitem_set.all():
|
|
checklist_data['items'].append({
|
|
'name': item.name,
|
|
'is_checked': item.is_checked
|
|
})
|
|
|
|
export_data['checklists'].append(checklist_data)
|
|
|
|
# Export Lodging
|
|
for idx, lodging in enumerate(user.lodging_set.all()):
|
|
collection_export_id = None
|
|
if lodging.collection:
|
|
collection_export_id = collection_name_to_id.get(lodging.collection.name)
|
|
|
|
export_data['lodging'].append({
|
|
'export_id': idx,
|
|
'name': lodging.name,
|
|
'type': lodging.type,
|
|
'description': lodging.description,
|
|
'rating': lodging.rating,
|
|
'link': lodging.link,
|
|
'check_in': lodging.check_in.isoformat() if lodging.check_in else None,
|
|
'check_out': lodging.check_out.isoformat() if lodging.check_out else None,
|
|
'timezone': lodging.timezone,
|
|
'reservation_number': lodging.reservation_number,
|
|
'price': str(lodging.price) if lodging.price else None,
|
|
'latitude': str(lodging.latitude) if lodging.latitude else None,
|
|
'longitude': str(lodging.longitude) if lodging.longitude else None,
|
|
'location': lodging.location,
|
|
'is_public': lodging.is_public,
|
|
'collection_export_id': collection_export_id
|
|
})
|
|
|
|
# Export Itinerary Items
|
|
# Create export_id mappings for all content types
|
|
location_id_to_export_id = {loc.id: idx for idx, loc in enumerate(user.location_set.all())}
|
|
transportation_id_to_export_id = {t.id: idx for idx, t in enumerate(user.transportation_set.all())}
|
|
note_id_to_export_id = {n.id: idx for idx, n in enumerate(user.note_set.all())}
|
|
lodging_id_to_export_id = {l.id: idx for idx, l in enumerate(user.lodging_set.all())}
|
|
checklist_id_to_export_id = {c.id: idx for idx, c in enumerate(user.checklist_set.all())}
|
|
|
|
for collection_idx, collection in enumerate(user.collection_set.all()):
|
|
for itinerary_item in collection.itinerary_items.all():
|
|
content_type_str = itinerary_item.content_type.model
|
|
item_reference = None
|
|
|
|
# Determine how to reference the item based on content type using export_ids
|
|
if content_type_str == 'location':
|
|
item_reference = location_id_to_export_id.get(itinerary_item.object_id)
|
|
elif content_type_str == 'transportation':
|
|
item_reference = transportation_id_to_export_id.get(itinerary_item.object_id)
|
|
elif content_type_str == 'note':
|
|
item_reference = note_id_to_export_id.get(itinerary_item.object_id)
|
|
elif content_type_str == 'lodging':
|
|
item_reference = lodging_id_to_export_id.get(itinerary_item.object_id)
|
|
elif content_type_str == 'checklist':
|
|
item_reference = checklist_id_to_export_id.get(itinerary_item.object_id)
|
|
|
|
if item_reference is not None:
|
|
export_data['itinerary_items'].append({
|
|
'collection_export_id': collection_idx,
|
|
'content_type': content_type_str,
|
|
'item_reference': item_reference,
|
|
'date': itinerary_item.date.isoformat() if itinerary_item.date else None,
|
|
'is_global': itinerary_item.is_global,
|
|
'order': itinerary_item.order
|
|
})
|
|
|
|
# Create ZIP file
|
|
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
|
|
with zipfile.ZipFile(tmp_file.name, 'w', zipfile.ZIP_DEFLATED) as zip_file:
|
|
# Add JSON data
|
|
zip_file.writestr('data.json', json.dumps(export_data, indent=2))
|
|
|
|
# Add images, attachments, and GPX files
|
|
files_added = set()
|
|
|
|
for location in user.location_set.all():
|
|
# Add images
|
|
for image in location.images.all():
|
|
if image.image and image.image.name not in files_added:
|
|
try:
|
|
image_content = default_storage.open(image.image.name).read()
|
|
filename = image.image.name.split('/')[-1]
|
|
zip_file.writestr(f'images/{filename}', image_content)
|
|
files_added.add(image.image.name)
|
|
except Exception as e:
|
|
print(f"Error adding image {image.image.name}: {e}")
|
|
|
|
# Add attachments
|
|
for attachment in location.attachments.all():
|
|
if attachment.file and attachment.file.name not in files_added:
|
|
try:
|
|
file_content = default_storage.open(attachment.file.name).read()
|
|
filename = attachment.file.name.split('/')[-1]
|
|
zip_file.writestr(f'attachments/{filename}', file_content)
|
|
files_added.add(attachment.file.name)
|
|
except Exception as e:
|
|
print(f"Error adding attachment {attachment.file.name}: {e}")
|
|
|
|
# Add GPX files from activities
|
|
for visit in location.visits.all():
|
|
for activity in visit.activities.all():
|
|
if activity.gpx_file and activity.gpx_file.name not in files_added:
|
|
try:
|
|
gpx_content = default_storage.open(activity.gpx_file.name).read()
|
|
filename = activity.gpx_file.name.split('/')[-1]
|
|
zip_file.writestr(f'gpx/{filename}', gpx_content)
|
|
files_added.add(activity.gpx_file.name)
|
|
except Exception as e:
|
|
print(f"Error adding GPX file {activity.gpx_file.name}: {e}")
|
|
|
|
# Return ZIP file as response
|
|
with open(tmp_file.name, 'rb') as zip_file:
|
|
response = HttpResponse(zip_file.read(), content_type='application/zip')
|
|
filename = f"adventurelog_backup_{user.username}_{datetime.now().strftime('%Y%m%d_%H%M%S')}.zip"
|
|
response['Content-Disposition'] = f'attachment; filename="{filename}"'
|
|
|
|
# Clean up
|
|
os.unlink(tmp_file.name)
|
|
return response
|
|
|
|
@action(
|
|
detail=False,
|
|
methods=['post'],
|
|
parser_classes=[MultiPartParser],
|
|
url_path='import', # changes the URL path to /import
|
|
url_name='import' # changes the reverse name to 'import'
|
|
)
|
|
def import_data(self, request):
|
|
"""
|
|
Import data from a ZIP backup file
|
|
"""
|
|
if 'file' not in request.FILES:
|
|
return Response({'error': 'No file provided'}, status=status.HTTP_400_BAD_REQUEST)
|
|
|
|
if 'confirm' not in request.data or request.data['confirm'] != 'yes':
|
|
return Response({'error': 'Confirmation required to proceed with import'},
|
|
status=status.HTTP_400_BAD_REQUEST)
|
|
|
|
backup_file = request.FILES['file']
|
|
user = request.user
|
|
|
|
# Save file temporarily
|
|
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
|
|
for chunk in backup_file.chunks():
|
|
tmp_file.write(chunk)
|
|
tmp_file_path = tmp_file.name
|
|
|
|
try:
|
|
with zipfile.ZipFile(tmp_file_path, 'r') as zip_file:
|
|
# Validate backup structure
|
|
if 'data.json' not in zip_file.namelist():
|
|
return Response({'error': 'Invalid backup file - missing data.json'},
|
|
status=status.HTTP_400_BAD_REQUEST)
|
|
|
|
# Load data
|
|
backup_data = json.loads(zip_file.read('data.json').decode('utf-8'))
|
|
|
|
# Import with transaction
|
|
with transaction.atomic():
|
|
# Clear existing data first
|
|
self._clear_user_data(user)
|
|
summary = self._import_data(backup_data, zip_file, user)
|
|
|
|
return Response({
|
|
'success': True,
|
|
'message': 'Data imported successfully',
|
|
'summary': summary
|
|
}, status=status.HTTP_200_OK)
|
|
|
|
except json.JSONDecodeError:
|
|
return Response({'error': 'Invalid JSON in backup file'},
|
|
status=status.HTTP_400_BAD_REQUEST)
|
|
except Exception:
|
|
import logging
|
|
logging.error("Import failed", exc_info=True)
|
|
return Response({'error': 'An internal error occurred during import'},
|
|
status=status.HTTP_400_BAD_REQUEST)
|
|
finally:
|
|
os.unlink(tmp_file_path)
|
|
|
|
def _clear_user_data(self, user):
|
|
"""Clear all existing user data before import"""
|
|
# Delete itinerary items first (they reference collections and content)
|
|
CollectionItineraryItem.objects.filter(collection__user=user).delete()
|
|
|
|
# Delete in reverse order of dependencies
|
|
user.activity_set.all().delete() # Delete activities first
|
|
user.trail_set.all().delete() # Delete trails
|
|
user.checklistitem_set.all().delete()
|
|
user.checklist_set.all().delete()
|
|
user.note_set.all().delete()
|
|
user.transportation_set.all().delete()
|
|
user.lodging_set.all().delete()
|
|
|
|
# Delete location-related data
|
|
user.contentimage_set.all().delete()
|
|
user.contentattachment_set.all().delete()
|
|
# Visits are deleted via cascade when locations are deleted
|
|
user.location_set.all().delete()
|
|
|
|
# Delete collections and categories last
|
|
user.collection_set.all().delete()
|
|
user.category_set.all().delete()
|
|
|
|
# Clear visited cities and regions
|
|
user.visitedcity_set.all().delete()
|
|
user.visitedregion_set.all().delete()
|
|
|
|
def _import_data(self, backup_data, zip_file, user):
|
|
"""Import backup data and return summary"""
|
|
from datetime import timedelta
|
|
|
|
# Track mappings and counts
|
|
category_map = {}
|
|
collection_map = {} # Map export_id to actual collection object
|
|
location_map = {} # Map location export_id to actual location object
|
|
trail_name_map = {} # Map (location_id, trail_name) to trail object
|
|
summary = {
|
|
'categories': 0, 'collections': 0, 'locations': 0,
|
|
'transportation': 0, 'notes': 0, 'checklists': 0,
|
|
'checklist_items': 0, 'lodging': 0, 'images': 0,
|
|
'attachments': 0, 'visited_cities': 0, 'visited_regions': 0,
|
|
'trails': 0, 'activities': 0, 'gpx_files': 0, 'itinerary_items': 0
|
|
}
|
|
|
|
# Import Visited Cities
|
|
for city_data in backup_data.get('visited_cities', []):
|
|
try:
|
|
city_obj = City.objects.get(id=city_data['city'])
|
|
visited_city, created = VisitedCity.objects.get_or_create(user=user, city=city_obj)
|
|
if created:
|
|
summary['visited_cities'] += 1
|
|
except City.DoesNotExist:
|
|
# If city does not exist, we can skip or log it
|
|
pass
|
|
|
|
# Import Visited Regions
|
|
for region_data in backup_data.get('visited_regions', []):
|
|
try:
|
|
region_obj = Region.objects.get(id=region_data['region'])
|
|
visited_region, created = VisitedRegion.objects.get_or_create(user=user, region=region_obj)
|
|
if created:
|
|
summary['visited_regions'] += 1
|
|
except Region.DoesNotExist:
|
|
# If region does not exist, we can skip or log it
|
|
pass
|
|
|
|
# Import Categories
|
|
for cat_data in backup_data.get('categories', []):
|
|
category = Category.objects.create(
|
|
user=user,
|
|
name=cat_data['name'],
|
|
display_name=cat_data['display_name'],
|
|
icon=cat_data.get('icon', '🌍')
|
|
)
|
|
category_map[cat_data['name']] = category
|
|
summary['categories'] += 1
|
|
|
|
pending_primary_images = []
|
|
location_images_map = {}
|
|
|
|
# Import Collections
|
|
for col_data in backup_data.get('collections', []):
|
|
collection = Collection.objects.create(
|
|
user=user,
|
|
name=col_data['name'],
|
|
description=col_data.get('description', ''),
|
|
is_public=col_data.get('is_public', False),
|
|
start_date=col_data.get('start_date'),
|
|
end_date=col_data.get('end_date'),
|
|
is_archived=col_data.get('is_archived', False),
|
|
link=col_data.get('link')
|
|
)
|
|
collection_map[col_data['export_id']] = collection
|
|
summary['collections'] += 1
|
|
|
|
# Handle shared users
|
|
for uuid in col_data.get('shared_with_user_ids', []):
|
|
try:
|
|
shared_user = User.objects.get(uuid=uuid)
|
|
if shared_user.public_profile:
|
|
collection.shared_with.add(shared_user)
|
|
except User.DoesNotExist:
|
|
pass
|
|
|
|
# Defer primary image assignment until images are created
|
|
if col_data.get('primary_image'):
|
|
pending_primary_images.append({
|
|
'collection_export_id': col_data['export_id'],
|
|
'data': col_data['primary_image'],
|
|
})
|
|
|
|
# Import Locations
|
|
for adv_data in backup_data.get('locations', []):
|
|
|
|
city = None
|
|
if adv_data.get('city'):
|
|
try:
|
|
city = City.objects.get(id=adv_data['city'])
|
|
except City.DoesNotExist:
|
|
city = None
|
|
|
|
region = None
|
|
if adv_data.get('region'):
|
|
try:
|
|
region = Region.objects.get(id=adv_data['region'])
|
|
except Region.DoesNotExist:
|
|
region = None
|
|
|
|
country = None
|
|
if adv_data.get('country'):
|
|
try:
|
|
country = Country.objects.get(id=adv_data['country'])
|
|
except Country.DoesNotExist:
|
|
country = None
|
|
|
|
location = Location(
|
|
user=user,
|
|
name=adv_data['name'],
|
|
location=adv_data.get('location'),
|
|
tags=adv_data.get('tags', []),
|
|
description=adv_data.get('description'),
|
|
rating=adv_data.get('rating'),
|
|
link=adv_data.get('link'),
|
|
is_public=adv_data.get('is_public', False),
|
|
longitude=adv_data.get('longitude'),
|
|
latitude=adv_data.get('latitude'),
|
|
city=city,
|
|
region=region,
|
|
country=country,
|
|
category=category_map.get(adv_data.get('category_name'))
|
|
)
|
|
location.save(_skip_geocode=True) # Skip geocoding for now
|
|
location_map[adv_data['export_id']] = location
|
|
location_images_map.setdefault(adv_data['export_id'], [])
|
|
|
|
# Add to collections using export_ids - MUST be done after save()
|
|
for collection_export_id in adv_data.get('collection_export_ids', []):
|
|
if collection_export_id in collection_map:
|
|
location.collections.add(collection_map[collection_export_id])
|
|
|
|
# Import trails for this location first
|
|
for trail_data in adv_data.get('trails', []):
|
|
trail = Trail.objects.create(
|
|
user=user,
|
|
location=location,
|
|
name=trail_data['name'],
|
|
link=trail_data.get('link'),
|
|
wanderer_id=trail_data.get('wanderer_id'),
|
|
created_at=trail_data.get('created_at')
|
|
)
|
|
trail_name_map[(location.id, trail_data['name'])] = trail
|
|
summary['trails'] += 1
|
|
|
|
# Import visits and their activities
|
|
for visit_data in adv_data.get('visits', []):
|
|
visit = Visit.objects.create(
|
|
location=location,
|
|
start_date=visit_data.get('start_date'),
|
|
end_date=visit_data.get('end_date'),
|
|
timezone=visit_data.get('timezone'),
|
|
notes=visit_data.get('notes')
|
|
)
|
|
|
|
# Import activities for this visit
|
|
for activity_data in visit_data.get('activities', []):
|
|
# Find the trail if specified
|
|
trail = None
|
|
if activity_data.get('trail_name'):
|
|
trail = trail_name_map.get((location.id, activity_data['trail_name']))
|
|
|
|
# Convert time durations back from seconds
|
|
moving_time = None
|
|
if activity_data.get('moving_time') is not None:
|
|
moving_time = timedelta(seconds=activity_data['moving_time'])
|
|
|
|
elapsed_time = None
|
|
if activity_data.get('elapsed_time') is not None:
|
|
elapsed_time = timedelta(seconds=activity_data['elapsed_time'])
|
|
|
|
rest_time = None
|
|
if activity_data.get('rest_time') is not None:
|
|
rest_time = timedelta(seconds=activity_data['rest_time'])
|
|
|
|
activity = Activity(
|
|
user=user,
|
|
visit=visit,
|
|
trail=trail,
|
|
name=activity_data['name'],
|
|
sport_type=activity_data.get('sport_type'),
|
|
distance=activity_data.get('distance'),
|
|
moving_time=moving_time,
|
|
elapsed_time=elapsed_time,
|
|
rest_time=rest_time,
|
|
elevation_gain=activity_data.get('elevation_gain'),
|
|
elevation_loss=activity_data.get('elevation_loss'),
|
|
elev_high=activity_data.get('elev_high'),
|
|
elev_low=activity_data.get('elev_low'),
|
|
start_date=activity_data.get('start_date'),
|
|
start_date_local=activity_data.get('start_date_local'),
|
|
timezone=activity_data.get('timezone'),
|
|
average_speed=activity_data.get('average_speed'),
|
|
max_speed=activity_data.get('max_speed'),
|
|
average_cadence=activity_data.get('average_cadence'),
|
|
calories=activity_data.get('calories'),
|
|
start_lat=activity_data.get('start_lat'),
|
|
start_lng=activity_data.get('start_lng'),
|
|
end_lat=activity_data.get('end_lat'),
|
|
end_lng=activity_data.get('end_lng'),
|
|
external_service_id=activity_data.get('external_service_id')
|
|
)
|
|
|
|
# Handle GPX file
|
|
gpx_filename = activity_data.get('gpx_filename')
|
|
if gpx_filename:
|
|
try:
|
|
gpx_content = zip_file.read(f'gpx/{gpx_filename}')
|
|
gpx_file = ContentFile(gpx_content, name=gpx_filename)
|
|
activity.gpx_file = gpx_file
|
|
summary['gpx_files'] += 1
|
|
except KeyError:
|
|
pass # GPX file not found in backup
|
|
|
|
activity.save()
|
|
summary['activities'] += 1
|
|
|
|
# Import images
|
|
content_type = ContentType.objects.get(model='location')
|
|
|
|
for img_data in adv_data.get('images', []):
|
|
immich_id = img_data.get('immich_id')
|
|
if immich_id:
|
|
new_img = ContentImage.objects.create(
|
|
user=user,
|
|
immich_id=immich_id,
|
|
is_primary=img_data.get('is_primary', False),
|
|
content_type=content_type,
|
|
object_id=location.id
|
|
)
|
|
location_images_map[adv_data['export_id']].append(new_img)
|
|
summary['images'] += 1
|
|
else:
|
|
filename = img_data.get('filename')
|
|
if filename:
|
|
try:
|
|
img_content = zip_file.read(f'images/{filename}')
|
|
img_file = ContentFile(img_content, name=filename)
|
|
new_img = ContentImage.objects.create(
|
|
user=user,
|
|
image=img_file,
|
|
is_primary=img_data.get('is_primary', False),
|
|
content_type=content_type,
|
|
object_id=location.id
|
|
)
|
|
location_images_map[adv_data['export_id']].append(new_img)
|
|
summary['images'] += 1
|
|
except KeyError:
|
|
pass
|
|
|
|
# Import attachments
|
|
for att_data in adv_data.get('attachments', []):
|
|
filename = att_data.get('filename')
|
|
if filename:
|
|
try:
|
|
att_content = zip_file.read(f'attachments/{filename}')
|
|
att_file = ContentFile(att_content, name=filename)
|
|
ContentAttachment.objects.create(
|
|
user=user,
|
|
file=att_file,
|
|
name=att_data.get('name'),
|
|
content_type=content_type,
|
|
object_id=location.id
|
|
)
|
|
summary['attachments'] += 1
|
|
except KeyError:
|
|
pass
|
|
|
|
summary['locations'] += 1
|
|
|
|
# Apply primary image selections now that images exist
|
|
for entry in pending_primary_images:
|
|
collection = collection_map.get(entry['collection_export_id'])
|
|
data = entry.get('data', {}) or {}
|
|
if not collection:
|
|
continue
|
|
|
|
loc_export_id = data.get('location_export_id')
|
|
img_index = data.get('image_index')
|
|
if loc_export_id is None or img_index is None:
|
|
continue
|
|
|
|
images_for_location = location_images_map.get(loc_export_id, [])
|
|
if 0 <= img_index < len(images_for_location):
|
|
collection.primary_image = images_for_location[img_index]
|
|
collection.save(update_fields=['primary_image'])
|
|
|
|
# Import Transportation
|
|
transportation_map = {} # Map export_id to actual transportation object
|
|
for trans_data in backup_data.get('transportation', []):
|
|
collection = None
|
|
if trans_data.get('collection_export_id') is not None:
|
|
collection = collection_map.get(trans_data['collection_export_id'])
|
|
|
|
transportation = Transportation.objects.create(
|
|
user=user,
|
|
type=trans_data['type'],
|
|
name=trans_data['name'],
|
|
description=trans_data.get('description'),
|
|
rating=trans_data.get('rating'),
|
|
link=trans_data.get('link'),
|
|
date=trans_data.get('date'),
|
|
end_date=trans_data.get('end_date'),
|
|
start_timezone=trans_data.get('start_timezone'),
|
|
end_timezone=trans_data.get('end_timezone'),
|
|
flight_number=trans_data.get('flight_number'),
|
|
from_location=trans_data.get('from_location'),
|
|
origin_latitude=trans_data.get('origin_latitude'),
|
|
origin_longitude=trans_data.get('origin_longitude'),
|
|
destination_latitude=trans_data.get('destination_latitude'),
|
|
destination_longitude=trans_data.get('destination_longitude'),
|
|
to_location=trans_data.get('to_location'),
|
|
is_public=trans_data.get('is_public', False),
|
|
collection=collection
|
|
)
|
|
# Only add to map if export_id exists (for backward compatibility with old backups)
|
|
if 'export_id' in trans_data:
|
|
transportation_map[trans_data['export_id']] = transportation
|
|
summary['transportation'] += 1
|
|
|
|
# Import Notes
|
|
note_map = {} # Map export_id to actual note object
|
|
for note_data in backup_data.get('notes', []):
|
|
collection = None
|
|
if note_data.get('collection_export_id') is not None:
|
|
collection = collection_map.get(note_data['collection_export_id'])
|
|
|
|
note = Note.objects.create(
|
|
user=user,
|
|
name=note_data['name'],
|
|
content=note_data.get('content'),
|
|
links=note_data.get('links', []),
|
|
date=note_data.get('date'),
|
|
is_public=note_data.get('is_public', False),
|
|
collection=collection
|
|
)
|
|
# Only add to map if export_id exists (for backward compatibility with old backups)
|
|
if 'export_id' in note_data:
|
|
note_map[note_data['export_id']] = note
|
|
summary['notes'] += 1
|
|
|
|
# Import Checklists
|
|
checklist_map = {} # Map export_id to actual checklist object
|
|
for check_data in backup_data.get('checklists', []):
|
|
collection = None
|
|
if check_data.get('collection_export_id') is not None:
|
|
collection = collection_map.get(check_data['collection_export_id'])
|
|
|
|
checklist = Checklist.objects.create(
|
|
user=user,
|
|
name=check_data['name'],
|
|
date=check_data.get('date'),
|
|
is_public=check_data.get('is_public', False),
|
|
collection=collection
|
|
)
|
|
|
|
# Import checklist items
|
|
for item_data in check_data.get('items', []):
|
|
ChecklistItem.objects.create(
|
|
user=user,
|
|
checklist=checklist,
|
|
name=item_data['name'],
|
|
is_checked=item_data.get('is_checked', False)
|
|
)
|
|
summary['checklist_items'] += 1
|
|
|
|
# Only add to map if export_id exists (for backward compatibility with old backups)
|
|
if 'export_id' in check_data:
|
|
checklist_map[check_data['export_id']] = checklist
|
|
summary['checklists'] += 1
|
|
|
|
# Import Lodging
|
|
lodging_map = {} # Map export_id to actual lodging object
|
|
for lodg_data in backup_data.get('lodging', []):
|
|
collection = None
|
|
if lodg_data.get('collection_export_id') is not None:
|
|
collection = collection_map.get(lodg_data['collection_export_id'])
|
|
|
|
lodging = Lodging.objects.create(
|
|
user=user,
|
|
name=lodg_data['name'],
|
|
type=lodg_data.get('type', 'other'),
|
|
description=lodg_data.get('description'),
|
|
rating=lodg_data.get('rating'),
|
|
link=lodg_data.get('link'),
|
|
check_in=lodg_data.get('check_in'),
|
|
check_out=lodg_data.get('check_out'),
|
|
timezone=lodg_data.get('timezone'),
|
|
reservation_number=lodg_data.get('reservation_number'),
|
|
price=lodg_data.get('price'),
|
|
latitude=lodg_data.get('latitude'),
|
|
longitude=lodg_data.get('longitude'),
|
|
location=lodg_data.get('location'),
|
|
is_public=lodg_data.get('is_public', False),
|
|
collection=collection
|
|
)
|
|
# Only add to map if export_id exists (for backward compatibility with old backups)
|
|
if 'export_id' in lodg_data:
|
|
lodging_map[lodg_data['export_id']] = lodging
|
|
summary['lodging'] += 1
|
|
|
|
# Import Itinerary Items
|
|
# Maps already created during import of each content type
|
|
|
|
for itinerary_data in backup_data.get('itinerary_items', []):
|
|
collection = collection_map.get(itinerary_data['collection_export_id'])
|
|
if not collection:
|
|
continue
|
|
|
|
content_type_str = itinerary_data['content_type']
|
|
item_reference = itinerary_data['item_reference']
|
|
|
|
# Get the actual object based on content type
|
|
content_object = None
|
|
content_type = None
|
|
|
|
if content_type_str == 'location':
|
|
content_object = location_map.get(item_reference) # item_reference is export_id
|
|
content_type = ContentType.objects.get(model='location')
|
|
elif content_type_str == 'transportation':
|
|
content_object = transportation_map.get(item_reference) # item_reference is export_id
|
|
content_type = ContentType.objects.get(model='transportation')
|
|
elif content_type_str == 'note':
|
|
content_object = note_map.get(item_reference) # item_reference is export_id
|
|
content_type = ContentType.objects.get(model='note')
|
|
elif content_type_str == 'lodging':
|
|
content_object = lodging_map.get(item_reference) # item_reference is export_id
|
|
content_type = ContentType.objects.get(model='lodging')
|
|
elif content_type_str == 'checklist':
|
|
content_object = checklist_map.get(item_reference) # item_reference is export_id
|
|
content_type = ContentType.objects.get(model='checklist')
|
|
|
|
if content_object and content_type:
|
|
CollectionItineraryItem.objects.create(
|
|
collection=collection,
|
|
content_type=content_type,
|
|
object_id=content_object.id,
|
|
date=itinerary_data.get('date') if not itinerary_data.get('is_global') else None,
|
|
is_global=bool(itinerary_data.get('is_global', False)),
|
|
order=itinerary_data['order']
|
|
)
|
|
summary['itinerary_items'] += 1
|
|
|
|
return summary |