Activities, Trails, Wanderer + Strava Integration, UI Refresh, Devops Improvments, and more (#785)
* Implement code changes to enhance functionality and improve performance * Update nl.json Fix Dutch translations. * feat(security): add Trivy security scans for Docker images and source code * feat(security): restructure Trivy scans for improved clarity and organization * fix(dependencies): update Django version to 5.2.2 * style(workflows): standardize quotes and fix typo in frontend-test.yml * feat(workflows): add job names for clarity in backend and frontend test workflows * refactor(workflows): remove path filters from pull_request and push triggers in backend and frontend workflows * feat(workflows): add paths to push and pull_request triggers for backend and frontend workflows * refactor(workflows): simplify trigger paths for backend and frontend workflows fix(dependencies): add overrides for esbuild in frontend package.json * fix(package): add missing pnpm overrides for esbuild in package.json * fix(workflows): add missing severity parameter for Trivy filesystem scan * fix(workflows): add missing severity parameter for Docker image scans in Trivy workflow * fix(workflows): remove MEDIUM severity from Trivy scans in security workflow * added-fix-image-deletion (#681) * added-fix-image-deletion * feat(commands): add image cleanup command to find and delete unused files * fix(models): ensure associated AdventureImages are deleted and files cleaned up on Adventure deletion * fix(models): ensure associated Attachment files are deleted and their filesystem cleaned up on Adventure deletion --------- Co-authored-by: ferdousahmed <taninme@gmail.com> Co-authored-by: Sean Morley * Rename Adventures to Locations (#696) * Refactor user_id to user in adventures and related models, views, and components - Updated all instances of user_id to user in the adventures app, including models, serializers, views, and frontend components. - Adjusted queries and filters to reflect the new user field naming convention. - Ensured consistency across the codebase for user identification in adventures, collections, notes, and transportation entities. - Modified frontend components to align with the updated data structure, ensuring proper access control and rendering based on user ownership. * Refactor adventure-related views and components to use "Location" terminology - Updated GlobalSearchView to replace AdventureSerializer with LocationSerializer. - Modified IcsCalendarGeneratorViewSet to use LocationSerializer instead of AdventureSerializer. - Created new LocationImageViewSet for managing location images, including primary image toggling and image deletion. - Introduced LocationViewSet for managing locations with enhanced filtering, sorting, and sharing capabilities. - Updated ReverseGeocodeViewSet to utilize LocationSerializer. - Added ActivityTypesView to retrieve distinct activity types from locations. - Refactored user views to replace AdventureSerializer with LocationSerializer. - Updated frontend components to reflect changes from "adventure" to "location", including AdventureCard, AdventureLink, AdventureModal, and others. - Adjusted API endpoints in frontend routes to align with new location-based structure. - Ensured all references to adventures are replaced with locations across the codebase. * refactor: rename adventures to locations across the application - Updated localization files to replace adventure-related terms with location-related terms. - Refactored TypeScript types and variables from Adventure to Location in various routes and components. - Adjusted UI elements and labels to reflect the change from adventures to locations. - Ensured all references to adventures in the codebase are consistent with the new location terminology. * Refactor code structure for improved readability and maintainability * feat: Implement location details page with server-side loading and deletion functionality - Added +page.server.ts to handle server-side loading of additional location info. - Created +page.svelte for displaying location details, including images, visits, and maps. - Integrated GPX file handling and rendering on the map. - Updated map route to link to locations instead of adventures. - Refactored profile and search routes to use LocationCard instead of AdventureCard. * docs: Update terminology from "Adventure" to "Location" and enhance project overview * docs: Clarify collection examples in usage documentation * feat: Enable credentials for GPX file fetch and add CORS_ALLOW_CREDENTIALS setting * Refactor adventure references to locations across the backend and frontend - Updated CategoryViewSet to reflect location context instead of adventures. - Modified ChecklistViewSet to include locations in retrieval logic. - Changed GlobalSearchView to search for locations instead of adventures. - Adjusted IcsCalendarGeneratorViewSet to handle locations instead of adventures. - Refactored LocationImageViewSet to remove unused import. - Updated LocationViewSet to clarify public access for locations. - Changed LodgingViewSet to reference locations instead of adventures. - Modified NoteViewSet to prevent listing all locations. - Updated RecommendationsViewSet to handle locations in parsing and response. - Adjusted ReverseGeocodeViewSet to search through user locations. - Updated StatsViewSet to count locations instead of adventures. - Changed TagsView to reflect activity types for locations. - Updated TransportationViewSet to reference locations instead of adventures. - Added new translations for search results related to locations in multiple languages. - Updated dashboard and profile pages to reflect location counts instead of adventure counts. - Adjusted search routes to handle locations instead of adventures. * Update banner image * style: Update stats component background and border for improved visibility * refactor: Rename AdventureCard and AdventureModal to LocationCard and LocationModal for consistency * Import and Export Functionality (#698) * feat(backup): add BackupViewSet for data export and import functionality * Fixed frontend returning corrupt binary data * feat(import): enhance import functionality with confirmation check and improved city/region/country handling * Potential fix for code scanning alert no. 29: Information exposure through an exception Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> * Refactor response handling to use arrayBuffer instead of bytes * Refactor image cleanup command to use LocationImage model and update import/export view to include backup and restore functionality * Update backup export versioning and improve data restore warning message * Enhance image navigation and localization support in modal components * Refactor location handling in Immich integration components for consistency * Enhance backup and restore functionality with improved localization and error handling * Improve accessibility by adding 'for' attribute to backup file input label --------- Co-authored-by: Christian Zäske <blitzdose@gmail.com> Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> * refactor(serializers): rename Location to Adventure and update related fields * refactor(serializers): rename Adventure to Location and update related fields * chore(requirements): update pillow version to 11.3.0 * Add PT-BR translations (#739) * Fixed frontend returning corrupt binary data * fix(adventure): enhance collection ownership validation in AdventureSerializer (#723) * Add PT-BR translations Add translation for Brazilian Portuguese to the project; Signed-off-by: Lucas Zampieri <lzampier@redhat.com> --------- Signed-off-by: Lucas Zampieri <lzampier@redhat.com> Co-authored-by: Sean Morley <98704938+seanmorley15@users.noreply.github.com> Co-authored-by: Christian Zäske <blitzdose@gmail.com> * fix: update date formatting for adventure items to include timezone * Image/attachment overhaul, activities, trails and integrations with Strava and Wanderer (#726) * refactor(models, views, serializers): rename LocationImage and Attachment to ContentImage and ContentAttachment, update related references * feat: Enhance collection sharing and location management features - Implemented unsharing functionality in CollectionViewSet, including removal of user-owned locations from collections. - Refactored ContentImageViewSet to support multiple content types and improved permission checks for image uploads. - Added user ownership checks in LocationViewSet for delete operations. - Enhanced collection management in the frontend to display both owned and shared collections separately. - Updated Immich integration to handle access control based on location visibility and user permissions. - Improved UI components to show creator information and manage collection links more effectively. - Added loading states and error handling in collection fetching logic. * feat: enhance transportation card and modal with image handling - Added CardCarousel component to TransportationCard for image display. - Implemented privacy indicator with Eye and EyeOff icons. - Introduced image upload functionality in TransportationModal, allowing users to upload multiple images. - Added image management features: remove image and set primary image. - Updated Transportation and Location types to include images as ContentImage array. - Enhanced UI for image upload and display in modal, including selected images preview and current images management. * feat: update CardCarousel component to handle images, name, and icon props across various cards * feat: add Discord link to AboutModal and update appVersion in config * feat: add LocationQuickStart and LocationVisits components for enhanced location selection and visit management - Implemented LocationQuickStart.svelte for searching and selecting locations on a map with reverse geocoding. - Created LocationVisits.svelte to manage visit dates and notes for locations, including timezone handling and validation. - Updated types to remove location property from Attachment type. - Modified locations page to integrate NewLocationModal for creating and editing locations, syncing updates with adventures. * feat: update button styles and add back and close functionality in location components * Collection invite system * feat: update CollectionSerializer to include 'shared_with' as a read-only field; update app version; add new background images and localization strings for invites * feat: add Strava integration with OAuth flow and activity management - Implemented IntegrationView for listing integrations including Immich, Google Maps, and Strava. - Created StravaIntegrationView for handling OAuth authorization and token exchange. - Added functionality to refresh Strava access tokens when needed. - Implemented endpoints to fetch user activities from Strava and extract essential information. - Added Strava logo asset and integrated it into the frontend settings page. - Updated settings page to display Strava integration status. - Enhanced location management to include trails with create, edit, and delete functionalities. - Updated types and localization files to support new features. * feat: enhance Strava integration with user-specific settings and management options; update localization strings * feat: update Strava integration settings and add Wanderer logo; enhance user experience with active section management * Add StravaActivity and Activity types to types.ts - Introduced StravaActivity type to represent detailed activity data from Strava. - Added Activity type to encapsulate user activities, including optional trail and GPX file information. - Updated Location type to include an array of activities associated with each visit. * feat: streamline location and activity management; enhance Strava import functionality and add activity handling in server actions * feat: add ActivityCard component and update LocationVisits to use it; modify Activity type to reference trail as string * feat: add geojson support to ActivitySerializer and ActivityCard; enhance location page with activity summaries and GPS tracks * feat: add trails property to recommendation object in collection page * feat: add Wanderer integration with authentication and management features * feat: implement Wanderer integration with trail management and UI components; enhance settings for reauthentication * feat: add measurement system field to CustomUser model and update related serializers, migrations, and UI components * feat: add measurement system support across ActivityCard, StravaActivityCard, NewLocationModal, LocationVisits, and related utility functions * feat: enhance Wanderer integration with trail data fetching and UI updates; add measurement system support * feat: add TrailCard component for displaying trail details with measurement system support * feat: add wanderer link support in TrailSerializer and TrailCard; update measurement system handling in location page * feat: integrate memcached for caching in Wanderer services; update Docker, settings, and supervisord configurations * feat: add activity statistics to user profile; include distance, moving time, elevation, and total activities * feat: enhance import/export functionality to include trails and activities; update UI components and localization * feat: integrate NewLocationModal across various components; update location handling and state management * Refactor Location and Visit types: Replace visits structure in Location with Visit type and add location, created_at, and updated_at fields to Visit * feat: enhance permissions and validation in activity, trail, and visit views; add unique constraint to CollectionInvite model * feat: sync visits when updating adventures in collection page * feat: add geojson support for attachments and refactor GPX handling in location page * chore: remove unused dependencies from pnpm-lock.yaml * feat: add Strava and Wanderer integration documentation and configuration options * Add support for Japanese and Arabic languages in localization * Add new localization strings for Russian, Swedish, and Chinese languages - Updated translations in ru.json, sv.json, and zh.json to include new phrases related to collections, activities, and integrations. - Added strings for leaving collections, loading collections, and quick start instructions. - Included new sections for invites and Strava integration with relevant messages. - Enhanced Google Maps integration descriptions for clarity. * Add localization support for activity-related features and update UI labels - Added new Russian, Swedish, and Chinese translations for activity statistics, achievements, and related terms. - Updated UI components to use localized strings for activity statistics, distance, moving time, and other relevant fields. - Enhanced user experience by ensuring all relevant buttons and labels are translated and accessible. * fix: update appVersion to reflect the latest development version * feat: add getActivityColor function and integrate activity color coding in map and location pages * feat: add support for showing activities and visited cities on the map * feat: update map page to display counts for visited cities and activities * fix: remove debug print statement from IsOwnerOrSharedWithFullAccess permission class * feat: add MapStyleSelector component and integrate basemap selection in map page * feat: enhance basemap functions with 3D terrain support and update XYZ style handling * feat: add management command to recalculate elevation data from GPX files and update activity view to handle elevation data extraction * feat: update MapStyleSelector component and enhance basemap options for improved user experience * feat: refactor activity model and admin to use sport_type, update serializers and components for improved activity handling * feat: update Activity model string representation to use sport_type instead of type * feat: update activity handling to use sport_type for color determination in map and location components * feat: Add attachments support to Transportation and Lodging types - Updated Transportation and Lodging types to include attachments array. - Enhanced localization files for multiple languages to include new strings related to attachments, lodging, and transportation. - Added error and success messages for attachment removal and upload information. - Included new prompts for creating and updating lodging and transportation details across various languages. * feat: Enhance activity statistics and breakdown by category in user profile * feat: Add SPORT_CATEGORIES for better organization of sports types and update StatsViewSet to use it * feat: Enhance CategoryDropdown for mobile responsiveness and add category creation functionality * feat: Update inspirational quote in adventure log * feat: Localize navigation labels in Navbar and add translation to en.json * feat: Update navigation elements to use anchor tags for better accessibility and add new fields to signup form * Translate login button text to support internationalization * feat: Refactor location visit status logic and add utility function for visited locations count * chore: Upgrade GitHub Actions and remove unused timezone import * fix: Update Docker image tags in GitHub Actions workflow for consistency * fix: Update Docker image build process to use BuildKit cache for improved performance * chore: Remove unused imports from stats_view.py for cleaner code * Increase background image opacity on login and signup pages for improved visibility * fix: Add postgresql-client to runtime dependencies in Dockerfile * fix: Update workflow files to include permissions for GitHub Actions * fix: Update esbuild version to ^0.25.9 in package.json and pnpm-lock.yaml for compatibility * chore: improve Chinese translation (#796) * fix: update adventure log quote and remove unused activity type field * fix: optimize import process by using get_or_create for visited cities and regions * fix: update README to reflect changes from adventures to locations and enhance feature descriptions * fix: update documentation to reflect changes from adventures to locations and enhance feature descriptions * Update google_maps_integration.md (#743) * Update google_maps_integration.md Explain APIs needed for AdventureLogs versions. Fixes #731 and #727 * Fix a typo google_maps_integration.md --------- Co-authored-by: Sean Morley <98704938+seanmorley15@users.noreply.github.com> * fix: update appVersion to reflect the main branch version * fix: update image source for satellite map in documentation * Update frontend/src/lib/components/NewLocationModal.svelte Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Add localization updates for multiple languages - Japanese (ja.json): Added new activity-related phrases and checklist terms. - Korean (ko.json): Included activity breakdown and checklist enhancements. - Dutch (nl.json): Updated activity descriptions and added checklist functionalities. - Norwegian (no.json): Enhanced activity and checklist terminology. - Polish (pl.json): Added new phrases for activities and checklist management. - Brazilian Portuguese (pt-br.json): Updated activity-related terms and checklist features. - Russian (ru.json): Included new phrases for activities and checklist management. - Swedish (sv.json): Enhanced activity descriptions and checklist functionalities. - Chinese (zh.json): Added new activity-related phrases and checklist terms. * fix: enhance image upload handling to support immich_id * Add "not_enabled" message for Strava integration in multiple languages - Updated Spanish, French, Italian, Japanese, Korean, Dutch, Norwegian, Polish, Brazilian Portuguese, Russian, Swedish, and Chinese locale files to include a new message indicating that Strava integration is not enabled in the current instance. --------- Signed-off-by: Lucas Zampieri <lzampier@redhat.com> Co-authored-by: Ycer0n <37674033+Ycer0n@users.noreply.github.com> Co-authored-by: taninme <5262715+taninme@users.noreply.github.com> Co-authored-by: ferdousahmed <taninme@gmail.com> Co-authored-by: Christian Zäske <blitzdose@gmail.com> Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> Co-authored-by: Lucas Zampieri <lcasmz54@gmail.com> Co-authored-by: pplulee <pplulee@live.cn> Co-authored-by: Cathelijne Hornstra <github@hornstra.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
This commit is contained in:
784
backend/server/adventures/views/import_export_view.py
Normal file
784
backend/server/adventures/views/import_export_view.py
Normal file
@@ -0,0 +1,784 @@
|
||||
# views.py
|
||||
import json
|
||||
import zipfile
|
||||
import tempfile
|
||||
import os
|
||||
from datetime import datetime
|
||||
from django.http import HttpResponse
|
||||
from django.core.files.storage import default_storage
|
||||
from django.core.files.base import ContentFile
|
||||
from django.db import transaction
|
||||
from django.contrib.auth import get_user_model
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.parsers import MultiPartParser
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from django.conf import settings
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
|
||||
from adventures.models import (
|
||||
Location, Collection, Transportation, Note, Checklist, ChecklistItem,
|
||||
ContentImage, ContentAttachment, Category, Lodging, Visit, Trail, Activity
|
||||
)
|
||||
from worldtravel.models import VisitedCity, VisitedRegion, City, Region, Country
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
class BackupViewSet(viewsets.ViewSet):
|
||||
permission_classes = [IsAuthenticated]
|
||||
"""
|
||||
Simple ViewSet for handling backup and import operations
|
||||
"""
|
||||
|
||||
@action(detail=False, methods=['get'])
|
||||
def export(self, request):
|
||||
"""
|
||||
Export all user data as a ZIP file containing JSON data and files
|
||||
"""
|
||||
user = request.user
|
||||
|
||||
# Build export data structure
|
||||
export_data = {
|
||||
'version': settings.ADVENTURELOG_RELEASE_VERSION,
|
||||
'export_date': datetime.now().isoformat(),
|
||||
'user_email': user.email,
|
||||
'user_username': user.username,
|
||||
'categories': [],
|
||||
'collections': [],
|
||||
'locations': [],
|
||||
'transportation': [],
|
||||
'notes': [],
|
||||
'checklists': [],
|
||||
'lodging': [],
|
||||
'visited_cities': [],
|
||||
'visited_regions': []
|
||||
}
|
||||
|
||||
# Export Visited Cities
|
||||
for visited_city in user.visitedcity_set.all():
|
||||
export_data['visited_cities'].append({
|
||||
'city': visited_city.city.id,
|
||||
})
|
||||
|
||||
# Export Visited Regions
|
||||
for visited_region in user.visitedregion_set.all():
|
||||
export_data['visited_regions'].append({
|
||||
'region': visited_region.region.id,
|
||||
})
|
||||
|
||||
# Export Categories
|
||||
for category in user.category_set.all():
|
||||
export_data['categories'].append({
|
||||
'name': category.name,
|
||||
'display_name': category.display_name,
|
||||
'icon': category.icon,
|
||||
})
|
||||
|
||||
# Export Collections
|
||||
for idx, collection in enumerate(user.collection_set.all()):
|
||||
export_data['collections'].append({
|
||||
'export_id': idx, # Add unique identifier for this export
|
||||
'name': collection.name,
|
||||
'description': collection.description,
|
||||
'is_public': collection.is_public,
|
||||
'start_date': collection.start_date.isoformat() if collection.start_date else None,
|
||||
'end_date': collection.end_date.isoformat() if collection.end_date else None,
|
||||
'is_archived': collection.is_archived,
|
||||
'link': collection.link,
|
||||
'shared_with_user_ids': [str(uuid) for uuid in collection.shared_with.values_list('uuid', flat=True)]
|
||||
})
|
||||
|
||||
# Create collection name to export_id mapping
|
||||
collection_name_to_id = {col.name: idx for idx, col in enumerate(user.collection_set.all())}
|
||||
|
||||
# Export locations with related data
|
||||
for idx, location in enumerate(user.location_set.all()):
|
||||
location_data = {
|
||||
'export_id': idx, # Add unique identifier for this export
|
||||
'name': location.name,
|
||||
'location': location.location,
|
||||
'tags': location.tags,
|
||||
'description': location.description,
|
||||
'rating': location.rating,
|
||||
'link': location.link,
|
||||
'is_public': location.is_public,
|
||||
'longitude': str(location.longitude) if location.longitude else None,
|
||||
'latitude': str(location.latitude) if location.latitude else None,
|
||||
'city': location.city_id,
|
||||
'region': location.region_id,
|
||||
'country': location.country_id,
|
||||
'category_name': location.category.name if location.category else None,
|
||||
'collection_export_ids': [collection_name_to_id[col_name] for col_name in location.collections.values_list('name', flat=True) if col_name in collection_name_to_id],
|
||||
'visits': [],
|
||||
'trails': [],
|
||||
'images': [],
|
||||
'attachments': []
|
||||
}
|
||||
|
||||
# Add visits
|
||||
for visit_idx, visit in enumerate(location.visits.all()):
|
||||
visit_data = {
|
||||
'export_id': visit_idx, # Add unique identifier for this visit
|
||||
'start_date': visit.start_date.isoformat() if visit.start_date else None,
|
||||
'end_date': visit.end_date.isoformat() if visit.end_date else None,
|
||||
'timezone': visit.timezone,
|
||||
'notes': visit.notes,
|
||||
'activities': []
|
||||
}
|
||||
|
||||
# Add activities for this visit
|
||||
for activity in visit.activities.all():
|
||||
activity_data = {
|
||||
'name': activity.name,
|
||||
'sport_type': activity.sport_type,
|
||||
'distance': float(activity.distance) if activity.distance else None,
|
||||
'moving_time': activity.moving_time.total_seconds() if activity.moving_time else None,
|
||||
'elapsed_time': activity.elapsed_time.total_seconds() if activity.elapsed_time else None,
|
||||
'rest_time': activity.rest_time.total_seconds() if activity.rest_time else None,
|
||||
'elevation_gain': float(activity.elevation_gain) if activity.elevation_gain else None,
|
||||
'elevation_loss': float(activity.elevation_loss) if activity.elevation_loss else None,
|
||||
'elev_high': float(activity.elev_high) if activity.elev_high else None,
|
||||
'elev_low': float(activity.elev_low) if activity.elev_low else None,
|
||||
'start_date': activity.start_date.isoformat() if activity.start_date else None,
|
||||
'start_date_local': activity.start_date_local.isoformat() if activity.start_date_local else None,
|
||||
'timezone': activity.timezone,
|
||||
'average_speed': float(activity.average_speed) if activity.average_speed else None,
|
||||
'max_speed': float(activity.max_speed) if activity.max_speed else None,
|
||||
'average_cadence': float(activity.average_cadence) if activity.average_cadence else None,
|
||||
'calories': float(activity.calories) if activity.calories else None,
|
||||
'start_lat': float(activity.start_lat) if activity.start_lat else None,
|
||||
'start_lng': float(activity.start_lng) if activity.start_lng else None,
|
||||
'end_lat': float(activity.end_lat) if activity.end_lat else None,
|
||||
'end_lng': float(activity.end_lng) if activity.end_lng else None,
|
||||
'external_service_id': activity.external_service_id,
|
||||
'trail_name': activity.trail.name if activity.trail else None, # Link by trail name
|
||||
'gpx_filename': None
|
||||
}
|
||||
|
||||
# Handle GPX file
|
||||
if activity.gpx_file:
|
||||
activity_data['gpx_filename'] = activity.gpx_file.name.split('/')[-1]
|
||||
|
||||
visit_data['activities'].append(activity_data)
|
||||
|
||||
location_data['visits'].append(visit_data)
|
||||
|
||||
# Add trails for this location
|
||||
for trail in location.trails.all():
|
||||
trail_data = {
|
||||
'name': trail.name,
|
||||
'link': trail.link,
|
||||
'wanderer_id': trail.wanderer_id,
|
||||
'created_at': trail.created_at.isoformat() if trail.created_at else None
|
||||
}
|
||||
location_data['trails'].append(trail_data)
|
||||
|
||||
# Add images
|
||||
for image in location.images.all():
|
||||
image_data = {
|
||||
'immich_id': image.immich_id,
|
||||
'is_primary': image.is_primary,
|
||||
'filename': None,
|
||||
}
|
||||
if image.image:
|
||||
image_data['filename'] = image.image.name.split('/')[-1]
|
||||
location_data['images'].append(image_data)
|
||||
|
||||
# Add attachments
|
||||
for attachment in location.attachments.all():
|
||||
attachment_data = {
|
||||
'name': attachment.name,
|
||||
'filename': None
|
||||
}
|
||||
if attachment.file:
|
||||
attachment_data['filename'] = attachment.file.name.split('/')[-1]
|
||||
location_data['attachments'].append(attachment_data)
|
||||
|
||||
export_data['locations'].append(location_data)
|
||||
|
||||
# Export Transportation
|
||||
for transport in user.transportation_set.all():
|
||||
collection_export_id = None
|
||||
if transport.collection:
|
||||
collection_export_id = collection_name_to_id.get(transport.collection.name)
|
||||
|
||||
export_data['transportation'].append({
|
||||
'type': transport.type,
|
||||
'name': transport.name,
|
||||
'description': transport.description,
|
||||
'rating': transport.rating,
|
||||
'link': transport.link,
|
||||
'date': transport.date.isoformat() if transport.date else None,
|
||||
'end_date': transport.end_date.isoformat() if transport.end_date else None,
|
||||
'start_timezone': transport.start_timezone,
|
||||
'end_timezone': transport.end_timezone,
|
||||
'flight_number': transport.flight_number,
|
||||
'from_location': transport.from_location,
|
||||
'origin_latitude': str(transport.origin_latitude) if transport.origin_latitude else None,
|
||||
'origin_longitude': str(transport.origin_longitude) if transport.origin_longitude else None,
|
||||
'destination_latitude': str(transport.destination_latitude) if transport.destination_latitude else None,
|
||||
'destination_longitude': str(transport.destination_longitude) if transport.destination_longitude else None,
|
||||
'to_location': transport.to_location,
|
||||
'is_public': transport.is_public,
|
||||
'collection_export_id': collection_export_id
|
||||
})
|
||||
|
||||
# Export Notes
|
||||
for note in user.note_set.all():
|
||||
collection_export_id = None
|
||||
if note.collection:
|
||||
collection_export_id = collection_name_to_id.get(note.collection.name)
|
||||
|
||||
export_data['notes'].append({
|
||||
'name': note.name,
|
||||
'content': note.content,
|
||||
'links': note.links,
|
||||
'date': note.date.isoformat() if note.date else None,
|
||||
'is_public': note.is_public,
|
||||
'collection_export_id': collection_export_id
|
||||
})
|
||||
|
||||
# Export Checklists
|
||||
for checklist in user.checklist_set.all():
|
||||
collection_export_id = None
|
||||
if checklist.collection:
|
||||
collection_export_id = collection_name_to_id.get(checklist.collection.name)
|
||||
|
||||
checklist_data = {
|
||||
'name': checklist.name,
|
||||
'date': checklist.date.isoformat() if checklist.date else None,
|
||||
'is_public': checklist.is_public,
|
||||
'collection_export_id': collection_export_id,
|
||||
'items': []
|
||||
}
|
||||
|
||||
# Add checklist items
|
||||
for item in checklist.checklistitem_set.all():
|
||||
checklist_data['items'].append({
|
||||
'name': item.name,
|
||||
'is_checked': item.is_checked
|
||||
})
|
||||
|
||||
export_data['checklists'].append(checklist_data)
|
||||
|
||||
# Export Lodging
|
||||
for lodging in user.lodging_set.all():
|
||||
collection_export_id = None
|
||||
if lodging.collection:
|
||||
collection_export_id = collection_name_to_id.get(lodging.collection.name)
|
||||
|
||||
export_data['lodging'].append({
|
||||
'name': lodging.name,
|
||||
'type': lodging.type,
|
||||
'description': lodging.description,
|
||||
'rating': lodging.rating,
|
||||
'link': lodging.link,
|
||||
'check_in': lodging.check_in.isoformat() if lodging.check_in else None,
|
||||
'check_out': lodging.check_out.isoformat() if lodging.check_out else None,
|
||||
'timezone': lodging.timezone,
|
||||
'reservation_number': lodging.reservation_number,
|
||||
'price': str(lodging.price) if lodging.price else None,
|
||||
'latitude': str(lodging.latitude) if lodging.latitude else None,
|
||||
'longitude': str(lodging.longitude) if lodging.longitude else None,
|
||||
'location': lodging.location,
|
||||
'is_public': lodging.is_public,
|
||||
'collection_export_id': collection_export_id
|
||||
})
|
||||
|
||||
# Create ZIP file
|
||||
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
|
||||
with zipfile.ZipFile(tmp_file.name, 'w', zipfile.ZIP_DEFLATED) as zip_file:
|
||||
# Add JSON data
|
||||
zip_file.writestr('data.json', json.dumps(export_data, indent=2))
|
||||
|
||||
# Add images, attachments, and GPX files
|
||||
files_added = set()
|
||||
|
||||
for location in user.location_set.all():
|
||||
# Add images
|
||||
for image in location.images.all():
|
||||
if image.image and image.image.name not in files_added:
|
||||
try:
|
||||
image_content = default_storage.open(image.image.name).read()
|
||||
filename = image.image.name.split('/')[-1]
|
||||
zip_file.writestr(f'images/{filename}', image_content)
|
||||
files_added.add(image.image.name)
|
||||
except Exception as e:
|
||||
print(f"Error adding image {image.image.name}: {e}")
|
||||
|
||||
# Add attachments
|
||||
for attachment in location.attachments.all():
|
||||
if attachment.file and attachment.file.name not in files_added:
|
||||
try:
|
||||
file_content = default_storage.open(attachment.file.name).read()
|
||||
filename = attachment.file.name.split('/')[-1]
|
||||
zip_file.writestr(f'attachments/{filename}', file_content)
|
||||
files_added.add(attachment.file.name)
|
||||
except Exception as e:
|
||||
print(f"Error adding attachment {attachment.file.name}: {e}")
|
||||
|
||||
# Add GPX files from activities
|
||||
for visit in location.visits.all():
|
||||
for activity in visit.activities.all():
|
||||
if activity.gpx_file and activity.gpx_file.name not in files_added:
|
||||
try:
|
||||
gpx_content = default_storage.open(activity.gpx_file.name).read()
|
||||
filename = activity.gpx_file.name.split('/')[-1]
|
||||
zip_file.writestr(f'gpx/{filename}', gpx_content)
|
||||
files_added.add(activity.gpx_file.name)
|
||||
except Exception as e:
|
||||
print(f"Error adding GPX file {activity.gpx_file.name}: {e}")
|
||||
|
||||
# Return ZIP file as response
|
||||
with open(tmp_file.name, 'rb') as zip_file:
|
||||
response = HttpResponse(zip_file.read(), content_type='application/zip')
|
||||
filename = f"adventurelog_backup_{user.username}_{datetime.now().strftime('%Y%m%d_%H%M%S')}.zip"
|
||||
response['Content-Disposition'] = f'attachment; filename="{filename}"'
|
||||
|
||||
# Clean up
|
||||
os.unlink(tmp_file.name)
|
||||
return response
|
||||
|
||||
@action(
|
||||
detail=False,
|
||||
methods=['post'],
|
||||
parser_classes=[MultiPartParser],
|
||||
url_path='import', # changes the URL path to /import
|
||||
url_name='import' # changes the reverse name to 'import'
|
||||
)
|
||||
def import_data(self, request):
|
||||
"""
|
||||
Import data from a ZIP backup file
|
||||
"""
|
||||
if 'file' not in request.FILES:
|
||||
return Response({'error': 'No file provided'}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
if 'confirm' not in request.data or request.data['confirm'] != 'yes':
|
||||
return Response({'error': 'Confirmation required to proceed with import'},
|
||||
status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
backup_file = request.FILES['file']
|
||||
user = request.user
|
||||
|
||||
# Save file temporarily
|
||||
with tempfile.NamedTemporaryFile(delete=False, suffix='.zip') as tmp_file:
|
||||
for chunk in backup_file.chunks():
|
||||
tmp_file.write(chunk)
|
||||
tmp_file_path = tmp_file.name
|
||||
|
||||
try:
|
||||
with zipfile.ZipFile(tmp_file_path, 'r') as zip_file:
|
||||
# Validate backup structure
|
||||
if 'data.json' not in zip_file.namelist():
|
||||
return Response({'error': 'Invalid backup file - missing data.json'},
|
||||
status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Load data
|
||||
backup_data = json.loads(zip_file.read('data.json').decode('utf-8'))
|
||||
|
||||
# Import with transaction
|
||||
with transaction.atomic():
|
||||
# Clear existing data first
|
||||
self._clear_user_data(user)
|
||||
summary = self._import_data(backup_data, zip_file, user)
|
||||
|
||||
return Response({
|
||||
'success': True,
|
||||
'message': 'Data imported successfully',
|
||||
'summary': summary
|
||||
}, status=status.HTTP_200_OK)
|
||||
|
||||
except json.JSONDecodeError:
|
||||
return Response({'error': 'Invalid JSON in backup file'},
|
||||
status=status.HTTP_400_BAD_REQUEST)
|
||||
except Exception:
|
||||
import logging
|
||||
logging.error("Import failed", exc_info=True)
|
||||
return Response({'error': 'An internal error occurred during import'},
|
||||
status=status.HTTP_400_BAD_REQUEST)
|
||||
finally:
|
||||
os.unlink(tmp_file_path)
|
||||
|
||||
def _clear_user_data(self, user):
|
||||
"""Clear all existing user data before import"""
|
||||
# Delete in reverse order of dependencies
|
||||
user.activity_set.all().delete() # Delete activities first
|
||||
user.trail_set.all().delete() # Delete trails
|
||||
user.checklistitem_set.all().delete()
|
||||
user.checklist_set.all().delete()
|
||||
user.note_set.all().delete()
|
||||
user.transportation_set.all().delete()
|
||||
user.lodging_set.all().delete()
|
||||
|
||||
# Delete location-related data
|
||||
user.contentimage_set.all().delete()
|
||||
user.contentattachment_set.all().delete()
|
||||
# Visits are deleted via cascade when locations are deleted
|
||||
user.location_set.all().delete()
|
||||
|
||||
# Delete collections and categories last
|
||||
user.collection_set.all().delete()
|
||||
user.category_set.all().delete()
|
||||
|
||||
# Clear visited cities and regions
|
||||
user.visitedcity_set.all().delete()
|
||||
user.visitedregion_set.all().delete()
|
||||
|
||||
def _import_data(self, backup_data, zip_file, user):
|
||||
"""Import backup data and return summary"""
|
||||
from datetime import timedelta
|
||||
|
||||
# Track mappings and counts
|
||||
category_map = {}
|
||||
collection_map = {} # Map export_id to actual collection object
|
||||
location_map = {} # Map location export_id to actual location object
|
||||
trail_name_map = {} # Map (location_id, trail_name) to trail object
|
||||
summary = {
|
||||
'categories': 0, 'collections': 0, 'locations': 0,
|
||||
'transportation': 0, 'notes': 0, 'checklists': 0,
|
||||
'checklist_items': 0, 'lodging': 0, 'images': 0,
|
||||
'attachments': 0, 'visited_cities': 0, 'visited_regions': 0,
|
||||
'trails': 0, 'activities': 0, 'gpx_files': 0
|
||||
}
|
||||
|
||||
# Import Visited Cities
|
||||
for city_data in backup_data.get('visited_cities', []):
|
||||
try:
|
||||
city_obj = City.objects.get(id=city_data['city'])
|
||||
visited_city, created = VisitedCity.objects.get_or_create(user=user, city=city_obj)
|
||||
if created:
|
||||
summary['visited_cities'] += 1
|
||||
except City.DoesNotExist:
|
||||
# If city does not exist, we can skip or log it
|
||||
pass
|
||||
|
||||
# Import Visited Regions
|
||||
for region_data in backup_data.get('visited_regions', []):
|
||||
try:
|
||||
region_obj = Region.objects.get(id=region_data['region'])
|
||||
visited_region, created = VisitedRegion.objects.get_or_create(user=user, region=region_obj)
|
||||
if created:
|
||||
summary['visited_regions'] += 1
|
||||
except Region.DoesNotExist:
|
||||
# If region does not exist, we can skip or log it
|
||||
pass
|
||||
|
||||
# Import Categories
|
||||
for cat_data in backup_data.get('categories', []):
|
||||
category = Category.objects.create(
|
||||
user=user,
|
||||
name=cat_data['name'],
|
||||
display_name=cat_data['display_name'],
|
||||
icon=cat_data.get('icon', '🌍')
|
||||
)
|
||||
category_map[cat_data['name']] = category
|
||||
summary['categories'] += 1
|
||||
|
||||
# Import Collections
|
||||
for col_data in backup_data.get('collections', []):
|
||||
collection = Collection.objects.create(
|
||||
user=user,
|
||||
name=col_data['name'],
|
||||
description=col_data.get('description', ''),
|
||||
is_public=col_data.get('is_public', False),
|
||||
start_date=col_data.get('start_date'),
|
||||
end_date=col_data.get('end_date'),
|
||||
is_archived=col_data.get('is_archived', False),
|
||||
link=col_data.get('link')
|
||||
)
|
||||
collection_map[col_data['export_id']] = collection
|
||||
summary['collections'] += 1
|
||||
|
||||
# Handle shared users
|
||||
for uuid in col_data.get('shared_with_user_ids', []):
|
||||
try:
|
||||
shared_user = User.objects.get(uuid=uuid)
|
||||
if shared_user.public_profile:
|
||||
collection.shared_with.add(shared_user)
|
||||
except User.DoesNotExist:
|
||||
pass
|
||||
|
||||
# Import Locations
|
||||
for adv_data in backup_data.get('locations', []):
|
||||
|
||||
city = None
|
||||
if adv_data.get('city'):
|
||||
try:
|
||||
city = City.objects.get(id=adv_data['city'])
|
||||
except City.DoesNotExist:
|
||||
city = None
|
||||
|
||||
region = None
|
||||
if adv_data.get('region'):
|
||||
try:
|
||||
region = Region.objects.get(id=adv_data['region'])
|
||||
except Region.DoesNotExist:
|
||||
region = None
|
||||
|
||||
country = None
|
||||
if adv_data.get('country'):
|
||||
try:
|
||||
country = Country.objects.get(id=adv_data['country'])
|
||||
except Country.DoesNotExist:
|
||||
country = None
|
||||
|
||||
location = Location(
|
||||
user=user,
|
||||
name=adv_data['name'],
|
||||
location=adv_data.get('location'),
|
||||
tags=adv_data.get('tags', []),
|
||||
description=adv_data.get('description'),
|
||||
rating=adv_data.get('rating'),
|
||||
link=adv_data.get('link'),
|
||||
is_public=adv_data.get('is_public', False),
|
||||
longitude=adv_data.get('longitude'),
|
||||
latitude=adv_data.get('latitude'),
|
||||
city=city,
|
||||
region=region,
|
||||
country=country,
|
||||
category=category_map.get(adv_data.get('category_name'))
|
||||
)
|
||||
location.save(_skip_geocode=True) # Skip geocoding for now
|
||||
location_map[adv_data['export_id']] = location
|
||||
|
||||
# Add to collections using export_ids - MUST be done after save()
|
||||
for collection_export_id in adv_data.get('collection_export_ids', []):
|
||||
if collection_export_id in collection_map:
|
||||
location.collections.add(collection_map[collection_export_id])
|
||||
|
||||
# Import trails for this location first
|
||||
for trail_data in adv_data.get('trails', []):
|
||||
trail = Trail.objects.create(
|
||||
user=user,
|
||||
location=location,
|
||||
name=trail_data['name'],
|
||||
link=trail_data.get('link'),
|
||||
wanderer_id=trail_data.get('wanderer_id'),
|
||||
created_at=trail_data.get('created_at')
|
||||
)
|
||||
trail_name_map[(location.id, trail_data['name'])] = trail
|
||||
summary['trails'] += 1
|
||||
|
||||
# Import visits and their activities
|
||||
for visit_data in adv_data.get('visits', []):
|
||||
visit = Visit.objects.create(
|
||||
location=location,
|
||||
start_date=visit_data.get('start_date'),
|
||||
end_date=visit_data.get('end_date'),
|
||||
timezone=visit_data.get('timezone'),
|
||||
notes=visit_data.get('notes')
|
||||
)
|
||||
|
||||
# Import activities for this visit
|
||||
for activity_data in visit_data.get('activities', []):
|
||||
# Find the trail if specified
|
||||
trail = None
|
||||
if activity_data.get('trail_name'):
|
||||
trail = trail_name_map.get((location.id, activity_data['trail_name']))
|
||||
|
||||
# Convert time durations back from seconds
|
||||
moving_time = None
|
||||
if activity_data.get('moving_time') is not None:
|
||||
moving_time = timedelta(seconds=activity_data['moving_time'])
|
||||
|
||||
elapsed_time = None
|
||||
if activity_data.get('elapsed_time') is not None:
|
||||
elapsed_time = timedelta(seconds=activity_data['elapsed_time'])
|
||||
|
||||
rest_time = None
|
||||
if activity_data.get('rest_time') is not None:
|
||||
rest_time = timedelta(seconds=activity_data['rest_time'])
|
||||
|
||||
activity = Activity(
|
||||
user=user,
|
||||
visit=visit,
|
||||
trail=trail,
|
||||
name=activity_data['name'],
|
||||
sport_type=activity_data.get('sport_type'),
|
||||
distance=activity_data.get('distance'),
|
||||
moving_time=moving_time,
|
||||
elapsed_time=elapsed_time,
|
||||
rest_time=rest_time,
|
||||
elevation_gain=activity_data.get('elevation_gain'),
|
||||
elevation_loss=activity_data.get('elevation_loss'),
|
||||
elev_high=activity_data.get('elev_high'),
|
||||
elev_low=activity_data.get('elev_low'),
|
||||
start_date=activity_data.get('start_date'),
|
||||
start_date_local=activity_data.get('start_date_local'),
|
||||
timezone=activity_data.get('timezone'),
|
||||
average_speed=activity_data.get('average_speed'),
|
||||
max_speed=activity_data.get('max_speed'),
|
||||
average_cadence=activity_data.get('average_cadence'),
|
||||
calories=activity_data.get('calories'),
|
||||
start_lat=activity_data.get('start_lat'),
|
||||
start_lng=activity_data.get('start_lng'),
|
||||
end_lat=activity_data.get('end_lat'),
|
||||
end_lng=activity_data.get('end_lng'),
|
||||
external_service_id=activity_data.get('external_service_id')
|
||||
)
|
||||
|
||||
# Handle GPX file
|
||||
gpx_filename = activity_data.get('gpx_filename')
|
||||
if gpx_filename:
|
||||
try:
|
||||
gpx_content = zip_file.read(f'gpx/{gpx_filename}')
|
||||
gpx_file = ContentFile(gpx_content, name=gpx_filename)
|
||||
activity.gpx_file = gpx_file
|
||||
summary['gpx_files'] += 1
|
||||
except KeyError:
|
||||
pass # GPX file not found in backup
|
||||
|
||||
activity.save()
|
||||
summary['activities'] += 1
|
||||
|
||||
# Import images
|
||||
content_type = ContentType.objects.get(model='location')
|
||||
|
||||
for img_data in adv_data.get('images', []):
|
||||
immich_id = img_data.get('immich_id')
|
||||
if immich_id:
|
||||
ContentImage.objects.create(
|
||||
user=user,
|
||||
immich_id=immich_id,
|
||||
is_primary=img_data.get('is_primary', False),
|
||||
content_type=content_type,
|
||||
object_id=location.id
|
||||
)
|
||||
summary['images'] += 1
|
||||
else:
|
||||
filename = img_data.get('filename')
|
||||
if filename:
|
||||
try:
|
||||
img_content = zip_file.read(f'images/{filename}')
|
||||
img_file = ContentFile(img_content, name=filename)
|
||||
ContentImage.objects.create(
|
||||
user=user,
|
||||
image=img_file,
|
||||
is_primary=img_data.get('is_primary', False),
|
||||
content_type=content_type,
|
||||
object_id=location.id
|
||||
)
|
||||
summary['images'] += 1
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
# Import attachments
|
||||
for att_data in adv_data.get('attachments', []):
|
||||
filename = att_data.get('filename')
|
||||
if filename:
|
||||
try:
|
||||
att_content = zip_file.read(f'attachments/{filename}')
|
||||
att_file = ContentFile(att_content, name=filename)
|
||||
ContentAttachment.objects.create(
|
||||
user=user,
|
||||
file=att_file,
|
||||
name=att_data.get('name'),
|
||||
content_type=content_type,
|
||||
object_id=location.id
|
||||
)
|
||||
summary['attachments'] += 1
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
summary['locations'] += 1
|
||||
|
||||
# Import Transportation
|
||||
for trans_data in backup_data.get('transportation', []):
|
||||
collection = None
|
||||
if trans_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(trans_data['collection_export_id'])
|
||||
|
||||
Transportation.objects.create(
|
||||
user=user,
|
||||
type=trans_data['type'],
|
||||
name=trans_data['name'],
|
||||
description=trans_data.get('description'),
|
||||
rating=trans_data.get('rating'),
|
||||
link=trans_data.get('link'),
|
||||
date=trans_data.get('date'),
|
||||
end_date=trans_data.get('end_date'),
|
||||
start_timezone=trans_data.get('start_timezone'),
|
||||
end_timezone=trans_data.get('end_timezone'),
|
||||
flight_number=trans_data.get('flight_number'),
|
||||
from_location=trans_data.get('from_location'),
|
||||
origin_latitude=trans_data.get('origin_latitude'),
|
||||
origin_longitude=trans_data.get('origin_longitude'),
|
||||
destination_latitude=trans_data.get('destination_latitude'),
|
||||
destination_longitude=trans_data.get('destination_longitude'),
|
||||
to_location=trans_data.get('to_location'),
|
||||
is_public=trans_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
summary['transportation'] += 1
|
||||
|
||||
# Import Notes
|
||||
for note_data in backup_data.get('notes', []):
|
||||
collection = None
|
||||
if note_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(note_data['collection_export_id'])
|
||||
|
||||
Note.objects.create(
|
||||
user=user,
|
||||
name=note_data['name'],
|
||||
content=note_data.get('content'),
|
||||
links=note_data.get('links', []),
|
||||
date=note_data.get('date'),
|
||||
is_public=note_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
summary['notes'] += 1
|
||||
|
||||
# Import Checklists
|
||||
for check_data in backup_data.get('checklists', []):
|
||||
collection = None
|
||||
if check_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(check_data['collection_export_id'])
|
||||
|
||||
checklist = Checklist.objects.create(
|
||||
user=user,
|
||||
name=check_data['name'],
|
||||
date=check_data.get('date'),
|
||||
is_public=check_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
|
||||
# Import checklist items
|
||||
for item_data in check_data.get('items', []):
|
||||
ChecklistItem.objects.create(
|
||||
user=user,
|
||||
checklist=checklist,
|
||||
name=item_data['name'],
|
||||
is_checked=item_data.get('is_checked', False)
|
||||
)
|
||||
summary['checklist_items'] += 1
|
||||
|
||||
summary['checklists'] += 1
|
||||
|
||||
# Import Lodging
|
||||
for lodg_data in backup_data.get('lodging', []):
|
||||
collection = None
|
||||
if lodg_data.get('collection_export_id') is not None:
|
||||
collection = collection_map.get(lodg_data['collection_export_id'])
|
||||
|
||||
Lodging.objects.create(
|
||||
user=user,
|
||||
name=lodg_data['name'],
|
||||
type=lodg_data.get('type', 'other'),
|
||||
description=lodg_data.get('description'),
|
||||
rating=lodg_data.get('rating'),
|
||||
link=lodg_data.get('link'),
|
||||
check_in=lodg_data.get('check_in'),
|
||||
check_out=lodg_data.get('check_out'),
|
||||
timezone=lodg_data.get('timezone'),
|
||||
reservation_number=lodg_data.get('reservation_number'),
|
||||
price=lodg_data.get('price'),
|
||||
latitude=lodg_data.get('latitude'),
|
||||
longitude=lodg_data.get('longitude'),
|
||||
location=lodg_data.get('location'),
|
||||
is_public=lodg_data.get('is_public', False),
|
||||
collection=collection
|
||||
)
|
||||
summary['lodging'] += 1
|
||||
|
||||
return summary
|
||||
Reference in New Issue
Block a user