Bug Fixes + Duplicate Support (#1016)
* Update README.md supporter list * Fix: Multiple bug fixes and features bundle (#888, #991, #617, #984) (#1007) * fix: resolve location creation failures, broken image uploads, and invalid URL handling - Add missing addToast import in LocationDetails.svelte for proper error feedback - Add objectId check and error response handling in ImageManagement.svelte to prevent ghost images - Add Content-Type check in +page.server.ts image action to handle non-JSON backend responses - Add client-side URL validation in LocationDetails.svelte (invalid URLs → null) - Improve Django field error extraction for user-friendly toast messages - Clean up empty description fields (whitespace → null) - Update BUGFIX_DOCUMENTATION.md with detailed fix descriptions * feat: bug fixes and new features bundle Bug fixes: - fix: resolve PATCH location with visits (#888) - fix: Wikipedia/URL image upload via server-side proxy (#991) - fix: private/public toggle race condition (#617) - fix: location creation feedback (addToast import) - fix: invalid URL handling for locations and collections - fix: world map country highlighting (bg-*-200 -> bg-*-400) - fix: clipboard API polyfill for HTTP contexts - fix: MultipleObjectsReturned for duplicate images - fix: SvelteKit proxy sessionid cookie forwarding Features: - feat: duplicate location button (list + detail view) - feat: duplicate collection button - feat: i18n translations for 19 languages - feat: improved error handling and user feedback Technical: - Backend: fetch_from_url endpoint with SSRF protection - Backend: validate_link() for collections - Backend: file_permissions filter() instead of get() - Frontend: copyToClipboard() helper function - Frontend: clipboard polyfill via server-side injection * chore: switch docker-compose from image to build Use local source code builds instead of upstream :latest images to preserve our custom patches and fixes. * fix: lodging save errors, AI language support, and i18n improvements - Fix Lodging save: add res.ok checks, error toasts, isSaving state (#984) - Fix URL validation: silently set invalid URLs to null (Lodging, Transportation) - Fix AI description language: pass user locale to Wikipedia API - Fix missing i18n keys: Strava toggle buttons (show/hide) - Add CHANGELOG.md - Remove internal documentation from public tracking - Update .gitignore for Cursor IDE and internal docs Co-authored-by: Cursor <cursoragent@cursor.com> * feat: update location duplication handling, improve UI feedback, and enhance localization support --------- Co-authored-by: AdventureLog Bugfix <bugfix@adventurelog.local> Co-authored-by: madmp87 <info@so-pa.de> Co-authored-by: Mathias Ponnwitz <devuser@dockge-dev.fritz.box> Co-authored-by: Cursor <cursoragent@cursor.com> Co-authored-by: Sean Morley <mail@seanmorley.com> * Enhance duplication functionality for collections and locations; update UI to reflect changes * Potential fix for code scanning alert no. 49: Information exposure through an exception Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> * Update Django and Pillow versions in requirements.txt * Fix error logging for image fetch timeout in ContentImageViewSet * Update requirements.txt to include jaraco.context and wheel for security fixes * Update app version and add security vulnerabilities to .trivyignore * Update backend/server/adventures/views/collection_view.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update frontend/src/lib/types.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Reorder build and image directives in docker-compose.yml for clarity * Refactor code structure for improved readability and maintainability * Remove inline clipboard polyfill script injection from server hooks (#1019) * Initial plan * Remove inline clipboard polyfill script injection from hooks.server.ts Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> * Fix unhandled promise rejections in copyToClipboard click handlers (#1018) * Initial plan * Fix: make copyToClipboard handlers async with try/catch error toast Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> * Harden `fetch_from_url` image proxy: require auth, rate-limit, and strengthen SSRF protections (#1017) * Initial plan * Harden fetch_from_url: require auth, rate-limit, block non-standard ports, check all IPs, re-validate redirects Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> * Fix subregion filtering in world travel page to exclude null values * Update package.json to use caret (^) for versioning in overrides * fix: update package dependencies for compatibility and stability - Added cookie dependency with version constraint <0.7.0 - Updated svelte dependency to allow versions <=5.51.4 - Updated @sveltejs/adapter-vercel dependency to allow versions <6.3.2 * Refactor code structure for improved readability and maintainability --------- Co-authored-by: madmp87 <79420509+madmp87@users.noreply.github.com> Co-authored-by: AdventureLog Bugfix <bugfix@adventurelog.local> Co-authored-by: madmp87 <info@so-pa.de> Co-authored-by: Mathias Ponnwitz <devuser@dockge-dev.fritz.box> Co-authored-by: Cursor <cursoragent@cursor.com> Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
This commit is contained in:
@@ -8,12 +8,13 @@ from rest_framework.parsers import MultiPartParser
|
||||
from rest_framework import status
|
||||
from django.http import HttpResponse
|
||||
from django.conf import settings
|
||||
from django.core.files.base import ContentFile
|
||||
import io
|
||||
import os
|
||||
import json
|
||||
import zipfile
|
||||
import tempfile
|
||||
from adventures.models import Collection, Location, Transportation, Note, Checklist, CollectionInvite, ContentImage, CollectionItineraryItem, Lodging, CollectionItineraryDay, ContentAttachment, Category
|
||||
from adventures.models import Collection, Location, Transportation, Note, Checklist, ChecklistItem, CollectionInvite, ContentImage, CollectionItineraryItem, Lodging, CollectionItineraryDay, ContentAttachment, Category
|
||||
from adventures.permissions import CollectionShared
|
||||
from adventures.serializers import CollectionSerializer, CollectionInviteSerializer, UltraSlimCollectionSerializer, CollectionItineraryItemSerializer, CollectionItineraryDaySerializer
|
||||
from users.models import CustomUser as User
|
||||
@@ -791,6 +792,241 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
serializer = self.get_serializer(new_collection)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def duplicate(self, request, pk=None):
|
||||
"""Create a duplicate of an existing collection.
|
||||
|
||||
Copies collection metadata and linked content:
|
||||
- locations (linked, not cloned)
|
||||
- transportation, notes, checklists (with items), lodging
|
||||
- itinerary days and itinerary items
|
||||
Shared users are not copied and the new collection is private.
|
||||
"""
|
||||
original = self.get_object()
|
||||
|
||||
# Only the owner can duplicate
|
||||
if original.user != request.user:
|
||||
return Response(
|
||||
{"error": "You do not have permission to duplicate this collection."},
|
||||
status=status.HTTP_403_FORBIDDEN,
|
||||
)
|
||||
|
||||
try:
|
||||
with transaction.atomic():
|
||||
new_collection = Collection.objects.create(
|
||||
user=request.user,
|
||||
name=f"Copy of {original.name}",
|
||||
description=original.description,
|
||||
link=original.link,
|
||||
is_public=False,
|
||||
is_archived=False,
|
||||
start_date=original.start_date,
|
||||
end_date=original.end_date,
|
||||
)
|
||||
|
||||
# Link existing locations to the new collection
|
||||
linked_locations = list(original.locations.all())
|
||||
if linked_locations:
|
||||
new_collection.locations.set(linked_locations)
|
||||
|
||||
# Duplicate primary image if it exists so permissions align with the new collection
|
||||
if original.primary_image:
|
||||
original_primary = original.primary_image
|
||||
if original_primary.image:
|
||||
try:
|
||||
original_primary.image.open('rb')
|
||||
image_bytes = original_primary.image.read()
|
||||
finally:
|
||||
try:
|
||||
original_primary.image.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (original_primary.image.name or '').split('/')[-1] or 'image.webp'
|
||||
new_primary = ContentImage(
|
||||
user=request.user,
|
||||
image=ContentFile(image_bytes, name=file_name),
|
||||
immich_id=None,
|
||||
is_primary=original_primary.is_primary,
|
||||
)
|
||||
else:
|
||||
new_primary = ContentImage(
|
||||
user=request.user,
|
||||
immich_id=original_primary.immich_id,
|
||||
is_primary=original_primary.is_primary,
|
||||
)
|
||||
|
||||
new_primary.content_object = new_collection
|
||||
new_primary.save()
|
||||
new_collection.primary_image = new_primary
|
||||
new_collection.save(update_fields=['primary_image'])
|
||||
|
||||
def _copy_generic_media(source_obj, target_obj):
|
||||
# Images
|
||||
for img in source_obj.images.all():
|
||||
if img.image:
|
||||
try:
|
||||
img.image.open('rb')
|
||||
image_bytes = img.image.read()
|
||||
finally:
|
||||
try:
|
||||
img.image.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (img.image.name or '').split('/')[-1] or 'image.webp'
|
||||
media = ContentImage(
|
||||
user=request.user,
|
||||
image=ContentFile(image_bytes, name=file_name),
|
||||
immich_id=None,
|
||||
is_primary=img.is_primary,
|
||||
)
|
||||
else:
|
||||
media = ContentImage(
|
||||
user=request.user,
|
||||
immich_id=img.immich_id,
|
||||
is_primary=img.is_primary,
|
||||
)
|
||||
|
||||
media.content_object = target_obj
|
||||
media.save()
|
||||
|
||||
# Attachments
|
||||
for attachment in source_obj.attachments.all():
|
||||
try:
|
||||
attachment.file.open('rb')
|
||||
file_bytes = attachment.file.read()
|
||||
finally:
|
||||
try:
|
||||
attachment.file.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (attachment.file.name or '').split('/')[-1] or 'attachment'
|
||||
new_attachment = ContentAttachment(
|
||||
user=request.user,
|
||||
file=ContentFile(file_bytes, name=file_name),
|
||||
name=attachment.name,
|
||||
)
|
||||
new_attachment.content_object = target_obj
|
||||
new_attachment.save()
|
||||
|
||||
# Copy FK-based related content and track ID mapping for itinerary relinks
|
||||
object_id_map = {}
|
||||
|
||||
for item in Transportation.objects.filter(collection=original):
|
||||
new_item = Transportation.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
type=item.type,
|
||||
name=item.name,
|
||||
description=item.description,
|
||||
rating=item.rating,
|
||||
price=item.price,
|
||||
link=item.link,
|
||||
date=item.date,
|
||||
end_date=item.end_date,
|
||||
start_timezone=item.start_timezone,
|
||||
end_timezone=item.end_timezone,
|
||||
flight_number=item.flight_number,
|
||||
from_location=item.from_location,
|
||||
origin_latitude=item.origin_latitude,
|
||||
origin_longitude=item.origin_longitude,
|
||||
destination_latitude=item.destination_latitude,
|
||||
destination_longitude=item.destination_longitude,
|
||||
start_code=item.start_code,
|
||||
end_code=item.end_code,
|
||||
to_location=item.to_location,
|
||||
is_public=item.is_public,
|
||||
)
|
||||
object_id_map[item.id] = new_item.id
|
||||
_copy_generic_media(item, new_item)
|
||||
|
||||
for item in Note.objects.filter(collection=original):
|
||||
new_item = Note.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
name=item.name,
|
||||
content=item.content,
|
||||
links=item.links,
|
||||
date=item.date,
|
||||
is_public=item.is_public,
|
||||
)
|
||||
object_id_map[item.id] = new_item.id
|
||||
_copy_generic_media(item, new_item)
|
||||
|
||||
for item in Lodging.objects.filter(collection=original):
|
||||
new_item = Lodging.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
name=item.name,
|
||||
type=item.type,
|
||||
description=item.description,
|
||||
rating=item.rating,
|
||||
link=item.link,
|
||||
check_in=item.check_in,
|
||||
check_out=item.check_out,
|
||||
timezone=item.timezone,
|
||||
reservation_number=item.reservation_number,
|
||||
price=item.price,
|
||||
latitude=item.latitude,
|
||||
longitude=item.longitude,
|
||||
location=item.location,
|
||||
is_public=item.is_public,
|
||||
)
|
||||
object_id_map[item.id] = new_item.id
|
||||
_copy_generic_media(item, new_item)
|
||||
|
||||
for checklist in Checklist.objects.filter(collection=original):
|
||||
new_checklist = Checklist.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
name=checklist.name,
|
||||
date=checklist.date,
|
||||
is_public=checklist.is_public,
|
||||
)
|
||||
object_id_map[checklist.id] = new_checklist.id
|
||||
|
||||
for checklist_item in checklist.checklistitem_set.all():
|
||||
ChecklistItem.objects.create(
|
||||
user=request.user,
|
||||
checklist=new_checklist,
|
||||
name=checklist_item.name,
|
||||
is_checked=checklist_item.is_checked,
|
||||
)
|
||||
|
||||
# Copy itinerary day metadata
|
||||
for day in CollectionItineraryDay.objects.filter(collection=original):
|
||||
CollectionItineraryDay.objects.create(
|
||||
collection=new_collection,
|
||||
date=day.date,
|
||||
name=day.name,
|
||||
description=day.description,
|
||||
)
|
||||
|
||||
# Copy itinerary items and relink to duplicated FK-based content where applicable
|
||||
for item in CollectionItineraryItem.objects.filter(collection=original):
|
||||
CollectionItineraryItem.objects.create(
|
||||
collection=new_collection,
|
||||
content_type=item.content_type,
|
||||
object_id=object_id_map.get(item.object_id, item.object_id),
|
||||
date=item.date,
|
||||
is_global=item.is_global,
|
||||
order=item.order,
|
||||
)
|
||||
|
||||
serializer = self.get_serializer(new_collection)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
except Exception:
|
||||
import logging
|
||||
logging.getLogger(__name__).exception("Failed to duplicate collection %s", pk)
|
||||
return Response(
|
||||
{"error": "An error occurred while duplicating the collection."},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
# This is ok because you cannot share a collection when creating it
|
||||
serializer.save(user=self.request.user)
|
||||
|
||||
Reference in New Issue
Block a user