Bug Fixes + Duplicate Support (#1016)
* Update README.md supporter list * Fix: Multiple bug fixes and features bundle (#888, #991, #617, #984) (#1007) * fix: resolve location creation failures, broken image uploads, and invalid URL handling - Add missing addToast import in LocationDetails.svelte for proper error feedback - Add objectId check and error response handling in ImageManagement.svelte to prevent ghost images - Add Content-Type check in +page.server.ts image action to handle non-JSON backend responses - Add client-side URL validation in LocationDetails.svelte (invalid URLs → null) - Improve Django field error extraction for user-friendly toast messages - Clean up empty description fields (whitespace → null) - Update BUGFIX_DOCUMENTATION.md with detailed fix descriptions * feat: bug fixes and new features bundle Bug fixes: - fix: resolve PATCH location with visits (#888) - fix: Wikipedia/URL image upload via server-side proxy (#991) - fix: private/public toggle race condition (#617) - fix: location creation feedback (addToast import) - fix: invalid URL handling for locations and collections - fix: world map country highlighting (bg-*-200 -> bg-*-400) - fix: clipboard API polyfill for HTTP contexts - fix: MultipleObjectsReturned for duplicate images - fix: SvelteKit proxy sessionid cookie forwarding Features: - feat: duplicate location button (list + detail view) - feat: duplicate collection button - feat: i18n translations for 19 languages - feat: improved error handling and user feedback Technical: - Backend: fetch_from_url endpoint with SSRF protection - Backend: validate_link() for collections - Backend: file_permissions filter() instead of get() - Frontend: copyToClipboard() helper function - Frontend: clipboard polyfill via server-side injection * chore: switch docker-compose from image to build Use local source code builds instead of upstream :latest images to preserve our custom patches and fixes. * fix: lodging save errors, AI language support, and i18n improvements - Fix Lodging save: add res.ok checks, error toasts, isSaving state (#984) - Fix URL validation: silently set invalid URLs to null (Lodging, Transportation) - Fix AI description language: pass user locale to Wikipedia API - Fix missing i18n keys: Strava toggle buttons (show/hide) - Add CHANGELOG.md - Remove internal documentation from public tracking - Update .gitignore for Cursor IDE and internal docs Co-authored-by: Cursor <cursoragent@cursor.com> * feat: update location duplication handling, improve UI feedback, and enhance localization support --------- Co-authored-by: AdventureLog Bugfix <bugfix@adventurelog.local> Co-authored-by: madmp87 <info@so-pa.de> Co-authored-by: Mathias Ponnwitz <devuser@dockge-dev.fritz.box> Co-authored-by: Cursor <cursoragent@cursor.com> Co-authored-by: Sean Morley <mail@seanmorley.com> * Enhance duplication functionality for collections and locations; update UI to reflect changes * Potential fix for code scanning alert no. 49: Information exposure through an exception Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> * Update Django and Pillow versions in requirements.txt * Fix error logging for image fetch timeout in ContentImageViewSet * Update requirements.txt to include jaraco.context and wheel for security fixes * Update app version and add security vulnerabilities to .trivyignore * Update backend/server/adventures/views/collection_view.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update frontend/src/lib/types.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Reorder build and image directives in docker-compose.yml for clarity * Refactor code structure for improved readability and maintainability * Remove inline clipboard polyfill script injection from server hooks (#1019) * Initial plan * Remove inline clipboard polyfill script injection from hooks.server.ts Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> * Fix unhandled promise rejections in copyToClipboard click handlers (#1018) * Initial plan * Fix: make copyToClipboard handlers async with try/catch error toast Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> * Harden `fetch_from_url` image proxy: require auth, rate-limit, and strengthen SSRF protections (#1017) * Initial plan * Harden fetch_from_url: require auth, rate-limit, block non-standard ports, check all IPs, re-validate redirects Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: seanmorley15 <98704938+seanmorley15@users.noreply.github.com> * Fix subregion filtering in world travel page to exclude null values * Update package.json to use caret (^) for versioning in overrides * fix: update package dependencies for compatibility and stability - Added cookie dependency with version constraint <0.7.0 - Updated svelte dependency to allow versions <=5.51.4 - Updated @sveltejs/adapter-vercel dependency to allow versions <6.3.2 * Refactor code structure for improved readability and maintainability --------- Co-authored-by: madmp87 <79420509+madmp87@users.noreply.github.com> Co-authored-by: AdventureLog Bugfix <bugfix@adventurelog.local> Co-authored-by: madmp87 <info@so-pa.de> Co-authored-by: Mathias Ponnwitz <devuser@dockge-dev.fritz.box> Co-authored-by: Cursor <cursoragent@cursor.com> Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
This commit is contained in:
@@ -435,9 +435,8 @@ class LocationSerializer(CustomModelSerializer):
|
||||
return location
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
has_visits = 'visits' in validated_data
|
||||
category_data = validated_data.pop('category', None)
|
||||
|
||||
visits_data = validated_data.pop('visits', None)
|
||||
collections_data = validated_data.pop('collections', None)
|
||||
|
||||
# Update regular fields
|
||||
@@ -452,12 +451,22 @@ class LocationSerializer(CustomModelSerializer):
|
||||
instance.category = category
|
||||
# If not the owner, ignore category changes
|
||||
|
||||
# Handle collections - only update if collections were provided
|
||||
# Save the location first so that user-supplied field values (including
|
||||
# is_public) are persisted before the m2m_changed signal fires.
|
||||
instance.save()
|
||||
|
||||
# Handle collections - only update if collections were provided.
|
||||
# NOTE: .set() triggers the m2m_changed signal which may override
|
||||
# is_public based on collection publicity. By saving first we ensure
|
||||
# the user's explicit value reaches the DB before the signal runs.
|
||||
if collections_data is not None:
|
||||
instance.collections.set(collections_data)
|
||||
|
||||
# call save on the location to update the updated_at field and trigger any geocoding
|
||||
instance.save()
|
||||
# Handle visits - replace all visits if provided
|
||||
if visits_data is not None:
|
||||
instance.visits.all().delete()
|
||||
for visit_data in visits_data:
|
||||
Visit.objects.create(location=instance, **visit_data)
|
||||
|
||||
return instance
|
||||
|
||||
@@ -720,6 +729,9 @@ class CollectionSerializer(CustomModelSerializer):
|
||||
required=False,
|
||||
allow_null=True,
|
||||
)
|
||||
# Override link as CharField so DRF's URLField doesn't reject invalid
|
||||
# values before validate_link() can clean them up.
|
||||
link = serializers.CharField(required=False, allow_blank=True, allow_null=True)
|
||||
|
||||
class Meta:
|
||||
model = Collection
|
||||
@@ -749,6 +761,19 @@ class CollectionSerializer(CustomModelSerializer):
|
||||
]
|
||||
read_only_fields = ['id', 'created_at', 'updated_at', 'user', 'shared_with', 'status', 'days_until_start', 'primary_image']
|
||||
|
||||
def validate_link(self, value):
|
||||
"""Convert empty or invalid URLs to None so Django doesn't reject them."""
|
||||
if not value or not value.strip():
|
||||
return None
|
||||
from django.core.validators import URLValidator
|
||||
from django.core.exceptions import ValidationError as DjangoValidationError
|
||||
validator = URLValidator()
|
||||
try:
|
||||
validator(value)
|
||||
except DjangoValidationError:
|
||||
return None
|
||||
return value
|
||||
|
||||
def get_collaborators(self, obj):
|
||||
request = self.context.get('request')
|
||||
request_user = getattr(request, 'user', None) if request else None
|
||||
|
||||
@@ -4,83 +4,55 @@ from adventures.models import Visit
|
||||
|
||||
protected_paths = ['images/', 'attachments/']
|
||||
|
||||
def _check_content_object_permission(content_object, user):
|
||||
"""Check if user has permission to access a content object."""
|
||||
# handle differently when content_object is a Visit, get the location instead
|
||||
if isinstance(content_object, Visit):
|
||||
if content_object.location:
|
||||
content_object = content_object.location
|
||||
|
||||
# Check if content object is public
|
||||
if hasattr(content_object, 'is_public') and content_object.is_public:
|
||||
return True
|
||||
|
||||
# Check if user owns the content object
|
||||
if hasattr(content_object, 'user') and content_object.user == user:
|
||||
return True
|
||||
|
||||
# Check collection-based permissions
|
||||
if hasattr(content_object, 'collections') and content_object.collections.exists():
|
||||
for collection in content_object.collections.all():
|
||||
if collection.user == user or collection.shared_with.filter(id=user.id).exists():
|
||||
return True
|
||||
return False
|
||||
elif hasattr(content_object, 'collection') and content_object.collection:
|
||||
if content_object.collection.user == user or content_object.collection.shared_with.filter(id=user.id).exists():
|
||||
return True
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
def checkFilePermission(fileId, user, mediaType):
|
||||
if mediaType not in protected_paths:
|
||||
return True
|
||||
if mediaType == 'images/':
|
||||
try:
|
||||
# Construct the full relative path to match the database field
|
||||
image_path = f"images/{fileId}"
|
||||
# Fetch the ContentImage object
|
||||
content_image = ContentImage.objects.get(image=image_path)
|
||||
|
||||
# Get the content object (could be Location, Transportation, Note, etc.)
|
||||
content_object = content_image.content_object
|
||||
|
||||
# handle differently when content_object is a Visit, get the location instead
|
||||
if isinstance(content_object, Visit):
|
||||
# check visit.location
|
||||
if content_object.location:
|
||||
# continue with the location check
|
||||
content_object = content_object.location
|
||||
|
||||
# Check if content object is public
|
||||
if hasattr(content_object, 'is_public') and content_object.is_public:
|
||||
return True
|
||||
|
||||
# Check if user owns the content object
|
||||
if hasattr(content_object, 'user') and content_object.user == user:
|
||||
return True
|
||||
|
||||
# Check collection-based permissions
|
||||
if hasattr(content_object, 'collections') and content_object.collections.exists():
|
||||
# For objects with multiple collections (like Location)
|
||||
for collection in content_object.collections.all():
|
||||
if collection.user == user or collection.shared_with.filter(id=user.id).exists():
|
||||
return True
|
||||
return False
|
||||
elif hasattr(content_object, 'collection') and content_object.collection:
|
||||
# For objects with single collection (like Transportation, Note, etc.)
|
||||
if content_object.collection.user == user or content_object.collection.shared_with.filter(id=user.id).exists():
|
||||
return True
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
except ContentImage.DoesNotExist:
|
||||
image_path = f"images/{fileId}"
|
||||
# Use filter() instead of get() to handle multiple ContentImage entries
|
||||
# pointing to the same file (e.g. after location duplication)
|
||||
content_images = ContentImage.objects.filter(image=image_path)
|
||||
if not content_images.exists():
|
||||
return False
|
||||
# Grant access if ANY associated content object permits it
|
||||
for content_image in content_images:
|
||||
content_object = content_image.content_object
|
||||
if content_object and _check_content_object_permission(content_object, user):
|
||||
return True
|
||||
return False
|
||||
elif mediaType == 'attachments/':
|
||||
try:
|
||||
# Construct the full relative path to match the database field
|
||||
attachment_path = f"attachments/{fileId}"
|
||||
# Fetch the ContentAttachment object
|
||||
content_attachment = ContentAttachment.objects.get(file=attachment_path)
|
||||
|
||||
# Get the content object (could be Location, Transportation, Note, etc.)
|
||||
content_object = content_attachment.content_object
|
||||
|
||||
# Check if content object is public
|
||||
if hasattr(content_object, 'is_public') and content_object.is_public:
|
||||
return True
|
||||
|
||||
# Check if user owns the content object
|
||||
if hasattr(content_object, 'user') and content_object.user == user:
|
||||
return True
|
||||
|
||||
# Check collection-based permissions
|
||||
if hasattr(content_object, 'collections') and content_object.collections.exists():
|
||||
# For objects with multiple collections (like Location)
|
||||
for collection in content_object.collections.all():
|
||||
if collection.user == user or collection.shared_with.filter(id=user.id).exists():
|
||||
return True
|
||||
return False
|
||||
elif hasattr(content_object, 'collection') and content_object.collection:
|
||||
# For objects with single collection (like Transportation, Note, etc.)
|
||||
if content_object.collection.user == user or content_object.collection.shared_with.filter(id=user.id).exists():
|
||||
return True
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
return _check_content_object_permission(content_object, user) if content_object else False
|
||||
except ContentAttachment.DoesNotExist:
|
||||
return False
|
||||
return False
|
||||
|
||||
@@ -8,12 +8,13 @@ from rest_framework.parsers import MultiPartParser
|
||||
from rest_framework import status
|
||||
from django.http import HttpResponse
|
||||
from django.conf import settings
|
||||
from django.core.files.base import ContentFile
|
||||
import io
|
||||
import os
|
||||
import json
|
||||
import zipfile
|
||||
import tempfile
|
||||
from adventures.models import Collection, Location, Transportation, Note, Checklist, CollectionInvite, ContentImage, CollectionItineraryItem, Lodging, CollectionItineraryDay, ContentAttachment, Category
|
||||
from adventures.models import Collection, Location, Transportation, Note, Checklist, ChecklistItem, CollectionInvite, ContentImage, CollectionItineraryItem, Lodging, CollectionItineraryDay, ContentAttachment, Category
|
||||
from adventures.permissions import CollectionShared
|
||||
from adventures.serializers import CollectionSerializer, CollectionInviteSerializer, UltraSlimCollectionSerializer, CollectionItineraryItemSerializer, CollectionItineraryDaySerializer
|
||||
from users.models import CustomUser as User
|
||||
@@ -791,6 +792,241 @@ class CollectionViewSet(viewsets.ModelViewSet):
|
||||
serializer = self.get_serializer(new_collection)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def duplicate(self, request, pk=None):
|
||||
"""Create a duplicate of an existing collection.
|
||||
|
||||
Copies collection metadata and linked content:
|
||||
- locations (linked, not cloned)
|
||||
- transportation, notes, checklists (with items), lodging
|
||||
- itinerary days and itinerary items
|
||||
Shared users are not copied and the new collection is private.
|
||||
"""
|
||||
original = self.get_object()
|
||||
|
||||
# Only the owner can duplicate
|
||||
if original.user != request.user:
|
||||
return Response(
|
||||
{"error": "You do not have permission to duplicate this collection."},
|
||||
status=status.HTTP_403_FORBIDDEN,
|
||||
)
|
||||
|
||||
try:
|
||||
with transaction.atomic():
|
||||
new_collection = Collection.objects.create(
|
||||
user=request.user,
|
||||
name=f"Copy of {original.name}",
|
||||
description=original.description,
|
||||
link=original.link,
|
||||
is_public=False,
|
||||
is_archived=False,
|
||||
start_date=original.start_date,
|
||||
end_date=original.end_date,
|
||||
)
|
||||
|
||||
# Link existing locations to the new collection
|
||||
linked_locations = list(original.locations.all())
|
||||
if linked_locations:
|
||||
new_collection.locations.set(linked_locations)
|
||||
|
||||
# Duplicate primary image if it exists so permissions align with the new collection
|
||||
if original.primary_image:
|
||||
original_primary = original.primary_image
|
||||
if original_primary.image:
|
||||
try:
|
||||
original_primary.image.open('rb')
|
||||
image_bytes = original_primary.image.read()
|
||||
finally:
|
||||
try:
|
||||
original_primary.image.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (original_primary.image.name or '').split('/')[-1] or 'image.webp'
|
||||
new_primary = ContentImage(
|
||||
user=request.user,
|
||||
image=ContentFile(image_bytes, name=file_name),
|
||||
immich_id=None,
|
||||
is_primary=original_primary.is_primary,
|
||||
)
|
||||
else:
|
||||
new_primary = ContentImage(
|
||||
user=request.user,
|
||||
immich_id=original_primary.immich_id,
|
||||
is_primary=original_primary.is_primary,
|
||||
)
|
||||
|
||||
new_primary.content_object = new_collection
|
||||
new_primary.save()
|
||||
new_collection.primary_image = new_primary
|
||||
new_collection.save(update_fields=['primary_image'])
|
||||
|
||||
def _copy_generic_media(source_obj, target_obj):
|
||||
# Images
|
||||
for img in source_obj.images.all():
|
||||
if img.image:
|
||||
try:
|
||||
img.image.open('rb')
|
||||
image_bytes = img.image.read()
|
||||
finally:
|
||||
try:
|
||||
img.image.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (img.image.name or '').split('/')[-1] or 'image.webp'
|
||||
media = ContentImage(
|
||||
user=request.user,
|
||||
image=ContentFile(image_bytes, name=file_name),
|
||||
immich_id=None,
|
||||
is_primary=img.is_primary,
|
||||
)
|
||||
else:
|
||||
media = ContentImage(
|
||||
user=request.user,
|
||||
immich_id=img.immich_id,
|
||||
is_primary=img.is_primary,
|
||||
)
|
||||
|
||||
media.content_object = target_obj
|
||||
media.save()
|
||||
|
||||
# Attachments
|
||||
for attachment in source_obj.attachments.all():
|
||||
try:
|
||||
attachment.file.open('rb')
|
||||
file_bytes = attachment.file.read()
|
||||
finally:
|
||||
try:
|
||||
attachment.file.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (attachment.file.name or '').split('/')[-1] or 'attachment'
|
||||
new_attachment = ContentAttachment(
|
||||
user=request.user,
|
||||
file=ContentFile(file_bytes, name=file_name),
|
||||
name=attachment.name,
|
||||
)
|
||||
new_attachment.content_object = target_obj
|
||||
new_attachment.save()
|
||||
|
||||
# Copy FK-based related content and track ID mapping for itinerary relinks
|
||||
object_id_map = {}
|
||||
|
||||
for item in Transportation.objects.filter(collection=original):
|
||||
new_item = Transportation.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
type=item.type,
|
||||
name=item.name,
|
||||
description=item.description,
|
||||
rating=item.rating,
|
||||
price=item.price,
|
||||
link=item.link,
|
||||
date=item.date,
|
||||
end_date=item.end_date,
|
||||
start_timezone=item.start_timezone,
|
||||
end_timezone=item.end_timezone,
|
||||
flight_number=item.flight_number,
|
||||
from_location=item.from_location,
|
||||
origin_latitude=item.origin_latitude,
|
||||
origin_longitude=item.origin_longitude,
|
||||
destination_latitude=item.destination_latitude,
|
||||
destination_longitude=item.destination_longitude,
|
||||
start_code=item.start_code,
|
||||
end_code=item.end_code,
|
||||
to_location=item.to_location,
|
||||
is_public=item.is_public,
|
||||
)
|
||||
object_id_map[item.id] = new_item.id
|
||||
_copy_generic_media(item, new_item)
|
||||
|
||||
for item in Note.objects.filter(collection=original):
|
||||
new_item = Note.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
name=item.name,
|
||||
content=item.content,
|
||||
links=item.links,
|
||||
date=item.date,
|
||||
is_public=item.is_public,
|
||||
)
|
||||
object_id_map[item.id] = new_item.id
|
||||
_copy_generic_media(item, new_item)
|
||||
|
||||
for item in Lodging.objects.filter(collection=original):
|
||||
new_item = Lodging.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
name=item.name,
|
||||
type=item.type,
|
||||
description=item.description,
|
||||
rating=item.rating,
|
||||
link=item.link,
|
||||
check_in=item.check_in,
|
||||
check_out=item.check_out,
|
||||
timezone=item.timezone,
|
||||
reservation_number=item.reservation_number,
|
||||
price=item.price,
|
||||
latitude=item.latitude,
|
||||
longitude=item.longitude,
|
||||
location=item.location,
|
||||
is_public=item.is_public,
|
||||
)
|
||||
object_id_map[item.id] = new_item.id
|
||||
_copy_generic_media(item, new_item)
|
||||
|
||||
for checklist in Checklist.objects.filter(collection=original):
|
||||
new_checklist = Checklist.objects.create(
|
||||
user=request.user,
|
||||
collection=new_collection,
|
||||
name=checklist.name,
|
||||
date=checklist.date,
|
||||
is_public=checklist.is_public,
|
||||
)
|
||||
object_id_map[checklist.id] = new_checklist.id
|
||||
|
||||
for checklist_item in checklist.checklistitem_set.all():
|
||||
ChecklistItem.objects.create(
|
||||
user=request.user,
|
||||
checklist=new_checklist,
|
||||
name=checklist_item.name,
|
||||
is_checked=checklist_item.is_checked,
|
||||
)
|
||||
|
||||
# Copy itinerary day metadata
|
||||
for day in CollectionItineraryDay.objects.filter(collection=original):
|
||||
CollectionItineraryDay.objects.create(
|
||||
collection=new_collection,
|
||||
date=day.date,
|
||||
name=day.name,
|
||||
description=day.description,
|
||||
)
|
||||
|
||||
# Copy itinerary items and relink to duplicated FK-based content where applicable
|
||||
for item in CollectionItineraryItem.objects.filter(collection=original):
|
||||
CollectionItineraryItem.objects.create(
|
||||
collection=new_collection,
|
||||
content_type=item.content_type,
|
||||
object_id=object_id_map.get(item.object_id, item.object_id),
|
||||
date=item.date,
|
||||
is_global=item.is_global,
|
||||
order=item.order,
|
||||
)
|
||||
|
||||
serializer = self.get_serializer(new_collection)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
except Exception:
|
||||
import logging
|
||||
logging.getLogger(__name__).exception("Failed to duplicate collection %s", pk)
|
||||
return Response(
|
||||
{"error": "An error occurred while duplicating the collection."},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
)
|
||||
|
||||
def perform_create(self, serializer):
|
||||
# This is ok because you cannot share a collection when creating it
|
||||
serializer.save(user=self.request.user)
|
||||
|
||||
@@ -1,6 +1,12 @@
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from rest_framework.throttling import UserRateThrottle
|
||||
from django.http import HttpResponse
|
||||
import ipaddress
|
||||
import socket
|
||||
from urllib.parse import urlparse
|
||||
from django.db.models import Q
|
||||
from django.core.files.base import ContentFile
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
@@ -10,6 +16,55 @@ from integrations.models import ImmichIntegration
|
||||
from adventures.permissions import IsOwnerOrSharedWithFullAccess # Your existing permission class
|
||||
import requests
|
||||
from adventures.permissions import ContentImagePermission
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ImageProxyThrottle(UserRateThrottle):
|
||||
scope = 'image_proxy'
|
||||
|
||||
|
||||
def _is_safe_url(image_url):
|
||||
"""
|
||||
Validate a URL for safe proxy use.
|
||||
Returns (True, parsed) on success or (False, error_message) on failure.
|
||||
Checks:
|
||||
- Scheme is http or https
|
||||
- No non-standard ports (only 80 and 443 allowed)
|
||||
- All resolved IPs are public (no private/loopback/reserved/link-local/multicast)
|
||||
"""
|
||||
parsed = urlparse(image_url)
|
||||
|
||||
if parsed.scheme not in ('http', 'https'):
|
||||
return False, "Invalid URL scheme. Only http and https are allowed."
|
||||
|
||||
port = parsed.port
|
||||
if port is not None and port not in (80, 443):
|
||||
return False, "Non-standard ports are not allowed."
|
||||
|
||||
hostname = parsed.hostname
|
||||
if not hostname:
|
||||
return False, "Invalid URL: missing hostname."
|
||||
|
||||
try:
|
||||
addr_infos = socket.getaddrinfo(hostname, None)
|
||||
except socket.gaierror:
|
||||
return False, "Could not resolve hostname."
|
||||
|
||||
if not addr_infos:
|
||||
return False, "Could not resolve hostname."
|
||||
|
||||
for addr_info in addr_infos:
|
||||
try:
|
||||
ip = ipaddress.ip_address(addr_info[4][0])
|
||||
except ValueError:
|
||||
return False, "Invalid IP address resolved from hostname."
|
||||
if (ip.is_private or ip.is_loopback or ip.is_reserved
|
||||
or ip.is_link_local or ip.is_multicast):
|
||||
return False, "Access to internal networks is not allowed."
|
||||
|
||||
return True, parsed
|
||||
|
||||
|
||||
class ContentImageViewSet(viewsets.ModelViewSet):
|
||||
@@ -119,6 +174,101 @@ class ContentImageViewSet(viewsets.ModelViewSet):
|
||||
instance.save()
|
||||
return Response({"success": "Image set as primary image"})
|
||||
|
||||
|
||||
@action(detail=False, methods=['post'],
|
||||
permission_classes=[IsAuthenticated],
|
||||
throttle_classes=[ImageProxyThrottle])
|
||||
def fetch_from_url(self, request):
|
||||
"""
|
||||
Authenticated proxy endpoint to fetch images from external URLs.
|
||||
Avoids CORS issues when the frontend downloads images from third-party
|
||||
servers (e.g. wikimedia.org). Requires a logged-in user and is
|
||||
rate-limited to 60 requests/minute.
|
||||
"""
|
||||
image_url = request.data.get('url')
|
||||
if not image_url:
|
||||
return Response(
|
||||
{"error": "URL is required"},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Validate the initial URL (scheme, port, SSRF check on all resolved IPs)
|
||||
safe, result = _is_safe_url(image_url)
|
||||
if not safe:
|
||||
return Response({"error": result}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
try:
|
||||
headers = {'User-Agent': 'AdventureLog/1.0 (Image Proxy)'}
|
||||
max_redirects = 3
|
||||
current_url = image_url
|
||||
|
||||
for _ in range(max_redirects + 1):
|
||||
response = requests.get(
|
||||
current_url,
|
||||
timeout=10,
|
||||
headers=headers,
|
||||
stream=True,
|
||||
allow_redirects=False,
|
||||
)
|
||||
|
||||
if not response.is_redirect:
|
||||
break
|
||||
|
||||
# Re-validate every redirect destination before following
|
||||
redirect_url = response.headers.get('Location', '')
|
||||
if not redirect_url:
|
||||
return Response(
|
||||
{"error": "Redirect with missing Location header"},
|
||||
status=status.HTTP_502_BAD_GATEWAY,
|
||||
)
|
||||
|
||||
safe, result = _is_safe_url(redirect_url)
|
||||
if not safe:
|
||||
return Response(
|
||||
{"error": f"Redirect blocked: {result}"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
current_url = redirect_url
|
||||
else:
|
||||
return Response(
|
||||
{"error": "Too many redirects"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
response.raise_for_status()
|
||||
|
||||
content_type = response.headers.get('Content-Type', '')
|
||||
if not content_type.startswith('image/'):
|
||||
return Response(
|
||||
{"error": "URL does not point to an image"},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
content_length = response.headers.get('Content-Length')
|
||||
if content_length and int(content_length) > 20 * 1024 * 1024:
|
||||
return Response(
|
||||
{"error": "Image too large (max 20MB)"},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
image_data = response.content
|
||||
|
||||
return HttpResponse(image_data, content_type=content_type, status=200)
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
logger.error("Timeout fetching image from URL %s", image_url)
|
||||
return Response(
|
||||
{"error": "Download timeout - image may be too large or server too slow"},
|
||||
status=status.HTTP_504_GATEWAY_TIMEOUT
|
||||
)
|
||||
except requests.exceptions.RequestException as e:
|
||||
logger.error("Failed to fetch image from URL %s: %s", image_url, str(e))
|
||||
return Response(
|
||||
{"error": "Failed to fetch image from the remote server"},
|
||||
status=status.HTTP_502_BAD_GATEWAY
|
||||
)
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
# Get content type and object ID from request
|
||||
content_type_name = request.data.get('content_type')
|
||||
@@ -163,6 +313,20 @@ class ContentImageViewSet(viewsets.ModelViewSet):
|
||||
"error": f"Invalid content_type. Must be one of: {', '.join(content_type_map.keys())}"
|
||||
}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Validate object_id format (must be a valid UUID, not "undefined" or empty)
|
||||
if not object_id or object_id == 'undefined':
|
||||
return Response({
|
||||
"error": "object_id is required and must be a valid UUID"
|
||||
}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
import uuid as uuid_module
|
||||
try:
|
||||
uuid_module.UUID(str(object_id))
|
||||
except (ValueError, AttributeError):
|
||||
return Response({
|
||||
"error": f"Invalid object_id format: {object_id}"
|
||||
}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Get the content object
|
||||
try:
|
||||
content_object = content_type_map[content_type_name].objects.get(id=object_id)
|
||||
|
||||
@@ -1,18 +1,22 @@
|
||||
import logging
|
||||
from django.utils import timezone
|
||||
from django.db import transaction
|
||||
from django.core.exceptions import PermissionDenied
|
||||
from django.core.files.base import ContentFile
|
||||
from django.db.models import Q, Max, Prefetch
|
||||
from django.db.models.functions import Lower
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
import requests
|
||||
from adventures.models import Location, Category, CollectionItineraryItem, Visit
|
||||
from adventures.models import Location, Category, Collection, CollectionItineraryItem, ContentImage, Visit
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from adventures.permissions import IsOwnerOrSharedWithFullAccess
|
||||
from adventures.serializers import LocationSerializer, MapPinSerializer, CalendarLocationSerializer
|
||||
from adventures.utils import pagination
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class LocationViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
ViewSet for managing Adventure objects with support for filtering, sorting,
|
||||
@@ -254,6 +258,131 @@ class LocationViewSet(viewsets.ModelViewSet):
|
||||
|
||||
return Response(response_data)
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def duplicate(self, request, pk=None):
|
||||
"""Create a duplicate of an existing location.
|
||||
|
||||
Copies all fields except collections and visits. Images are duplicated as
|
||||
independent files (not shared references). The name is prefixed with
|
||||
"Copy of " and is_public is reset to False.
|
||||
"""
|
||||
original = self.get_object()
|
||||
|
||||
# Verify the requesting user owns the location or has access
|
||||
if not self._has_adventure_access(original, request.user):
|
||||
return Response(
|
||||
{"error": "You do not have permission to duplicate this location."},
|
||||
status=status.HTTP_403_FORBIDDEN,
|
||||
)
|
||||
|
||||
try:
|
||||
with transaction.atomic():
|
||||
target_collection = None
|
||||
target_collection_id = request.data.get('collection_id')
|
||||
|
||||
if target_collection_id:
|
||||
try:
|
||||
target_collection = Collection.objects.get(id=target_collection_id)
|
||||
except Collection.DoesNotExist:
|
||||
return Response(
|
||||
{"error": "Collection not found."},
|
||||
status=status.HTTP_404_NOT_FOUND,
|
||||
)
|
||||
|
||||
user_can_link_to_collection = (
|
||||
target_collection.user == request.user
|
||||
or target_collection.shared_with.filter(uuid=request.user.uuid).exists()
|
||||
)
|
||||
if not user_can_link_to_collection:
|
||||
return Response(
|
||||
{"error": "You do not have permission to add locations to this collection."},
|
||||
status=status.HTTP_403_FORBIDDEN,
|
||||
)
|
||||
|
||||
# Snapshot original images before creating the copy
|
||||
original_images = list(original.images.all())
|
||||
|
||||
# Build the new location
|
||||
new_location = Location(
|
||||
user=request.user,
|
||||
name=f"Copy of {original.name}",
|
||||
description=original.description,
|
||||
rating=original.rating,
|
||||
link=original.link,
|
||||
location=original.location,
|
||||
tags=list(original.tags) if original.tags else None,
|
||||
is_public=False,
|
||||
longitude=original.longitude,
|
||||
latitude=original.latitude,
|
||||
city=original.city,
|
||||
region=original.region,
|
||||
country=original.country,
|
||||
price=original.price,
|
||||
price_currency=original.price_currency,
|
||||
)
|
||||
|
||||
# Handle category: reuse the user's own matching category or
|
||||
# create one if necessary.
|
||||
if original.category:
|
||||
category, _ = Category.objects.get_or_create(
|
||||
user=request.user,
|
||||
name=original.category.name,
|
||||
defaults={
|
||||
'display_name': original.category.display_name,
|
||||
'icon': original.category.icon,
|
||||
},
|
||||
)
|
||||
new_location.category = category
|
||||
|
||||
new_location.save()
|
||||
|
||||
# If requested, link the duplicate only to the current collection.
|
||||
# This avoids accidentally inheriting all source collections.
|
||||
if target_collection:
|
||||
new_location.collections.set([target_collection])
|
||||
|
||||
# Duplicate images as independent files/new records
|
||||
location_ct = ContentType.objects.get_for_model(Location)
|
||||
for img in original_images:
|
||||
if img.image:
|
||||
try:
|
||||
img.image.open('rb')
|
||||
image_bytes = img.image.read()
|
||||
finally:
|
||||
try:
|
||||
img.image.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
file_name = (img.image.name or '').split('/')[-1] or 'image.webp'
|
||||
|
||||
ContentImage.objects.create(
|
||||
content_type=location_ct,
|
||||
object_id=str(new_location.id),
|
||||
image=ContentFile(image_bytes, name=file_name),
|
||||
immich_id=None,
|
||||
is_primary=img.is_primary,
|
||||
user=request.user,
|
||||
)
|
||||
else:
|
||||
ContentImage.objects.create(
|
||||
content_type=location_ct,
|
||||
object_id=str(new_location.id),
|
||||
immich_id=img.immich_id,
|
||||
is_primary=img.is_primary,
|
||||
user=request.user,
|
||||
)
|
||||
|
||||
serializer = self.get_serializer(new_location)
|
||||
return Response(serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
except Exception:
|
||||
logger.exception("Failed to duplicate location %s", pk)
|
||||
return Response(
|
||||
{"error": "An error occurred while duplicating the location."},
|
||||
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
|
||||
)
|
||||
|
||||
# view to return location name and lat/lon for all locations a user owns for the golobal map
|
||||
@action(detail=False, methods=['get'], url_path='pins')
|
||||
def map_locations(self, request):
|
||||
|
||||
@@ -303,6 +303,13 @@ REST_FRAMEWORK = {
|
||||
'rest_framework.authentication.SessionAuthentication',
|
||||
),
|
||||
'DEFAULT_SCHEMA_CLASS': 'rest_framework.schemas.coreapi.AutoSchema',
|
||||
'DEFAULT_THROTTLE_CLASSES': [
|
||||
'rest_framework.throttling.UserRateThrottle',
|
||||
],
|
||||
'DEFAULT_THROTTLE_RATES': {
|
||||
'user': '1000/day',
|
||||
'image_proxy': '60/minute',
|
||||
},
|
||||
}
|
||||
|
||||
if DEBUG:
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
Django==5.2.8
|
||||
Django==5.2.11
|
||||
djangorestframework>=3.15.2
|
||||
django-allauth==0.63.3
|
||||
django-money==3.5.4
|
||||
@@ -8,7 +8,7 @@ django-cors-headers==4.4.0
|
||||
coreapi==2.3.3
|
||||
python-dotenv==1.1.0
|
||||
psycopg2-binary==2.9.10
|
||||
pillow==11.3.0
|
||||
pillow==12.1.1
|
||||
whitenoise==6.9.0
|
||||
django-resized==1.0.3
|
||||
django-geojson==4.2.0
|
||||
@@ -29,4 +29,5 @@ psutil==6.1.1
|
||||
geojson==3.2.0
|
||||
gpxpy==1.6.2
|
||||
pymemcache==4.0.0
|
||||
legacy-cgi==2.6.3
|
||||
legacy-cgi==2.6.3
|
||||
requests>=2.31.0
|
||||
Reference in New Issue
Block a user