So, I’m selfhosting immich, the issue is we tend to take a lot of pictures of the same scene/thing to later pick the best, and well, we can have 5~10 photos which are basically duplicates but not quite.
Some duplicate finding programs put those images at 95% or more similarity.

I’m wondering if there’s any way, probably at file system level, for the same images to be compressed together.
Maybe deduplication?
Have any of you guys handled a similar situation?

  • pe1uca@lemmy.pe1uca.devOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I’m not saying to delete, I’m saying for the file system to save space by something similar to deduping.
    If I understand correctly, deduping works by using the same data blocks for similar files, so there’s no actual data loss.

    • Dave.@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I don’t think there’s anything commercially available that can do it.

      However, as an experiment, you could:

      • Get a group of photos from a burst shot
      • Encode them as individual frames using a modern video codec using, eg VLC.
      • See what kind of file size you get with the resulting video output.
      • See what artifacts are introduced when you play with encoder settings.

      You could probably/eventually script this kind of operation if you have software that can automatically identify and group images.

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      I believe this is what some compression algorithms do if you were to compress the similar photos into a single archive. It sounds like that’s what you want (e.g. archive each day), for immich to cache the thumbnails, and only decompress them if you view the full resolution. Maybe test some algorithms like zstd against a group of similar photos vs individually?

      FYI file system deduplication works based on file content hash. Only exact 1:1 binary content duplicates share the same hash.

      Also, modern image and video encoding algorithms are already the most heavily optimized that computer scientists can currently achieve with consumer hardware, which is why compressing a jpg or mp4 offers negligible savings, and sometimes even increases the file size.