Yet, another "Deleting unwanted files …" stuck for days

Sorry, I meant faster compared to uploading a single file as I was doing before.

Yes, large files. For instance, I backup Immich DB and assets as described here. Then I get two files:

  • immich_backup_20240807_030001.tar.gz (160GB)
  • immich_db_20240807_030001.sql.gz (128MB)

Backing up those two files takes about 5-6 hours.

This is the expected outcome for reasons explained, which is that Duplicati has the best chance of uploading a small set of changes if it has individual files (mostly unchanged) to look at, not one big one which may dramatically increase upload because things shifted. But only you have the upload amounts which you can see in a job log’s Complete log as "BytesUploaded". You can compare.

So a two file backup, both .gz which probably means deduplication can’t do much for the reasons explained which is that everything shifts. The total is only slightly above a suitable size for default blocksize (now 100 KB, going to 1 MB next), but you can probably speed it up a little with a larger blocksize anyway, as the main reason for a small one is better deduplication for easier scenarios.

So if the 278 GB is the worst-case upload size, that’s about 6 hours, so the math seems to match.

I use neither immich nor PostgreSQL. but if the 160GB tar.gz came from the database, you might be stuck with it. If it’s your own packaging, having more and smaller files might reduce uploading.

1 Like