How to repack existing backup remote volumes after volume size change?

How to repack existing backup remote volumes after volume size change?

There’s no GUI or CLI tool specifically designed for all the things you’re after.

The COMPACT command can be persuaded to increase (not decrease) the size, however I’m not sure how much it will improve your performance, and it’s hard to control how much work it does at one run. There are a couple of posts about this:

Feature Request: Time Limit for Compaction

Compact - Limited / Partial

I’m not sure it would need to be purged. Current compact does a little at a time, downloading files to repack still-in-use blocks into a new file which is uploaded.

v2.1.0.119_canary_2025-05-29

Preserve space during compact by deleting files early, instead of at the end of the compact

You can see this in CommandLine using --dry-run flag to stop actual change:

  Downloading file duplicati-b79b25742c67f4def900c8edf919fee12.dblock.zip.aes (1.93 MiB) ...
  Downloading file duplicati-b70708693fbbb4707a4da5f1f83dfc1e6.dblock.zip.aes (1.05 MiB) ...
  Downloading file duplicati-b42f28d1af6724da9bd00ecca4e31d3ac.dblock.zip.aes (2.11 MiB) ...
[Dryrun]: Would upload generated blockset of size 10.91 MiB
[Dryrun]: Would delete remote file: duplicati-bf64e95282d714c3492449377ed87f142.dblock.zip.aes, size: 9.99 MiB
[Dryrun]: Would delete remote file: duplicati-i7b6b165265204c3ca35f66120af65e1f.dindex.zip.aes, size: 66.29 KiB
[Dryrun]: Would delete remote file: duplicati-b8d0c9f99438c4af1bae5234210af768c.dblock.zip.aes, size: 4.73 MiB
[Dryrun]: Would delete remote file: duplicati-i9807acdf586c430e849b44bd8290d4f6.dindex.zip.aes, size: 42.64 KiB
[Dryrun]: Would delete remote file: duplicati-b79b25742c67f4def900c8edf919fee12.dblock.zip.aes, size: 1.93 MiB
[Dryrun]: Would delete remote file: duplicati-i76a6f06a0a574ed7ae5185cf8c384c01.dindex.zip.aes, size: 38.25 KiB
  Downloading file duplicati-ba24dc45e49cf4c7c91ffd11678c7acee.dblock.zip.aes (1.28 MiB) ...
  Downloading file duplicati-b46124700bbb244db869c1dc7fa80ee31.dblock.zip.aes (279.58 KiB) ...
[Dryrun]: Would upload generated blockset of size 3.68 MiB
[Dryrun]: Would delete remote file: duplicati-b70708693fbbb4707a4da5f1f83dfc1e6.dblock.zip.aes, size: 1.05 MiB
[Dryrun]: Would delete remote file: duplicati-ifea8b78ddaad47d5a87c0979aed10f63.dindex.zip.aes, size: 37.06 KiB
[Dryrun]: Would delete remote file: duplicati-b42f28d1af6724da9bd00ecca4e31d3ac.dblock.zip.aes, size: 2.11 MiB
[Dryrun]: Would delete remote file: duplicati-i8bb745e6fdb1413ea922004167ac0cce.dindex.zip.aes, size: 41.67 KiB
[Dryrun]: Would delete remote file: duplicati-ba24dc45e49cf4c7c91ffd11678c7acee.dblock.zip.aes, size: 1.28 MiB
[Dryrun]: Would delete remote file: duplicati-i055b72e2df354f7285e994986b062c06.dindex.zip.aes, size: 37.67 KiB
[Dryrun]: Would delete remote file: duplicati-b46124700bbb244db869c1dc7fa80ee31.dblock.zip.aes, size: 279.58 KiB
[Dryrun]: Would delete remote file: duplicati-idfa32ce5d0d246b998e35f39b3069810.dindex.zip.aes, size: 1.26 KiB

Is an old backup in the cloud viable? A long compact may delay usual backup.
My test run above had tried --dblock-size=11MB because 10MB did nothing,
while 20MB did more than I wanted. Basically, it appears “roughly” controllable.

Duplicati.CommandLine.exe compact destination-URL --dbpath=path-to-database --passphrase=REDACTED --dblock-size=11MB --dry-run=True

I could get 10MB to do something by adding --small-file-max-count=0, and I had some luck getting the processing amount down with --threshold=1000 and lowering it. Yes, that’s percent, but it seems like one can go beyond 100 for value.

So it’s sort of possible but kind of indirect and finicky, persuading the wrong tool…