Target larger than Source

Target 5,91 TiB

Source 4,34 TiB

--dblock-size=2GB
–all-versions=True
–auto-cleanup=True
–auto-vacuum=True
–backup-test-percentage=0.2
–blocksize=64MB
–debug-retry-errors=True
–number-of-retries=5
–rebuild-missing-dblock-files=True
–restore-with-local-blocks=True
–snapshot-policy=Auto
–usn-policy=Auto
–asynchronous-concurrent-upload-limit=8
–zip-compression-level=9
–disable-synthetic-filelist=True

How is a zipped Target with only 1 Version a lot larger than the source? No Symlinks that I know of. Just a Data-Partition.

That is very odd. There is a slight amount of overhead but this is usually massively offset by deduplication and compression.

Is it possible that your backup was including the temporary folder where Duplicati stores files during upload, so a number of these files were picked up as well?

Also, for the “Source” number, does the source disk have compression enabled? How did you get this number?

Symlinks and hardlinks do not matter because the content is deduplicated, unless the symlinks point outside the source folder.

Does that mean there has only been 1 ever done (so initial backup), or config of

If the latter, you’re losing protection of having old versions. A space effect is this:

  --threshold (Integer): The maximum wasted space in percent
    As files are changed, some data stored at the remote destination may
    not be required. This option controls how much wasted space the
    destination can contain before being reclaimed. This value is a
    percentage used on each volume and the total storage.
    * default value: 25

The COMPACT command

This is by default run automatically as needed, although options can control it.

If this is the first backup ever, then I agree it seems odd.