Just starting out with Duplicati, backing up to OneDrive. I’ve got two separate jobs set up, one for my “Backups and Photos” directories, the other for everything else. This is from an OpenMediaVault NAS, local filesystem is ZFS. I am only working with the “Backups and Photos” at the moment.
The Photos portion is fairly straightforward and almost entirely photos with a video here or there. The Backups directory is the target for my UrBackup server that I use for full-system images and incremental backups for all our PCs. When I set up the backup job, it was with a new OneDrive account, and I ran into the “5GB free tier” issue, despite not starting a free trial; that’s been cleared up and it’s uploading cleanly.
The problem is, it’s completed the file count (which looks right; output of
find . -type f | wc -l comes within shouting distance) and the file size it’s found is 770GB. On disk, the Backups directory is 227GB and the Photos directory is 232GB. I have compression turned on in ZFS; as expected Photos is at a compress ratio of 1.00. Backups is at 1.31, which would at worst case bring the 227GB to 300GB. So somehow, Duplicati is seeing an extra 250GB of file size.
I thought it might be running both backup jobs combined for some reason, but the other job is well over 400GB in size (and backing up to a different OneDrive account, actually). Does anyone know why it’s happening? UrBackup uses a ton of symlinks; is it possible that Duplicati is counting each symlinked file as a whole file? The Advanced option seemed to indicate the default is to just record that it exists and where it points to, so I wouldn’t have expected that.
Thanks for any help you folks can provide!