I’m backing up ~ 10T from a 30T drive on a linux (Ubuntu) server, and the Duplicati progress bar displays that over 120T is being backed up. This would seem to be impossible, the source disk isn’t even close to that size. Any ideas what could be happening?
Assuming “version canary” means the latest (currently 2.0.3.11) then my guess would be that you’ve got a shortcut / hard link / mount point that’s with including unexpected data or causing a recursive loop.
What is your current shortcut following setting set at?
You may want to consider editing your backup job to have a smaller source where the “to go” count is what’s expected then slowly as folders between runs until to balloons.
Have you gotten a full backup completed yet? If so, check the job log to see what was reported as scanned & backup up (first 10 or so lines of the log). It’s possible there’s just a bug in the GUI.
Thanks for the response. I hadn’t set a shortcut following setting. I did not complete a backup of that particular file system yet. I have completed smaller backups however.
How would I set up the backup to only backup files and not shortcuts/hard links/mount points?