Enormous backup size?

I’m trying Duplicati with B2 storage. I selected a bunch of files that can’t possibly be more than 2.5 TB, probably more like 1 TB, but Duplicati says it’s backing up 650 TB!!!

2664956 files (642.70 TB) to go at 836.02 KB/s

Is this a bug? Or maybe it’s due to following symlinks and backing up the same data multiple times? How do I find out what it thinks it’s backing up?

Also if I try to stop the backup, edit the configuration, and start again, it won’t start again, I get an error “The database file is locked database is locked”:

The database file is locked database is locked

This is on Ubuntu 18.04.02 using the latest .deb file.

I tried to select just 2 folders for backup, with total size of 3 GB, and now it says the backup size is negative!?!

6 files (-615 bytes) to go at 715.39 KB/s

Can you enable debug logging and provide the log file. Remove any sensitive info from the log before uploading.

Can you link to instructions on how to do that?

Execute the job as a command line and use the following options, log-file-level set to profiling and log-file option to send output to a file.

I haven’t figured out how to do the debug thing yet, but running more backups, I get error messages:

Screenshot%20from%202019-02-17%2015-16-42

If I click on Show, it doesn’t show me the errors, it just shows me “Log data” with a list of dates, with no indication that they are clickable:

Screenshot%20from%202019-02-17%2015-16-59

But they are. If you click on them it then shows a big dump of text:

Screenshot%20from%202019-02-17%2015-23-40

and if you scroll down enough, there are some nested brackets with this inside:

Errors: [ 2019-02-17 15:04:15 -05 - [Error-Duplicati.Library.Main.Database.LocalBackupDatabase-CheckingErrorsForIssue1400]: Checking errors, related to #1400. Unexpected result count: 0, expected 1, hash: 8cqGCuMqPNZVNoHXNJlsIZ3tBwreriRrTUa42BO15oA=, size: 14424, blocksetid: 285924, ix: 129, fullhash: +gNLX5B4EQxl5tACu9u7gkcoC9TIcjc3GBWVYkJ2x4g=, fullsize: 13224024, 2019-02-17 15:04:15 -05 - [Error-Duplicati.Library.Main.Database.LocalBackupDatabase-FoundIssue1400Error]: Found block with ID 569826 and hash 8cqGCuMqPNZVNoHXNJlsIZ3tBwreriRrTUa42BO15oA= and size 10320 ]

This is a pretty bad user experience. Yet it’s the highest-rated backup solution on AlternativeTo, so the others must be even worse?