Backup failing due to e.g. ArgumentNullException

I’ve started using Duplicati just a short while ago and while I am extremely excited about the feature set (and I honestly wish I would’ve known about it years earlier) I’m currently not really impressed with the reliability and the reporting when it comes to failed backup tasks. Hopefully someone can help me here:

I’ve started with adding a backup job of a few folders at around 150 GB. This one worked fine. Then I’ve added additional ~500 GB and waited until that finished, which took around 1-2 days (due to lots of smaller files, upload to cloud). I am not completely sure if this completely succeeded as - while I saw it ran for at least 1-2 days - the backup size afterwards still shows at 150 GB (although on the Cloud side the ~500 GB were stored as there it’s around 650 GB now). Weirdly, neither the Log data for the Backup job shows the Job execution nor does the General Duplicati log show any errors for that time.

The next execution then showed the error “Duplicati.Library.Interface.UserInformationException: Found 1 remote files that are not recorded in local storage, please run repair”. While another attempt then started to run (and took ~1 day) it seemed to have failed with “System.ArgumentNullException: Path cannot be null.”.

It’s a bit hard to investigate since the backup takes around 2 days until it’s finished. It would probably be easier to just delete the whole Backup job and re-create and re-backup it but honestly I have to make sure that one can solve those problems to be able to trust a Backup solution.

Basically I don’t know how to proceed now, as neither running it again nor running repair seems to have allowed me to run the Backup again. I would appreciate any kind of help here. Thanks!


Hello and welcome to the forum!

What back end are you using?

As long as you’re doing backups through the web UI (and not command line), it should be accurate. The “Source” size is the total of all files you have selected for backup. The “Backup” size is how much space it takes on the back end. This could certainly be less than the source side depending on how well deduplication and compression work on your data.

Sounds like Duplicati sees an extra file in your remote storage. Do you have more than one backup job? Make sure you have a unique target folder for each backup job on each PC - no sharing of target folders is allowed.