Over time, I began to encounter errors very often. And it got in the way a lot. I understand that each error needs to be dealt with separately, but errors have become very frequent. Most of my backups are local. But there are failures with both local backups and backups to the cloud.
For example, here are the errors that I have now:
-
ArgumentNullException: Value cannot be null. (Parameter ‘_entriesForCurrentReadStream’)
-
DatabaseInconsistencyException: Detected 1 volume with missing filesets: VolumeId = 121, Name = duplicati-20250927T230334Z.dlist.zip.aes, State = Deleting
Which version, what back-end, do you abort backups or compaction (soft or hard) for any reason?
This error message points to SharpCompress which is only used for fallbacks. Are you using the latest version of Duplicati? Are you perhaps having non-standard compression settings?
This looks like an issue that has been fixed in recent versions of Duplicati.
I was seeing that same ArgumentNullException and wasn’t sure if it was my settings or a bug. Good to know it’s likely just a version/fallback issue.
It is extremely rarely interrupted manually, but shutting down or restarting the computer of course interrupts the backup process quite often. I can understand that this is bad, but I can’t wait for Duplicati to complete the backup. The program should work out such situations.
2.2.0.1_stable_2025-11-09"
“Are you perhaps having non-standard compression settings?” - No. Never used.
As a result, I lost backup for several years.
Tip: do not create backups larger than 300GB. 1) It takes a long time to make the backup itself. As a result, it is often interrupted due to shutting down/restarting the computer, which leads to errors. 2) Big backup = big database. If it needs to be recreated due to errors, then the database can take more than one day to recreate. (But re-creating a database doesn’t always help.)
The disadvantage of Duplicati is that it does not allow you to use data recovery if the backup process is currently underway (even if the backups are different).
That’s GUI limitation only, nothing prevents you from running parallel instances of Duplicati. I do that every time when I do full restore test for well, a set of backups. Currently I’m running 8 restore processes in parallel, always restoring from the backup destination, without database. Because that’s the only sane way to test backups.