Duplicati fails deadly?


#1

I set up Duplicati to backup lots of folders. Checked on it and it hasn’t backed up for days. It says

The source folder /home/…/… does not exist, aborting backup

Seriously?? If a single folder gets deleted, it doesn’t backup anything at all? This is not the kind of behavior I’d expect from a backup software. The whole point is to prevent data loss.


#2

I partially agree, but it’s just how you look at it.

If a backup solution says the backup completed successfully, but the contents of one or more source folders is skipped completely, you could be unpleasantly surprised when you have to restore files from that source folder.

Duplicati could continue the backup operation for the sources that still exist, but the main developer chose to abort the backup and throw an error message in this case. Underlying meaning is that the backup operator should regularly check and/or monitor the backups for errors or warnings. Duplicati-Monitoring and dupReport could help monitoring your backups.

There’s some discussion about this topic here and here.


#3

IMO the backup should still take place (for the valid source paths) but a Warning should be thrown for the invalid paths.


#4

In this case that wouldn’t happen, because the folder has ceased to exist.

Ok, but I would be more unpleasantly surprised to find out that nothing at all had been backed up.


#5

This is a tough call.

While at first it makes sense to warn if one of many source folders disappears and error if the ONLY source for goes away, there are side effects to this.

For example, when a folder (or file) is no longer found Duplicati considers it deleted. This has potential follow on actions including:

  • “disconnect” in history for EVERYTHING in that path
  • potential removal from backups of the disappeared items depending on retention policy

So a warning-with-backup that is missed or ignored for more than retention-policy keep rules could result in deletionb of valid previously backed up content.


#6

Continue backing up the data that exists, don’t delete the data that’s missing, and warn the user. I don’t know what’s so difficult to understand.

Failing to backup data at all is obviously wrong.


#7

It’s not difficult to understand. It’s difficult to know.

Duplicati currently has no way of knowing if the file disappeared, was moved, or was deleted intentionally. It can only know how everything looks in the snapshot that is taken.

If you understand the risk and accept it, then you can change this behavior with --“allow-missing-source=true” but we cannot in good conscience enable this by default for everyone.


#8

So you think it’s better for backup software to have a default configuration that fails to backup data?


#9

In a number of situations: yes.

Some (probably more than you would expect) users set retention policy default to “Keep 3 backups”. If a source folder is renamed and the backup is scheduled to run daily, after 3 days you end up with having no backup of one or more of your important source folders.
In that situation it’s better to have an older backup of all sources, than a recent backup that misses source folders.

Too bad that the default setting is not optimal for your situation, but others may profit from this default behaviour. Duplicati is designed in such a way that it can be adjusted to virtually any situation. The mentioned –allow-missing-source setting, the built-in reporting options and the external services Duplicati-Monitoring and dupReport could help detecting and fixing problems when a source folder is missing.