Automatic backups stopped on June 23rd!

Hi

Writing “ps -ef | grep -i Duplicati” in Terminal confirms that the Duplicati service is running, but when I login to the Duplicati web-interface I can see that it hasn’t executed the scheduled automatic daily backups since June 23rd(!?) And when I press “Run now” nothing happens! This is all quite disturbing!
If it was possible, I would attach a screenshot, so you can see it for yourself.

PS: Does this problem have anything to do with this red error message at the bottom of the screen:

“Error while running Lenovo-Ubuntu
The source folder /mnt/4AF15A0435E762B4/Paragon backups/Lenovo_Ubuntu_Backup_20180523/ does not exist, aborting backup”

If the deletion (or renaming?) of a single folder on the PC I am backing up can make Duplicati stop backing up anything, then I would have to declare Duplicati totally useless…

I look forward to hear from anyone!

I suspect it has everything to do with that “source folder xxx does not exists” message.

My guess is if you check the logs you’ll find that Duplicati actually ran as scheduled even after the 23rd but every run failed with the “source folder does not exist message”.

Basically, if a folder specified in step 3 (Source Data) of the job doesn’t exist, the backup will fail UNLESS you tell Duplicati to ignore that issue by adding --allow-missing-source=true to step 5 (Options) or the backup.

--allow-missing-source
Use this option to continue even if some source entries are missing.
Default value: “false”

Alternatively, you could simply remove the missing folder from the Source Data so Duplicati doesn’t try backing up a non-existent location anymore.

Note that I’m making some assumptions here (like that /mnt/4AF15A0435E762B4/Paragon backups/Lenovo_Ubuntu_Backup_20180523/ really doesn’t exist). If it DOES exist and you’re still getting the error, then there’s something else going on (perhaps a permissions problem) and we can try looking at it from that angle.

Yes. That helped! Thank you. But I am still shocked…!
I have used many online/cloud backup systems and this is the worst default setting I have seen. It’s great if you get some warning that a certain folder or sub-folder wasn’t found, but the fact that the program totally stops all future backups is shocking! It gives me a fundamental distrust in Duplicati.

This feature/issue is tracked here Entire backup fails when one path is missing · Issue #2919 · duplicati/duplicati · GitHub

It’s currently expected behavior what you experienced but there is a lot of debate :slight_smile:

Hi Pectojin

You say “It’s currently expected behavior”, but, just like me, the users on that discussion did NOT expect Duplicati to abandon all future backups just because a folder has been deleted…!
Inspired by that discussion, I have now configured Duplicati to send me a mail ‘on all occasions’. That’s necessary because you apparently can’t trust this backup software.

Hello @Henr , you right. This should not be default behavior.
But if you are serious with backup, you should set up email notification on backup job in case error/warning…

You can not trust any backup solution blindly withotu monitoring/notifications

There also other tools to notify you about Duplicati backup:
Selfhosted solution:


Hosted solution:
https://www.duplicati-monitoring.com/

Booth of them can notify you if backup stop working.

https://www.duplicati-monitoring.com

this is another great Duplicati addon service that will help you spot backups that are not working.

edit: lol, I see it was also mentioned above.

But I suppose third party monitoring is only necessary for businesses that want extra security, since Duplicati has the ‘send mail’ options.

That’s only partly true. Duplicati email option wont save anyone from Duplicati crash for example. But dupReport or www.duplicati-monitoring.com will alert you that backup job don’t have any new successful backups.

Yes, OK! I see the point: Independent confirmation!

I think that is a big point of contention.
Duplicati handles file and folder changes/removal just fine, provided it is a level below the entered location/path, but when the entire path disappears is when it causes what some see as a problem.

Example: If you set it to backup /path/A/subfolder, /path/B/subfolder, and /path/C/subfolder, any changes to the files and folders at least one level below these will continue backing up, so deleting/moving/renaming /path/A/subfolder/removedfolder means duplicati will continue to work like normal.

The point of contention comes in when someone deletes/moves/renames the top level path set in Duplicati. So if /path/B/subfolder was deleted, or moved/renamed to /path/B/foldermoved, Duplicati is only looking for “/path/B/subfolder”, duplicati does not know it was deleted/moved/renamed, so it fails the entire job for this backup since the entire path is not available. The job itself needs to be changed to reflect this change.

It is the equivalent of a Windows shortcut/hard link pointing to drive F: and Windows or person changes drive letter to G, any shortcut or hardlinked item will no longer work because it is pointed at “F”. This is also why in the documentation, they say by default it is setup to backup to always available locations, or at least is able to pickup the backup on a schedule when the drive/backup location is available. It may fail the job, but if you were to re-add that path, it would complete the backup based on the new changes to that folder compared to last time/first time.

In the case of the original poster, I suggest changing the job to point to /mnt/4AF15A0435E762B4/Paragon backups/, that way if any of the folders below that get added, removed, deleted, changed by Paragon, Duplicati can keep backing them up without failing the job since “Lenovo_Ubuntu_Backup_20180523” is one sub level below the path entered. The other option is as mentioned above, changing the setting to “–allow-missing-source=true”.

The problem there is if you keep adding new jobs pointing to folders that get deleted by other programs/software (like Paragon), you end up with dozens or hundreds of backups jobs pointing at non-existent folders/paths which ends up using a lot of resources each time it attempts to back them up. I also understand that each Paragon backup may be anywhere from 5-500GB (or more) each, so it may be worth doing a little creative programming for a standalone cron job that essentially once a day checks the available folders under the Paragon folder, and then creates/removes the Duplicati jobs in the files (then restarting the Duplicati service after changes are made).