I think that is a big point of contention.
Duplicati handles file and folder changes/removal just fine, provided it is a level below the entered location/path, but when the entire path disappears is when it causes what some see as a problem.
Example: If you set it to backup /path/A/subfolder, /path/B/subfolder, and /path/C/subfolder, any changes to the files and folders at least one level below these will continue backing up, so deleting/moving/renaming /path/A/subfolder/removedfolder means duplicati will continue to work like normal.
The point of contention comes in when someone deletes/moves/renames the top level path set in Duplicati. So if /path/B/subfolder was deleted, or moved/renamed to /path/B/foldermoved, Duplicati is only looking for “/path/B/subfolder”, duplicati does not know it was deleted/moved/renamed, so it fails the entire job for this backup since the entire path is not available. The job itself needs to be changed to reflect this change.
It is the equivalent of a Windows shortcut/hard link pointing to drive F: and Windows or person changes drive letter to G, any shortcut or hardlinked item will no longer work because it is pointed at “F”. This is also why in the documentation, they say by default it is setup to backup to always available locations, or at least is able to pickup the backup on a schedule when the drive/backup location is available. It may fail the job, but if you were to re-add that path, it would complete the backup based on the new changes to that folder compared to last time/first time.
In the case of the original poster, I suggest changing the job to point to /mnt/4AF15A0435E762B4/Paragon backups/, that way if any of the folders below that get added, removed, deleted, changed by Paragon, Duplicati can keep backing them up without failing the job since “Lenovo_Ubuntu_Backup_20180523” is one sub level below the path entered. The other option is as mentioned above, changing the setting to “–allow-missing-source=true”.
The problem there is if you keep adding new jobs pointing to folders that get deleted by other programs/software (like Paragon), you end up with dozens or hundreds of backups jobs pointing at non-existent folders/paths which ends up using a lot of resources each time it attempts to back them up. I also understand that each Paragon backup may be anywhere from 5-500GB (or more) each, so it may be worth doing a little creative programming for a standalone cron job that essentially once a day checks the available folders under the Paragon folder, and then creates/removes the Duplicati jobs in the files (then restarting the Duplicati service after changes are made).