I can’t think of one that’s both human-simple and resource-light. If you want to brute-force it, restoring everything to a bunch of free space then overwriting the source files only if backup is newer might do.
-u , --update
copy only when the SOURCE file is newer than the destination file or when the destination file is missing
If you want to try to script to get only missing source files, you could probably use The FIND command, perhaps running it from true CLI based on adapting an Export As Command-line. This doesn’t show file timestamps, and it only shows exact sizes for smaller files that don’t cause it to start saying
After getting find or something to tell you what you have left, you can sort that and a version of Duplicati backup
find output, compare them with comm, and run a true CLI restore of exactly the files you need.
Or you can let Duplicati work for you, having it not overwrite versions where restore differs from current. That option in the GUI is called
Save different versions with timestamp in file name, and the manual (or scripted, or tool aided with a merge tool) job is to figure out which file to keep if you have two.
Ordinarily I’d think collisions would be the backup being older than current, so another way to reduce the collisions would be to move aside the files newer than the last backup, then move them back after doing wasted restore of a stale version. You could do some sort of a date-based find command for new files…