Right now I use Duplicati 2 to backup to three different backup targets (locally attached USB drive, NAS, cloud storage). This results in three backup jobs with different target settings (which is fine) but identical source settings.
Drawback: whenever I’m adjusting the set of source files to include/exclude/filter in the backup jobs (which happens frequently), I have to do this three times, always trying to ensure that all three jobs are configured the same.
I’d very much like the idea of having the backup source definition separate from the actual backup job in some kind of “Backup source set” which then can be “linked” to one ore more backup jobs. The “add backup” wizard could offer both the option to create a new “Backup Set” (which would be mostly the same as the current process, besides of having to choose a name for the new Backup Set) or select an already existing Backup Set.
I’m looking forward to any feedback for this suggestion for improvement.
While no doubt this would suit you, most users are not advanced user and the risk of them clicking the option and not knowing what exactly it does is too high. I would not recommend implementing something like this. To mimic what you’re doing, just export the job as a JSON file and import it while changing only the parameters that need to change per job. I did this when I deployed Duplicati at my work place for beta testing.
I think Duplicati has so much “dangerous” switches…
I had the same though as abacus. It would be really nice to define a source and then use one or more of this sources in a backup job and/or use a source in more than one job.
I backup my important stuff to 4 different cloud providers. So I have 4 jobs with the same source. It is not a really big problem for me because my source does not change very often but I agree with abacus that this would be a nice feature.
I’m not sure what you mean by dangerous. They provide advanced functionality and should be skipped by most users. Duplicati works just fine with the default settings, although I would change the hashsize from 100KiB to 250KiB to prevent local database bloat on larger backups.
What I think @abacus is after is multiple destinations. So you define a single job and it sends the data to multiple locations such as NAS, Cloud, etc. That is a feature I could get behind, but seems like a bit overkill for a backup solution such as Duplicati.
Multiple destinations within a single backup jopb is definitely not what I’m after primarily. As I already wrote, having three or more different jobs with different backup targets is fine with me, especially because I’ve got more sophisticated scheduling needs. For example, the backup job targeting the local NAS runs every 30 minutes while the one targeting the cloud storage runs only twice a day. Having both multiple targets and multiple scheduling settings within one single backup job may indeed become rather confusing to code and to use. So, when it comes to backup targets, I’m totally happy with the current situation.
However, what I’d very much appreciate to have is the “Backup Source Set” design. And no, I definitely don’t believe that this necessarily is about to complicate the configuration of a backup job. When having a most basic “One Backup Source, One Backup Target, One Schedule” configuration, things may look very much the same as today (maybe, a default Backup Source Set name may be automatically used when nothing else is selected). Done.
For any somewhat more sophisticated backup job configuration, having the option to use a common set of backup sources in multiple backup jobs would most definitely help to keep the backups consistent.
Your recommendation to export an existing backup job and use it as some kind of template may help when configuring the jobs initially. However, it doesn’t help at all if the goal is for example just to consistently add one more source path to multiple existing backup jobs (all of them with an identical backup source configuration).
To date most people looking for a single source list that can be used in multiple jobs seem to have opted for a “Duplicati sources” type folder in which various symlinks are placed. All their various scheduled & destinationed jobs then just point to that single symlink filled folder for their source.