I suspect I know the answer . .

I’m using linuxserver.io’s Duplicati container (2.0.5.1_beta_2020-01-18) and I’ve been backing up some very large folders to BackBlaze. To keep the initial job running times down I split the jobs into smaller chunks, i.e. all sub-folders A-E, F-J and so on.

They’ve all gone to different folders in the same bucket using the same encryption settings and application keys. This will work fine but it means every now and then I will need to check that no new sub-folders will be created for each individual job.

Now that all the data is up in the cloud is there a way to move them all into one folder and merge the databases back into one job that just backs up the next folder up?

By the way I’ve only been using Duplicati for a couple of months from initial playing to actually moving data and so far I am very pleased and impressed. Thanks for the great application.

No, you cannot merge files on the back end - it will cause all sorts of problems and probably break your backups.

I totally understand your goal if doing the initial backup in manageable chunks. But I would have suggested only haveing one backup job, and adding folders to the selection list slowly instead of setting up separate backup jobs that can never be merged later.

At this point what I’d probably do is pick one of your backup jobs to be the “main” one. Add the source folders from the OTHER backup jobs to the main backup job source selection list. (You can do this slowly if you want.) Then eventually you can delete those other backup jobs and end up with just a single job. You will lose your history for those other backup jobs you delete, so make sure you think about if you want to do that or not.

Let me know if this makes sense…

It confirms what I thought but your suggestion is perfect.

I’m disappointed I didn’t think of that. I suppose I was looking for ‘quick and easy’ rather than ‘do it properly’.

Thanks.