Create New Folder with Each Backup

Thanks for the context. My impression is that SQL backups and transaction logs tend to defeat Duplicati deduplication efforts, due to the scattered changes. That’s certainly been the experience for people who backup Duplicati’s own SQLite database in a second job (hoping to avoid Recreate, which can be slow).

There are other hazards too. Sometimes people may use the database’s own facilities to write backup-ready files in an application-consistent state (rather than rely on –snapshot-policy - VSS can sometimes get this with help from the database). I’m not familiar with database dump formats, but Duplicati has fixed deduplication block sizes and boundaries, so is probably going to lose deduplication advantages if offsets change (e.g. in sequential dump). Even if the changed-blocks-only scheme worked, it interferes with fast restores (your goal) because blocks need to be collected from various sources, and that slows a restore.

Choosing sizes in Duplicati may be relevant. You might want a big –blocksize if dedup does little anyway. Some people use 1 MB (up from 50 MB default) to reduce overhead losses of slicing a backup too finely.

So yours might actually be (in some ways) a reasonable case, although I’m not sure if direct copy using rclone wouldn’t get you there more directly, faster, and with less risk of Duplicati messing up all the block operations that might not help in your case. Duplicati is also not very happy with cold storage, because it does remote operations all the time by default. It can be somewhat pacified by reducing verifications, but that’s got dangers too. You could try searching the forum for earlier topics about adapting to cold storage. Converting already-started S3 backup to Glacier, Misc Glacier Questions touches on some of the issues.

There are lots of ways to get some indication the backup ran, including reporting options and third-party procesors of the raw reports, such as duplicati-monitoring, or dupReport which I think supports Apprise (extending notifications even further). You could also use –upload-verification-file to make a file with fixed name, then make sure file time changes, or you could use Duplicati.CommandLine.BackendTool.exe to grab a full file listing and pull the backup date out of the dlist file name, to make sure the right one is seen. Or instead of a listing, use it to download duplicati-verication.json to get dlist file information. You can run restore by date, in theory, but my experience is it’s a bit buggy compared to –version, where 0 is newest. The FIND command can show the available versions of backup, and can list the files that it finds in them.

If you still want a new folder each time (gets less restore time, maybe more reliable, likely more storage), then the command line backup command is probably scriptable, whereas changing the GUI config is not. There’s a third-party client that tries to get both though, being a CLI client that uses the raw GUI interface.