New retention policy deletes old backups in a smart way

Starting with the easier question, backing up the configuration starts with Configuration --> Export on the job menu, and this merges in the default options on the global settings, but omits non-job stuff like UI password. Back up the configuration somewhere other than Duplicati to use for getting Duplicati back if a disk gets lost.

Database backup is not as easy. Backup duplicati database: is it better or doesn’t matter? provides another statement about when it helps, and I’m still thinking intentionally deleting it is contrary to the expected usage. Reasons for a backup probably don’t include “quieter” unless something goes wrong on the recreate and it gets noisy. If no issues occur, it will probably be “quicker”, however that’s a bigger issue for bigger backups.

Pulling back to closer to on-topic, you could use one of --retention-policy, –keep-time, or –keep-versions for trimming your backup, but you might need to run The COMPACT command to reclaim the destination space. These backup retention operations are run after the backup, so they can help shrink the destination usage (helping your copy-off of the destination), however my belief is they won’t delete last backup before backup (which is dangerous anyway) even if you turn off the safety by saying --allow-full-removal, due to the timing.

Typically, people who do database backup seem to do something like a separate job that runs after the real backup. You can find out the path to of your job’s local database by looking at Advanced -->Database for a backup, and this is also what you would delete if you insist on trying to reset to force doing complete upload. Attempting folder backup needs to carefully avoid backing up the database that’s active doing the backup…

While a backup database is much smaller than the actual backup (because it’s basically cached information attempting to stay in sync with the destination data), some people find that Duplicati’s deduplication doesn’t help much due to the extent of the changes, so the upload winds up being pretty much the whole database. Possibly that won’t matter to you, because the wish seems to be a large self-contained backup to then park.

To get back to an earlier suggestion, maybe you could use a different backup program to do self-contained backup for loss-of-disk cases, while letting Duplicati run a continuing backup maintained by retention policy. rclone may be useful for remote copying. Local has many options, but a big local disaster will leave nothing.

If you like the complete file backups, possibly you could also consider even-more-complete image backups. Free versions of commercial products exist, likely limited in fancy options you might not want. One example.

If you decide to design a complex strategy that you count on totally, please be sure that you test it very well.